WorldWideScience

Sample records for source code display

  1. Visual search asymmetries within color-coded and intensity-coded displays.

    Science.gov (United States)

    Yamani, Yusuke; McCarley, Jason S

    2010-06-01

    Color and intensity coding provide perceptual cues to segregate categories of objects within a visual display, allowing operators to search more efficiently for needed information. Even within a perceptually distinct subset of display elements, however, it may often be useful to prioritize items representing urgent or task-critical information. The design of symbology to produce search asymmetries (Treisman & Souther, 1985) offers a potential technique for doing this, but it is not obvious from existing models of search that an asymmetry observed in the absence of extraneous visual stimuli will persist within a complex color- or intensity-coded display. To address this issue, in the current study we measured the strength of a visual search asymmetry within displays containing color- or intensity-coded extraneous items. The asymmetry persisted strongly in the presence of extraneous items that were drawn in a different color (Experiment 1) or a lower contrast (Experiment 2) than the search-relevant items, with the targets favored by the search asymmetry producing highly efficient search. The asymmetry was attenuated but not eliminated when extraneous items were drawn in a higher contrast than search-relevant items (Experiment 3). Results imply that the coding of symbology to exploit visual search asymmetries can facilitate visual search for high-priority items even within color- or intensity-coded displays. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  2. Transmission imaging with a coded source

    International Nuclear Information System (INIS)

    Stoner, W.W.; Sage, J.P.; Braun, M.; Wilson, D.T.; Barrett, H.H.

    1976-01-01

    The conventional approach to transmission imaging is to use a rotating anode x-ray tube, which provides the small, brilliant x-ray source needed to cast sharp images of acceptable intensity. Stationary anode sources, although inherently less brilliant, are more compatible with the use of large area anodes, and so they can be made more powerful than rotating anode sources. Spatial modulation of the source distribution provides a way to introduce detailed structure in the transmission images cast by large area sources, and this permits the recovery of high resolution images, in spite of the source diameter. The spatial modulation is deliberately chosen to optimize recovery of image structure; the modulation pattern is therefore called a ''code.'' A variety of codes may be used; the essential mathematical property is that the code possess a sharply peaked autocorrelation function, because this property permits the decoding of the raw image cast by th coded source. Random point arrays, non-redundant point arrays, and the Fresnel zone pattern are examples of suitable codes. This paper is restricted to the case of the Fresnel zone pattern code, which has the unique additional property of generating raw images analogous to Fresnel holograms. Because the spatial frequency of these raw images are extremely coarse compared with actual holograms, a photoreduction step onto a holographic plate is necessary before the decoded image may be displayed with the aid of coherent illumination

  3. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  4. Analysis of visual coding variables on CRT generated displays

    International Nuclear Information System (INIS)

    Blackman, H.S.; Gilmore, W.E.

    1985-01-01

    Cathode ray tube generated safety parameter display systems in a nuclear power plant control room situation have been found to be improved in effectiveness when color coding is employed. Research has indicated strong support for graphic coding techniques particularly in redundant coding schemes. In addition, findings on pictographs, as applied in coding schemes, indicate the need for careful application and for further research in the development of a standardized set of symbols

  5. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  6. Theoretical atomic physics code development III TAPS: A display code for atomic physics data

    International Nuclear Information System (INIS)

    Clark, R.E.H.; Abdallah, J. Jr.; Kramer, S.P.

    1988-12-01

    A large amount of theoretical atomic physics data is becoming available through use of the computer codes CATS and ACE developed at Los Alamos National Laboratory. A new code, TAPS, has been written to access this data, perform averages over terms and configurations, and display information in graphical or text form. 7 refs., 13 figs., 1 tab

  7. AUTOET code (a code for automatically constructing event trees and displaying subsystem interdependencies)

    International Nuclear Information System (INIS)

    Wilson, J.R.; Burdick, G.R.

    1977-06-01

    This is a user's manual for AUTOET I and II. AUTOET I is a computer code for automatic event tree construction. It is designed to incorporate and display subsystem interdependencies and common or key component dependencies in the event tree format. The code is written in FORTRAN IV for the CDC Cyber 76 using the Integrated Graphics System (IGS). AUTOET II incorporates consequence and risk calculations, in addition to some other refinements. 5 figures

  8. Adaptive distributed source coding.

    Science.gov (United States)

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  9. ColorPhylo: A Color Code to Accurately Display Taxonomic Classifications.

    Science.gov (United States)

    Lespinats, Sylvain; Fertil, Bernard

    2011-01-01

    Color may be very useful to visualise complex data. As far as taxonomy is concerned, color may help observing various species' characteristics in correlation with classification. However, choosing the number of subclasses to display is often a complex task: on the one hand, assigning a limited number of colors to taxa of interest hides the structure imbedded in the subtrees of the taxonomy; on the other hand, differentiating a high number of taxa by giving them specific colors, without considering the underlying taxonomy, may lead to unreadable results since relationships between displayed taxa would not be supported by the color code. In the present paper, an automatic color coding scheme is proposed to visualise the levels of taxonomic relationships displayed as overlay on any kind of data plot. To achieve this goal, a dimensionality reduction method allows displaying taxonomic "distances" onto a Euclidean two-dimensional space. The resulting map is projected onto a 2D color space (the Hue, Saturation, Brightness colorimetric space with brightness set to 1). Proximity in the taxonomic classification corresponds to proximity on the map and is therefore materialised by color proximity. As a result, each species is related to a color code showing its position in the taxonomic tree. The so called ColorPhylo displays taxonomic relationships intuitively and can be combined with any biological result. A Matlab version of ColorPhylo is available at http://sy.lespi.free.fr/ColorPhylo-homepage.html. Meanwhile, an ad-hoc distance in case of taxonomy with unknown edge lengths is proposed.

  10. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    Science.gov (United States)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  11. Joint source-channel coding using variable length codes

    NARCIS (Netherlands)

    Balakirsky, V.B.

    2001-01-01

    We address the problem of joint source-channel coding when variable-length codes are used for information transmission over a discrete memoryless channel. Data transmitted over the channel are interpreted as pairs (m k ,t k ), where m k is a message generated by the source and t k is a time instant

  12. Demonstration of Vibrational Braille Code Display Using Large Displacement Micro-Electro-Mechanical Systems Actuators

    Science.gov (United States)

    Watanabe, Junpei; Ishikawa, Hiroaki; Arouette, Xavier; Matsumoto, Yasuaki; Miki, Norihisa

    2012-06-01

    In this paper, we present a vibrational Braille code display with large-displacement micro-electro-mechanical systems (MEMS) actuator arrays. Tactile receptors are more sensitive to vibrational stimuli than to static ones. Therefore, when each cell of the Braille code vibrates at optimal frequencies, subjects can recognize the codes more efficiently. We fabricated a vibrational Braille code display that used actuators consisting of piezoelectric actuators and a hydraulic displacement amplification mechanism (HDAM) as cells. The HDAM that encapsulated incompressible liquids in microchambers with two flexible polymer membranes could amplify the displacement of the MEMS actuator. We investigated the voltage required for subjects to recognize Braille codes when each cell, i.e., the large-displacement MEMS actuator, vibrated at various frequencies. Lower voltages were required at vibration frequencies higher than 50 Hz than at vibration frequencies lower than 50 Hz, which verified that the proposed vibrational Braille code display is efficient by successfully exploiting the characteristics of human tactile receptors.

  13. Attention Filtering in the Design of Electronic Map Displays: A Comparison of Color-Coding, Intensity Coding, and Decluttering Techniques

    National Research Council Canada - National Science Library

    Yeh, Michelle; Wickens, Christopher D

    2000-01-01

    In a series of experiments, the use of color-coding, intensity coding, and decluttering were compared order to assess their potential benefits for accessing information from electronic map displays...

  14. Rate-adaptive BCH coding for Slepian-Wolf coding of highly correlated sources

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Salmistraro, Matteo; Larsen, Knud J.

    2012-01-01

    This paper considers using BCH codes for distributed source coding using feedback. The focus is on coding using short block lengths for a binary source, X, having a high correlation between each symbol to be coded and a side information, Y, such that the marginal probability of each symbol, Xi in X......, given Y is highly skewed. In the analysis, noiseless feedback and noiseless communication are assumed. A rate-adaptive BCH code is presented and applied to distributed source coding. Simulation results for a fixed error probability show that rate-adaptive BCH achieves better performance than LDPCA (Low......-Density Parity-Check Accumulate) codes for high correlation between source symbols and the side information....

  15. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  16. The Visual Code Navigator : An Interactive Toolset for Source Code Investigation

    NARCIS (Netherlands)

    Lommerse, Gerard; Nossin, Freek; Voinea, Lucian; Telea, Alexandru

    2005-01-01

    We present the Visual Code Navigator, a set of three interrelated visual tools that we developed for exploring large source code software projects from three different perspectives, or views: The syntactic view shows the syntactic constructs in the source code. The symbol view shows the objects a

  17. Research on Primary Shielding Calculation Source Generation Codes

    Science.gov (United States)

    Zheng, Zheng; Mei, Qiliang; Li, Hui; Shangguan, Danhua; Zhang, Guangchun

    2017-09-01

    Primary Shielding Calculation (PSC) plays an important role in reactor shielding design and analysis. In order to facilitate PSC, a source generation code is developed to generate cumulative distribution functions (CDF) for the source particle sample code of the J Monte Carlo Transport (JMCT) code, and a source particle sample code is deveoped to sample source particle directions, types, coordinates, energy and weights from the CDFs. A source generation code is developed to transform three dimensional (3D) power distributions in xyz geometry to source distributions in r θ z geometry for the J Discrete Ordinate Transport (JSNT) code. Validation on PSC model of Qinshan No.1 nuclear power plant (NPP), CAP1400 and CAP1700 reactors are performed. Numerical results show that the theoretical model and the codes are both correct.

  18. Distributed source coding of video

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Van Luong, Huynh

    2015-01-01

    A foundation for distributed source coding was established in the classic papers of Slepian-Wolf (SW) [1] and Wyner-Ziv (WZ) [2]. This has provided a starting point for work on Distributed Video Coding (DVC), which exploits the source statistics at the decoder side offering shifting processing...... steps, conventionally performed at the video encoder side, to the decoder side. Emerging applications such as wireless visual sensor networks and wireless video surveillance all require lightweight video encoding with high coding efficiency and error-resilience. The video data of DVC schemes differ from...... the assumptions of SW and WZ distributed coding, e.g. by being correlated in time and nonstationary. Improving the efficiency of DVC coding is challenging. This paper presents some selected techniques to address the DVC challenges. Focus is put on pin-pointing how the decoder steps are modified to provide...

  19. Code Forking, Governance, and Sustainability in Open Source Software

    OpenAIRE

    Juho Lindman; Linus Nyman

    2013-01-01

    The right to fork open source code is at the core of open source licensing. All open source licenses grant the right to fork their code, that is to start a new development effort using an existing code as its base. Thus, code forking represents the single greatest tool available for guaranteeing sustainability in open source software. In addition to bolstering program sustainability, code forking directly affects the governance of open source initiatives. Forking, and even the mere possibilit...

  20. Obstacle Detection Display for Visually Impaired: Coding of Direction, Distance, and Height on a Vibrotactile Waist Band

    Directory of Open Access Journals (Sweden)

    Jan B. F. van Erp

    2017-10-01

    Full Text Available Electronic travel aids (ETAs can potentially increase the safety and comfort of blind users by detecting and displaying obstacles outside the range of the white cane. In a series of experiments, we aim to balance the amount of information displayed and the comprehensibility of the information taking into account the risk of information overload. In Experiment 1, we investigate perception of compound signals displayed on a tactile vest while walking. The results confirm that the threat of information overload is clear and present. Tactile coding parameters that are sufficiently discriminable in isolation may not be so in compound signals and while walking and using the white cane. Horizontal tactor location is a strong coding parameter, and temporal pattern is the preferred secondary coding parameter. Vertical location is also possible as coding parameter but it requires additional tactors and makes the display hardware more complex and expensive and less user friendly. In Experiment 2, we investigate how we can off-load the tactile modality by mitigating part of the information to an auditory display. Off-loading the tactile modality through auditory presentation is possible, but this off-loading is limited and may result in a new threat of auditory overload. In addition, taxing the auditory channel may in turn interfere with other auditory cues from the environment. In Experiment 3, we off-load the tactile sense by reducing the amount of displayed information using several filter rules. The resulting design was evaluated in Experiment 4 with visually impaired users. Although they acknowledge the potential of the display, the added of the ETA as a whole also depends on its sensor and object recognition capabilities. We recommend to use not more than two coding parameters in a tactile compound message and apply filter rules to reduce the amount of obstacles to be displayed in an obstacle avoidance ETA.

  1. Code Forking, Governance, and Sustainability in Open Source Software

    Directory of Open Access Journals (Sweden)

    Juho Lindman

    2013-01-01

    Full Text Available The right to fork open source code is at the core of open source licensing. All open source licenses grant the right to fork their code, that is to start a new development effort using an existing code as its base. Thus, code forking represents the single greatest tool available for guaranteeing sustainability in open source software. In addition to bolstering program sustainability, code forking directly affects the governance of open source initiatives. Forking, and even the mere possibility of forking code, affects the governance and sustainability of open source initiatives on three distinct levels: software, community, and ecosystem. On the software level, the right to fork makes planned obsolescence, versioning, vendor lock-in, end-of-support issues, and similar initiatives all but impossible to implement. On the community level, forking impacts both sustainability and governance through the power it grants the community to safeguard against unfavourable actions by corporations or project leaders. On the business-ecosystem level forking can serve as a catalyst for innovation while simultaneously promoting better quality software through natural selection. Thus, forking helps keep open source initiatives relevant and presents opportunities for the development and commercialization of current and abandoned programs.

  2. On the Combination of Multi-Layer Source Coding and Network Coding for Wireless Networks

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Fitzek, Frank; Pedersen, Morten Videbæk

    2013-01-01

    quality is developed. A linear coding structure designed to gracefully encapsulate layered source coding provides both low complexity of the utilised linear coding while enabling robust erasure correction in the form of fountain coding capabilities. The proposed linear coding structure advocates efficient...

  3. Image authentication using distributed source coding.

    Science.gov (United States)

    Lin, Yao-Chung; Varodayan, David; Girod, Bernd

    2012-01-01

    We present a novel approach using distributed source coding for image authentication. The key idea is to provide a Slepian-Wolf encoded quantized image projection as authentication data. This version can be correctly decoded with the help of an authentic image as side information. Distributed source coding provides the desired robustness against legitimate variations while detecting illegitimate modification. The decoder incorporating expectation maximization algorithms can authenticate images which have undergone contrast, brightness, and affine warping adjustments. Our authentication system also offers tampering localization by using the sum-product algorithm.

  4. The Astrophysics Source Code Library by the numbers

    Science.gov (United States)

    Allen, Alice; Teuben, Peter; Berriman, G. Bruce; DuPrie, Kimberly; Mink, Jessica; Nemiroff, Robert; Ryan, PW; Schmidt, Judy; Shamir, Lior; Shortridge, Keith; Wallin, John; Warmels, Rein

    2018-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) was founded in 1999 by Robert Nemiroff and John Wallin. ASCL editors seek both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and add entries for the found codes to the library. Software authors can submit their codes to the ASCL as well. This ensures a comprehensive listing covering a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL is indexed by both NASA’s Astrophysics Data System (ADS) and Web of Science, making software used in research more discoverable. This presentation covers the growth in the ASCL’s number of entries, the number of citations to its entries, and in which journals those citations appear. It also discusses what changes have been made to the ASCL recently, and what its plans are for the future.

  5. Data processing with microcode designed with source coding

    Science.gov (United States)

    McCoy, James A; Morrison, Steven E

    2013-05-07

    Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.

  6. Present state of the SOURCES computer code

    International Nuclear Information System (INIS)

    Shores, Erik F.

    2002-01-01

    In various stages of development for over two decades, the SOURCES computer code continues to calculate neutron production rates and spectra from four types of problems: homogeneous media, two-region interfaces, three-region interfaces and that of a monoenergetic alpha particle beam incident on a slab of target material. Graduate work at the University of Missouri - Rolla, in addition to user feedback from a tutorial course, provided the impetus for a variety of code improvements. Recently upgraded to version 4B, initial modifications to SOURCES focused on updates to the 'tape5' decay data library. Shortly thereafter, efforts focused on development of a graphical user interface for the code. This paper documents the Los Alamos SOURCES Tape1 Creator and Library Link (LASTCALL) and describes additional library modifications in more detail. Minor improvements and planned enhancements are discussed.

  7. Schroedinger’s Code: A Preliminary Study on Research Source Code Availability and Link Persistence in Astrophysics

    Science.gov (United States)

    Allen, Alice; Teuben, Peter J.; Ryan, P. Wesley

    2018-05-01

    We examined software usage in a sample set of astrophysics research articles published in 2015 and searched for the source codes for the software mentioned in these research papers. We categorized the software to indicate whether the source code is available for download and whether there are restrictions to accessing it, and if the source code is not available, whether some other form of the software, such as a binary, is. We also extracted hyperlinks from one journal’s 2015 research articles, as links in articles can serve as an acknowledgment of software use and lead to the data used in the research, and tested them to determine which of these URLs are still accessible. For our sample of 715 software instances in the 166 articles we examined, we were able to categorize 418 records as according to whether source code was available and found that 285 unique codes were used, 58% of which offered the source code for download. Of the 2558 hyperlinks extracted from 1669 research articles, at best, 90% of them were available over our testing period.

  8. Iterative List Decoding of Concatenated Source-Channel Codes

    Directory of Open Access Journals (Sweden)

    Hedayat Ahmadreza

    2005-01-01

    Full Text Available Whenever variable-length entropy codes are used in the presence of a noisy channel, any channel errors will propagate and cause significant harm. Despite using channel codes, some residual errors always remain, whose effect will get magnified by error propagation. Mitigating this undesirable effect is of great practical interest. One approach is to use the residual redundancy of variable length codes for joint source-channel decoding. In this paper, we improve the performance of residual redundancy source-channel decoding via an iterative list decoder made possible by a nonbinary outer CRC code. We show that the list decoding of VLC's is beneficial for entropy codes that contain redundancy. Such codes are used in state-of-the-art video coders, for example. The proposed list decoder improves the overall performance significantly in AWGN and fully interleaved Rayleigh fading channels.

  9. Measuring Modularity in Open Source Code Bases

    Directory of Open Access Journals (Sweden)

    Roberto Milev

    2009-03-01

    Full Text Available Modularity of an open source software code base has been associated with growth of the software development community, the incentives for voluntary code contribution, and a reduction in the number of users who take code without contributing back to the community. As a theoretical construct, modularity links OSS to other domains of research, including organization theory, the economics of industry structure, and new product development. However, measuring the modularity of an OSS design has proven difficult, especially for large and complex systems. In this article, we describe some preliminary results of recent research at Carleton University that examines the evolving modularity of large-scale software systems. We describe a measurement method and a new modularity metric for comparing code bases of different size, introduce an open source toolkit that implements this method and metric, and provide an analysis of the evolution of the Apache Tomcat application server as an illustrative example of the insights gained from this approach. Although these results are preliminary, they open the door to further cross-discipline research that quantitatively links the concerns of business managers, entrepreneurs, policy-makers, and open source software developers.

  10. Optimization of Coding of AR Sources for Transmission Across Channels with Loss

    DEFF Research Database (Denmark)

    Arildsen, Thomas

    Source coding concerns the representation of information in a source signal using as few bits as possible. In the case of lossy source coding, it is the encoding of a source signal using the fewest possible bits at a given distortion or, at the lowest possible distortion given a specified bit rate....... Channel coding is usually applied in combination with source coding to ensure reliable transmission of the (source coded) information at the maximal rate across a channel given the properties of this channel. In this thesis, we consider the coding of auto-regressive (AR) sources which are sources that can...... compared to the case where the encoder is unaware of channel loss. We finally provide an extensive overview of cross-layer communication issues which are important to consider due to the fact that the proposed algorithm interacts with the source coding and exploits channel-related information typically...

  11. RSAP - A Code for Display of Neutron Cross Section Data and SAMMY Fit Results

    International Nuclear Information System (INIS)

    Sayer, R.O.

    2001-01-01

    RSAP is a computer code for display of neutron cross section data and selected SAMMY output. SAMMY is a multilevel R-matrix code for fitting neutron time-of-flight cross-section data using Bayes' method. RSAP, which runs on the Digital Unix Alpha platform, reads ORELA Data Files (ODF) created by SAMMY and uses graphics routines from the PLPLOT package. In addition, RSAP can read data and/or computed values from ASCII files with a format specified by the user. Plot output may be displayed in an X window, sent to a postscript file (rsap.ps), or sent to a color postscript file (rsap.psc). Thirteen plot types are supported, allowing the user to display cross section data, transmission data, errors, theory, Bayes fits, and residuals in various combinations. In this document the designations theory and Bayes refer to the initial and final theoretical cross sections, respectively, as evaluated by SAMMY. Special plot types include Bayes/Data, Theory--Data, and Bayes--Data. Output from two SAMMY runs may be compared by plotting the ratios Theory2/Theory1 and Bayes2/Bayes1 or by plotting the differences (Theory2-Theory1) and (Bayes2-Bayes1)

  12. Repairing business process models as retrieved from source code

    NARCIS (Netherlands)

    Fernández-Ropero, M.; Reijers, H.A.; Pérez-Castillo, R.; Piattini, M.; Nurcan, S.; Proper, H.A.; Soffer, P.; Krogstie, J.; Schmidt, R.; Halpin, T.; Bider, I.

    2013-01-01

    The static analysis of source code has become a feasible solution to obtain underlying business process models from existing information systems. Due to the fact that not all information can be automatically derived from source code (e.g., consider manual activities), such business process models

  13. Blahut-Arimoto algorithm and code design for action-dependent source coding problems

    DEFF Research Database (Denmark)

    Trillingsgaard, Kasper Fløe; Simeone, Osvaldo; Popovski, Petar

    2013-01-01

    The source coding problem with action-dependent side information at the decoder has recently been introduced to model data acquisition in resource-constrained systems. In this paper, an efficient Blahut-Arimoto-type algorithm for the numerical computation of the rate-distortion-cost function...... for this problem is proposed. Moreover, a simplified two-stage code structure based on multiplexing is put forth, whereby the first stage encodes the actions and the second stage is composed of an array of classical Wyner-Ziv codes, one for each action. Leveraging this structure, specific coding/decoding...... strategies are designed based on LDGM codes and message passing. Through numerical examples, the proposed code design is shown to achieve performance close to the rate-distortion-cost function....

  14. Optimizing height presentation for aircraft cockpit displays

    Science.gov (United States)

    Jordan, Chris S.; Croft, D.; Selcon, Stephen J.; Markin, H.; Jackson, M.

    1997-02-01

    This paper describes an experiment conducted to investigate the type of display symbology that most effectively conveys height information to users of head-down plan-view radar displays. The experiment also investigated the use of multiple information sources (redundancy) in the design of such displays. Subjects were presented with eight different height display formats. These formats were constructed from a control, and/or one, two, or three sources of redundant information. The three formats were letter coding, analogue scaling, and toggling (spatially switching the position of the height information from above to below the aircraft symbol). Subjects were required to indicate altitude awareness via a four-key, forced-choice keyboard response. Error scores and response times were taken as performance measures. There were three main findings. First, there was a significant performance advantage when the altitude information was presented above and below the symbol to aid the representation of height information. Second, the analogue scale, a line whose length indicated altitude, proved significantly detrimental to performance. Finally, no relationship was found between the number of redundant information sources employed and performance. The implications for future aircraft and displays are discussed in relation to current aircraft tactical displays and in the context of perceptual psychological theory.

  15. Towards Holography via Quantum Source-Channel Codes

    Science.gov (United States)

    Pastawski, Fernando; Eisert, Jens; Wilming, Henrik

    2017-07-01

    While originally motivated by quantum computation, quantum error correction (QEC) is currently providing valuable insights into many-body quantum physics, such as topological phases of matter. Furthermore, mounting evidence originating from holography research (AdS/CFT) indicates that QEC should also be pertinent for conformal field theories. With this motivation in mind, we introduce quantum source-channel codes, which combine features of lossy compression and approximate quantum error correction, both of which are predicted in holography. Through a recent construction for approximate recovery maps, we derive guarantees on its erasure decoding performance from calculations of an entropic quantity called conditional mutual information. As an example, we consider Gibbs states of the transverse field Ising model at criticality and provide evidence that they exhibit nontrivial protection from local erasure. This gives rise to the first concrete interpretation of a bona fide conformal field theory as a quantum error correcting code. We argue that quantum source-channel codes are of independent interest beyond holography.

  16. OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim); H.J.S. Basten (Bas)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and

  17. An efficient chaotic source coding scheme with variable-length blocks

    International Nuclear Information System (INIS)

    Lin Qiu-Zhen; Wong Kwok-Wo; Chen Jian-Yong

    2011-01-01

    An efficient chaotic source coding scheme operating on variable-length blocks is proposed. With the source message represented by a trajectory in the state space of a chaotic system, data compression is achieved when the dynamical system is adapted to the probability distribution of the source symbols. For infinite-precision computation, the theoretical compression performance of this chaotic coding approach attains that of optimal entropy coding. In finite-precision implementation, it can be realized by encoding variable-length blocks using a piecewise linear chaotic map within the precision of register length. In the decoding process, the bit shift in the register can track the synchronization of the initial value and the corresponding block. Therefore, all the variable-length blocks are decoded correctly. Simulation results show that the proposed scheme performs well with high efficiency and minor compression loss when compared with traditional entropy coding. (general)

  18. Coded aperture imaging of alpha source spatial distribution

    International Nuclear Information System (INIS)

    Talebitaher, Alireza; Shutler, Paul M.E.; Springham, Stuart V.; Rawat, Rajdeep S.; Lee, Paul

    2012-01-01

    The Coded Aperture Imaging (CAI) technique has been applied with CR-39 nuclear track detectors to image alpha particle source spatial distributions. The experimental setup comprised: a 226 Ra source of alpha particles, a laser-machined CAI mask, and CR-39 detectors, arranged inside a vacuum enclosure. Three different alpha particle source shapes were synthesized by using a linear translator to move the 226 Ra source within the vacuum enclosure. The coded mask pattern used is based on a Singer Cyclic Difference Set, with 400 pixels and 57 open square holes (representing ρ = 1/7 = 14.3% open fraction). After etching of the CR-39 detectors, the area, circularity, mean optical density and positions of all candidate tracks were measured by an automated scanning system. Appropriate criteria were used to select alpha particle tracks, and a decoding algorithm applied to the (x, y) data produced the de-coded image of the source. Signal to Noise Ratio (SNR) values obtained for alpha particle CAI images were found to be substantially better than those for corresponding pinhole images, although the CAI-SNR values were below the predictions of theoretical formulae. Monte Carlo simulations of CAI and pinhole imaging were performed in order to validate the theoretical SNR formulae and also our CAI decoding algorithm. There was found to be good agreement between the theoretical formulae and SNR values obtained from simulations. Possible reasons for the lower SNR obtained for the experimental CAI study are discussed.

  19. An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem

    Directory of Open Access Journals (Sweden)

    Tu Zhenyu

    2005-01-01

    Full Text Available A simple but powerful scheme exploiting the binning concept for asymmetric lossless distributed source coding is proposed. The novelty in the proposed scheme is the introduction of a syndrome former (SF in the source encoder and an inverse syndrome former (ISF in the source decoder to efficiently exploit an existing linear channel code without the need to modify the code structure or the decoding strategy. For most channel codes, the construction of SF-ISF pairs is a light task. For parallelly and serially concatenated codes and particularly parallel and serial turbo codes where this appear less obvious, an efficient way for constructing linear complexity SF-ISF pairs is demonstrated. It is shown that the proposed SF-ISF approach is simple, provenly optimal, and generally applicable to any linear channel code. Simulation using conventional and asymmetric turbo codes demonstrates a compression rate that is only 0.06 bit/symbol from the theoretical limit, which is among the best results reported so far.

  20. Code of conduct on the safety and security of radioactive sources

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-01-01

    The objectives of the Code of Conduct are, through the development, harmonization and implementation of national policies, laws and regulations, and through the fostering of international co-operation, to: (i) achieve and maintain a high level of safety and security of radioactive sources; (ii) prevent unauthorized access or damage to, and loss, theft or unauthorized transfer of, radioactive sources, so as to reduce the likelihood of accidental harmful exposure to such sources or the malicious use of such sources to cause harm to individuals, society or the environment; and (iii) mitigate or minimize the radiological consequences of any accident or malicious act involving a radioactive source. These objectives should be achieved through the establishment of an adequate system of regulatory control of radioactive sources, applicable from the stage of initial production to their final disposal, and a system for the restoration of such control if it has been lost. This Code relies on existing international standards relating to nuclear, radiation, radioactive waste and transport safety and to the control of radioactive sources. It is intended to complement existing international standards in these areas. The Code of Conduct serves as guidance in general issues, legislation and regulations, regulatory bodies as well as import and export of radioactive sources. A list of radioactive sources covered by the code is provided which includes activities corresponding to thresholds of categories.

  1. Code of conduct on the safety and security of radioactive sources

    International Nuclear Information System (INIS)

    2004-01-01

    The objectives of the Code of Conduct are, through the development, harmonization and implementation of national policies, laws and regulations, and through the fostering of international co-operation, to: (i) achieve and maintain a high level of safety and security of radioactive sources; (ii) prevent unauthorized access or damage to, and loss, theft or unauthorized transfer of, radioactive sources, so as to reduce the likelihood of accidental harmful exposure to such sources or the malicious use of such sources to cause harm to individuals, society or the environment; and (iii) mitigate or minimize the radiological consequences of any accident or malicious act involving a radioactive source. These objectives should be achieved through the establishment of an adequate system of regulatory control of radioactive sources, applicable from the stage of initial production to their final disposal, and a system for the restoration of such control if it has been lost. This Code relies on existing international standards relating to nuclear, radiation, radioactive waste and transport safety and to the control of radioactive sources. It is intended to complement existing international standards in these areas. The Code of Conduct serves as guidance in general issues, legislation and regulations, regulatory bodies as well as import and export of radioactive sources. A list of radioactive sources covered by the code is provided which includes activities corresponding to thresholds of categories

  2. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    Science.gov (United States)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third

  3. OSSMETER D3.2 – Report on Source Code Activity Metrics

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and initial prototypes of the tools that are needed for source code activity analysis. It builds upon the Deliverable 3.1 where infra-structure and a domain analysis have been

  4. Java Source Code Analysis for API Migration to Embedded Systems

    Energy Technology Data Exchange (ETDEWEB)

    Winter, Victor [Univ. of Nebraska, Omaha, NE (United States); McCoy, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guerrero, Jonathan [Univ. of Nebraska, Omaha, NE (United States); Reinke, Carl Werner [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perry, James Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered by APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.

  5. Using National Drug Codes and drug knowledge bases to organize prescription records from multiple sources.

    Science.gov (United States)

    Simonaitis, Linas; McDonald, Clement J

    2009-10-01

    The utility of National Drug Codes (NDCs) and drug knowledge bases (DKBs) in the organization of prescription records from multiple sources was studied. The master files of most pharmacy systems include NDCs and local codes to identify the products they dispense. We obtained a large sample of prescription records from seven different sources. These records carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in or associated with the sample of prescription records. Considering the total prescription volume, DKBs covered 93.0-99.8% of the product codes from three outpatient sources and 77.4-97.0% of the product codes from four inpatient sources. Among the in-patient sources, invented codes explained 36-94% of the noncoverage. Outpatient pharmacy sources rarely invented codes, which comprised only 0.11-0.21% of their total prescription volume, compared with inpatient pharmacy sources for which invented codes comprised 1.7-7.4% of their prescription volume. The distribution of prescribed products was highly skewed, with 1.4-4.4% of codes accounting for 50% of the message volume and 10.7-34.5% accounting for 90% of the message volume. DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources.

  6. Joint source/channel coding of scalable video over noisy channels

    Energy Technology Data Exchange (ETDEWEB)

    Cheung, G.; Zakhor, A. [Department of Electrical Engineering and Computer Sciences University of California Berkeley, California94720 (United States)

    1997-01-01

    We propose an optimal bit allocation strategy for a joint source/channel video codec over noisy channel when the channel state is assumed to be known. Our approach is to partition source and channel coding bits in such a way that the expected distortion is minimized. The particular source coding algorithm we use is rate scalable and is based on 3D subband coding with multi-rate quantization. We show that using this strategy, transmission of video over very noisy channels still renders acceptable visual quality, and outperforms schemes that use equal error protection only. The flexibility of the algorithm also permits the bit allocation to be selected optimally when the channel state is in the form of a probability distribution instead of a deterministic state. {copyright} {ital 1997 American Institute of Physics.}

  7. When Content Matters: The Role of Processing Code in Tactile Display Design.

    Science.gov (United States)

    Ferris, Thomas K; Sarter, Nadine

    2010-01-01

    The distribution of tasks and stimuli across multiple modalities has been proposed as a means to support multitasking in data-rich environments. Recently, the tactile channel and, more specifically, communication via the use of tactile/haptic icons have received considerable interest. Past research has examined primarily the impact of concurrent task modality on the effectiveness of tactile information presentation. However, it is not well known to what extent the interpretation of iconic tactile patterns is affected by another attribute of information: the information processing codes of concurrent tasks. In two driving simulation studies (n = 25 for each), participants decoded icons composed of either spatial or nonspatial patterns of vibrations (engaging spatial and nonspatial processing code resources, respectively) while concurrently interpreting spatial or nonspatial visual task stimuli. As predicted by Multiple Resource Theory, performance was significantly worse (approximately 5-10 percent worse) when the tactile icons and visual tasks engaged the same processing code, with the overall worst performance in the spatial-spatial task pairing. The findings from these studies contribute to an improved understanding of information processing and can serve as input to multidimensional quantitative models of timesharing performance. From an applied perspective, the results suggest that competition for processing code resources warrants consideration, alongside other factors such as the naturalness of signal-message mapping, when designing iconic tactile displays. Nonspatially encoded tactile icons may be preferable in environments which already rely heavily on spatial processing, such as car cockpits.

  8. Combined Source-Channel Coding of Images under Power and Bandwidth Constraints

    Directory of Open Access Journals (Sweden)

    Fossorier Marc

    2007-01-01

    Full Text Available This paper proposes a framework for combined source-channel coding for a power and bandwidth constrained noisy channel. The framework is applied to progressive image transmission using constant envelope -ary phase shift key ( -PSK signaling over an additive white Gaussian noise channel. First, the framework is developed for uncoded -PSK signaling (with . Then, it is extended to include coded -PSK modulation using trellis coded modulation (TCM. An adaptive TCM system is also presented. Simulation results show that, depending on the constellation size, coded -PSK signaling performs 3.1 to 5.2 dB better than uncoded -PSK signaling. Finally, the performance of our combined source-channel coding scheme is investigated from the channel capacity point of view. Our framework is further extended to include powerful channel codes like turbo and low-density parity-check (LDPC codes. With these powerful codes, our proposed scheme performs about one dB away from the capacity-achieving SNR value of the QPSK channel.

  9. Source Coding for Wireless Distributed Microphones in Reverberant Environments

    DEFF Research Database (Denmark)

    Zahedi, Adel

    2016-01-01

    . However, it comes with the price of several challenges, including the limited power and bandwidth resources for wireless transmission of audio recordings. In such a setup, we study the problem of source coding for the compression of the audio recordings before the transmission in order to reduce the power...... consumption and/or transmission bandwidth by reduction in the transmission rates. Source coding for wireless microphones in reverberant environments has several special characteristics which make it more challenging in comparison with regular audio coding. The signals which are acquired by the microphones......Modern multimedia systems are more and more shifting toward distributed and networked structures. This includes audio systems, where networks of wireless distributed microphones are replacing the traditional microphone arrays. This allows for flexibility of placement and high spatial diversity...

  10. Asymmetric Joint Source-Channel Coding for Correlated Sources with Blind HMM Estimation at the Receiver

    Directory of Open Access Journals (Sweden)

    Ser Javier Del

    2005-01-01

    Full Text Available We consider the case of two correlated sources, and . The correlation between them has memory, and it is modelled by a hidden Markov chain. The paper studies the problem of reliable communication of the information sent by the source over an additive white Gaussian noise (AWGN channel when the output of the other source is available as side information at the receiver. We assume that the receiver has no a priori knowledge of the correlation statistics between the sources. In particular, we propose the use of a turbo code for joint source-channel coding of the source . The joint decoder uses an iterative scheme where the unknown parameters of the correlation model are estimated jointly within the decoding process. It is shown that reliable communication is possible at signal-to-noise ratios close to the theoretical limits set by the combination of Shannon and Slepian-Wolf theorems.

  11. Combined Source-Channel Coding of Images under Power and Bandwidth Constraints

    Directory of Open Access Journals (Sweden)

    Marc Fossorier

    2007-01-01

    Full Text Available This paper proposes a framework for combined source-channel coding for a power and bandwidth constrained noisy channel. The framework is applied to progressive image transmission using constant envelope M-ary phase shift key (M-PSK signaling over an additive white Gaussian noise channel. First, the framework is developed for uncoded M-PSK signaling (with M=2k. Then, it is extended to include coded M-PSK modulation using trellis coded modulation (TCM. An adaptive TCM system is also presented. Simulation results show that, depending on the constellation size, coded M-PSK signaling performs 3.1 to 5.2 dB better than uncoded M-PSK signaling. Finally, the performance of our combined source-channel coding scheme is investigated from the channel capacity point of view. Our framework is further extended to include powerful channel codes like turbo and low-density parity-check (LDPC codes. With these powerful codes, our proposed scheme performs about one dB away from the capacity-achieving SNR value of the QPSK channel.

  12. Comparison of DT neutron production codes MCUNED, ENEA-JSI source subroutine and DDT

    Energy Technology Data Exchange (ETDEWEB)

    Čufar, Aljaž, E-mail: aljaz.cufar@ijs.si [Reactor Physics Department, Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Lengar, Igor; Kodeli, Ivan [Reactor Physics Department, Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Milocco, Alberto [Culham Centre for Fusion Energy, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom); Sauvan, Patrick [Departamento de Ingeniería Energética, E.T.S. Ingenieros Industriales, UNED, C/Juan del Rosal 12, 28040 Madrid (Spain); Conroy, Sean [VR Association, Uppsala University, Department of Physics and Astronomy, PO Box 516, SE-75120 Uppsala (Sweden); Snoj, Luka [Reactor Physics Department, Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia)

    2016-11-01

    Highlights: • Results of three codes capable of simulating the accelerator based DT neutron generators were compared on a simple model where only a thin target made of mixture of titanium and tritium is present. Two typical deuteron beam energies, 100 keV and 250 keV, were used in the comparison. • Comparisons of the angular dependence of the total neutron flux and spectrum as well as the neutron spectrum of all the neutrons emitted from the target show general agreement of the results but also some noticeable differences. • A comparison of figures of merit of the calculations using different codes showed that the computational time necessary to achieve the same statistical uncertainty can vary for more than 30× when different codes for the simulation of the DT neutron generator are used. - Abstract: As the DT fusion reaction produces neutrons with energies significantly higher than in fission reactors, special fusion-relevant benchmark experiments are often performed using DT neutron generators. However, commonly used Monte Carlo particle transport codes such as MCNP or TRIPOLI cannot be directly used to analyze these experiments since they do not have the capabilities to model the production of DT neutrons. Three of the available approaches to model the DT neutron generator source are the MCUNED code, the ENEA-JSI DT source subroutine and the DDT code. The MCUNED code is an extension of the well-established and validated MCNPX Monte Carlo code. The ENEA-JSI source subroutine was originally prepared for the modelling of the FNG experiments using different versions of the MCNP code (−4, −5, −X) and was later extended to allow the modelling of both DT and DD neutron sources. The DDT code prepares the DT source definition file (SDEF card in MCNP) which can then be used in different versions of the MCNP code. In the paper the methods for the simulation of the DT neutron production used in the codes are briefly described and compared for the case of a

  13. Joint Source-Channel Coding by Means of an Oversampled Filter Bank Code

    Directory of Open Access Journals (Sweden)

    Marinkovic Slavica

    2006-01-01

    Full Text Available Quantized frame expansions based on block transforms and oversampled filter banks (OFBs have been considered recently as joint source-channel codes (JSCCs for erasure and error-resilient signal transmission over noisy channels. In this paper, we consider a coding chain involving an OFB-based signal decomposition followed by scalar quantization and a variable-length code (VLC or a fixed-length code (FLC. This paper first examines the problem of channel error localization and correction in quantized OFB signal expansions. The error localization problem is treated as an -ary hypothesis testing problem. The likelihood values are derived from the joint pdf of the syndrome vectors under various hypotheses of impulse noise positions, and in a number of consecutive windows of the received samples. The error amplitudes are then estimated by solving the syndrome equations in the least-square sense. The message signal is reconstructed from the corrected received signal by a pseudoinverse receiver. We then improve the error localization procedure by introducing a per-symbol reliability information in the hypothesis testing procedure of the OFB syndrome decoder. The per-symbol reliability information is produced by the soft-input soft-output (SISO VLC/FLC decoders. This leads to the design of an iterative algorithm for joint decoding of an FLC and an OFB code. The performance of the algorithms developed is evaluated in a wavelet-based image coding system.

  14. Distributed Remote Vector Gaussian Source Coding for Wireless Acoustic Sensor Networks

    DEFF Research Database (Denmark)

    Zahedi, Adel; Østergaard, Jan; Jensen, Søren Holdt

    2014-01-01

    In this paper, we consider the problem of remote vector Gaussian source coding for a wireless acoustic sensor network. Each node receives messages from multiple nodes in the network and decodes these messages using its own measurement of the sound field as side information. The node’s measurement...... and the estimates of the source resulting from decoding the received messages are then jointly encoded and transmitted to a neighboring node in the network. We show that for this distributed source coding scenario, one can encode a so-called conditional sufficient statistic of the sources instead of jointly...

  15. Solar active region display system

    Science.gov (United States)

    Golightly, M.; Raben, V.; Weyland, M.

    2003-04-01

    The Solar Active Region Display System (SARDS) is a client-server application that automatically collects a wide range of solar data and displays it in a format easy for users to assimilate and interpret. Users can rapidly identify active regions of interest or concern from color-coded indicators that visually summarize each region's size, magnetic configuration, recent growth history, and recent flare and CME production. The active region information can be overlaid onto solar maps, multiple solar images, and solar difference images in orthographic, Mercator or cylindrical equidistant projections. Near real-time graphs display the GOES soft and hard x-ray flux, flare events, and daily F10.7 value as a function of time; color-coded indicators show current trends in soft x-ray flux, flare temperature, daily F10.7 flux, and x-ray flare occurrence. Through a separate window up to 4 real-time or static graphs can simultaneously display values of KP, AP, daily F10.7 flux, GOES soft and hard x-ray flux, GOES >10 and >100 MeV proton flux, and Thule neutron monitor count rate. Climatologic displays use color-valued cells to show F10.7 and AP values as a function of Carrington/Bartel's rotation sequences - this format allows users to detect recurrent patterns in solar and geomagnetic activity as well as variations in activity levels over multiple solar cycles. Users can customize many of the display and graph features; all displays can be printed or copied to the system's clipboard for "pasting" into other applications. The system obtains and stores space weather data and images from sources such as the NOAA Space Environment Center, NOAA National Geophysical Data Center, the joint ESA/NASA SOHO spacecraft, and the Kitt Peak National Solar Observatory, and can be extended to include other data series and image sources. Data and images retrieved from the system's database are converted to XML and transported from a central server using HTTP and SOAP protocols, allowing

  16. Test of Effective Solid Angle code for the efficiency calculation of volume source

    Energy Technology Data Exchange (ETDEWEB)

    Kang, M. Y.; Kim, J. H.; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of); Sun, G. M. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    It is hard to determine a full energy (FE) absorption peak efficiency curve for an arbitrary volume source by experiment. That's why the simulation and semi-empirical methods have been preferred so far, and many works have progressed in various ways. Moens et al. determined the concept of effective solid angle by considering an attenuation effect of γ-rays in source, media and detector. This concept is based on a semi-empirical method. An Effective Solid Angle code (ESA code) has been developed for years by the Applied Nuclear Physics Group in Seoul National University. ESA code converts an experimental FE efficiency curve determined by using a standard point source to that for a volume source. To test the performance of ESA Code, we measured the point standard sources and voluminous certified reference material (CRM) sources of γ-ray, and compared with efficiency curves obtained in this study. 200∼1500 KeV energy region is fitted well. NIST X-ray mass attenuation coefficient data is used currently to check for the effect of linear attenuation only. We will use the interaction cross-section data obtained from XCOM code to check the each contributing factor like photoelectric effect, incoherent scattering and coherent scattering in the future. In order to minimize the calculation time and code simplification, optimization of algorithm is needed.

  17. Joint Source-Channel Decoding of Variable-Length Codes with Soft Information: A Survey

    Directory of Open Access Journals (Sweden)

    Pierre Siohan

    2005-05-01

    Full Text Available Multimedia transmission over time-varying wireless channels presents a number of challenges beyond existing capabilities conceived so far for third-generation networks. Efficient quality-of-service (QoS provisioning for multimedia on these channels may in particular require a loosening and a rethinking of the layer separation principle. In that context, joint source-channel decoding (JSCD strategies have gained attention as viable alternatives to separate decoding of source and channel codes. A statistical framework based on hidden Markov models (HMM capturing dependencies between the source and channel coding components sets the foundation for optimal design of techniques of joint decoding of source and channel codes. The problem has been largely addressed in the research community, by considering both fixed-length codes (FLC and variable-length source codes (VLC widely used in compression standards. Joint source-channel decoding of VLC raises specific difficulties due to the fact that the segmentation of the received bitstream into source symbols is random. This paper makes a survey of recent theoretical and practical advances in the area of JSCD with soft information of VLC-encoded sources. It first describes the main paths followed for designing efficient estimators for VLC-encoded sources, the key component of the JSCD iterative structure. It then presents the main issues involved in the application of the turbo principle to JSCD of VLC-encoded sources as well as the main approaches to source-controlled channel decoding. This survey terminates by performance illustrations with real image and video decoding systems.

  18. Joint Source-Channel Decoding of Variable-Length Codes with Soft Information: A Survey

    Science.gov (United States)

    Guillemot, Christine; Siohan, Pierre

    2005-12-01

    Multimedia transmission over time-varying wireless channels presents a number of challenges beyond existing capabilities conceived so far for third-generation networks. Efficient quality-of-service (QoS) provisioning for multimedia on these channels may in particular require a loosening and a rethinking of the layer separation principle. In that context, joint source-channel decoding (JSCD) strategies have gained attention as viable alternatives to separate decoding of source and channel codes. A statistical framework based on hidden Markov models (HMM) capturing dependencies between the source and channel coding components sets the foundation for optimal design of techniques of joint decoding of source and channel codes. The problem has been largely addressed in the research community, by considering both fixed-length codes (FLC) and variable-length source codes (VLC) widely used in compression standards. Joint source-channel decoding of VLC raises specific difficulties due to the fact that the segmentation of the received bitstream into source symbols is random. This paper makes a survey of recent theoretical and practical advances in the area of JSCD with soft information of VLC-encoded sources. It first describes the main paths followed for designing efficient estimators for VLC-encoded sources, the key component of the JSCD iterative structure. It then presents the main issues involved in the application of the turbo principle to JSCD of VLC-encoded sources as well as the main approaches to source-controlled channel decoding. This survey terminates by performance illustrations with real image and video decoding systems.

  19. Model-Based Least Squares Reconstruction of Coded Source Neutron Radiographs: Integrating the ORNL HFIR CG1D Source Model

    Energy Technology Data Exchange (ETDEWEB)

    Santos-Villalobos, Hector J [ORNL; Gregor, Jens [University of Tennessee, Knoxville (UTK); Bingham, Philip R [ORNL

    2014-01-01

    At the present, neutron sources cannot be fabricated small and powerful enough in order to achieve high resolution radiography while maintaining an adequate flux. One solution is to employ computational imaging techniques such as a Magnified Coded Source Imaging (CSI) system. A coded-mask is placed between the neutron source and the object. The system resolution is increased by reducing the size of the mask holes and the flux is increased by increasing the size of the coded-mask and/or the number of holes. One limitation of such system is that the resolution of current state-of-the-art scintillator-based detectors caps around 50um. To overcome this challenge, the coded-mask and object are magnified by making the distance from the coded-mask to the object much smaller than the distance from object to detector. In previous work, we have shown via synthetic experiments that our least squares method outperforms other methods in image quality and reconstruction precision because of the modeling of the CSI system components. However, the validation experiments were limited to simplistic neutron sources. In this work, we aim to model the flux distribution of a real neutron source and incorporate such a model in our least squares computational system. We provide a full description of the methodology used to characterize the neutron source and validate the method with synthetic experiments.

  20. Process Model Improvement for Source Code Plagiarism Detection in Student Programming Assignments

    Science.gov (United States)

    Kermek, Dragutin; Novak, Matija

    2016-01-01

    In programming courses there are various ways in which students attempt to cheat. The most commonly used method is copying source code from other students and making minimal changes in it, like renaming variable names. Several tools like Sherlock, JPlag and Moss have been devised to detect source code plagiarism. However, for larger student…

  1. Code of conduct on the safety and security of radioactive sources

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    The objective of this Code is to achieve and maintain a high level of safety and security of radioactive sources through the development, harmonization and enforcement of national policies, laws and regulations, and through tile fostering of international co-operation. In particular, this Code addresses the establishment of an adequate system of regulatory control from the production of radioactive sources to their final disposal, and a system for the restoration of such control if it has been lost.

  2. Code of conduct on the safety and security of radioactive sources

    International Nuclear Information System (INIS)

    2001-03-01

    The objective of this Code is to achieve and maintain a high level of safety and security of radioactive sources through the development, harmonization and enforcement of national policies, laws and regulations, and through tile fostering of international co-operation. In particular, this Code addresses the establishment of an adequate system of regulatory control from the production of radioactive sources to their final disposal, and a system for the restoration of such control if it has been lost

  3. The Astrophysics Source Code Library: Supporting software publication and citation

    Science.gov (United States)

    Allen, Alice; Teuben, Peter

    2018-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net), established in 1999, is a free online registry for source codes used in research that has appeared in, or been submitted to, peer-reviewed publications. The ASCL is indexed by the SAO/NASA Astrophysics Data System (ADS) and Web of Science and is citable by using the unique ascl ID assigned to each code. In addition to registering codes, the ASCL can house archive files for download and assign them DOIs. The ASCL advocations for software citation on par with article citation, participates in multidiscipinary events such as Force11, OpenCon, and the annual Workshop on Sustainable Software for Science, works with journal publishers, and organizes Special Sessions and Birds of a Feather meetings at national and international conferences such as Astronomical Data Analysis Software and Systems (ADASS), European Week of Astronomy and Space Science, and AAS meetings. In this presentation, I will discuss some of the challenges of gathering credit for publishing software and ideas and efforts from other disciplines that may be useful to astronomy.

  4. Distributed Source Coding Techniques for Lossless Compression of Hyperspectral Images

    Directory of Open Access Journals (Sweden)

    Barni Mauro

    2007-01-01

    Full Text Available This paper deals with the application of distributed source coding (DSC theory to remote sensing image compression. Although DSC exhibits a significant potential in many application fields, up till now the results obtained on real signals fall short of the theoretical bounds, and often impose additional system-level constraints. The objective of this paper is to assess the potential of DSC for lossless image compression carried out onboard a remote platform. We first provide a brief overview of DSC of correlated information sources. We then focus on onboard lossless image compression, and apply DSC techniques in order to reduce the complexity of the onboard encoder, at the expense of the decoder's, by exploiting the correlation of different bands of a hyperspectral dataset. Specifically, we propose two different compression schemes, one based on powerful binary error-correcting codes employed as source codes, and one based on simpler multilevel coset codes. The performance of both schemes is evaluated on a few AVIRIS scenes, and is compared with other state-of-the-art 2D and 3D coders. Both schemes turn out to achieve competitive compression performance, and one of them also has reduced complexity. Based on these results, we highlight the main issues that are still to be solved to further improve the performance of DSC-based remote sensing systems.

  5. Remodularizing Java Programs for Improved Locality of Feature Implementations in Source Code

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    Explicit traceability between features and source code is known to help programmers to understand and modify programs during maintenance tasks. However, the complex relations between features and their implementations are not evident from the source code of object-oriented Java programs....... Consequently, the implementations of individual features are difficult to locate, comprehend, and modify in isolation. In this paper, we present a novel remodularization approach that improves the representation of features in the source code of Java programs. Both forward- and reverse restructurings...... are supported through on-demand bidirectional restructuring between feature-oriented and object-oriented decompositions. The approach includes a feature location phase based of tracing program execution, a feature representation phase that reallocates classes into a new package structure based on single...

  6. Distributed coding of multiview sparse sources with joint recovery

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Deligiannis, Nikos; Forchhammer, Søren

    2016-01-01

    In support of applications involving multiview sources in distributed object recognition using lightweight cameras, we propose a new method for the distributed coding of sparse sources as visual descriptor histograms extracted from multiview images. The problem is challenging due to the computati...... transform (SIFT) descriptors extracted from multiview images shows that our method leads to bit-rate saving of up to 43% compared to the state-of-the-art distributed compressed sensing method with independent encoding of the sources....

  7. Revised IAEA Code of Conduct on the Safety and Security of Radioactive Sources

    International Nuclear Information System (INIS)

    Wheatley, J. S.

    2004-01-01

    The revised Code of Conduct on the Safety and Security of Radioactive Sources is aimed primarily at Governments, with the objective of achieving and maintaining a high level of safety and security of radioactive sources through the development, harmonization and enforcement of national policies, laws and regulations; and through the fostering of international co-operation. It focuses on sealed radioactive sources and provides guidance on legislation, regulations and the regulatory body, and import/export controls. Nuclear materials (except for sources containing 239Pu), as defined in the Convention on the Physical Protection of Nuclear Materials, are not covered by the revised Code, nor are radioactive sources within military or defence programmes. An earlier version of the Code was published by IAEA in 2001. At that time, agreement was not reached on a number of issues, notably those relating to the creation of comprehensive national registries for radioactive sources, obligations of States exporting radioactive sources, and the possibility of unilateral declarations of support. The need to further consider these and other issues was highlighted by the events of 11th September 2001. Since then, the IAEA's Secretariat has been working closely with Member States and relevant International Organizations to achieve consensus. The text of the revised Code was finalized at a meeting of technical and legal experts in August 2003, and it was submitted to IAEA's Board of Governors for approval in September 2003, with a recommendation that the IAEA General Conference adopt it and encourage its wide implementation. The IAEA General Conference, in September 2003, endorsed the revised Code and urged States to work towards following the guidance contained within it. This paper summarizes the history behind the revised Code, its content and the outcome of the discussions within the IAEA Board of Governors and General Conference. (Author) 8 refs

  8. Code of conduct on the safety and security of radioactive sources

    International Nuclear Information System (INIS)

    Anon.

    2001-01-01

    The objective of the code of conduct is to achieve and maintain a high level of safety and security of radioactive sources through the development, harmonization and enforcement of national policies, laws and regulations, and through the fostering of international co-operation. In particular, this code addresses the establishment of an adequate system of regulatory control from the production of radioactive sources to their final disposal, and a system for the restoration of such control if it has been lost. (N.C.)

  9. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks.

    Science.gov (United States)

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-07-09

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption.

  10. Distributed Remote Vector Gaussian Source Coding with Covariance Distortion Constraints

    DEFF Research Database (Denmark)

    Zahedi, Adel; Østergaard, Jan; Jensen, Søren Holdt

    2014-01-01

    In this paper, we consider a distributed remote source coding problem, where a sequence of observations of source vectors is available at the encoder. The problem is to specify the optimal rate for encoding the observations subject to a covariance matrix distortion constraint and in the presence...

  11. IllinoisGRMHD: an open-source, user-friendly GRMHD code for dynamical spacetimes

    International Nuclear Information System (INIS)

    Etienne, Zachariah B; Paschalidis, Vasileios; Haas, Roland; Mösta, Philipp; Shapiro, Stuart L

    2015-01-01

    In the extreme violence of merger and mass accretion, compact objects like black holes and neutron stars are thought to launch some of the most luminous outbursts of electromagnetic and gravitational wave energy in the Universe. Modeling these systems realistically is a central problem in theoretical astrophysics, but has proven extremely challenging, requiring the development of numerical relativity codes that solve Einstein's equations for the spacetime, coupled to the equations of general relativistic (ideal) magnetohydrodynamics (GRMHD) for the magnetized fluids. Over the past decade, the Illinois numerical relativity (ILNR) group's dynamical spacetime GRMHD code has proven itself as a robust and reliable tool for theoretical modeling of such GRMHD phenomena. However, the code was written ‘by experts and for experts’ of the code, with a steep learning curve that would severely hinder community adoption if it were open-sourced. Here we present IllinoisGRMHD, which is an open-source, highly extensible rewrite of the original closed-source GRMHD code of the ILNR group. Reducing the learning curve was the primary focus of this rewrite, with the goal of facilitating community involvement in the code's use and development, as well as the minimization of human effort in generating new science. IllinoisGRMHD also saves computer time, generating roundoff-precision identical output to the original code on adaptive-mesh grids, but nearly twice as fast at scales of hundreds to thousands of cores. (paper)

  12. Designing display primaries with currently available light sources for UHDTV wide-gamut system colorimetry.

    Science.gov (United States)

    Masaoka, Kenichiro; Nishida, Yukihiro; Sugawara, Masayuki

    2014-08-11

    The wide-gamut system colorimetry has been standardized for ultra-high definition television (UHDTV). The chromaticities of the primaries are designed to lie on the spectral locus to cover major standard system colorimetries and real object colors. Although monochromatic light sources are required for a display to perfectly fulfill the system colorimetry, highly saturated emission colors using recent quantum dot technology may effectively achieve the wide gamut. This paper presents simulation results on the chromaticities of highly saturated non-monochromatic light sources and gamut coverage of real object colors to be considered in designing wide-gamut displays with color filters for the UHDTV.

  13. Domain-Specific Acceleration and Auto-Parallelization of Legacy Scientific Code in FORTRAN 77 using Source-to-Source Compilation

    OpenAIRE

    Vanderbauwhede, Wim; Davidson, Gavin

    2017-01-01

    Massively parallel accelerators such as GPGPUs, manycores and FPGAs represent a powerful and affordable tool for scientists who look to speed up simulations of complex systems. However, porting code to such devices requires a detailed understanding of heterogeneous programming tools and effective strategies for parallelization. In this paper we present a source to source compilation approach with whole-program analysis to automatically transform single-threaded FORTRAN 77 legacy code into Ope...

  14. Automating RPM Creation from a Source Code Repository

    Science.gov (United States)

    2012-02-01

    apps/usr --with- libpq=/apps/ postgres make rm -rf $RPM_BUILD_ROOT umask 0077 mkdir -p $RPM_BUILD_ROOT/usr/local/bin mkdir -p $RPM_BUILD_ROOT...from a source code repository. %pre %prep %setup %build ./autogen.sh ; ./configure --with-db=/apps/db --with-libpq=/apps/ postgres make

  15. Development of in-vessel source term analysis code, tracer

    International Nuclear Information System (INIS)

    Miyagi, K.; Miyahara, S.

    1996-01-01

    Analyses of radionuclide transport in fuel failure accidents (generally referred to source terms) are considered to be important especially in the severe accident evaluation. The TRACER code has been developed to realistically predict the time dependent behavior of FPs and aerosols within the primary cooling system for wide range of fuel failure events. This paper presents the model description, results of validation study, the recent model advancement status of the code, and results of check out calculations under reactor conditions. (author)

  16. Source Coding in Networks with Covariance Distortion Constraints

    DEFF Research Database (Denmark)

    Zahedi, Adel; Østergaard, Jan; Jensen, Søren Holdt

    2016-01-01

    results to a joint source coding and denoising problem. We consider a network with a centralized topology and a given weighted sum-rate constraint, where the received signals at the center are to be fused to maximize the output SNR while enforcing no linear distortion. We show that one can design...

  17. Use of source term code package in the ELEBRA MX-850 system

    International Nuclear Information System (INIS)

    Guimaraes, A.C.F.; Goes, A.G.A.

    1988-12-01

    The implantation of source term code package in the ELEBRA-MX850 system is presented. The source term is formed when radioactive materials generated in nuclear fuel leakage toward containment and the external environment to reactor containment. The implantated version in the ELEBRA system are composed of five codes: MARCH 3, TRAPMELT 3, THCCA, VANESA and NAVA. The original example case was used. The example consists of a small loca accident in a PWR type reactor. A sensitivity study for the TRAPMELT 3 code was carried out, modifying the 'TIME STEP' to estimate the processing time of CPU for executing the original example case. (M.C.K.) [pt

  18. Evaluating Open-Source Full-Text Search Engines for Matching ICD-10 Codes.

    Science.gov (United States)

    Jurcău, Daniel-Alexandru; Stoicu-Tivadar, Vasile

    2016-01-01

    This research presents the results of evaluating multiple free, open-source engines on matching ICD-10 diagnostic codes via full-text searches. The study investigates what it takes to get an accurate match when searching for a specific diagnostic code. For each code the evaluation starts by extracting the words that make up its text and continues with building full-text search queries from the combinations of these words. The queries are then run against all the ICD-10 codes until a match indicates the code in question as a match with the highest relative score. This method identifies the minimum number of words that must be provided in order for the search engines choose the desired entry. The engines analyzed include a popular Java-based full-text search engine, a lightweight engine written in JavaScript which can even execute on the user's browser, and two popular open-source relational database management systems.

  19. Detecting Source Code Plagiarism on .NET Programming Languages using Low-level Representation and Adaptive Local Alignment

    Directory of Open Access Journals (Sweden)

    Oscar Karnalim

    2017-01-01

    Full Text Available Even though there are various source code plagiarism detection approaches, only a few works which are focused on low-level representation for deducting similarity. Most of them are only focused on lexical token sequence extracted from source code. In our point of view, low-level representation is more beneficial than lexical token since its form is more compact than the source code itself. It only considers semantic-preserving instructions and ignores many source code delimiter tokens. This paper proposes a source code plagiarism detection which rely on low-level representation. For a case study, we focus our work on .NET programming languages with Common Intermediate Language as its low-level representation. In addition, we also incorporate Adaptive Local Alignment for detecting similarity. According to Lim et al, this algorithm outperforms code similarity state-of-the-art algorithm (i.e. Greedy String Tiling in term of effectiveness. According to our evaluation which involves various plagiarism attacks, our approach is more effective and efficient when compared with standard lexical-token approach.

  20. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks †

    Science.gov (United States)

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-01-01

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption. PMID:27409616

  1. Optimal power allocation and joint source-channel coding for wireless DS-CDMA visual sensor networks

    Science.gov (United States)

    Pandremmenou, Katerina; Kondi, Lisimachos P.; Parsopoulos, Konstantinos E.

    2011-01-01

    In this paper, we propose a scheme for the optimal allocation of power, source coding rate, and channel coding rate for each of the nodes of a wireless Direct Sequence Code Division Multiple Access (DS-CDMA) visual sensor network. The optimization is quality-driven, i.e. the received quality of the video that is transmitted by the nodes is optimized. The scheme takes into account the fact that the sensor nodes may be imaging scenes with varying levels of motion. Nodes that image low-motion scenes will require a lower source coding rate, so they will be able to allocate a greater portion of the total available bit rate to channel coding. Stronger channel coding will mean that such nodes will be able to transmit at lower power. This will both increase battery life and reduce interference to other nodes. Two optimization criteria are considered. One that minimizes the average video distortion of the nodes and one that minimizes the maximum distortion among the nodes. The transmission powers are allowed to take continuous values, whereas the source and channel coding rates can assume only discrete values. Thus, the resulting optimization problem lies in the field of mixed-integer optimization tasks and is solved using Particle Swarm Optimization. Our experimental results show the importance of considering the characteristics of the video sequences when determining the transmission power, source coding rate and channel coding rate for the nodes of the visual sensor network.

  2. Microdosimetry computation code of internal sources - MICRODOSE 1

    International Nuclear Information System (INIS)

    Li Weibo; Zheng Wenzhong; Ye Changqing

    1995-01-01

    This paper describes a microdosimetry computation code, MICRODOSE 1, on the basis of the following described methods: (1) the method of calculating f 1 (z) for charged particle in the unit density tissues; (2) the method of calculating f(z) for a point source; (3) the method of applying the Fourier transform theory to the calculation of the compound Poisson process; (4) the method of using fast Fourier transform technique to determine f(z) and, giving some computed examples based on the code, MICRODOSE 1, including alpha particles emitted from 239 Pu in the alveolar lung tissues and from radon progeny RaA and RAC in the human respiratory tract. (author). 13 refs., 6 figs

  3. Source Code Vulnerabilities in IoT Software Systems

    Directory of Open Access Journals (Sweden)

    Saleh Mohamed Alnaeli

    2017-08-01

    Full Text Available An empirical study that examines the usage of known vulnerable statements in software systems developed in C/C++ and used for IoT is presented. The study is conducted on 18 open source systems comprised of millions of lines of code and containing thousands of files. Static analysis methods are applied to each system to determine the number of unsafe commands (e.g., strcpy, strcmp, and strlen that are well-known among research communities to cause potential risks and security concerns, thereby decreasing a system’s robustness and quality. These unsafe statements are banned by many companies (e.g., Microsoft. The use of these commands should be avoided from the start when writing code and should be removed from legacy code over time as recommended by new C/C++ language standards. Each system is analyzed and the distribution of the known unsafe commands is presented. Historical trends in the usage of the unsafe commands of 7 of the systems are presented to show how the studied systems evolved over time with respect to the vulnerable code. The results show that the most prevalent unsafe command used for most systems is memcpy, followed by strlen. These results can be used to help train software developers on secure coding practices so that they can write higher quality software systems.

  4. Source-term model for the SYVAC3-NSURE performance assessment code

    International Nuclear Information System (INIS)

    Rowat, J.H.; Rattan, D.S.; Dolinar, G.M.

    1996-11-01

    Radionuclide contaminants in wastes emplaced in disposal facilities will not remain in those facilities indefinitely. Engineered barriers will eventually degrade, allowing radioactivity to escape from the vault. The radionuclide release rate from a low-level radioactive waste (LLRW) disposal facility, the source term, is a key component in the performance assessment of the disposal system. This report describes the source-term model that has been implemented in Ver. 1.03 of the SYVAC3-NSURE (Systems Variability Analysis Code generation 3-Near Surface Repository) code. NSURE is a performance assessment code that evaluates the impact of near-surface disposal of LLRW through the groundwater pathway. The source-term model described here was developed for the Intrusion Resistant Underground Structure (IRUS) disposal facility, which is a vault that is to be located in the unsaturated overburden at AECL's Chalk River Laboratories. The processes included in the vault model are roof and waste package performance, and diffusion, advection and sorption of radionuclides in the vault backfill. The model presented here was developed for the IRUS vault; however, it is applicable to other near-surface disposal facilities. (author). 40 refs., 6 figs

  5. Enabling Cognitive Load-Aware AR with Rateless Coding on a Wearable Network

    Directory of Open Access Journals (Sweden)

    R. Razavi

    2008-01-01

    Full Text Available Augmented reality (AR on a head-mounted display is conveniently supported by a wearable wireless network. If, in addition, the AR display is moderated to take account of the cognitive load of the wearer, then additional biosensors form part of the network. In this paper, the impact of these additional traffic sources is assessed. Rateless coding is proposed to not only protect the fragile encoded video stream from wireless noise and interference but also to reduce coding overhead. The paper proposes a block-based form of rateless channel coding in which the unit of coding is a block within a packet. The contribution of this paper is that it minimizes energy consumption by reducing the overhead from forward error correction (FEC, while error correction properties are conserved. Compared to simple packet-based rateless coding, with this form of block-based coding, data loss is reduced and energy efficiency is improved. Cross-layer organization of piggy-backed response blocks must take place in response to feedback, as detailed in the paper. Compared also to variants of its default FEC scheme, results from a Bluetooth (IEEE 802.15.1 wireless network show a consistent improvement in energy consumption, packet arrival latency, and video quality at the AR display.

  6. Verification test calculations for the Source Term Code Package

    International Nuclear Information System (INIS)

    Denning, R.S.; Wooton, R.O.; Alexander, C.A.; Curtis, L.A.; Cybulskis, P.; Gieseke, J.A.; Jordan, H.; Lee, K.W.; Nicolosi, S.L.

    1986-07-01

    The purpose of this report is to demonstrate the reasonableness of the Source Term Code Package (STCP) results. Hand calculations have been performed spanning a wide variety of phenomena within the context of a single accident sequence, a loss of all ac power with late containment failure, in the Peach Bottom (BWR) plant, and compared with STCP results. The report identifies some of the limitations of the hand calculation effort. The processes involved in a core meltdown accident are complex and coupled. Hand calculations by their nature must deal with gross simplifications of these processes. Their greatest strength is as an indicator that a computer code contains an error, for example that it doesn't satisfy basic conservation laws, rather than in showing the analysis accurately represents reality. Hand calculations are an important element of verification but they do not satisfy the need for code validation. The code validation program for the STCP is a separate effort. In general the hand calculation results show that models used in the STCP codes (e.g., MARCH, TRAP-MELT, VANESA) obey basic conservation laws and produce reasonable results. The degree of agreement and significance of the comparisons differ among the models evaluated. 20 figs., 26 tabs

  7. Imaging x-ray sources at a finite distance in coded-mask instruments

    International Nuclear Information System (INIS)

    Donnarumma, Immacolata; Pacciani, Luigi; Lapshov, Igor; Evangelista, Yuri

    2008-01-01

    We present a method for the correction of beam divergence in finite distance sources imaging through coded-mask instruments. We discuss the defocusing artifacts induced by the finite distance showing two different approaches to remove such spurious effects. We applied our method to one-dimensional (1D) coded-mask systems, although it is also applicable in two-dimensional systems. We provide a detailed mathematical description of the adopted method and of the systematics introduced in the reconstructed image (e.g., the fraction of source flux collected in the reconstructed peak counts). The accuracy of this method was tested by simulating pointlike and extended sources at a finite distance with the instrumental setup of the SuperAGILE experiment, the 1D coded-mask x-ray imager onboard the AGILE (Astro-rivelatore Gamma a Immagini Leggero) mission. We obtained reconstructed images of good quality and high source location accuracy. Finally we show the results obtained by applying this method to real data collected during the calibration campaign of SuperAGILE. Our method was demonstrated to be a powerful tool to investigate the imaging response of the experiment, particularly the absorption due to the materials intercepting the line of sight of the instrument and the conversion between detector pixel and sky direction

  8. SOURCES-3A: A code for calculating (α, n), spontaneous fission, and delayed neutron sources and spectra

    International Nuclear Information System (INIS)

    Perry, R.T.; Wilson, W.B.; Charlton, W.S.

    1998-04-01

    In many systems, it is imperative to have accurate knowledge of all significant sources of neutrons due to the decay of radionuclides. These sources can include neutrons resulting from the spontaneous fission of actinides, the interaction of actinide decay α-particles in (α,n) reactions with low- or medium-Z nuclides, and/or delayed neutrons from the fission products of actinides. Numerous systems exist in which these neutron sources could be important. These include, but are not limited to, clean and spent nuclear fuel (UO 2 , ThO 2 , MOX, etc.), enrichment plant operations (UF 6 , PuF 4 , etc.), waste tank studies, waste products in borosilicate glass or glass-ceramic mixtures, and weapons-grade plutonium in storage containers. SOURCES-3A is a computer code that determines neutron production rates and spectra from (α,n) reactions, spontaneous fission, and delayed neutron emission due to the decay of radionuclides in homogeneous media (i.e., a mixture of α-emitting source material and low-Z target material) and in interface problems (i.e., a slab of α-emitting source material in contact with a slab of low-Z target material). The code is also capable of calculating the neutron production rates due to (α,n) reactions induced by a monoenergetic beam of α-particles incident on a slab of target material. Spontaneous fission spectra are calculated with evaluated half-life, spontaneous fission branching, and Watt spectrum parameters for 43 actinides. The (α,n) spectra are calculated using an assumed isotropic angular distribution in the center-of-mass system with a library of 89 nuclide decay α-particle spectra, 24 sets of measured and/or evaluated (α,n) cross sections and product nuclide level branching fractions, and functional α-particle stopping cross sections for Z < 106. The delayed neutron spectra are taken from an evaluated library of 105 precursors. The code outputs the magnitude and spectra of the resultant neutron source. It also provides an

  9. Identification of Sparse Audio Tampering Using Distributed Source Coding and Compressive Sensing Techniques

    Directory of Open Access Journals (Sweden)

    Valenzise G

    2009-01-01

    Full Text Available In the past few years, a large amount of techniques have been proposed to identify whether a multimedia content has been illegally tampered or not. Nevertheless, very few efforts have been devoted to identifying which kind of attack has been carried out, especially due to the large data required for this task. We propose a novel hashing scheme which exploits the paradigms of compressive sensing and distributed source coding to generate a compact hash signature, and we apply it to the case of audio content protection. The audio content provider produces a small hash signature by computing a limited number of random projections of a perceptual, time-frequency representation of the original audio stream; the audio hash is given by the syndrome bits of an LDPC code applied to the projections. At the content user side, the hash is decoded using distributed source coding tools. If the tampering is sparsifiable or compressible in some orthonormal basis or redundant dictionary, it is possible to identify the time-frequency position of the attack, with a hash size as small as 200 bits/second; the bit saving obtained by introducing distributed source coding ranges between 20% to 70%.

  10. Beyond the Business Model: Incentives for Organizations to Publish Software Source Code

    Science.gov (United States)

    Lindman, Juho; Juutilainen, Juha-Pekka; Rossi, Matti

    The software stack opened under Open Source Software (OSS) licenses is growing rapidly. Commercial actors have released considerable amounts of previously proprietary source code. These actions beg the question why companies choose a strategy based on giving away software assets? Research on outbound OSS approach has tried to answer this question with the concept of the “OSS business model”. When studying the reasons for code release, we have observed that the business model concept is too generic to capture the many incentives organizations have. Conversely, in this paper we investigate empirically what the companies’ incentives are by means of an exploratory case study of three organizations in different stages of their code release. Our results indicate that the companies aim to promote standardization, obtain development resources, gain cost savings, improve the quality of software, increase the trustworthiness of software, or steer OSS communities. We conclude that future research on outbound OSS could benefit from focusing on the heterogeneous incentives for code release rather than on revenue models.

  11. Color Analysis in Air Traffic Control Displays, Part II. Auxiliary Displays

    National Research Council Canada - National Science Library

    Xing, Jing

    2007-01-01

    ...), Traffic Management Advisor (TMA), and Integrated Terminal Weather System (ITWS). For each display, we documented the background and default colors, color-coding, color usage, associated purposes of color use, and color complexity...

  12. Authorship attribution of source code by using back propagation neural network based on particle swarm optimization.

    Science.gov (United States)

    Yang, Xinyu; Xu, Guoai; Li, Qi; Guo, Yanhui; Zhang, Miao

    2017-01-01

    Authorship attribution is to identify the most likely author of a given sample among a set of candidate known authors. It can be not only applied to discover the original author of plain text, such as novels, blogs, emails, posts etc., but also used to identify source code programmers. Authorship attribution of source code is required in diverse applications, ranging from malicious code tracking to solving authorship dispute or software plagiarism detection. This paper aims to propose a new method to identify the programmer of Java source code samples with a higher accuracy. To this end, it first introduces back propagation (BP) neural network based on particle swarm optimization (PSO) into authorship attribution of source code. It begins by computing a set of defined feature metrics, including lexical and layout metrics, structure and syntax metrics, totally 19 dimensions. Then these metrics are input to neural network for supervised learning, the weights of which are output by PSO and BP hybrid algorithm. The effectiveness of the proposed method is evaluated on a collected dataset with 3,022 Java files belong to 40 authors. Experiment results show that the proposed method achieves 91.060% accuracy. And a comparison with previous work on authorship attribution of source code for Java language illustrates that this proposed method outperforms others overall, also with an acceptable overhead.

  13. Eu-NORSEWInD - Assessment of Viability of Open Source CFD Code for the Wind Industry

    DEFF Research Database (Denmark)

    Stickland, Matt; Scanlon, Tom; Fabre, Sylvie

    2009-01-01

    Part of the overall NORSEWInD project is the use of LiDAR remote sensing (RS) systems mounted on offshore platforms to measure wind velocity profiles at a number of locations offshore. The data acquired from the offshore RS measurements will be fed into a large and novel wind speed dataset suitab...... between the results of simulations created by the commercial code FLUENT and the open source code OpenFOAM. An assessment of the ease with which the open source code can be used is also included....

  14. Health physics source document for codes of practice

    International Nuclear Information System (INIS)

    Pearson, G.W.; Meggitt, G.C.

    1989-05-01

    Personnel preparing codes of practice often require basic Health Physics information or advice relating to radiological protection problems and this document is written primarily to supply such information. Certain technical terms used in the text are explained in the extensive glossary. Due to the pace of change in the field of radiological protection it is difficult to produce an up-to-date document. This document was compiled during 1988 however, and therefore contains the principle changes brought about by the introduction of the Ionising Radiations Regulations (1985). The paper covers the nature of ionising radiation, its biological effects and the principles of control. It is hoped that the document will provide a useful source of information for both codes of practice and wider areas and stimulate readers to study radiological protection issues in greater depth. (author)

  15. Low complexity source and channel coding for mm-wave hybrid fiber-wireless links

    DEFF Research Database (Denmark)

    Lebedev, Alexander; Vegas Olmos, Juan José; Pang, Xiaodan

    2014-01-01

    We report on the performance of channel and source coding applied for an experimentally realized hybrid fiber-wireless W-band link. Error control coding performance is presented for a wireless propagation distance of 3 m and 20 km fiber transmission. We report on peak signal-to-noise ratio perfor...

  16. Fine-Grained Energy Modeling for the Source Code of a Mobile Application

    DEFF Research Database (Denmark)

    Li, Xueliang; Gallagher, John Patrick

    2016-01-01

    The goal of an energy model for source code is to lay a foundation for the application of energy-aware programming techniques. State of the art solutions are based on source-line energy information. In this paper, we present an approach to constructing a fine-grained energy model which is able...

  17. A plug-in to Eclipse for VHDL source codes: functionalities

    Science.gov (United States)

    Niton, B.; Poźniak, K. T.; Romaniuk, R. S.

    The paper presents an original application, written by authors, which supports writing and edition of source codes in VHDL language. It is a step towards fully automatic, augmented code writing for photonic and electronic systems, also systems based on FPGA and/or DSP processors. An implementation is described, based on VEditor. VEditor is a free license program. Thus, the work presented in this paper supplements and extends this free license. The introduction characterizes shortly available tools on the market which serve for aiding the design processes of electronic systems in VHDL. Particular attention was put on plug-ins to the Eclipse environment and Emacs program. There are presented detailed properties of the written plug-in such as: programming extension conception, and the results of the activities of formatter, re-factorizer, code hider, and other new additions to the VEditor program.

  18. Network, system, and status software enhancements for the autonomously managed electrical power system breadboard. Volume 4: Graphical status display

    Science.gov (United States)

    Mckee, James W.

    1990-01-01

    This volume (4 of 4) contains the description, structured flow charts, prints of the graphical displays, and source code to generate the displays for the AMPS graphical status system. The function of these displays is to present to the manager of the AMPS system a graphical status display with the hot boxes that allow the manager to get more detailed status on selected portions of the AMPS system. The development of the graphical displays is divided into two processes; the creation of the screen images and storage of them in files on the computer, and the running of the status program which uses the screen images.

  19. WASTK: A Weighted Abstract Syntax Tree Kernel Method for Source Code Plagiarism Detection

    Directory of Open Access Journals (Sweden)

    Deqiang Fu

    2017-01-01

    Full Text Available In this paper, we introduce a source code plagiarism detection method, named WASTK (Weighted Abstract Syntax Tree Kernel, for computer science education. Different from other plagiarism detection methods, WASTK takes some aspects other than the similarity between programs into account. WASTK firstly transfers the source code of a program to an abstract syntax tree and then gets the similarity by calculating the tree kernel of two abstract syntax trees. To avoid misjudgment caused by trivial code snippets or frameworks given by instructors, an idea similar to TF-IDF (Term Frequency-Inverse Document Frequency in the field of information retrieval is applied. Each node in an abstract syntax tree is assigned a weight by TF-IDF. WASTK is evaluated on different datasets and, as a result, performs much better than other popular methods like Sim and JPlag.

  20. Plagiarism Detection Algorithm for Source Code in Computer Science Education

    Science.gov (United States)

    Liu, Xin; Xu, Chan; Ouyang, Boyu

    2015-01-01

    Nowadays, computer programming is getting more necessary in the course of program design in college education. However, the trick of plagiarizing plus a little modification exists among some students' home works. It's not easy for teachers to judge if there's plagiarizing in source code or not. Traditional detection algorithms cannot fit this…

  1. Liquid crystal display

    International Nuclear Information System (INIS)

    Takami, K.

    1981-01-01

    An improved liquid crystal display device is described which can display letters, numerals and other necessary patterns in the night time using a minimized amount of radioactive material. To achieve this a self-luminous light source is placed in a limited region corresponding to a specific display area. (U.K.)

  2. Rascal: A domain specific language for source code analysis and manipulation

    NARCIS (Netherlands)

    P. Klint (Paul); T. van der Storm (Tijs); J.J. Vinju (Jurgen); A. Walenstein; S. Schuppe

    2009-01-01

    htmlabstractMany automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This

  3. RASCAL : a domain specific language for source code analysis and manipulationa

    NARCIS (Netherlands)

    Klint, P.; Storm, van der T.; Vinju, J.J.

    2009-01-01

    Many automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This impedance

  4. European display scene

    Science.gov (United States)

    Bartlett, Christopher T.

    2000-08-01

    The manufacture of Flat Panel Displays (FPDs) is dominated by Far Eastern sources, particularly in Active Matrix Liquid Crystal Displays (AMLCD) and Plasma. The United States has a very powerful capability in micro-displays. It is not well known that Europe has a very active research capability which has lead to many innovations in display technology. In addition there is a capability in display manufacturing of organic technologies as well as the licensed build of Japanese or Korean designs. Finally, Europe has a display systems capability in military products which is world class.

  5. BOT3P5.2, 3D Mesh Generator and Graphical Display of Geometry for Radiation Transport Codes, Display of Results

    International Nuclear Information System (INIS)

    Orsi, Roberto; Bidaud, Adrien

    2007-01-01

    1 - Description of program or function: BOT3P was originally conceived as a set of standard FORTRAN 77 language programs in order to give the users of the DORT and TORT deterministic transport codes some useful diagnostic tools to prepare and check their input data files. Later versions extended the possibility to produce the geometrical, material distribution and fixed neutron source data to other deterministic transport codes such as TWODANT/THREEDANT of the DANTSYS system, PARTISN and, potentially, to any transport code through BOT3P binary output files that can be easily interfaced (see, for example, the Russian two-dimensional (2D) and three-dimensional (3D) discrete ordinates neutron, photon and charged particle transport codes KASKAD-S-2.5 and KATRIN-2.0). As from Version 5.1 BOT3P contained important additions specifically addressed to radiation transport analysis for medical applications. BOT3P-5.2 contains new graphics capabilities. Some of them enable users to select space sub-domains of the total mesh grid in order to improve the zoom simulation of the geometry, both in 2D cuts and in 3D. Moreover the new BOT3P module (PDTM) may improve the interface of BOT3P geometrical models to transport analysis codes. The following programs are included in the BOT3P software package: GGDM, DDM, GGTM, DTM2, DTM3, RVARSCL, COMPARE, MKSRC, CATSM, DTET, and PDTM. The main features of these different programs are described. 2 - Methods: GGDM and GGTM work similarly from the logical point of view. Since the 3D case is more general, the following description refers to GGTM. All the co-ordinate values that characterise the geometrical scheme at the basis of the 3D transport code geometrical and material model are read, sorted and all stored if different from the neighbouring ones more than an input tolerance established by the user. These co-ordinates are always present in the fine-mesh boundary arrays independently of the mesh grid refinement options, because they

  6. D-DSC: Decoding Delay-based Distributed Source Coding for Internet of Sensing Things.

    Science.gov (United States)

    Aktas, Metin; Kuscu, Murat; Dinc, Ergin; Akan, Ozgur B

    2018-01-01

    Spatial correlation between densely deployed sensor nodes in a wireless sensor network (WSN) can be exploited to reduce the power consumption through a proper source coding mechanism such as distributed source coding (DSC). In this paper, we propose the Decoding Delay-based Distributed Source Coding (D-DSC) to improve the energy efficiency of the classical DSC by employing the decoding delay concept which enables the use of the maximum correlated portion of sensor samples during the event estimation. In D-DSC, network is partitioned into clusters, where the clusterheads communicate their uncompressed samples carrying the side information, and the cluster members send their compressed samples. Sink performs joint decoding of the compressed and uncompressed samples and then reconstructs the event signal using the decoded sensor readings. Based on the observed degree of the correlation among sensor samples, the sink dynamically updates and broadcasts the varying compression rates back to the sensor nodes. Simulation results for the performance evaluation reveal that D-DSC can achieve reliable and energy-efficient event communication and estimation for practical signal detection/estimation applications having massive number of sensors towards the realization of Internet of Sensing Things (IoST).

  7. Documentation for grants equal to tax model: Volume 3, Source code

    International Nuclear Information System (INIS)

    Boryczka, M.K.

    1986-01-01

    The GETT model is capable of forecasting the amount of tax liability associated with all property owned and all activities undertaken by the US Department of Energy (DOE) in site characterization and repository development. The GETT program is a user-friendly, menu-driven model developed using dBASE III/trademark/, a relational data base management system. The data base for GETT consists primarily of eight separate dBASE III/trademark/ files corresponding to each of the eight taxes (real property, personal property, corporate income, franchise, sales, use, severance, and excise) levied by State and local jurisdictions on business property and activity. Additional smaller files help to control model inputs and reporting options. Volume 3 of the GETT model documentation is the source code. The code is arranged primarily by the eight tax types. Other code files include those for JURISDICTION, SIMULATION, VALIDATION, TAXES, CHANGES, REPORTS, GILOT, and GETT. The code has been verified through hand calculations

  8. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  9. A Source Term Calculation for the APR1400 NSSS Auxiliary System Components Using the Modified SHIELD Code

    International Nuclear Information System (INIS)

    Park, Hong Sik; Kim, Min; Park, Seong Chan; Seo, Jong Tae; Kim, Eun Kee

    2005-01-01

    The SHIELD code has been used to calculate the source terms of NSSS Auxiliary System (comprising CVCS, SIS, and SCS) components of the OPR1000. Because the code had been developed based upon the SYSTEM80 design and the APR1400 NSSS Auxiliary System design is considerably changed from that of SYSTEM80 or OPR1000, the SHIELD code cannot be used directly for APR1400 radiation design. Thus the hand-calculation is needed for the portion of design changes using the results of the SHIELD code calculation. In this study, the SHIELD code is modified to incorporate the APR1400 design changes and the source term calculation is performed for the APR1400 NSSS Auxiliary System components

  10. Comparison of TG-43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes.

    Science.gov (United States)

    Zaker, Neda; Zehtabian, Mehdi; Sina, Sedigheh; Koontz, Craig; Meigooni, Ali S

    2016-03-08

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross-sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross-sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in 125I and 103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code - MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low-energy sources such as 125I and 103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for 103Pd and 10 cm for 125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for 192Ir and less than 1.2% for 137Cs between the three codes.

  11. CMCpy: Genetic Code-Message Coevolution Models in Python

    Science.gov (United States)

    Becich, Peter J.; Stark, Brian P.; Bhat, Harish S.; Ardell, David H.

    2013-01-01

    Code-message coevolution (CMC) models represent coevolution of a genetic code and a population of protein-coding genes (“messages”). Formally, CMC models are sets of quasispecies coupled together for fitness through a shared genetic code. Although CMC models display plausible explanations for the origin of multiple genetic code traits by natural selection, useful modern implementations of CMC models are not currently available. To meet this need we present CMCpy, an object-oriented Python API and command-line executable front-end that can reproduce all published results of CMC models. CMCpy implements multiple solvers for leading eigenpairs of quasispecies models. We also present novel analytical results that extend and generalize applications of perturbation theory to quasispecies models and pioneer the application of a homotopy method for quasispecies with non-unique maximally fit genotypes. Our results therefore facilitate the computational and analytical study of a variety of evolutionary systems. CMCpy is free open-source software available from http://pypi.python.org/pypi/CMCpy/. PMID:23532367

  12. Tangent: Automatic Differentiation Using Source Code Transformation in Python

    OpenAIRE

    van Merriënboer, Bart; Wiltschko, Alexander B.; Moldovan, Dan

    2017-01-01

    Automatic differentiation (AD) is an essential primitive for machine learning programming systems. Tangent is a new library that performs AD using source code transformation (SCT) in Python. It takes numeric functions written in a syntactic subset of Python and NumPy as input, and generates new Python functions which calculate a derivative. This approach to automatic differentiation is different from existing packages popular in machine learning, such as TensorFlow and Autograd. Advantages ar...

  13. Speckle noise reduction on a laser projection display via a broadband green light source.

    Science.gov (United States)

    Yu, Nan Ei; Choi, Ju Won; Kang, Heejong; Ko, Do-Kyeong; Fu, Shih-Hao; Liou, Jiun-Wei; Kung, Andy H; Choi, Hee Joo; Kim, Byoung Joo; Cha, Myoungsik; Peng, Lung-Han

    2014-02-10

    A broadband green light source was demonstrated using a tandem-poled lithium niobate (TPLN) crystal. The measured wavelength and temperature bandwidth were 6.5 nm and 100 °C, respectively, spectral bandwidth was 36 times broader than the periodically poled case. Although the conversion efficiency was smaller than in the periodic case, the TPLN device had a good figure of merit owing to the extremely large bandwidth for wavelength and temperature. The developed broadband green light source exhibited speckle noise approximately one-seventh of that in the conventional approach for a laser projection display.

  14. Code of Conduct on the Safety and Security of Radioactive Sources and the Supplementary Guidance on the Import and Export of Radioactive Sources

    International Nuclear Information System (INIS)

    2005-01-01

    In operative paragraph 4 of its resolution GC(47)/RES/7.B, the General Conference, having welcomed the approval by the Board of Governors of the revised IAEA Code of Conduct on the Safety and Security of Radioactive Sources (GC(47)/9), and while recognizing that the Code is not a legally binding instrument, urged each State to write to the Director General that it fully supports and endorses the IAEA's efforts to enhance the safety and security of radioactive sources and is working toward following the guidance contained in the IAEA Code of Conduct. In operative paragraph 5, the Director General was requested to compile, maintain and publish a list of States that have made such a political commitment. The General Conference, in operative paragraph 6, recognized that this procedure 'is an exceptional one, having no legal force and only intended for information, and therefore does not constitute a precedent applicable to other Codes of Conduct of the Agency or of other bodies belonging to the United Nations system'. In operative paragraph 7 of resolution GC(48)/RES/10.D, the General Conference welcomed the fact that more than 60 States had made political commitments with respect to the Code in line with resolution GC(47)/RES/7.B and encouraged other States to do so. In operative paragraph 8 of resolution GC(48)/RES/10.D, the General Conference further welcomed the approval by the Board of Governors of the Supplementary Guidance on the Import and Export of Radioactive Sources (GC(48)/13), endorsed this Guidance while recognizing that it is not legally binding, noted that more than 30 countries had made clear their intention to work towards effective import and export controls by 31 December 2005, and encouraged States to act in accordance with the Guidance on a harmonized basis and to notify the Director General of their intention to do so as supplementary information to the Code of Conduct, recalling operative paragraph 6 of resolution GC(47)/RES/7.B. 4. The

  15. Theoretical Atomic Physics code development II: ACE: Another collisional excitation code

    International Nuclear Information System (INIS)

    Clark, R.E.H.; Abdallah, J. Jr.; Csanak, G.; Mann, J.B.; Cowan, R.D.

    1988-12-01

    A new computer code for calculating collisional excitation data (collision strengths or cross sections) using a variety of models is described. The code uses data generated by the Cowan Atomic Structure code or CATS for the atomic structure. Collisional data are placed on a random access file and can be displayed in a variety of formats using the Theoretical Atomic Physics Code or TAPS. All of these codes are part of the Theoretical Atomic Physics code development effort at Los Alamos. 15 refs., 10 figs., 1 tab

  16. Hybrid digital-analog coding with bandwidth expansion for correlated Gaussian sources under Rayleigh fading

    Science.gov (United States)

    Yahampath, Pradeepa

    2017-12-01

    Consider communicating a correlated Gaussian source over a Rayleigh fading channel with no knowledge of the channel signal-to-noise ratio (CSNR) at the transmitter. In this case, a digital system cannot be optimal for a range of CSNRs. Analog transmission however is optimal at all CSNRs, if the source and channel are memoryless and bandwidth matched. This paper presents new hybrid digital-analog (HDA) systems for sources with memory and channels with bandwidth expansion, which outperform both digital-only and analog-only systems over a wide range of CSNRs. The digital part is either a predictive quantizer or a transform code, used to achieve a coding gain. Analog part uses linear encoding to transmit the quantization error which improves the performance under CSNR variations. The hybrid encoder is optimized to achieve the minimum AMMSE (average minimum mean square error) over the CSNR distribution. To this end, analytical expressions are derived for the AMMSE of asymptotically optimal systems. It is shown that the outage CSNR of the channel code and the analog-digital power allocation must be jointly optimized to achieve the minimum AMMSE. In the case of HDA predictive quantization, a simple algorithm is presented to solve the optimization problem. Experimental results are presented for both Gauss-Markov sources and speech signals.

  17. Comparison of TG‐43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes

    Science.gov (United States)

    Zaker, Neda; Sina, Sedigheh; Koontz, Craig; Meigooni1, Ali S.

    2016-01-01

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross‐sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross‐sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in  125I and  103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code — MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low‐energy sources such as  125I and  103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for  103Pd and 10 cm for  125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for  192Ir and less than 1.2% for  137Cs between the three codes. PACS number(s): 87.56.bg PMID:27074460

  18. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    Science.gov (United States)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be

  19. A content analysis of displayed alcohol references on a social networking web site.

    Science.gov (United States)

    Moreno, Megan A; Briner, Leslie R; Williams, Amanda; Brockman, Libby; Walker, Leslie; Christakis, Dimitri A

    2010-08-01

    Exposure to alcohol use in media is associated with adolescent alcohol use. Adolescents frequently display alcohol references on Internet media, such as social networking web sites. The purpose of this study was to conduct a theoretically based content analysis of older adolescents' displayed alcohol references on a social networking web site. We evaluated 400 randomly selected public MySpace profiles of self-reported 17- to 20-year-olds from zip codes, representing urban, suburban, and rural communities in one Washington county. Content was evaluated for alcohol references, suggesting: (1) explicit versus figurative alcohol use, (2) alcohol-related motivations, associations, and consequences, including references that met CRAFFT problem drinking criteria. We compared profiles from four target zip codes for prevalence and frequency of alcohol display. Of 400 profiles, 225 (56.3%) contained 341 references to alcohol. Profile owners who displayed alcohol references were mostly male (54.2%) and white (70.7%). The most frequent reference category was explicit use (49.3%); the most commonly displayed alcohol use motivation was peer pressure (4.7%). Few references met CRAFFT problem drinking criteria (3.2%). There were no differences in prevalence or frequency of alcohol display among the four sociodemographic communities. Despite alcohol use being illegal and potentially stigmatizing in this population, explicit alcohol use is frequently referenced on adolescents' MySpace profiles across several sociodemographic communities. Motivations, associations, and consequences regarding alcohol use referenced on MySpace appear consistent with previous studies of adolescent alcohol use. These references may be a potent source of influence on adolescents, particularly given that they are created and displayed by peers. (c) 2010 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  20. Source Code Verification for Embedded Systems using Prolog

    Directory of Open Access Journals (Sweden)

    Frank Flederer

    2017-01-01

    Full Text Available System relevant embedded software needs to be reliable and, therefore, well tested, especially for aerospace systems. A common technique to verify programs is the analysis of their abstract syntax tree (AST. Tree structures can be elegantly analyzed with the logic programming language Prolog. Moreover, Prolog offers further advantages for a thorough analysis: On the one hand, it natively provides versatile options to efficiently process tree or graph data structures. On the other hand, Prolog's non-determinism and backtracking eases tests of different variations of the program flow without big effort. A rule-based approach with Prolog allows to characterize the verification goals in a concise and declarative way. In this paper, we describe our approach to verify the source code of a flash file system with the help of Prolog. The flash file system is written in C++ and has been developed particularly for the use in satellites. We transform a given abstract syntax tree of C++ source code into Prolog facts and derive the call graph and the execution sequence (tree, which then are further tested against verification goals. The different program flow branching due to control structures is derived by backtracking as subtrees of the full execution sequence. Finally, these subtrees are verified in Prolog. We illustrate our approach with a case study, where we search for incorrect applications of semaphores in embedded software using the real-time operating system RODOS. We rely on computation tree logic (CTL and have designed an embedded domain specific language (DSL in Prolog to express the verification goals.

  1. Multi-rate control over AWGN channels via analog joint source-channel coding

    KAUST Repository

    Khina, Anatoly; Pettersson, Gustav M.; Kostina, Victoria; Hassibi, Babak

    2017-01-01

    We consider the problem of controlling an unstable plant over an additive white Gaussian noise (AWGN) channel with a transmit power constraint, where the signaling rate of communication is larger than the sampling rate (for generating observations and applying control inputs) of the underlying plant. Such a situation is quite common since sampling is done at a rate that captures the dynamics of the plant and which is often much lower than the rate that can be communicated. This setting offers the opportunity of improving the system performance by employing multiple channel uses to convey a single message (output plant observation or control input). Common ways of doing so are through either repeating the message, or by quantizing it to a number of bits and then transmitting a channel coded version of the bits whose length is commensurate with the number of channel uses per sampled message. We argue that such “separated source and channel coding” can be suboptimal and propose to perform joint source-channel coding. Since the block length is short we obviate the need to go to the digital domain altogether and instead consider analog joint source-channel coding. For the case where the communication signaling rate is twice the sampling rate, we employ the Archimedean bi-spiral-based Shannon-Kotel'nikov analog maps to show significant improvement in stability margins and linear-quadratic Gaussian (LQG) costs over simple schemes that employ repetition.

  2. Multi-rate control over AWGN channels via analog joint source-channel coding

    KAUST Repository

    Khina, Anatoly

    2017-01-05

    We consider the problem of controlling an unstable plant over an additive white Gaussian noise (AWGN) channel with a transmit power constraint, where the signaling rate of communication is larger than the sampling rate (for generating observations and applying control inputs) of the underlying plant. Such a situation is quite common since sampling is done at a rate that captures the dynamics of the plant and which is often much lower than the rate that can be communicated. This setting offers the opportunity of improving the system performance by employing multiple channel uses to convey a single message (output plant observation or control input). Common ways of doing so are through either repeating the message, or by quantizing it to a number of bits and then transmitting a channel coded version of the bits whose length is commensurate with the number of channel uses per sampled message. We argue that such “separated source and channel coding” can be suboptimal and propose to perform joint source-channel coding. Since the block length is short we obviate the need to go to the digital domain altogether and instead consider analog joint source-channel coding. For the case where the communication signaling rate is twice the sampling rate, we employ the Archimedean bi-spiral-based Shannon-Kotel\\'nikov analog maps to show significant improvement in stability margins and linear-quadratic Gaussian (LQG) costs over simple schemes that employ repetition.

  3. Running the source term code package in Elebra MX-850

    International Nuclear Information System (INIS)

    Guimaraes, A.C.F.; Goes, A.G.A.

    1988-01-01

    The source term package (STCP) is one of the main tools applied in calculations of behavior of fission products from nuclear power plants. It is a set of computer codes to assist the calculations of the radioactive materials leaving from the metallic containment of power reactors to the environment during a severe reactor accident. The original version of STCP runs in SDC computer systems, but as it has been written in FORTRAN 77, is possible run it in others systems such as IBM, Burroughs, Elebra, etc. The Elebra MX-8500 version of STCP contains 5 codes:March 3, Trapmelt, Tcca, Vanessa and Nava. The example presented in this report has taken into consideration a small LOCA accident into a PWR type reactor. (M.I.)

  4. Code of practice for the use of sealed radioactive sources in borehole logging (1998)

    International Nuclear Information System (INIS)

    1989-12-01

    The purpose of this code is to establish working practices, procedures and protective measures which will aid in keeping doses, arising from the use of borehole logging equipment containing sealed radioactive sources, to as low as reasonably achievable and to ensure that the dose-equivalent limits specified in the National Health and Medical Research Council s radiation protection standards, are not exceeded. This code applies to all situations and practices where a sealed radioactive source or sources are used through wireline logging for investigating the physical properties of the geological sequence, or any fluids contained in the geological sequence, or the properties of the borehole itself, whether casing, mudcake or borehole fluids. The radiation protection standards specify dose-equivalent limits for two categories: radiation workers and members of the public. 3 refs., tabs., ills

  5. CACTI: free, open-source software for the sequential coding of behavioral interactions.

    Science.gov (United States)

    Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B

    2012-01-01

    The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery.

  6. Neutrons Flux Distributions of the Pu-Be Source and its Simulation by the MCNP-4B Code

    Science.gov (United States)

    Faghihi, F.; Mehdizadeh, S.; Hadad, K.

    Neutron Fluence rate of a low intense Pu-Be source is measured by Neutron Activation Analysis (NAA) of 197Au foils. Also, the neutron fluence rate distribution versus energy is calculated using the MCNP-4B code based on ENDF/B-V library. Theoretical simulation as well as our experimental performance are a new experience for Iranians to make reliability with the code for further researches. In our theoretical investigation, an isotropic Pu-Be source with cylindrical volume distribution is simulated and relative neutron fluence rate versus energy is calculated using MCNP-4B code. Variation of the fast and also thermal neutrons fluence rate, which are measured by NAA method and MCNP code, are compared.

  7. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    Natarajan Meghanathan

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  8. X-33 Telemetry Best Source Selection, Processing, Display, and Simulation Model Comparison

    Science.gov (United States)

    Burkes, Darryl A.

    1998-01-01

    The X-33 program requires the use of multiple telemetry ground stations to cover the launch, ascent, transition, descent, and approach phases for the flights from Edwards AFB to landings at Dugway Proving Grounds, UT and Malmstrom AFB, MT. This paper will discuss the X-33 telemetry requirements and design, including information on fixed and mobile telemetry systems, best source selection, and support for Range Safety Officers. A best source selection system will be utilized to automatically determine the best source based on the frame synchronization status of the incoming telemetry streams. These systems will be used to select the best source at the landing sites and at NASA Dryden Flight Research Center to determine the overall best source between the launch site, intermediate sites, and landing site sources. The best source at the landing sites will be decommutated to display critical flight safety parameters for the Range Safety Officers. The overall best source will be sent to the Lockheed Martin's Operational Control Center at Edwards AFB for performance monitoring by X-33 program personnel and for monitoring of critical flight safety parameters by the primary Range Safety Officer. The real-time telemetry data (received signal strength, etc.) from each of the primary ground stations will also be compared during each nu'ssion with simulation data generated using the Dynamic Ground Station Analysis software program. An overall assessment of the accuracy of the model will occur after each mission. Acknowledgment: The work described in this paper was NASA supported through cooperative agreement NCC8-115 with Lockheed Martin Skunk Works.

  9. From system requirements to source code: transitions in UML and RUP

    Directory of Open Access Journals (Sweden)

    Stanisław Wrycza

    2011-06-01

    Full Text Available There are many manuals explaining language specification among UML-related books. Only some of books mentioned concentrate on practical aspects of using the UML language in effective way using CASE tools and RUP. The current paper presents transitions from system requirements specification to structural source code, useful while developing an information system.

  10. Source coherence impairments in a direct detection direct sequence optical code-division multiple-access system.

    Science.gov (United States)

    Fsaifes, Ihsan; Lepers, Catherine; Lourdiane, Mounia; Gallion, Philippe; Beugin, Vincent; Guignard, Philippe

    2007-02-01

    We demonstrate that direct sequence optical code- division multiple-access (DS-OCDMA) encoders and decoders using sampled fiber Bragg gratings (S-FBGs) behave as multipath interferometers. In that case, chip pulses of the prime sequence codes generated by spreading in time-coherent data pulses can result from multiple reflections in the interferometers that can superimpose within a chip time duration. We show that the autocorrelation function has to be considered as the sum of complex amplitudes of the combined chip as the laser source coherence time is much greater than the integration time of the photodetector. To reduce the sensitivity of the DS-OCDMA system to the coherence time of the laser source, we analyze the use of sparse and nonperiodic quadratic congruence and extended quadratic congruence codes.

  11. Source coherence impairments in a direct detection direct sequence optical code-division multiple-access system

    Science.gov (United States)

    Fsaifes, Ihsan; Lepers, Catherine; Lourdiane, Mounia; Gallion, Philippe; Beugin, Vincent; Guignard, Philippe

    2007-02-01

    We demonstrate that direct sequence optical code- division multiple-access (DS-OCDMA) encoders and decoders using sampled fiber Bragg gratings (S-FBGs) behave as multipath interferometers. In that case, chip pulses of the prime sequence codes generated by spreading in time-coherent data pulses can result from multiple reflections in the interferometers that can superimpose within a chip time duration. We show that the autocorrelation function has to be considered as the sum of complex amplitudes of the combined chip as the laser source coherence time is much greater than the integration time of the photodetector. To reduce the sensitivity of the DS-OCDMA system to the coherence time of the laser source, we analyze the use of sparse and nonperiodic quadratic congruence and extended quadratic congruence codes.

  12. RMG An Open Source Electronic Structure Code for Multi-Petaflops Calculations

    Science.gov (United States)

    Briggs, Emil; Lu, Wenchang; Hodak, Miroslav; Bernholc, Jerzy

    RMG (Real-space Multigrid) is an open source, density functional theory code for quantum simulations of materials. It solves the Kohn-Sham equations on real-space grids, which allows for natural parallelization via domain decomposition. Either subspace or Davidson diagonalization, coupled with multigrid methods, are used to accelerate convergence. RMG is a cross platform open source package which has been used in the study of a wide range of systems, including semiconductors, biomolecules, and nanoscale electronic devices. It can optionally use GPU accelerators to improve performance on systems where they are available. The recently released versions (>2.0) support multiple GPU's per compute node, have improved performance and scalability, enhanced accuracy and support for additional hardware platforms. New versions of the code are regularly released at http://www.rmgdft.org. The releases include binaries for Linux, Windows and MacIntosh systems, automated builds for clusters using cmake, as well as versions adapted to the major supercomputing installations and platforms. Several recent, large-scale applications of RMG will be discussed.

  13. Producing EGS4 shower displays with the Unified Graphics System

    International Nuclear Information System (INIS)

    Cowan, R.F.

    1990-01-01

    The EGS4 Code System has been coupled with the SLAC Unified Graphics System in such a manner as to provide a means for displaying showers on UGS77-supported devices. This is most easily accomplished by attaching an auxiliary subprogram package (SHOWGRAF) to existing EGS4 User Codes and making use of a graphics display or a post-processor code called EGS4PL. SHOWGRAF may be used to create shower displays directly on interactive IBM 5080 color display devices, supporting three-dimensional rotations, translations, and zoom features, and providing illustration of particle types and energies by color and/or intensity. Alternatively, SHOWGRAF may be used to record a two-dimensional projection of the shower in a device-independent graphics file. The EGS4PL post-processor may then be used to convert this file into device-dependent graphics code for any UGS77-supported device. Options exist within EGS4PL that allow for two-dimensional translations and zoom, for creating line structure to indicate particle types and energies, and for optional display of particles by type. All of this is facilitated by means of the command processor EGS4PL EXEC together with new options (5080 and PDEV) with the standard EGS4IN EXEC routine for running EGS4 interactively under VM/SP. 6 refs

  14. Coded moderator approach for fast neutron source detection and localization at standoff

    Energy Technology Data Exchange (ETDEWEB)

    Littell, Jennifer [Department of Nuclear Engineering, University of Tennessee, 305 Pasqua Engineering Building, Knoxville, TN 37996 (United States); Lukosi, Eric, E-mail: elukosi@utk.edu [Department of Nuclear Engineering, University of Tennessee, 305 Pasqua Engineering Building, Knoxville, TN 37996 (United States); Institute for Nuclear Security, University of Tennessee, 1640 Cumberland Avenue, Knoxville, TN 37996 (United States); Hayward, Jason; Milburn, Robert; Rowan, Allen [Department of Nuclear Engineering, University of Tennessee, 305 Pasqua Engineering Building, Knoxville, TN 37996 (United States)

    2015-06-01

    Considering the need for directional sensing at standoff for some security applications and scenarios where a neutron source may be shielded by high Z material that nearly eliminates the source gamma flux, this work focuses on investigating the feasibility of using thermal neutron sensitive boron straw detectors for fast neutron source detection and localization. We utilized MCNPX simulations to demonstrate that, through surrounding the boron straw detectors by a HDPE coded moderator, a source-detector orientation-specific response enables potential 1D source localization in a high neutron detection efficiency design. An initial test algorithm has been developed in order to confirm the viability of this detector system's localization capabilities which resulted in identification of a 1 MeV neutron source with a strength equivalent to 8 kg WGPu at 50 m standoff within ±11°.

  15. Source Code Stylometry Improvements in Python

    Science.gov (United States)

    2017-12-14

    grant (Caliskan-Islam et al. 2015) ............. 1 Fig. 2 Corresponding abstract syntax tree from de-anonymizing programmers’ paper (Caliskan-Islam et...person can be identified via their handwriting or an author identified by their style or prose, programmers can be identified by their code...Provided a labelled training set of code samples (example in Fig. 1), the techniques used in stylometry can identify the author of a piece of code or even

  16. Refreshable Braille displays using EAP actuators

    Science.gov (United States)

    Bar-Cohen, Yoseph

    2010-04-01

    Refreshable Braille can help visually impaired persons benefit from the growing advances in computer technology. The development of such displays in a full screen form is a great challenge due to the need to pack many actuators in small area without interferences. In recent years, various displays using actuators such as piezoelectric stacks have become available in commercial form but most of them are limited to one line Braille code. Researchers in the field of electroactive polymers (EAP) investigated methods of using these materials to form full screen displays. This manuscript reviews the state of the art of producing refreshable Braille displays using EAP-based actuators.

  17. Refreshable Braille Displays Using EAP Actuators

    Science.gov (United States)

    Bar-Cohen, Yoseph

    2010-01-01

    Refreshable Braille can help visually impaired persons benefit from the growing advances in computer technology. The development of such displays in a full screen form is a great challenge due to the need to pack many actuators in small area without interferences. In recent years, various displays using actuators such as piezoelectric stacks have become available in commercial form but most of them are limited to one line Braille code. Researchers in the field of electroactive polymers (EAP) investigated methods of using these materials to form full screen displays. This manuscript reviews the state of the art of producing refreshable Braille displays using EAP-based actuators..

  18. WE-D-9A-06: Open Source Monitor Calibration and Quality Control Software for Enterprise Display Management

    Energy Technology Data Exchange (ETDEWEB)

    Bevins, N; Vanderhoek, M; Lang, S; Flynn, M [Henry Ford Health System, Detroit, MI (United States)

    2014-06-15

    Purpose: Medical display monitor calibration and quality control present challenges to medical physicists. The purpose of this work is to demonstrate and share experiences with an open source package that allows for both initial monitor setup and routine performance evaluation. Methods: A software package, pacsDisplay, has been developed over the last decade to aid in the calibration of all monitors within the radiology group in our health system. The software is used to calibrate monitors to follow the DICOM Grayscale Standard Display Function (GSDF) via lookup tables installed on the workstation. Additional functionality facilitates periodic evaluations of both primary and secondary medical monitors to ensure satisfactory performance. This software is installed on all radiology workstations, and can also be run as a stand-alone tool from a USB disk. Recently, a database has been developed to store and centralize the monitor performance data and to provide long-term trends for compliance with internal standards and various accrediting organizations. Results: Implementation and utilization of pacsDisplay has resulted in improved monitor performance across the health system. Monitor testing is now performed at regular intervals and the software is being used across multiple imaging modalities. Monitor performance characteristics such as maximum and minimum luminance, ambient luminance and illuminance, color tracking, and GSDF conformity are loaded into a centralized database for system performance comparisons. Compliance reports for organizations such as MQSA, ACR, and TJC are generated automatically and stored in the same database. Conclusion: An open source software solution has simplified and improved the standardization of displays within our health system. This work serves as an example method for calibrating and testing monitors within an enterprise health system.

  19. WE-D-9A-06: Open Source Monitor Calibration and Quality Control Software for Enterprise Display Management

    International Nuclear Information System (INIS)

    Bevins, N; Vanderhoek, M; Lang, S; Flynn, M

    2014-01-01

    Purpose: Medical display monitor calibration and quality control present challenges to medical physicists. The purpose of this work is to demonstrate and share experiences with an open source package that allows for both initial monitor setup and routine performance evaluation. Methods: A software package, pacsDisplay, has been developed over the last decade to aid in the calibration of all monitors within the radiology group in our health system. The software is used to calibrate monitors to follow the DICOM Grayscale Standard Display Function (GSDF) via lookup tables installed on the workstation. Additional functionality facilitates periodic evaluations of both primary and secondary medical monitors to ensure satisfactory performance. This software is installed on all radiology workstations, and can also be run as a stand-alone tool from a USB disk. Recently, a database has been developed to store and centralize the monitor performance data and to provide long-term trends for compliance with internal standards and various accrediting organizations. Results: Implementation and utilization of pacsDisplay has resulted in improved monitor performance across the health system. Monitor testing is now performed at regular intervals and the software is being used across multiple imaging modalities. Monitor performance characteristics such as maximum and minimum luminance, ambient luminance and illuminance, color tracking, and GSDF conformity are loaded into a centralized database for system performance comparisons. Compliance reports for organizations such as MQSA, ACR, and TJC are generated automatically and stored in the same database. Conclusion: An open source software solution has simplified and improved the standardization of displays within our health system. This work serves as an example method for calibrating and testing monitors within an enterprise health system

  20. BNL325 - Nuclear reaction data display program

    International Nuclear Information System (INIS)

    Dunford, C.L.

    1994-01-01

    A computer code for the graphical display of nuclear reaction data is described. The code, which works on a computer with VMS operating system, can overlay experimental data from an EXFOR/CSISRS table-computation format with evaluated data from ENDF formatted data libraries. Originally, this code has been used at the U.S. National Nuclear Data Center to produce the well-known neutron cross-section atlas published as report BNL-325. (author). 3 tabs

  1. BNL325 - Nuclear reaction data display program

    Energy Technology Data Exchange (ETDEWEB)

    Dunford, C L

    1994-11-27

    A computer code for the graphical display of nuclear reaction data is described. The code, which works on a computer with VMS operating system, can overlay experimental data from an EXFOR/CSISRS table-computation format with evaluated data from ENDF formatted data libraries. Originally, this code has been used at the U.S. National Nuclear Data Center to produce the well-known neutron cross-section atlas published as report BNL-325. (author). 3 tabs.

  2. Securing information display by use of visual cryptography.

    Science.gov (United States)

    Yamamoto, Hirotsugu; Hayasaki, Yoshio; Nishida, Nobuo

    2003-09-01

    We propose a secure display technique based on visual cryptography. The proposed technique ensures the security of visual information. The display employs a decoding mask based on visual cryptography. Without the decoding mask, the displayed information cannot be viewed. The viewing zone is limited by the decoding mask so that only one person can view the information. We have developed a set of encryption codes to maintain the designed viewing zone and have demonstrated a display that provides a limited viewing zone.

  3. Pre-coding method and apparatus for multiple source or time-shifted single source data and corresponding inverse post-decoding method and apparatus

    Science.gov (United States)

    Yeh, Pen-Shu (Inventor)

    1998-01-01

    A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.

  4. Design of Programmable LED Controller with a Variable Current Source for 3D Image Display

    Directory of Open Access Journals (Sweden)

    Kyung-Ryang Lee

    2014-12-01

    Full Text Available Conventional fluorescent light sources, as well as incandescent light sources are gradually being replaced by Light Emitting Diodes (LEDs for reducing power consumption in the image display area for multimedia application. An LED light source requires a controller with a low-power operation. In this paper, a low-power technique using adiabatic operation is applied for the implementation of LED controller with a stable constant-current, a low-power and low-heat function. From the simulation result, the power consumption of the proposed LED controller using adiabatic operation was reduced to about 87% in comparison with conventional operation with a constant VDD. The proposed circuit is expected to be an alternative LED controller which is sensitive to external conditions such as heat.

  5. Aberdeen polygons: computer displays of physiological profiles for intensive care.

    Science.gov (United States)

    Green, C A; Logie, R H; Gilhooly, K J; Ross, D G; Ronald, A

    1996-03-01

    The clinician in an intensive therapy unit is presented regularly with a range of information about the current physiological state of the patients under care. This information typically comes from a variety of sources and in a variety of formats. A more integrated form of display incorporating several physiological parameters may be helpful therefore. Three experiments are reported that explored the potential use of analogue, polygon diagrams to display physiological data from patients undergoing intensive therapy. Experiment 1 demonstrated that information can be extracted readily from such diagrams comprising 8- or 10-sided polygons, but with an advantage for simpler polygons and for information displayed at the top of the diagram. Experiment 2 showed that colour coding removed these biases for simpler polygons and the top of the diagram, together with speeding the processing time. Experiment 3 used polygons displaying patterns of physiological data that were consistent with typical conditions observed in the intensive care unit. It was found that physicians can readily learn to recognize these patterns and to diagnose both the nature and severity of the patient's physiological state. These polygon diagrams appear to have some considerable potential for use in providing on-line summary information of a patient's physiological state.

  6. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  7. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  8. Bit rates in audio source coding

    NARCIS (Netherlands)

    Veldhuis, Raymond N.J.

    1992-01-01

    The goal is to introduce and solve the audio coding optimization problem. Psychoacoustic results such as masking and excitation pattern models are combined with results from rate distortion theory to formulate the audio coding optimization problem. The solution of the audio optimization problem is a

  9. COMPASS: A source term code for investigating capillary barrier performance

    International Nuclear Information System (INIS)

    Zhou, Wei; Apted, J.J.

    1996-01-01

    A computer code COMPASS based on compartment model approach is developed to calculate the near-field source term of the High-Level-Waste repository under unsaturated conditions. COMPASS is applied to evaluate the expected performance of Richard's (capillary) barriers as backfills to divert infiltrating groundwater at Yucca Mountain. Comparing the release rates of four typical nuclides with and without the Richard's barrier, it is shown that the Richard's barrier significantly decreases the peak release rates from the Engineered-Barrier-System (EBS) into the host rock

  10. Time-dependent anisotropic external sources in transient 3-D transport code TORT-TD

    International Nuclear Information System (INIS)

    Seubert, A.; Pautz, A.; Becker, M.; Dagan, R.

    2009-01-01

    This paper describes the implementation of a time-dependent distributed external source in TORT-TD by explicitly considering the external source in the ''fixed-source'' term of the implicitly time-discretised 3-D discrete ordinates transport equation. Anisotropy of the external source is represented by a spherical harmonics series expansion similar to the angular fluxes. The YALINA-Thermal subcritical assembly serves as a test case. The configuration with 280 fuel rods has been analysed with TORT-TD using cross sections in 18 energy groups and P1 scattering order generated by the KAPROS code system. Good agreement is achieved concerning the multiplication factor. The response of the system to an artificial time-dependent source consisting of two square-wave pulses demonstrates the time-dependent external source capability of TORT-TD. The result is physically plausible as judged from validation calculations. (orig.)

  11. Experimental benchmark of the NINJA code for application to the Linac4 H- ion source plasma

    Science.gov (United States)

    Briefi, S.; Mattei, S.; Rauner, D.; Lettry, J.; Tran, M. Q.; Fantz, U.

    2017-10-01

    For a dedicated performance optimization of negative hydrogen ion sources applied at particle accelerators, a detailed assessment of the plasma processes is required. Due to the compact design of these sources, diagnostic access is typically limited to optical emission spectroscopy yielding only line-of-sight integrated results. In order to allow for a spatially resolved investigation, the electromagnetic particle-in-cell Monte Carlo collision code NINJA has been developed for the Linac4 ion source at CERN. This code considers the RF field generated by the ICP coil as well as the external static magnetic fields and calculates self-consistently the resulting discharge properties. NINJA is benchmarked at the diagnostically well accessible lab experiment CHARLIE (Concept studies for Helicon Assisted RF Low pressure Ion sourcEs) at varying RF power and gas pressure. A good general agreement is observed between experiment and simulation although the simulated electron density trends for varying pressure and power as well as the absolute electron temperature values deviate slightly from the measured ones. This can be explained by the assumption of strong inductive coupling in NINJA, whereas the CHARLIE discharges show the characteristics of loosely coupled plasmas. For the Linac4 plasma, this assumption is valid. Accordingly, both the absolute values of the accessible plasma parameters and their trends for varying RF power agree well in measurement and simulation. At varying RF power, the H- current extracted from the Linac4 source peaks at 40 kW. For volume operation, this is perfectly reflected by assessing the processes in front of the extraction aperture based on the simulation results where the highest H- density is obtained for the same power level. In surface operation, the production of negative hydrogen ions at the converter surface can only be considered by specialized beam formation codes, which require plasma parameters as input. It has been demonstrated that

  12. Theory of the space-dependent fuel management computer code ''UAFCC''

    International Nuclear Information System (INIS)

    El-Meshad, Y.; Morsy, S.; El-Osery, I.A.

    1981-01-01

    This report displays the theory of the spatial burnup computer code ''UAFCC'' which has been constructed as a part of an integrated reactor calculation scheme proposed at the Reactors Department of the ARE Atomic Energy Authority. The ''UAFCC'' is a single energy-one-dimensional diffusion burnup FORTRAN computer code for well moderated, multiregion, cylindrical thermal reactors. The effect of reactivity variation with burnup is introduced in the steady state diffusion equation by a fictitious neutron source. The infinite multiplication factor, the total migration area, and the power density per unit thermal flux are calculated from the point model burnup code ''UABUC'' fitted to polynomials of suitable degree in the flux-time, and then used as an input data to the ''UAFCC'' code. The proposed burnup spatial model has been used to study the different stratogemes of the incore fuel management schemes. The conclusions of this study will be presented in a future publication. (author)

  13. Uncertainties in source term calculations generated by the ORIGEN2 computer code for Hanford Production Reactors

    International Nuclear Information System (INIS)

    Heeb, C.M.

    1991-03-01

    The ORIGEN2 computer code is the primary calculational tool for computing isotopic source terms for the Hanford Environmental Dose Reconstruction (HEDR) Project. The ORIGEN2 code computes the amounts of radionuclides that are created or remain in spent nuclear fuel after neutron irradiation and radioactive decay have occurred as a result of nuclear reactor operation. ORIGEN2 was chosen as the primary code for these calculations because it is widely used and accepted by the nuclear industry, both in the United States and the rest of the world. Its comprehensive library of over 1,600 nuclides includes any possible isotope of interest to the HEDR Project. It is important to evaluate the uncertainties expected from use of ORIGEN2 in the HEDR Project because these uncertainties may have a pivotal impact on the final accuracy and credibility of the results of the project. There are three primary sources of uncertainty in an ORIGEN2 calculation: basic nuclear data uncertainty in neutron cross sections, radioactive decay constants, energy per fission, and fission product yields; calculational uncertainty due to input data; and code uncertainties (i.e., numerical approximations, and neutron spectrum-averaged cross-section values from the code library). 15 refs., 5 figs., 5 tabs

  14. Compact RGBY light sources with high luminance for laser display applications

    Science.gov (United States)

    Paschke, Katrin; Blume, Gunnar; Werner, Nils; Müller, André; Sumpf, Bernd; Pohl, Johannes; Feise, David; Ressel, Peter; Sahm, Alexander; Bege, Roland; Hofmann, Julian; Jedrzejczyk, Daniel; Tränkle, Günther

    2018-02-01

    Watt-class visible laser light with a high luminance can be created with high-power GaAs-based lasers either directly in the red spectral region or using single-pass second harmonic generation (SHG) for the colors in the blue-yellow spectral region. The concepts and results of red- and near infrared-emitting distributed Bragg reflector tapered lasers and master oscillator power amplifier systems as well as their application for SHG bench-top experiments and miniaturized modules are presented. Examples of these high-luminance light sources aiming at different applications such as flying spot display or holographic 3D cinema are discussed in more detail. The semiconductor material allows an easy adaptation of the wavelength allowing techniques such as six-primary color 3D projection or color space enhancement by adding a fourth yellow color.

  15. Evaluation of tactual displays for flight control

    Science.gov (United States)

    Levison, W. H.; Tanner, R. B.; Triggs, T. J.

    1973-01-01

    Manual tracking experiments were conducted to determine the suitability of tactual displays for presenting flight-control information in multitask situations. Although tracking error scores are considerably greater than scores obtained with a continuous visual display, preliminary results indicate that inter-task interference effects are substantially less with the tactual display in situations that impose high visual scanning workloads. The single-task performance degradation found with the tactual display appears to be a result of the coding scheme rather than the use of the tactual sensory mode per se. Analysis with the state-variable pilot/vehicle model shows that reliable predictions of tracking errors can be obtained for wide-band tracking systems once the pilot-related model parameters have been adjusted to reflect the pilot-display interaction.

  16. A Comparison of Source Code Plagiarism Detection Engines

    Science.gov (United States)

    Lancaster, Thomas; Culwin, Fintan

    2004-06-01

    Automated techniques for finding plagiarism in student source code submissions have been in use for over 20 years and there are many available engines and services. This paper reviews the literature on the major modern detection engines, providing a comparison of them based upon the metrics and techniques they deploy. Generally the most common and effective techniques are seen to involve tokenising student submissions then searching pairs of submissions for long common substrings, an example of what is defined to be a paired structural metric. Computing academics are recommended to use one of the two Web-based detection engines, MOSS and JPlag. It is shown that whilst detection is well established there are still places where further research would be useful, particularly where visual support of the investigation process is possible.

  17. Sensitivity analysis and benchmarking of the BLT low-level waste source term code

    International Nuclear Information System (INIS)

    Suen, C.J.; Sullivan, T.M.

    1993-07-01

    To evaluate the source term for low-level waste disposal, a comprehensive model had been developed and incorporated into a computer code, called BLT (Breach-Leach-Transport) Since the release of the original version, many new features and improvements had also been added to the Leach model of the code. This report consists of two different studies based on the new version of the BLT code: (1) a series of verification/sensitivity tests; and (2) benchmarking of the BLT code using field data. Based on the results of the verification/sensitivity tests, the authors concluded that the new version represents a significant improvement and it is capable of providing more realistic simulations of the leaching process. Benchmarking work was carried out to provide a reasonable level of confidence in the model predictions. In this study, the experimentally measured release curves for nitrate, technetium-99 and tritium from the saltstone lysimeters operated by Savannah River Laboratory were used. The model results are observed to be in general agreement with the experimental data, within the acceptable limits of uncertainty

  18. Handbook of Visual Display Technology

    CERN Document Server

    Cranton, Wayne; Fihn, Mark

    2012-01-01

    The Handbook of Visual Display Technology is a unique work offering a comprehensive description of the science, technology, economic and human interface factors associated with the displays industry. An invaluable compilation of information, the Handbook will serve as a single reference source with expert contributions from over 150 international display professionals and academic researchers. All classes of display device are covered including LCDs, reflective displays, flexible solutions and emissive devices such as OLEDs and plasma displays, with discussion of established principles, emergent technologies, and particular areas of application. The wide-ranging content also encompasses the fundamental science of light and vision, image manipulation, core materials and processing techniques, display driving and metrology.

  19. Review of the status of validation of the computer codes used in the severe accident source term reassessment study (BMI-2104)

    International Nuclear Information System (INIS)

    Kress, T.S.

    1985-04-01

    The determination of severe accident source terms must, by necessity it seems, rely heavily on the use of complex computer codes. Source term acceptability, therefore, rests on the assessed validity of such codes. Consequently, one element of NRC's recent efforts to reassess LWR severe accident source terms is to provide a review of the status of validation of the computer codes used in the reassessment. The results of this review is the subject of this document. The separate review documents compiled in this report were used as a resource along with the results of the BMI-2104 study by BCL and the QUEST study by SNL to arrive at a more-or-less independent appraisal of the status of source term modeling at this time

  20. OpenSWPC: an open-source integrated parallel simulation code for modeling seismic wave propagation in 3D heterogeneous viscoelastic media

    Science.gov (United States)

    Maeda, Takuto; Takemura, Shunsuke; Furumura, Takashi

    2017-07-01

    We have developed an open-source software package, Open-source Seismic Wave Propagation Code (OpenSWPC), for parallel numerical simulations of seismic wave propagation in 3D and 2D (P-SV and SH) viscoelastic media based on the finite difference method in local-to-regional scales. This code is equipped with a frequency-independent attenuation model based on the generalized Zener body and an efficient perfectly matched layer for absorbing boundary condition. A hybrid-style programming using OpenMP and the Message Passing Interface (MPI) is adopted for efficient parallel computation. OpenSWPC has wide applicability for seismological studies and great portability to allowing excellent performance from PC clusters to supercomputers. Without modifying the code, users can conduct seismic wave propagation simulations using their own velocity structure models and the necessary source representations by specifying them in an input parameter file. The code has various modes for different types of velocity structure model input and different source representations such as single force, moment tensor and plane-wave incidence, which can easily be selected via the input parameters. Widely used binary data formats, the Network Common Data Form (NetCDF) and the Seismic Analysis Code (SAC) are adopted for the input of the heterogeneous structure model and the outputs of the simulation results, so users can easily handle the input/output datasets. All codes are written in Fortran 2003 and are available with detailed documents in a public repository.[Figure not available: see fulltext.

  1. New Source Term Model for the RESRAD-OFFSITE Code Version 3

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States); Gnanapragasam, Emmanuel [Argonne National Lab. (ANL), Argonne, IL (United States); Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Shih-Yew [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-06-01

    This report documents the new source term model developed and implemented in Version 3 of the RESRAD-OFFSITE code. This new source term model includes: (1) "first order release with transport" option, in which the release of the radionuclide is proportional to the inventory in the primary contamination and the user-specified leach rate is the proportionality constant, (2) "equilibrium desorption release" option, in which the user specifies the distribution coefficient which quantifies the partitioning of the radionuclide between the solid and aqueous phases, and (3) "uniform release" option, in which the radionuclides are released from a constant fraction of the initially contaminated material during each time interval and the user specifies the duration over which the radionuclides are released.

  2. Microlaser-based displays

    Science.gov (United States)

    Bergstedt, Robert; Fink, Charles G.; Flint, Graham W.; Hargis, David E.; Peppler, Philipp W.

    1997-07-01

    Laser Power Corporation has developed a new type of projection display, based upon microlaser technology and a novel scan architecture, which provides the foundation for bright, extremely high resolution images. A review of projection technologies is presented along with the limitations of each and the difficulties they experience in trying to generate high resolution imagery. The design of the microlaser based projector is discussed along with the advantage of this technology. High power red, green, and blue microlasers have been designed and developed specifically for use in projection displays. These sources, in combination with high resolution, high contrast modulator, produce a 24 bit color gamut, capable of supporting the full range of real world colors. The new scan architecture, which reduces the modulation rate and scan speeds required, is described. This scan architecture, along with the inherent brightness of the laser provides the fundamentals necessary to produce a 5120 by 4096 resolution display. The brightness and color uniformity of the display is excellent, allowing for tiling of the displays with far fewer artifacts than those in a traditionally tiled display. Applications for the display include simulators, command and control centers, and electronic cinema.

  3. Code of practice for the control and safe handling of radioactive sources used for therapeutic purposes (1988)

    International Nuclear Information System (INIS)

    1988-01-01

    This Code is intended as a guide to safe practices in the use of sealed and unsealed radioactive sources and in the management of patients being treated with them. It covers the procedures for the handling, preparation and use of radioactive sources, precautions to be taken for patients undergoing treatment, storage and transport of radioactive sources within a hospital or clinic, and routine testing of sealed sources [fr

  4. Application and analysis of performance of dqpsk advanced modulation format in spectral amplitude coding ocdma

    International Nuclear Information System (INIS)

    Memon, A.

    2015-01-01

    SAC (Spectral Amplitude Coding) is a technique of OCDMA (Optical Code Division Multiple Access) to encode and decode data bits by utilizing spectral components of the broadband source. Usually OOK (ON-Off-Keying) modulation format is used in this encoding scheme. To make SAC OCDMA network spectrally efficient, advanced modulation format of DQPSK (Differential Quaternary Phase Shift Keying) is applied, simulated and analyzed, m-sequence code is encoded in the simulated setup. Performance regarding various lengths of m-sequence code is also analyzed and displayed in the pictorial form. The results of the simulation are evaluated with the help of electrical constellation diagram, eye diagram and bit error rate graph. All the graphs indicate better transmission quality in case of advanced modulation format of DQPSK used in SAC OCDMA network as compared with OOK. (author)

  5. Astronaut John Young displays drawing of Charlie Brown

    Science.gov (United States)

    1969-01-01

    Astronaut John W. Young, Apollo 10 command module pilot, displays drawing of Charlie Brown in this color reproduction taken from the fourth telecast made by the color television camera aboard the Apollo 10 spacecraft. When this picture was made the Apollo 10 spacecraft was about half-way to the moon, or approximately 112,000 nautical miles from the earth. Charlie Brown will be the code name of the Command Module (CM) during Apollo 10 operations when the Lunar Module and CM are separated (34075); Young displays drawing of Snoopy in this reproduction taken from a television transmission. Snoopy will be the code name of the Lunar Module (LM) during Apollo 10 operations when the LM and CM are separated (34076).

  6. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    Directory of Open Access Journals (Sweden)

    CARVALHO, J. S. C.

    2008-12-01

    Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

  7. An compression algorithm for medical images and a display with the decoding function

    International Nuclear Information System (INIS)

    Gotoh, Toshiyuki; Nakagawa, Yukihiro; Shiohara, Morito; Yoshida, Masumi

    1990-01-01

    This paper describes and efficient image compression method for medical images, a high-speed display with the decoding function. In our method, an input image is divided into blocks, and either of Discrete Cosine Transform coding (DCT) or Block Truncation Coding (BTC) is adaptively applied on each block to improve image quality. The display, we developed, receives the compressed data from the host computer and reconstruct images of good quality at high speed using four decoding microprocessors on which our algorithm is implemented in pipeline. By the experiments, our method and display were verified to be effective. (author)

  8. Chronos sickness: digital reality in Duncan Jones’s Source Code

    Directory of Open Access Journals (Sweden)

    Marcia Tiemy Morita Kawamoto

    2017-01-01

    Full Text Available http://dx.doi.org/10.5007/2175-8026.2017v70n1p249 The advent of the digital technologies unquestionably affected the cinema. The indexical relation and realistic effect with the photographed world much praised by André Bazin and Roland Barthes is just one of the affected aspects. This article discusses cinema in light of the new digital possibilities, reflecting on Steven Shaviro’s consideration of “how a nonindexical realism might be possible” (63 and how in fact a new kind of reality, a digital one, might emerge in the science fiction film Source Code (2013 by Duncan Jones.

  9. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  10. Coded aperture detector for high precision gamma-ray burst source locations

    International Nuclear Information System (INIS)

    Helmken, H.; Gorenstein, P.

    1977-01-01

    Coded aperture collimators in conjunction with position-sensitive detectors are very useful in the study of transient phenomenon because they combine broad field of view, high sensitivity, and an ability for precise source locations. Since the preceeding conference, a series of computer simulations of various detector designs have been carried out with the aid of a CDC 6400. Particular emphasis was placed on the development of a unit consisting of a one-dimensional random or periodic collimator in conjunction with a two-dimensional position-sensitive Xenon proportional counter. A configuration involving four of these units has been incorporated into the preliminary design study of the Transient Explorer (ATREX) satellite and are applicable to any SAS or HEAO type satellite mission. Results of this study, including detector response, fields of view, and source location precision, will be presented

  11. Supporting the Cybercrime Investigation Process: Effective Discrimination of Source Code Authors Based on Byte-Level Information

    Science.gov (United States)

    Frantzeskou, Georgia; Stamatatos, Efstathios; Gritzalis, Stefanos

    Source code authorship analysis is the particular field that attempts to identify the author of a computer program by treating each program as a linguistically analyzable entity. This is usually based on other undisputed program samples from the same author. There are several cases where the application of such a method could be of a major benefit, such as tracing the source of code left in the system after a cyber attack, authorship disputes, proof of authorship in court, etc. In this paper, we present our approach which is based on byte-level n-gram profiles and is an extension of a method that has been successfully applied to natural language text authorship attribution. We propose a simplified profile and a new similarity measure which is less complicated than the algorithm followed in text authorship attribution and it seems more suitable for source code identification since is better able to deal with very small training sets. Experiments were performed on two different data sets, one with programs written in C++ and the second with programs written in Java. Unlike the traditional language-dependent metrics used by previous studies, our approach can be applied to any programming language with no additional cost. The presented accuracy rates are much better than the best reported results for the same data sets.

  12. Development of Coupled Interface System between the FADAS Code and a Source-term Evaluation Code XSOR for CANDU Reactors

    International Nuclear Information System (INIS)

    Son, Han Seong; Song, Deok Yong; Kim, Ma Woong; Shin, Hyeong Ki; Lee, Sang Kyu; Kim, Hyun Koon

    2006-01-01

    An accident prevention system is essential to the industrial security of nuclear industry. Thus, the more effective accident prevention system will be helpful to promote safety culture as well as to acquire public acceptance for nuclear power industry. The FADAS(Following Accident Dose Assessment System) which is a part of the Computerized Advisory System for a Radiological Emergency (CARE) system in KINS is used for the prevention against nuclear accident. In order to enhance the FADAS system more effective for CANDU reactors, it is necessary to develop the various accident scenarios and reliable database of source terms. This study introduces the construction of the coupled interface system between the FADAS and the source-term evaluation code aimed to improve the applicability of the CANDU Integrated Safety Analysis System (CISAS) for CANDU reactors

  13. Survey of source code metrics for evaluating testability of object oriented systems

    OpenAIRE

    Shaheen , Muhammad Rabee; Du Bousquet , Lydie

    2010-01-01

    Software testing is costly in terms of time and funds. Testability is a software characteristic that aims at producing systems easy to test. Several metrics have been proposed to identify the testability weaknesses. But it is sometimes difficult to be convinced that those metrics are really related with testability. This article is a critical survey of the source-code based metrics proposed in the literature for object-oriented software testability. It underlines the necessity to provide test...

  14. NEACRP comparison of source term codes for the radiation protection assessment of transportation packages

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Locke, H.F.; Avery, A.F.

    1994-01-01

    The results for Problems 5 and 6 of the NEACRP code comparison as submitted by six participating countries are presented in summary. These problems concentrate on the prediction of the neutron and gamma-ray sources arising in fuel after a specified irradiation, the fuel being uranium oxide for problem 5 and a mixture of uranium and plutonium oxides for problem 6. In both problems the predicted neutron sources are in good agreement for all participants. For gamma rays, however, there are differences, largely due to the omission of bremsstrahlung in some calculations

  15. Application and Analysis of Performance of DQPSK Advanced Modulation Format in Spectral Amplitude Coding OCDMA

    Directory of Open Access Journals (Sweden)

    Abdul Latif Memon

    2014-04-01

    Full Text Available SAC (Spectral Amplitude Coding is a technique of OCDMA (Optical Code Division Multiple Access to encode and decode data bits by utilizing spectral components of the broadband source. Usually OOK (ON-Off-Keying modulation format is used in this encoding scheme. To make SAC OCDMA network spectrally efficient, advanced modulation format of DQPSK (Differential Quaternary Phase Shift Keying is applied, simulated and analyzed. m-sequence code is encoded in the simulated setup. Performance regarding various lengths of m-sequence code is also analyzed and displayed in the pictorial form. The results of the simulation are evaluated with the help of electrical constellation diagram, eye diagram and bit error rate graph. All the graphs indicate better transmission quality in case of advanced modulation format of DQPSK used in SAC OCDMA network as compared with OOK

  16. Flat panel planar optic display

    Energy Technology Data Exchange (ETDEWEB)

    Veligdan, J.T. [Brookhaven National Lab., Upton, NY (United States). Dept. of Advanced Technology

    1994-11-01

    A prototype 10 inch flat panel Planar Optic Display, (POD), screen has been constructed and tested. This display screen is comprised of hundreds of planar optic class sheets bonded together with a cladding layer between each sheet where each glass sheet represents a vertical line of resolution. The display is 9 inches wide by 5 inches high and approximately 1 inch thick. A 3 milliwatt HeNe laser is used as the illumination source and a vector scanning technique is employed.

  17. Optimal source coding, removable noise elimination, and natural coordinate system construction for general vector sources using replicator neural networks

    Science.gov (United States)

    Hecht-Nielsen, Robert

    1997-04-01

    A new universal one-chart smooth manifold model for vector information sources is introduced. Natural coordinates (a particular type of chart) for such data manifolds are then defined. Uniformly quantized natural coordinates form an optimal vector quantization code for a general vector source. Replicator neural networks (a specialized type of multilayer perceptron with three hidden layers) are the introduced. As properly configured examples of replicator networks approach minimum mean squared error (e.g., via training and architecture adjustment using randomly chosen vectors from the source), these networks automatically develop a mapping which, in the limit, produces natural coordinates for arbitrary source vectors. The new concept of removable noise (a noise model applicable to a wide variety of real-world noise processes) is then discussed. Replicator neural networks, when configured to approach minimum mean squared reconstruction error (e.g., via training and architecture adjustment on randomly chosen examples from a vector source, each with randomly chosen additive removable noise contamination), in the limit eliminate removable noise and produce natural coordinates for the data vector portions of the noise-corrupted source vectors. Consideration regarding selection of the dimension of a data manifold source model and the training/configuration of replicator neural networks are discussed.

  18. Wired World-Wide Web Interactive Remote Event Display

    Energy Technology Data Exchange (ETDEWEB)

    De Groot, Nicolo

    2003-05-07

    WIRED (World-Wide Web Interactive Remote Event Display) is a framework, written in the Java{trademark} language, for building High Energy Physics event displays. An event display based on the WIRED framework enables users of a HEP collaboration to visualize and analyze events remotely using ordinary WWW browsers, on any type of machine. In addition, event displays using WIRED may provide the general public with access to the research of high energy physics. The recent introduction of the object-oriented Java{trademark} language enables the transfer of machine independent code across the Internet, to be safely executed by a Java enhanced WWW browser. We have employed this technology to create a remote event display in WWW. The combined Java-WWW technology hence assures a world wide availability of such an event display, an always up-to-date program and a platform independent implementation, which is easy to use and to install.

  19. Time-dependent anisotropic distributed source capability in transient 3-d transport code tort-TD

    International Nuclear Information System (INIS)

    Seubert, A.; Pautz, A.; Becker, M.; Dagan, R.

    2009-01-01

    The transient 3-D discrete ordinates transport code TORT-TD has been extended to account for time-dependent anisotropic distributed external sources. The extension aims at the simulation of the pulsed neutron source in the YALINA-Thermal subcritical assembly. Since feedback effects are not relevant in this zero-power configuration, this offers a unique opportunity to validate the time-dependent neutron kinetics of TORT-TD with experimental data. The extensions made in TORT-TD to incorporate a time-dependent anisotropic external source are described. The steady state of the YALINA-Thermal assembly and its response to an artificial square-wave source pulse sequence have been analysed with TORT-TD using pin-wise homogenised cross sections in 18 prompt energy groups with P 1 scattering order and 8 delayed neutron groups. The results demonstrate the applicability of TORT-TD to subcritical problems with a time-dependent external source. (authors)

  20. Halftone display, particularly for a high resolution radioactivity distribution detection system

    International Nuclear Information System (INIS)

    Grenier, R.P.

    1977-01-01

    A device is described for presenting a halftone pictorial presentation composed of dot picture elements by selectively controlling the number of dot picture elements per unit area at locations on a display. In a high resolution radioactivity distribution detection system, the number of detected radioactive elements at XY locations of an array of sensing devices are fed to a computer and stored at corresponding address locations. The number of radioactive events detected at each address location is normalized into Gray scale coded signals as a function of the greatest number of radioactive events detected at any one address location. The normalized Gray scale coded signals are applied to a display for controlling the number of dot picture elements per unit area presented at corresponding XY locations on the display. The number of radioactive events detected at XY locations of the array are presented on the display as a halftone pictorial representation; the greatest number of picture dot elements per unit are being presented as a brighter image

  1. Numerical modeling of the Linac4 negative ion source extraction region by 3D PIC-MCC code ONIX

    CERN Document Server

    Mochalskyy, S; Minea, T; Lifschitz, AF; Schmitzer, C; Midttun, O; Steyaert, D

    2013-01-01

    At CERN, a high performance negative ion (NI) source is required for the 160 MeV H- linear accelerator Linac4. The source is planned to produce 80 mA of H- with an emittance of 0.25 mm mradN-RMS which is technically and scientifically very challenging. The optimization of the NI source requires a deep understanding of the underling physics concerning the production and extraction of the negative ions. The extraction mechanism from the negative ion source is complex involving a magnetic filter in order to cool down electrons’ temperature. The ONIX (Orsay Negative Ion eXtraction) code is used to address this problem. The ONIX is a selfconsistent 3D electrostatic code using Particles-in-Cell Monte Carlo Collisions (PIC-MCC) approach. It was written to handle the complex boundary conditions between plasma, source walls, and beam formation at the extraction hole. Both, the positive extraction potential (25kV) and the magnetic field map are taken from the experimental set-up, in construction at CERN. This contrib...

  2. EchoSeed Model 6733 Iodine-125 brachytherapy source: Improved dosimetric characterization using the MCNP5 Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Mosleh-Shirazi, M. A.; Hadad, K.; Faghihi, R.; Baradaran-Ghahfarokhi, M.; Naghshnezhad, Z.; Meigooni, A. S. [Center for Research in Medical Physics and Biomedical Engineering and Physics Unit, Radiotherapy Department, Shiraz University of Medical Sciences, Shiraz 71936-13311 (Iran, Islamic Republic of); Radiation Research Center and Medical Radiation Department, School of Engineering, Shiraz University, Shiraz 71936-13311 (Iran, Islamic Republic of); Comprehensive Cancer Center of Nevada, Las Vegas, Nevada 89169 (United States)

    2012-08-15

    This study primarily aimed to obtain the dosimetric characteristics of the Model 6733 {sup 125}I seed (EchoSeed) with improved precision and accuracy using a more up-to-date Monte-Carlo code and data (MCNP5) compared to previously published results, including an uncertainty analysis. Its secondary aim was to compare the results obtained using the MCNP5, MCNP4c2, and PTRAN codes for simulation of this low-energy photon-emitting source. The EchoSeed geometry and chemical compositions together with a published {sup 125}I spectrum were used to perform dosimetric characterization of this source as per the updated AAPM TG-43 protocol. These simulations were performed in liquid water material in order to obtain the clinically applicable dosimetric parameters for this source model. Dose rate constants in liquid water, derived from MCNP4c2 and MCNP5 simulations, were found to be 0.993 cGyh{sup -1} U{sup -1} ({+-}1.73%) and 0.965 cGyh{sup -1} U{sup -1} ({+-}1.68%), respectively. Overall, the MCNP5 derived radial dose and 2D anisotropy functions results were generally closer to the measured data (within {+-}4%) than MCNP4c and the published data for PTRAN code (Version 7.43), while the opposite was seen for dose rate constant. The generally improved MCNP5 Monte Carlo simulation may be attributed to a more recent and accurate cross-section library. However, some of the data points in the results obtained from the above-mentioned Monte Carlo codes showed no statistically significant differences. Derived dosimetric characteristics in liquid water are provided for clinical applications of this source model.

  3. Two-terminal video coding.

    Science.gov (United States)

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  4. Laser illuminated flat panel display

    Energy Technology Data Exchange (ETDEWEB)

    Veligdan, J.T.

    1995-12-31

    A 10 inch laser illuminated flat panel Planar Optic Display (POD) screen has been constructed and tested. This POD screen technology is an entirely new concept in display technology. Although the initial display is flat and made of glass, this technology lends itself to applications where a plastic display might be wrapped around the viewer. The display screen is comprised of hundreds of planar optical waveguides where each glass waveguide represents a vertical line of resolution. A black cladding layer, having a lower index of refraction, is placed between each waveguide layer. Since the cladding makes the screen surface black, the contrast is high. The prototype display is 9 inches wide by 5 inches high and approximately I inch thick. A 3 milliwatt HeNe laser is used as the illumination source and a vector scanning technique is employed.

  5. The European source-term evaluation code ASTEC: status and applications, including CANDU plant applications

    International Nuclear Information System (INIS)

    Van Dorsselaere, J.P.; Giordano, P.; Kissane, M.P.; Montanelli, T.; Schwinges, B.; Ganju, S.; Dickson, L.

    2004-01-01

    Research on light-water reactor severe accidents (SA) is still required in a limited number of areas in order to confirm accident-management plans. Thus, 49 European organizations have linked their SA research in a durable way through SARNET (Severe Accident Research and management NETwork), part of the European 6th Framework Programme. One goal of SARNET is to consolidate the integral code ASTEC (Accident Source Term Evaluation Code, developed by IRSN and GRS) as the European reference tool for safety studies; SARNET efforts include extending the application scope to reactor types other than PWR (including VVER) such as BWR and CANDU. ASTEC is used in IRSN's Probabilistic Safety Analysis level 2 of 900 MWe French PWRs. An earlier version of ASTEC's SOPHAEROS module, including improvements by AECL, is being validated as the Canadian Industry Standard Toolset code for FP-transport analysis in the CANDU Heat Transport System. Work with ASTEC has also been performed by Bhabha Atomic Research Centre, Mumbai, on IPHWR containment thermal hydraulics. (author)

  6. PuLSE: Quality control and quantification of peptide sequences explored by phage display libraries.

    Science.gov (United States)

    Shave, Steven; Mann, Stefan; Koszela, Joanna; Kerr, Alastair; Auer, Manfred

    2018-01-01

    The design of highly diverse phage display libraries is based on assumption that DNA bases are incorporated at similar rates within the randomized sequence. As library complexity increases and expected copy numbers of unique sequences decrease, the exploration of library space becomes sparser and the presence of truly random sequences becomes critical. We present the program PuLSE (Phage Library Sequence Evaluation) as a tool for assessing randomness and therefore diversity of phage display libraries. PuLSE runs on a collection of sequence reads in the fastq file format and generates tables profiling the library in terms of unique DNA sequence counts and positions, translated peptide sequences, and normalized 'expected' occurrences from base to residue codon frequencies. The output allows at-a-glance quantitative quality control of a phage library in terms of sequence coverage both at the DNA base and translated protein residue level, which has been missing from toolsets and literature. The open source program PuLSE is available in two formats, a C++ source code package for compilation and integration into existing bioinformatics pipelines and precompiled binaries for ease of use.

  7. Vector Network Coding

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...

  8. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  9. SCRIC: a code dedicated to the detailed emission and absorption of heterogeneous NLTE plasmas; application to xenon EUV sources

    International Nuclear Information System (INIS)

    Gaufridy de Dortan, F. de

    2006-01-01

    Nearly all spectral opacity codes for LTE and NLTE plasmas rely on configurations approximate modelling or even supra-configurations modelling for mid Z plasmas. But in some cases, configurations interaction (either relativistic and non relativistic) induces dramatic changes in spectral shapes. We propose here a new detailed emissivity code with configuration mixing to allow for a realistic description of complex mid Z plasmas. A collisional radiative calculation. based on HULLAC precise energies and cross sections. determines the populations. Detailed emissivities and opacities are then calculated and radiative transfer equation is resolved for wide inhomogeneous plasmas. This code is able to cope rapidly with very large amount of atomic data. It is therefore possible to use complex hydrodynamic files even on personal computers in a very limited time. We used this code for comparison with Xenon EUV sources within the framework of nano-lithography developments. It appears that configurations mixing strongly shifts satellite lines and must be included in the description of these sources to enhance their efficiency. (author)

  10. The source of display rules and their effects on primary health care professionals' well-being.

    Science.gov (United States)

    Martínez-Iñigo, David; Totterdell, Peter; Alcover, Carlos Maria; Holman, David

    2009-11-01

    Employees' perceptions of the emotional requirements of their work role are considered a necessary antecedent of emotion work. The impact of these requirements on the emotions employees display, their well-being, and their clients' satisfaction has been explored in previous research. Emotional requirements have been characterized as organizationally-based expectations (e.g., Brotheridge & Lee, 2003), formal and informal organizational rules (e.g., Cropanzano, Weiss & Elias, 2004), occupational norms (e.g., Rafaeli & Sutton, 1987; Smith & Kleinman, 1989) and job-based demands (Brotheridge & Lee, 2002). Although all these definitions assume some kind of shared source for perceptions of emotional requirements, it remains unclear to what extent these different sources contribute and to what extent the requirements are shared by different units, teams and individuals in the organization. The present study analyses the perception of emotional requirements from a survey of ninety-seven Primary Health Care teams composed of general practitioners, nurses and administrative staff (N = 1057). The relative contribution of different sources of variance (team, organizational, and occupational) to perceived emotional requirements and the effects on employees' job satisfaction and well being are examined. Results confirm the relevance of the source and show the contribution of emotional demands to prediction of emotional exhaustion and job satisfaction levels.

  11. Graphic display of spatially distributed binary-state experimental data

    International Nuclear Information System (INIS)

    Watson, B.L.

    1981-01-01

    Experimental data collected from a large number of transducers spatially distributed throughout a three-dimensional volume has typically posed a difficult interpretation task for the analyst. This paper describes one approach to alleviating this problem by presenting color graphic displays of experimental data; specifically, data representing the dynamic three-dimensional distribution of cooling fluid collected during the reflood and refill of simulated nuclear reactor vessels. Color-coded binary data (wet/dry) are integrated with a graphic representation of the reactor vessel and displayed on a high-resolution color CRT. The display is updated with successive data sets and made into 16-mm movies for distribution and analysis. Specific display formats are presented and extension to other applications discussed

  12. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    Science.gov (United States)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  13. Source characteristics of the underwater knocking displays of a male Pacific walrus (Odobenus rosmarus divergens)

    DEFF Research Database (Denmark)

    Hughes, William R.; Reichmuth, Colleen; Mulsow, Jason L.

    2011-01-01

    "knocks"’ punctuated by occasional metallic "bells." The source characteristics of the knocking sounds that were regularly emitted by a male walrus raised in captivity were examined. Knocks were produced as single 20 ms pulses, or as doublets and triplets, and were typically repeated at rates of 0.8/s...... to 1.2/s. These were loud sounds with greater bandwidth than previously reported: mean source levels were 186 dB pk-pk re 1 Pa at 1 m (range 164-196) with maximum frequency >24 kHz. Production of each knock was associated with visible impulsive movement of the forehead. During rut, this walrus had...... difficulty inhibiting sound production and would often continue to emit knocks in air during haul-out and even while eating, suggesting an endogenous component to this behavior. A strong correlation between his seasonal testosterone levels and the persistence of knocking displays was confirmed. Captive...

  14. Ready, steady… Code!

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    This summer, CERN took part in the Google Summer of Code programme for the third year in succession. Open to students from all over the world, this programme leads to very successful collaborations for open source software projects.   Image: GSoC 2013. Google Summer of Code (GSoC) is a global programme that offers student developers grants to write code for open-source software projects. Since its creation in 2005, the programme has brought together some 6,000 students from over 100 countries worldwide. The students selected by Google are paired with a mentor from one of the participating projects, which can be led by institutes, organisations, companies, etc. This year, CERN PH Department’s SFT (Software Development for Experiments) Group took part in the GSoC programme for the third time, submitting 15 open-source projects. “Once published on the Google Summer for Code website (in April), the projects are open to applications,” says Jakob Blomer, one of the o...

  15. Modelling RF sources using 2-D PIC codes

    Energy Technology Data Exchange (ETDEWEB)

    Eppley, K.R.

    1993-03-01

    In recent years, many types of RF sources have been successfully modelled using 2-D PIC codes. Both cross field devices (magnetrons, cross field amplifiers, etc.) and pencil beam devices (klystrons, gyrotrons, TWT'S, lasertrons, etc.) have been simulated. All these devices involve the interaction of an electron beam with an RF circuit. For many applications, the RF structure may be approximated by an equivalent circuit, which appears in the simulation as a boundary condition on the electric field ( port approximation''). The drive term for the circuit is calculated from the energy transfer between beam and field in the drift space. For some applications it may be necessary to model the actual geometry of the structure, although this is more expensive. One problem not entirely solved is how to accurately model in 2-D the coupling to an external waveguide. Frequently this is approximated by a radial transmission line, but this sometimes yields incorrect results. We also discuss issues in modelling the cathode and injecting the beam into the PIC simulation.

  16. Modelling RF sources using 2-D PIC codes

    Energy Technology Data Exchange (ETDEWEB)

    Eppley, K.R.

    1993-03-01

    In recent years, many types of RF sources have been successfully modelled using 2-D PIC codes. Both cross field devices (magnetrons, cross field amplifiers, etc.) and pencil beam devices (klystrons, gyrotrons, TWT`S, lasertrons, etc.) have been simulated. All these devices involve the interaction of an electron beam with an RF circuit. For many applications, the RF structure may be approximated by an equivalent circuit, which appears in the simulation as a boundary condition on the electric field (``port approximation``). The drive term for the circuit is calculated from the energy transfer between beam and field in the drift space. For some applications it may be necessary to model the actual geometry of the structure, although this is more expensive. One problem not entirely solved is how to accurately model in 2-D the coupling to an external waveguide. Frequently this is approximated by a radial transmission line, but this sometimes yields incorrect results. We also discuss issues in modelling the cathode and injecting the beam into the PIC simulation.

  17. Modelling RF sources using 2-D PIC codes

    International Nuclear Information System (INIS)

    Eppley, K.R.

    1993-03-01

    In recent years, many types of RF sources have been successfully modelled using 2-D PIC codes. Both cross field devices (magnetrons, cross field amplifiers, etc.) and pencil beam devices (klystrons, gyrotrons, TWT'S, lasertrons, etc.) have been simulated. All these devices involve the interaction of an electron beam with an RF circuit. For many applications, the RF structure may be approximated by an equivalent circuit, which appears in the simulation as a boundary condition on the electric field (''port approximation''). The drive term for the circuit is calculated from the energy transfer between beam and field in the drift space. For some applications it may be necessary to model the actual geometry of the structure, although this is more expensive. One problem not entirely solved is how to accurately model in 2-D the coupling to an external waveguide. Frequently this is approximated by a radial transmission line, but this sometimes yields incorrect results. We also discuss issues in modelling the cathode and injecting the beam into the PIC simulation

  18. Validation of the Open Source Code_Aster Software Used in the Modal Analysis of the Fluid-filled Cylindrical Shell

    Directory of Open Access Journals (Sweden)

    B D. Kashfutdinov

    2017-01-01

    Full Text Available The paper deals with a modal analysis of the elastic cylindrical shell with a clamped bottom partially filled with fluid in open source Code_Aster software using the finite element method. Natural frequencies and modes obtained in Code_Aster are compared to experimental and theoretical data. The aim of this paper is to prove that Code_Aster has all necessary tools for solving fluid structure interaction problems. Also, Code_Aster can be used in the industrial projects as an alternative to commercial software. The available free pre- and post-processors with a graphical user interface that is compatible with Code_Aster allow creating complex models and processing the results.The paper presents new validation results of open source Code_Aster software used to calculate small natural modes of the cylindrical shell partially filled with non-viscous compressible barotropic fluid under gravity field.The displacement of the middle surface of thin shell and the displacement of the fluid relative to the equilibrium position are described by coupled hydro-elasticity problem. The fluid flow is considered to be potential. The finite element method (FEM is used. The features of computational model are described. The resolution equation has symmetrical block matrices. To compare the results, is discussed the well-known modal analysis problem of cylindrical shell with flat non-deformable bottom, filled with a compressible fluid. The numerical parameters of the scheme were chosen in accordance with well-known experimental and analytical data. Three cases were taken into account: an empty, a partially filled and a full-filled cylindrical shell.The frequencies of Code_Aster are in good agreement with those, obtained in experiment, analytical solution, as well as with results obtained by FEM in other software. The difference between experiment and analytical solution in software is approximately the same. The obtained results extend a set of validation tests for

  19. High dynamic range coding imaging system

    Science.gov (United States)

    Wu, Renfan; Huang, Yifan; Hou, Guangqi

    2014-10-01

    We present a high dynamic range (HDR) imaging system design scheme based on coded aperture technique. This scheme can help us obtain HDR images which have extended depth of field. We adopt Sparse coding algorithm to design coded patterns. Then we utilize the sensor unit to acquire coded images under different exposure settings. With the guide of the multiple exposure parameters, a series of low dynamic range (LDR) coded images are reconstructed. We use some existing algorithms to fuse and display a HDR image by those LDR images. We build an optical simulation model and get some simulation images to verify the novel system.

  20. Digital Display Integration Project Project Online 2.0

    International Nuclear Information System (INIS)

    Bardsley, J. N.

    1999-01-01

    The electronic display industry is changing in three important ways. First, the dominance of the cathode ray tube (CRT) is being challenged by the development of flat panel displays (FPDs). This will lead to the availability of displays of higher performance, albeit at greater cost. Secondly, the analog interfaces between displays that show data and the computers that generate the data are being replaced by digital connections. Finally, a high-resolution display is becoming the most expensive component in computer system for homes and small offices. It is therefore desirable that the useful lifetime of the display extend over several years and that the electronics allows the display to be used with many different image sources. Hopefully, the necessity of having three or four large CRTs in one office to accommodate different computer operating systems or communication protocols will soon disappear. Instead, we hope to see a set of flat panels that can be switched to show several independent images from multiple sources or a composite image from a single source. The more rapid rate of technological improvements and the higher cost of flat panel displays raise the incentive for greater planning and guidance in the acquisition and integration of high performance displays into large organizations, such as LLNL. The goal of the Digital Display Integration Project (DDIP) is to provide such support. This will be achieved through collaboration with leading suppliers of displays, communications equipment and image-processing products, and by greater exchange of information within the Laboratory. The project will start in October 1999. During the first two years (FY2000-1), the primary focus of the program will be upon: introducing displays with high information content (over 5M pixels); facilitating the transition from analog to digital interfaces; enabling data transfer from key computer platforms; incorporating optical communications to remove length restrictions on data

  1. GRHydro: a new open-source general-relativistic magnetohydrodynamics code for the Einstein toolkit

    International Nuclear Information System (INIS)

    Mösta, Philipp; Haas, Roland; Ott, Christian D; Reisswig, Christian; Mundim, Bruno C; Faber, Joshua A; Noble, Scott C; Bode, Tanja; Löffler, Frank; Schnetter, Erik

    2014-01-01

    We present the new general-relativistic magnetohydrodynamics (GRMHD) capabilities of the Einstein toolkit, an open-source community-driven numerical relativity and computational relativistic astrophysics code. The GRMHD extension of the toolkit builds upon previous releases and implements the evolution of relativistic magnetized fluids in the ideal MHD limit in fully dynamical spacetimes using the same shock-capturing techniques previously applied to hydrodynamical evolution. In order to maintain the divergence-free character of the magnetic field, the code implements both constrained transport and hyperbolic divergence cleaning schemes. We present test results for a number of MHD tests in Minkowski and curved spacetimes. Minkowski tests include aligned and oblique planar shocks, cylindrical explosions, magnetic rotors, Alfvén waves and advected loops, as well as a set of tests designed to study the response of the divergence cleaning scheme to numerically generated monopoles. We study the code’s performance in curved spacetimes with spherical accretion onto a black hole on a fixed background spacetime and in fully dynamical spacetimes by evolutions of a magnetized polytropic neutron star and of the collapse of a magnetized stellar core. Our results agree well with exact solutions where these are available and we demonstrate convergence. All code and input files used to generate the results are available on http://einsteintoolkit.org. This makes our work fully reproducible and provides new users with an introduction to applications of the code. (paper)

  2. Neutron spallation source and the Dubna cascade code

    CERN Document Server

    Kumar, V; Goel, U; Barashenkov, V S

    2003-01-01

    Neutron multiplicity per incident proton, n/p, in collision of high energy proton beam with voluminous Pb and W targets has been estimated from the Dubna cascade code and compared with the available experimental data for the purpose of benchmarking of the code. Contributions of various atomic and nuclear processes for heat production and isotopic yield of secondary nuclei are also estimated to assess the heat and radioactivity conditions of the targets. Results obtained from the code show excellent agreement with the experimental data at beam energy, E < 1.2 GeV and differ maximum up to 25% at higher energy. (author)

  3. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  4. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C.

  5. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    International Nuclear Information System (INIS)

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C

  6. A statistical–mechanical view on source coding: physical compression and data compression

    International Nuclear Information System (INIS)

    Merhav, Neri

    2011-01-01

    We draw a certain analogy between the classical information-theoretic problem of lossy data compression (source coding) of memoryless information sources and the statistical–mechanical behavior of a certain model of a chain of connected particles (e.g. a polymer) that is subjected to a contracting force. The free energy difference pertaining to such a contraction turns out to be proportional to the rate-distortion function in the analogous data compression model, and the contracting force is proportional to the derivative of this function. Beyond the fact that this analogy may be interesting in its own right, it may provide a physical perspective on the behavior of optimum schemes for lossy data compression (and perhaps also an information-theoretic perspective on certain physical system models). Moreover, it triggers the derivation of lossy compression performance for systems with memory, using analysis tools and insights from statistical mechanics

  7. SU-E-T-212: Comparison of TG-43 Dosimetric Parameters of Low and High Energy Brachytherapy Sources Obtained by MCNP Code Versions of 4C, X and 5

    Energy Technology Data Exchange (ETDEWEB)

    Zehtabian, M; Zaker, N; Sina, S [Shiraz University, Shiraz, Fars (Iran, Islamic Republic of); Meigooni, A Soleimani [Comprehensive Cancer Center of Nevada, Las Vegas, Nevada (United States)

    2015-06-15

    Purpose: Different versions of MCNP code are widely used for dosimetry purposes. The purpose of this study is to compare different versions of the MCNP codes in dosimetric evaluation of different brachytherapy sources. Methods: The TG-43 parameters such as dose rate constant, radial dose function, and anisotropy function of different brachytherapy sources, i.e. Pd-103, I-125, Ir-192, and Cs-137 were calculated in water phantom. The results obtained by three versions of Monte Carlo codes (MCNP4C, MCNPX, MCNP5) were compared for low and high energy brachytherapy sources. Then the cross section library of MCNP4C code was changed to ENDF/B-VI release 8 which is used in MCNP5 and MCNPX codes. Finally, the TG-43 parameters obtained using the MCNP4C-revised code, were compared with other codes. Results: The results of these investigations indicate that for high energy sources, the differences in TG-43 parameters between the codes are less than 1% for Ir-192 and less than 0.5% for Cs-137. However for low energy sources like I-125 and Pd-103, large discrepancies are observed in the g(r) values obtained by MCNP4C and the two other codes. The differences between g(r) values calculated using MCNP4C and MCNP5 at the distance of 6cm were found to be about 17% and 28% for I-125 and Pd-103 respectively. The results obtained with MCNP4C-revised and MCNPX were similar. However, the maximum difference between the results obtained with the MCNP5 and MCNP4C-revised codes was 2% at 6cm. Conclusion: The results indicate that using MCNP4C code for dosimetry of low energy brachytherapy sources can cause large errors in the results. Therefore it is recommended not to use this code for low energy sources, unless its cross section library is changed. Since the results obtained with MCNP4C-revised and MCNPX were similar, it is concluded that the difference between MCNP4C and MCNPX is their cross section libraries.

  8. MPEG-compliant joint source/channel coding using discrete cosine transform and substream scheduling for visual communication over packet networks

    Science.gov (United States)

    Kim, Seong-Whan; Suthaharan, Shan; Lee, Heung-Kyu; Rao, K. R.

    2001-01-01

    Quality of Service (QoS)-guarantee in real-time communication for multimedia applications is significantly important. An architectural framework for multimedia networks based on substreams or flows is effectively exploited for combining source and channel coding for multimedia data. But the existing frame by frame approach which includes Moving Pictures Expert Group (MPEG) cannot be neglected because it is a standard. In this paper, first, we designed an MPEG transcoder which converts an MPEG coded stream into variable rate packet sequences to be used for our joint source/channel coding (JSCC) scheme. Second, we designed a classification scheme to partition the packet stream into multiple substreams which have their own QoS requirements. Finally, we designed a management (reservation and scheduling) scheme for substreams to support better perceptual video quality such as the bound of end-to-end jitter. We have shown that our JSCC scheme is better than two other two popular techniques by simulation and real video experiments on the TCP/IP environment.

  9. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  10. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  11. Improvements in data display

    International Nuclear Information System (INIS)

    Ellis, G.W.

    1979-01-01

    An analog signal processor is described in this patent for connecting a source of analog signals to a cathode ray tube display in order to extend the dynamic range of the display. This has important applications in the field of computerised X-ray tomography since significant medical information, such as tumours in soft tissue, is often represented by minimal level changes in image density. Cathode ray tube displays are limited to approximately 15 intensity levels. Thus if both strong and weak absorption of the X-rays occurs, the dynamic range of the transmitted signals will be too large to permit small variations to be examined directly on a cathode ray display. Present tomographic image reconstruction methods are capable of quantising X-ray absorption density measurements into 256 or more distinct levels and a description is given of the electronics which enables the upper and lower range of intensity levels to be independently set and continuously varied. (UK)

  12. Open-source tool for automatic import of coded surveying data to multiple vector layers in GIS environment

    Directory of Open Access Journals (Sweden)

    Eva Stopková

    2016-12-01

    Full Text Available This paper deals with a tool that enables import of the coded data in a singletext file to more than one vector layers (including attribute tables, together withautomatic drawing of line and polygon objects and with optional conversion toCAD. Python script v.in.survey is available as an add-on for open-source softwareGRASS GIS (GRASS Development Team. The paper describes a case study basedon surveying at the archaeological mission at Tell-el Retaba (Egypt. Advantagesof the tool (e.g. significant optimization of surveying work and its limits (demandson keeping conventions for the points’ names coding are discussed here as well.Possibilities of future development are suggested (e.g. generalization of points’names coding or more complex attribute table creation.

  13. The European source term code ESTER - basic ideas and tools for coupling of ATHLET and ESTER

    International Nuclear Information System (INIS)

    Schmidt, F.; Schuch, A.; Hinkelmann, M.

    1993-04-01

    The French software house CISI and IKE of the University of Stuttgart have developed during 1990 and 1991 in the frame of the Shared Cost Action Reactor Safety the informatic structure of the European Source TERm Evaluation System (ESTER). Due to this work tools became available which allow to unify on an European basis both code development and code application in the area of severe core accident research. The behaviour of reactor cores is determined by thermal hydraulic conditions. Therefore for the development of ESTER it was important to investigate how to integrate thermal hydraulic code systems with ESTER applications. This report describes the basic ideas of ESTER and improvements of ESTER tools in view of a possible coupling of the thermal hydraulic code system ATHLET and ESTER. Due to the work performed during this project the ESTER tools became the most modern informatic tools presently available in the area of severe accident research. A sample application is given which demonstrates the use of the new tools. (orig.) [de

  14. Calculation Of Fuel Burnup And Radionuclide Inventory In The Syrian Miniature Neutron Source Reactor Using The GETERA Code

    International Nuclear Information System (INIS)

    Khattab, K.; Dawahra, S.

    2011-01-01

    Calculations of the fuel burnup and radionuclide inventory in the Syrian Miniature Neutron Source Reactor (MNSR) after 10 years (the reactor core expected life) of the reactor operation time are presented in this paper using the GETERA code. The code is used to calculate the fuel group constants and the infinite multiplication factor versus the reactor operating time for 10, 20, and 30 kW operating power levels. The amounts of uranium burnup and plutonium produced in the reactor core, the concentrations and radionuclides of the most important fission product and actinide radionuclides accumulated in the reactor core, and the total radioactivity of the reactor core were calculated using the GETERA code as well. It is found that the GETERA code is better than the WIMSD4 code for the fuel burnup calculation in the MNSR reactor since it is newer and has a bigger library of isotopes and more accurate. (author)

  15. Combat vehicle crew helmet-mounted display: next generation high-resolution head-mounted display

    Science.gov (United States)

    Nelson, Scott A.

    1994-06-01

    The Combat Vehicle Crew Head-Mounted Display (CVC HMD) program is an ARPA-funded, US Army Natick Research, Development, and Engineering Center monitored effort to develop a high resolution, flat panel HMD for the M1 A2 Abrams main battle tank. CVC HMD is part of the ARPA High Definition Systems (HDS) thrust to develop and integrate small (24 micrometers square pels), high resolution (1280 X 1024 X 6-bit grey scale at 60 frame/sec) active matrix electroluminescent (AMEL) and active matrix liquid crystal displays (AMLCD) for head mounted and projection applications. The Honeywell designed CVC HMD is a next generation head-mounted display system that includes advanced flat panel image sources, advanced digital display driver electronics, high speed (> 1 Gbps) digital interconnect electronics, and light weight, high performance optical and mechanical designs. The resulting dramatic improvements in size, weight, power, and cost have already led to program spin offs for both military and commercial applications.

  16. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.

  17. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  18. Comparison of thick-target (alpha,n yield calculation codes

    Directory of Open Access Journals (Sweden)

    Fernandes Ana C.

    2017-01-01

    Full Text Available Neutron production yields and energy distributions from (α,n reactions in light elements were calculated using three different codes (SOURCES, NEDIS and USD and compared with the existing experimental data in the 3.5-10 MeV alpha energy range. SOURCES and NEDIS display an agreement between calculated and measured yields in the decay series of 235U, 238U and 232Th within ±10% for most materials. The discrepancy increases with alpha energy but still an agreement of ±20% applies to all materials with reliable elemental production yields (the few exceptions are identified. The calculated neutron energy distributions describe the experimental data, with NEDIS retrieving very well the detailed features. USD generally underestimates the measured yields, in particular for compounds with heavy elements and/or at high alpha energies. The energy distributions exhibit sharp peaks that do not match the observations. These findings may be caused by a poor accounting of the alpha particle energy loss by the code. A big variability was found among the calculated neutron production yields for alphas from Sm decay; the lack of yield measurements for low (~2 MeV alphas does not allow to conclude on the codes’ accuracy in this energy region.

  19. A low-cost system for graphical process monitoring with colour video symbol display units

    International Nuclear Information System (INIS)

    Grauer, H.; Jarsch, V.; Mueller, W.

    1977-01-01

    A system for computer controlled graphic process supervision, using color symbol video displays is described. It has the following characteristics: - compact unit: no external memory for image storage - problem oriented simple descriptive cut to the process program - no restriction of the graphical representation of process variables - computer and display independent, by implementation of colours and parameterized code creation for the display. (WB) [de

  20. SCRIC: a code dedicated to the detailed emission and absorption of heterogeneous NLTE plasmas; application to xenon EUV sources; SCRIC: un code pour calculer l'absorption et l'emission detaillees de plasmas hors equilibre, inhomogenes et etendus; application aux sources EUV a base de xenon

    Energy Technology Data Exchange (ETDEWEB)

    Gaufridy de Dortan, F. de

    2006-07-01

    Nearly all spectral opacity codes for LTE and NLTE plasmas rely on configurations approximate modelling or even supra-configurations modelling for mid Z plasmas. But in some cases, configurations interaction (either relativistic and non relativistic) induces dramatic changes in spectral shapes. We propose here a new detailed emissivity code with configuration mixing to allow for a realistic description of complex mid Z plasmas. A collisional radiative calculation. based on HULLAC precise energies and cross sections. determines the populations. Detailed emissivities and opacities are then calculated and radiative transfer equation is resolved for wide inhomogeneous plasmas. This code is able to cope rapidly with very large amount of atomic data. It is therefore possible to use complex hydrodynamic files even on personal computers in a very limited time. We used this code for comparison with Xenon EUV sources within the framework of nano-lithography developments. It appears that configurations mixing strongly shifts satellite lines and must be included in the description of these sources to enhance their efficiency. (author)

  1. Gaze strategies can reveal the impact of source code features on the cognitive load of novice programmers

    DEFF Research Database (Denmark)

    Wulff-Jensen, Andreas; Ruder, Kevin Vignola; Triantafyllou, Evangelia

    2018-01-01

    As shown by several studies, programmers’ readability of source code is influenced by its structural and the textual features. In order to assess the importance of these features, we conducted an eye-tracking experiment with programming students. To assess the readability and comprehensibility of...

  2. Fast space-varying convolution using matrix source coding with applications to camera stray light reduction.

    Science.gov (United States)

    Wei, Jianing; Bouman, Charles A; Allebach, Jan P

    2014-05-01

    Many imaging applications require the implementation of space-varying convolution for accurate restoration and reconstruction of images. Here, we use the term space-varying convolution to refer to linear operators whose impulse response has slow spatial variation. In addition, these space-varying convolution operators are often dense, so direct implementation of the convolution operator is typically computationally impractical. One such example is the problem of stray light reduction in digital cameras, which requires the implementation of a dense space-varying deconvolution operator. However, other inverse problems, such as iterative tomographic reconstruction, can also depend on the implementation of dense space-varying convolution. While space-invariant convolution can be efficiently implemented with the fast Fourier transform, this approach does not work for space-varying operators. So direct convolution is often the only option for implementing space-varying convolution. In this paper, we develop a general approach to the efficient implementation of space-varying convolution, and demonstrate its use in the application of stray light reduction. Our approach, which we call matrix source coding, is based on lossy source coding of the dense space-varying convolution matrix. Importantly, by coding the transformation matrix, we not only reduce the memory required to store it; we also dramatically reduce the computation required to implement matrix-vector products. Our algorithm is able to reduce computation by approximately factoring the dense space-varying convolution operator into a product of sparse transforms. Experimental results show that our method can dramatically reduce the computation required for stray light reduction while maintaining high accuracy.

  3. A rule-based expert system for generating control displays at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Coulter, K.J.

    1993-01-01

    The integration of a rule-based expert system for generating screen displays for controlling and monitoring instrumentation under the Experimental Physics and Industrial Control System (EPICS) is presented. The expert system is implemented using CLIPS, an expert system shell from the Software Technology Branch at Lyndon B. Johnson Space Center. The user selects the hardware input and output to be displayed and the expert system constructs a graphical control screen appropriate for the data. Such a system provides a method for implementing a common look and feel for displays created by several different users and reduces the amount of time required to create displays for new hardware configurations. Users are able to modify the displays as needed using the EPICS display editor tool

  4. A rule-based expert system for generating control displays at the advanced photon source

    International Nuclear Information System (INIS)

    Coulter, K.J.

    1994-01-01

    The integration of a rule-based expert system for generating screen displays for controlling and monitoring instrumentation under the Experimental Physics and Industrial Control System (EPICS) is presented. The expert system is implemented using CLIPS, an expert system shell from the Software Technology Branch at Lyndon B. Johnson Space Center. The user selects the hardware input and output to be displayed and the expert system constructs a graphical control screen appropriate for the data. Such a system provides a method for implementing a common look and feel for displays created by several different users and reduces the amount of time required to create displays for new hardware configurations. Users are able to modify the displays as needed using the EPICS display editor tool. ((orig.))

  5. Firework displays as sources of particles similar to gunshot residue.

    Science.gov (United States)

    Grima, Matthew; Butler, Mark; Hanson, Robert; Mohameden, Ahmed

    2012-03-01

    In light of past research being targeted to find specific particles which may be similar to gunshot residue (GSR), this project was formulated to detect any possible particulate by random particle fallout onto substrates at firework displays and to assess the impact this may have on GSR evidence. Firework residue was collected at a display site, from amongst spectators as well as from the author's hair 90min after the display. SEM-EDX analysis has detected such particulate in all three scenarios, with the firework particle population at large providing a solid ground for discrimination from GSR. Wind dispersal was found to decrease the particle population and subsequently, the latter's discriminatory power. Some particles, if treated individually were found to be indistinguishable from GSR. Findings also include residues which may mimic strontium based GSR as well as GSR which may be mixed with that from previous firings. The continuous changes made to primer and propellant compositions by manufacturers also call for greater consideration when classifying particles as originating from pyrotechnic devices. Furthermore, authorities such as police forces should be made more aware about the incidence of such particle transfer in firework related periods. Copyright © 2011 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  6. Pseudo real-time coded aperture imaging system with intensified vidicon cameras

    International Nuclear Information System (INIS)

    Han, K.S.; Berzins, G.J.

    1977-01-01

    A coded image displayed on a TV monitor was used to directly reconstruct a decoded image. Both the coded and the decoded images were viewed with intensified vidicon cameras. The coded aperture was a 15-element nonredundant pinhole array. The coding and decoding were accomplished simultaneously during the scanning of a single 16-msec TV frame

  7. Variable code gamma ray imaging system

    International Nuclear Information System (INIS)

    Macovski, A.; Rosenfeld, D.

    1979-01-01

    A gamma-ray source distribution in the body is imaged onto a detector using an array of apertures. The transmission of each aperture is modulated using a code such that the individual views of the source through each aperture can be decoded and separated. The codes are chosen to maximize the signal to noise ratio for each source distribution. These codes determine the photon collection efficiency of the aperture array. Planar arrays are used for volumetric reconstructions and circular arrays for cross-sectional reconstructions. 14 claims

  8. The IAEA code of conduct on the safety of radiation sources and the security of radioactive materials. A step forwards or backwards?

    International Nuclear Information System (INIS)

    Boustany, K.

    2001-01-01

    About the finalization of the Code of Conduct on the Safety and Security of radioactive Sources, it appeared that two distinct but interrelated subject areas have been identified: the prevention of accidents involving radiation sources and the prevention of theft or any other unauthorized use of radioactive materials. What analysis reveals is rather that there are gaps in both the content of the Code and the processes relating to it. Nevertheless, new standards have been introduced as a result of this exercise and have thus, as an enactment of what constitutes appropriate behaviour in the field of the safety and security of radioactive sources, emerged into the arena of international relations. (N.C.)

  9. Combining Open-Source Packages for Planetary Exploration

    Science.gov (United States)

    Schmidt, Albrecht; Grieger, Björn; Völk, Stefan

    2015-04-01

    The science planning of the ESA Rosetta mission has presented challenges which were addressed with combining various open-source software packages, such as the SPICE toolkit, the Python language and the Web graphics library three.js. The challenge was to compute certain parameters from a pool of trajectories and (possible) attitudes to describe the behaviour of the spacecraft. To be able to do this declaratively and efficiently, a C library was implemented that allows to interface the SPICE toolkit for geometrical computations from the Python language and process as much data as possible during one subroutine call. To minimise the lines of code one has to write special care was taken to ensure that the bindings were idiomatic and thus integrate well into the Python language and ecosystem. When done well, this very much simplifies the structure of the code and facilitates the testing for correctness by automatic test suites and visual inspections. For rapid visualisation and confirmation of correctness of results, the geometries were visualised with the three.js library, a popular Javascript library for displaying three-dimensional graphics in a Web browser. Programmatically, this was achieved by generating data files from SPICE sources that were included into templated HTML and displayed by a browser, thus made easily accessible to interested parties at large. As feedback came and new ideas were to be explored, the authors benefited greatly from the design of the Python-to-SPICE library which allowed the expression of algorithms to be concise and easier to communicate. In summary, by combining several well-established open-source tools, we were able to put together a flexible computation and visualisation environment that helped communicate and build confidence in planning ideas.

  10. SMILEI: A collaborative, open-source, multi-purpose PIC code for the next generation of super-computers

    Science.gov (United States)

    Grech, Mickael; Derouillat, J.; Beck, A.; Chiaramello, M.; Grassi, A.; Niel, F.; Perez, F.; Vinci, T.; Fle, M.; Aunai, N.; Dargent, J.; Plotnikov, I.; Bouchard, G.; Savoini, P.; Riconda, C.

    2016-10-01

    Over the last decades, Particle-In-Cell (PIC) codes have been central tools for plasma simulations. Today, new trends in High-Performance Computing (HPC) are emerging, dramatically changing HPC-relevant software design and putting some - if not most - legacy codes far beyond the level of performance expected on the new and future massively-parallel super computers. SMILEI is a new open-source PIC code co-developed by both plasma physicists and HPC specialists, and applied to a wide range of physics-related studies: from laser-plasma interaction to astrophysical plasmas. It benefits from an innovative parallelization strategy that relies on a super-domain-decomposition allowing for enhanced cache-use and efficient dynamic load balancing. Beyond these HPC-related developments, SMILEI also benefits from additional physics modules allowing to deal with binary collisions, field and collisional ionization and radiation back-reaction. This poster presents the SMILEI project, its HPC capabilities and illustrates some of the physics problems tackled with SMILEI.

  11. Study of the source term of radiation of the CDTN GE-PET trace 8 cyclotron with the MCNPX code

    Energy Technology Data Exchange (ETDEWEB)

    Benavente C, J. A.; Lacerda, M. A. S.; Fonseca, T. C. F.; Da Silva, T. A. [Centro de Desenvolvimento da Tecnologia Nuclear / CNEN, Av. Pte. Antonio Carlos 6627, 31270-901 Belo Horizonte, Minas Gerais (Brazil); Vega C, H. R., E-mail: jhonnybenavente@gmail.com [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98068 Zacatecas, Zac. (Mexico)

    2015-10-15

    Full text: The knowledge of the neutron spectra in a PET cyclotron is important for the optimization of radiation protection of the workers and individuals of the public. The main objective of this work is to study the source term of radiation of the GE-PET trace 8 cyclotron of the Development Center of Nuclear Technology (CDTN/CNEN) using computer simulation by the Monte Carlo method. The MCNPX version 2.7 code was used to calculate the flux of neutrons produced from the interaction of the primary proton beam with the target body and other cyclotron components, during 18F production. The estimate of the source term and the corresponding radiation field was performed from the bombardment of a H{sub 2}{sup 18}O target with protons of 75 μA current and 16.5 MeV of energy. The values of the simulated fluxes were compared with those reported by the accelerator manufacturer (GE Health care Company). Results showed that the fluxes estimated with the MCNPX codes were about 70% lower than the reported by the manufacturer. The mean energies of the neutrons were also different of that reported by GE Health Care. It is recommended to investigate other cross sections data and the use of physical models of the code itself for a complete characterization of the source term of radiation. (Author)

  12. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  13. A new release of the S3M code

    International Nuclear Information System (INIS)

    Pavlovic, M.; Bokor, J.; Regodic, M.; Sagatova, A.

    2015-01-01

    This paper presents a new release of the code that contains some additional routines and advanced features of displaying the results. Special attention is paid to the processing of the SRIM range file, which was not included in the previous release of the code. Examples of distributions provided by the S 3 M code for implanted ions in thyroid and iron are presented. (authors)

  14. Analysis of source term aspects in the experiment Phebus FPT1 with the MELCOR and CFX codes

    Energy Technology Data Exchange (ETDEWEB)

    Martin-Fuertes, F. [Universidad Politecnica de Madrid, UPM, Nuclear Engineering Department, Jose Gutierrez Abascal 2, 28006 Madrid (Spain)]. E-mail: francisco.martinfuertes@upm.es; Barbero, R. [Universidad Politecnica de Madrid, UPM, Nuclear Engineering Department, Jose Gutierrez Abascal 2, 28006 Madrid (Spain); Martin-Valdepenas, J.M. [Universidad Politecnica de Madrid, UPM, Nuclear Engineering Department, Jose Gutierrez Abascal 2, 28006 Madrid (Spain); Jimenez, M.A. [Universidad Politecnica de Madrid, UPM, Nuclear Engineering Department, Jose Gutierrez Abascal 2, 28006 Madrid (Spain)

    2007-03-15

    Several aspects related to the source term in the Phebus FPT1 experiment have been analyzed with the help of MELCOR 1.8.5 and CFX 5.7 codes. Integral aspects covering circuit thermalhydraulics, fission product and structural material release, vapours and aerosol retention in the circuit and containment were studied with MELCOR, and the strong and weak points after comparison to experimental results are stated. Then, sensitivity calculations dealing with chemical speciation upon release, vertical line aerosol deposition and steam generator aerosol deposition were performed. Finally, detailed calculations concerning aerosol deposition in the steam generator tube are presented. They were obtained by means of an in-house code application, named COCOA, as well as with CFX computational fluid dynamics code, in which several models for aerosol deposition were implemented and tested, while the models themselves are discussed.

  15. Visual Attention to Radar Displays

    Science.gov (United States)

    Moray, N.; Richards, M.; Brophy, C.

    1984-01-01

    A model is described which predicts the allocation of attention to the features of a radar display. It uses the growth of uncertainty and the probability of near collision to call the eye to a feature of the display. The main source of uncertainty is forgetting following a fixation, which is modelled as a two dimensional diffusion process. The model was used to predict information overload in intercept controllers, and preliminary validation obtained by recording eye movements of intercept controllers in simulated and live (practice) interception.

  16. Data visualization for ONEDANT and TWODANT discrete ordinates codes

    International Nuclear Information System (INIS)

    Lee, C.L.

    1993-01-01

    Effective graphical display of code calculations allow for efficient analysis of results. This is especially true in the case of discrete ordinates transport codes, which can generate thousands of flux or reaction rate data points per calculation. For this reason, a package of portable interface programs called OTTUI (ONEDANT-TWODANT-Tecplot trademark Unix-Based Interface) has been developed at Los Alamos National Laboratory to permit rapid visualization of ONEDANT and TWODANT discrete ordinates results using the graphics package Tecplot. This paper describes the various uses of OTTUI for display of ONEDANT and TWODANT problem geometries and calculational results

  17. Mobile, hybrid Compton/coded aperture imaging for detection, identification and localization of gamma-ray sources at stand-off distances

    Science.gov (United States)

    Tornga, Shawn R.

    The Stand-off Radiation Detection System (SORDS) program is an Advanced Technology Demonstration (ATD) project through the Department of Homeland Security's Domestic Nuclear Detection Office (DNDO) with the goal of detection, identification and localization of weak radiological sources in the presence of large dynamic backgrounds. The Raytheon-SORDS Tri-Modal Imager (TMI) is a mobile truck-based, hybrid gamma-ray imaging system able to quickly detect, identify and localize, radiation sources at standoff distances through improved sensitivity while minimizing the false alarm rate. Reconstruction of gamma-ray sources is performed using a combination of two imaging modalities; coded aperture and Compton scatter imaging. The TMI consists of 35 sodium iodide (NaI) crystals 5x5x2 in3 each, arranged in a random coded aperture mask array (CA), followed by 30 position sensitive NaI bars each 24x2.5x3 in3 called the detection array (DA). The CA array acts as both a coded aperture mask and scattering detector for Compton events. The large-area DA array acts as a collection detector for both Compton scattered events and coded aperture events. In this thesis, developed coded aperture, Compton and hybrid imaging algorithms will be described along with their performance. It will be shown that multiple imaging modalities can be fused to improve detection sensitivity over a broader energy range than either alone. Since the TMI is a moving system, peripheral data, such as a Global Positioning System (GPS) and Inertial Navigation System (INS) must also be incorporated. A method of adapting static imaging algorithms to a moving platform has been developed. Also, algorithms were developed in parallel with detector hardware, through the use of extensive simulations performed with the Geometry and Tracking Toolkit v4 (GEANT4). Simulations have been well validated against measured data. Results of image reconstruction algorithms at various speeds and distances will be presented as well as

  18. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  19. Visualizing Debugging Activity in Source Code Repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    2007-01-01

    We present the use of the CVSgrab visualization tool for understanding the debugging activity in the Mozilla project. We show how to display the distribution of different bug types over the project structure, locate project components which undergo heavy debugging activity, and get insight in the

  20. PRIMUS: a computer code for the preparation of radionuclide ingrowth matrices from user-specified sources

    International Nuclear Information System (INIS)

    Hermann, O.W.; Baes, C.F. III; Miller, C.W.; Begovich, C.L.; Sjoreen, A.L.

    1984-10-01

    The computer program, PRIMUS, reads a library of radionuclide branching fractions and half-lives and constructs a decay-chain data library and a problem-specific decay-chain data file. PRIMUS reads the decay data compiled for 496 nuclides from the Evaluated Nuclear Structure Data File (ENSDF). The ease of adding radionuclides to the input library allows the CRRIS system to further expand its comprehensive data base. The decay-chain library produced is input to the ANEMOS code. Also, PRIMUS produces a data set reduced to only the decay chains required in a particular problem, for input to the SUMIT, TERRA, MLSOIL, and ANDROS codes. Air concentrations and deposition rates from the PRIMUS decay-chain data file. Source term data may be entered directly to PRIMUS to be read by MLSOIL, TERRA, and ANDROS. The decay-chain data prepared by PRIMUS is needed for a matrix-operator method that computes either time-dependent decay products from an initial concentration generated from a constant input source. This document describes the input requirements and the output obtained. Also, sections are included on methods, applications, subroutines, and sample cases. A short appendix indicates a method of utilizing PRIMUS and the associated decay subroutines from TERRA or ANDROS for applications to other decay problems. 18 references

  1. Visualization of elastic wavefields computed with a finite difference code

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, S. [Lawrence Livermore National Lab., CA (United States); Harris, D.

    1994-11-15

    The authors have developed a finite difference elastic propagation model to simulate seismic wave propagation through geophysically complex regions. To facilitate debugging and to assist seismologists in interpreting the seismograms generated by the code, they have developed an X Windows interface that permits viewing of successive temporal snapshots of the (2D) wavefield as they are calculated. The authors present a brief video displaying the generation of seismic waves by an explosive source on a continent, which propagate to the edge of the continent then convert to two types of acoustic waves. This sample calculation was part of an effort to study the potential of offshore hydroacoustic systems to monitor seismic events occurring onshore.

  2. GRAVE: An Interactive Geometry Construction and Visualization Software System for the TORT Nuclear Radiation Transport Code

    International Nuclear Information System (INIS)

    Blakeman, E.D.

    2000-01-01

    A software system, GRAVE (Geometry Rendering and Visual Editor), has been developed at the Oak Ridge National Laboratory (ORNL) to perform interactive visualization and development of models used as input to the TORT three-dimensional discrete ordinates radiation transport code. Three-dimensional and two-dimensional visualization displays are included. Display capabilities include image rotation, zoom, translation, wire-frame and translucent display, geometry cuts and slices, and display of individual component bodies and material zones. The geometry can be interactively edited and saved in TORT input file format. This system is an advancement over the current, non-interactive, two-dimensional display software. GRAVE is programmed in the Java programming language and can be implemented on a variety of computer platforms. Three- dimensional visualization is enabled through the Visualization Toolkit (VTK), a free-ware C++ software library developed for geometric and data visual display. Future plans include an extension of the system to read inputs using binary zone maps and combinatorial geometry models containing curved surfaces, such as those used for Monte Carlo code inputs. Also GRAVE will be extended to geometry visualization/editing for the DORT two-dimensional transport code and will be integrated into a single GUI-based system for all of the ORNL discrete ordinates transport codes

  3. BLT [Breach, Leach, and Transport]: A source term computer code for low-level waste shallow land burial

    International Nuclear Information System (INIS)

    Suen, C.J.; Sullivan, T.M.

    1990-01-01

    This paper discusses the development of a source term model for low-level waste shallow land burial facilities and separates the problem into four individual compartments. These are water flow, corrosion and subsequent breaching of containers, leaching of the waste forms, and solute transport. For the first and the last compartments, we adopted the existing codes, FEMWATER and FEMWASTE, respectively. We wrote two new modules for the other two compartments in the form of two separate Fortran subroutines -- BREACH and LEACH. They were incorporated into a modified version of the transport code FEMWASTE. The resultant code, which contains all three modules of container breaching, waste form leaching, and solute transport, was renamed BLT (for Breach, Leach, and Transport). This paper summarizes the overall program structure and logistics, and presents two examples from the results of verification and sensitivity tests. 6 refs., 7 figs., 1 tab

  4. SNS online display technologies for EPICS

    International Nuclear Information System (INIS)

    Kasemir, K.U.; Chen, X.; Purcell, J.; Danilova, E.

    2012-01-01

    The ubiquitousness of web clients from personal computers to cell phones results in a growing demand for web-based access to control system data. At the Oak Ridge National Laboratory Spallation Neutron Source (SNS) we have investigated different technical approaches to provide read access to data in the Experimental Physics and Industrial Control System (EPICS) for a wide variety of web client devices. The core web technology, HTTP, is less than ideal for online control system displays. Appropriate use of Ajax, especially the Long Poll paradigm, can alleviate fundamental HTTP limitations. The SNS Status web uses basic Ajax technology to generate generic displays for a wide audience. The Dashboard uses Long Poll and more client-side Java-Script to offer more customization and faster updates for users that need specialized displays. The Web OPI uses RAP for web access to any BOY display, offering utmost flexibility because users can create their own BOY displays in CSS. These three approaches complement each other. Users can access generic status displays with zero effort, invest time in creating their fully customized displays for the Web OPI, or use the Dashboard as an intermediate solution

  5. Visualizing Debugging Activity in Source Code Repositories

    OpenAIRE

    Voinea, Lucian; Telea, Alexandru

    2007-01-01

    We present the use of the CVSgrab visualization tool for understanding the debugging activity in the Mozilla project. We show how to display the distribution of different bug types over the project structure, locate project components which undergo heavy debugging activity, and get insight in the bug evolution in time.

  6. High performance visual display for HENP detectors

    CERN Document Server

    McGuigan, M; Spiletic, J; Fine, V; Nevski, P

    2001-01-01

    A high end visual display for High Energy Nuclear Physics (HENP) detectors is necessary because of the sheer size and complexity of the detector. For BNL this display will be of special interest because of STAR and ATLAS. To load, rotate, query, and debug simulation code with a modern detector simply takes too long even on a powerful work station. To visualize the HENP detectors with maximal performance we have developed software with the following characteristics. We develop a visual display of HENP detectors on BNL multiprocessor visualization server at multiple level of detail. We work with general and generic detector framework consistent with ROOT, GAUDI etc, to avoid conflicting with the many graphic development groups associated with specific detectors like STAR and ATLAS. We develop advanced OpenGL features such as transparency and polarized stereoscopy. We enable collaborative viewing of detector and events by directly running the analysis in BNL stereoscopic theatre. We construct enhanced interactiv...

  7. Transparent ICD and DRG coding using information technology: linking and associating information sources with the eXtensible Markup Language.

    Science.gov (United States)

    Hoelzer, Simon; Schweiger, Ralf K; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or "semantically associated" parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach.

  8. Version 4. 00 of the MINTEQ geochemical code

    Energy Technology Data Exchange (ETDEWEB)

    Eary, L.E.; Jenne, E.A.

    1992-09-01

    The MINTEQ code is a thermodynamic model that can be used to calculate solution equilibria for geochemical applications. Included in the MINTEQ code are formulations for ionic speciation, ion exchange, adsorption, solubility, redox, gas-phase equilibria, and the dissolution of finite amounts of specified solids. Since the initial development of the MINTEQ geochemical code, a number of undocumented versions of the source code and data files have come into use at the Pacific Northwest Laboratory (PNL). This report documents these changes, describes source code modifications made for the Aquifer Thermal Energy Storage (ATES) program, and provides comprehensive listings of the data files. A version number of 4.00 has been assigned to the MINTEQ source code and the individual data files described in this report.

  9. Version 4.00 of the MINTEQ geochemical code

    Energy Technology Data Exchange (ETDEWEB)

    Eary, L.E.; Jenne, E.A.

    1992-09-01

    The MINTEQ code is a thermodynamic model that can be used to calculate solution equilibria for geochemical applications. Included in the MINTEQ code are formulations for ionic speciation, ion exchange, adsorption, solubility, redox, gas-phase equilibria, and the dissolution of finite amounts of specified solids. Since the initial development of the MINTEQ geochemical code, a number of undocumented versions of the source code and data files have come into use at the Pacific Northwest Laboratory (PNL). This report documents these changes, describes source code modifications made for the Aquifer Thermal Energy Storage (ATES) program, and provides comprehensive listings of the data files. A version number of 4.00 has been assigned to the MINTEQ source code and the individual data files described in this report.

  10. Computer code FIT

    International Nuclear Information System (INIS)

    Rohmann, D.; Koehler, T.

    1987-02-01

    This is a description of the computer code FIT, written in FORTRAN-77 for a PDP 11/34. FIT is an interactive program to decude position, width and intensity of lines of X-ray spectra (max. length of 4K channels). The lines (max. 30 lines per fit) may have Gauss- or Voigt-profile, as well as exponential tails. Spectrum and fit can be displayed on a Tektronix terminal. (orig.) [de

  11. Network coding for multi-resolution multicast

    DEFF Research Database (Denmark)

    2013-01-01

    A method, apparatus and computer program product for utilizing network coding for multi-resolution multicast is presented. A network source partitions source content into a base layer and one or more refinement layers. The network source receives a respective one or more push-back messages from one...... or more network destination receivers, the push-back messages identifying the one or more refinement layers suited for each one of the one or more network destination receivers. The network source computes a network code involving the base layer and the one or more refinement layers for at least one...... of the one or more network destination receivers, and transmits the network code to the one or more network destination receivers in accordance with the push-back messages....

  12. Pre-Test Analysis of the MEGAPIE Spallation Source Target Cooling Loop Using the TRAC/AAA Code

    International Nuclear Information System (INIS)

    Bubelis, Evaldas; Coddington, Paul; Leung, Waihung

    2006-01-01

    A pilot project is being undertaken at the Paul Scherrer Institute in Switzerland to test the feasibility of installing a Lead-Bismuth Eutectic (LBE) spallation target in the SINQ facility. Efforts are coordinated under the MEGAPIE project, the main objectives of which are to design, build, operate and decommission a 1 MW spallation neutron source. The technology and experience of building and operating a high power spallation target are of general interest in the design of an Accelerator Driven System (ADS) and in this context MEGAPIE is one of the key experiments. The target cooling is one of the important aspects of the target system design that needs to be studied in detail. Calculations were performed previously using the RELAP5/Mod 3.2.2 and ATHLET codes, but in order to verify the previous code results and to provide another capability to model LBE systems, a similar study of the MEGAPIE target cooling system has been conducted with the TRAC/AAA code. In this paper a comparison is presented for the steady-state results obtained using the above codes. Analysis of transients, such as unregulated cooling of the target, loss of heat sink, the main electro-magnetic pump trip of the LBE loop and unprotected proton beam trip, were studied with TRAC/AAA and compared to those obtained earlier using RELAP5/Mod 3.2.2. This work extends the existing validation data-base of TRAC/AAA to heavy liquid metal systems and comprises the first part of the TRAC/AAA code validation study for LBE systems based on data from the MEGAPIE test facility and corresponding inter-code comparisons. (authors)

  13. A progressive data compression scheme based upon adaptive transform coding: Mixture block coding of natural images

    Science.gov (United States)

    Rost, Martin C.; Sayood, Khalid

    1991-01-01

    A method for efficiently coding natural images using a vector-quantized variable-blocksized transform source coder is presented. The method, mixture block coding (MBC), incorporates variable-rate coding by using a mixture of discrete cosine transform (DCT) source coders. Which coders are selected to code any given image region is made through a threshold driven distortion criterion. In this paper, MBC is used in two different applications. The base method is concerned with single-pass low-rate image data compression. The second is a natural extension of the base method which allows for low-rate progressive transmission (PT). Since the base method adapts easily to progressive coding, it offers the aesthetic advantage of progressive coding without incorporating extensive channel overhead. Image compression rates of approximately 0.5 bit/pel are demonstrated for both monochrome and color images.

  14. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    1999-01-01

    The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)

  15. Theoretical Atomic Physics code development IV: LINES, A code for computing atomic line spectra

    International Nuclear Information System (INIS)

    Abdallah, J. Jr.; Clark, R.E.H.

    1988-12-01

    A new computer program, LINES, has been developed for simulating atomic line emission and absorption spectra using the accurate fine structure energy levels and transition strengths calculated by the (CATS) Cowan Atomic Structure code. Population distributions for the ion stages are obtained in LINES by using the Local Thermodynamic Equilibrium (LTE) model. LINES is also useful for displaying the pertinent atomic data generated by CATS. This report describes the use of LINES. Both CATS and LINES are part of the Theoretical Atomic PhysicS (TAPS) code development effort at Los Alamos. 11 refs., 9 figs., 1 tab

  16. Modelling and Display of the Ultraviolet Sky

    Science.gov (United States)

    Daniels, J.; Henry, R.; Murthy, J.; Allen, M.; McGlynn, T. A.; Scollick, K.

    1994-12-01

    A computer program is currently under development to model in 3D - one dimension of which is wavelength - all the known and major speculated sources of ultraviolet (900 A - 3100 A ) radiation over the celestial sphere. The software is being written in Fortran 77 and IDL and currently operates under IRIX (the operating system of the Silicon Graphics Iris Machine); all output models are in FITS format. Models along with display software will become available to the astronomical community. The Ultraviolet Sky Model currently includes the Zodiacal Light, Point Sources of Emission, and the Diffuse Galactic Light. The Ultraviolet Sky Model is currently displayed using SkyView: a package under development at NASA/ GSFC, which allows users to retrieve and display publically available all-sky astronomical survey data (covering many wavebands) over the Internet. We present a demonstration of the SkyView display of the Ultraviolet Model. The modelling is a five year development project: the work illustrated here represents product output at the end of year one. Future work includes enhancements to the current models and incorporation of the following models: Galactic Molecular Hydrogen Fluorescence; Galactic Highly Ionized Atomic Line Emission; Integrated Extragalactic Light; and speculated sources in the intergalactic medium such as Ionized Plasma and radiation from Non-Baryonic Particle Decay. We also present a poster which summarizes the components of the Ultraviolet Sky Model and outlines a further package that will be used to display the Ultraviolet Model. This work is supported by United States Air Force Contract F19628-93-K-0004. Dr J. Daniels is supported with a post-doctoral Fellowship from the Leverhulme Foundation, London, United Kingdom. We are also grateful for the encouragement of Dr Stephen Price (Phillips Laboratory, Hanscomb Air Force Base, MA)

  17. Full dynamic resolution low lower DA-Converters for flat panel displays

    Directory of Open Access Journals (Sweden)

    C. Saas

    2006-01-01

    Full Text Available It has been shown that stepwise charging can reduce the power dissipated in the source drivers of a flat panel display. However the solution presented only provided a dynamic resolution of 3 bits which is not sufficient for obtaining a full color resolution display. In this work a further development of the basic idea is presented. The stepwise charging is increased to 4 bits and supplemented by a current source to provide an output signal which represents an 8 bit value with sufficient accuracy. Within this work the application is an AM-OLED flat panel display, but the concept can easily be applied to other display technologies like TFT-LCD as well.

  18. Flat panel planar optic display. Revision 4/95

    Energy Technology Data Exchange (ETDEWEB)

    Veligdan, J.T.

    1995-05-01

    A prototype 10 inch flat panel Planar Optic display, (POD), screen has been constructed and tested. This display screen is comprised of hundreds of planar optic glass sheets bonded together with a cladding layer between each sheet where each glass sheet represents a vertical line of resolution. The display is 9 inches wide by 5 inches high and approximately 1 inch thick. A 3 milliwatt HeNe laser is used as the illumination source and a vector scanning technique is employed.

  19. NIST display colorimeter calibration facility

    Science.gov (United States)

    Brown, Steven W.; Ohno, Yoshihiro

    2003-07-01

    A facility has been developed at the National Institute of Standards and Technology (NIST) to provide calibration services for color-measuring instruments to address the need for improving and certifying the measurement uncertainties of this type of instrument. While NIST has active programs in photometry, flat panel display metrology, and color and appearance measurements, these are the first services offered by NIST tailored to color-measuring instruments for displays. An overview of the facility, the calibration approach, and associated uncertainties are presented. Details of a new tunable colorimetric source and the development of new transfer standard instruments are discussed.

  20. RIES - Rijnland Internet Election System: A Cursory Study of Published Source Code

    Science.gov (United States)

    Gonggrijp, Rop; Hengeveld, Willem-Jan; Hotting, Eelco; Schmidt, Sebastian; Weidemann, Frederik

    The Rijnland Internet Election System (RIES) is a system designed for voting in public elections over the internet. A rather cursory scan of the source code to RIES showed a significant lack of security-awareness among the programmers which - among other things - appears to have left RIES vulnerable to near-trivial attacks. If it had not been for independent studies finding problems, RIES would have been used in the 2008 Water Board elections, possibly handling a million votes or more. While RIES was more extensively studied to find cryptographic shortcomings, our work shows that more down-to-earth secure design practices can be at least as important, and the aspects need to be examined much sooner than right before an election.

  1. Use of WIMS-E lattice code for prediction of the transuranic source term for spent fuel dose estimation

    International Nuclear Information System (INIS)

    Schwinkendorf, K.N.

    1996-01-01

    A recent source term analysis has shown a discrepancy between ORIGEN2 transuranic isotopic production estimates and those produced with the WIMS-E lattice physics code. Excellent agreement between relevant experimental measurements and WIMS-E was shown, thus exposing an error in the cross section library used by ORIGEN2

  2. Dynamic benchmarking of simulation codes

    International Nuclear Information System (INIS)

    Henry, R.E.; Paik, C.Y.; Hauser, G.M.

    1996-01-01

    Computer simulation of nuclear power plant response can be a full-scope control room simulator, an engineering simulator to represent the general behavior of the plant under normal and abnormal conditions, or the modeling of the plant response to conditions that would eventually lead to core damage. In any of these, the underlying foundation for their use in analysing situations, training of vendor/utility personnel, etc. is how well they represent what has been known from industrial experience, large integral experiments and separate effects tests. Typically, simulation codes are benchmarked with some of these; the level of agreement necessary being dependent upon the ultimate use of the simulation tool. However, these analytical models are computer codes, and as a result, the capabilities are continually enhanced, errors are corrected, new situations are imposed on the code that are outside of the original design basis, etc. Consequently, there is a continual need to assure that the benchmarks with important transients are preserved as the computer code evolves. Retention of this benchmarking capability is essential to develop trust in the computer code. Given the evolving world of computer codes, how is this retention of benchmarking capabilities accomplished? For the MAAP4 codes this capability is accomplished through a 'dynamic benchmarking' feature embedded in the source code. In particular, a set of dynamic benchmarks are included in the source code and these are exercised every time the archive codes are upgraded and distributed to the MAAP users. Three different types of dynamic benchmarks are used: plant transients; large integral experiments; and separate effects tests. Each of these is performed in a different manner. The first is accomplished by developing a parameter file for the plant modeled and an input deck to describe the sequence; i.e. the entire MAAP4 code is exercised. The pertinent plant data is included in the source code and the computer

  3. Use of CITATION code for flux calculation in neutron activation analysis with voluminous sample using an Am-Be source

    International Nuclear Information System (INIS)

    Khelifi, R.; Idiri, Z.; Bode, P.

    2002-01-01

    The CITATION code based on neutron diffusion theory was used for flux calculations inside voluminous samples in prompt gamma activation analysis with an isotopic neutron source (Am-Be). The code uses specific parameters related to the energy spectrum source and irradiation system materials (shielding, reflector). The flux distribution (thermal and fast) was calculated in the three-dimensional geometry for the system: air, polyethylene and water cuboidal sample (50x50x50 cm). Thermal flux was calculated in a series of points inside the sample. The results agreed reasonably well with observed values. The maximum thermal flux was observed at a distance of 3.2 cm while CITATION gave 3.7 cm. Beyond a depth of 7.2 cm, the thermal flux to fast flux ratio increases up to twice and allows us to optimise the detection system position in the scope of in-situ PGAA

  4. LID: Computer code for identifying atomic and ionic lines below 3500 Angstroms

    International Nuclear Information System (INIS)

    Peek, J.M.; Dukart, R.J.

    1987-08-01

    An interactive computer code has been written to search a data base containing information useful for identifying lines in experimentally-observed spectra or for designing experiments. The data base was the basis for the Kelly and Palumbo critical review of well-resolved lines below 2000 Angstroms, includes lines below 3500 Angstroms for atoms and ions of hydrogen through krypton, and was obtained from R.L. Kelly. This code allows the user to search the data base for a user-specified wavelength region, with this search either limited to atoms or ions of the user's choice for all atoms and ions contained in the data base. The line information found in the search is stored in a local file for later reference. A plotting capability is provided to graphically display the lines resulting from the search. Several options are available to control the nature of these graphs. It is also possible to bring in data from another source, such as an experimental spectra, for display along with the lines from the data-base search. Options for manipulating the experimental spectra's background intensity and wavelength scale are also available to the user. The intensities for the lines from each ion found in the data-base search can be scaled by a multiplicative constant to better simulate the observed spectrum

  5. T-38 Primary Flight Display Prototyping and HIVE Support Abstract & Summary

    Science.gov (United States)

    Boniface, Andrew

    2015-01-01

    . Complete navigation, control, and function customization will be achievable once a display is fully developed. Other than the T-38 prototyping, I spent time learning how to design small circuits and write code for them to function. This was done by adding electronic circuit components to breadboard and microcontroller then writing code to speak to those components through the microcontroller. I went through an Arduino starter kit to build circuits and code software that allowed the hardware to act. This work was planned to assist in a lighting project this fall but another solution was discovered for the lighting project. Other tasks that I assisted with, included hands on work such as mock-up construction/removal, logic analyzer repairs, and soldering with circuits. The unique opportunity to be involved work with NASA has significantly changed my educational and career goals. This opportunity has only opened the door to my career with engineering. I have learned over the span of this internship that I am fascinated by the type of work that NASA does. My desire to work in the aerospace industry has increased immensely. I hope to return to NASA to be more involved in the advancement of science, engineering, and spaceflight. My interests for my future education and career lie in NASA’s work - pioneering the future in space exploration, scientific discovery and aeronautics research.

  6. Microprocessor based beam intensity and efficiency display system for the Fermilab accelerator

    International Nuclear Information System (INIS)

    Biwer, R.

    1979-01-01

    The Main Accelerator display system for the Fermilab accelerator gathers charge data and displays it including processed transfer efficiencies of each of the accelerators. To accomplish this, strategically located charge converters monitor the circulating internal beam of each of the Fermilab accelerators. Their outputs are processed via an asynchronously triggered, multiplexed analog-to-digital converter. The data is converted into a digital byte containing address code and data, then stores it into two 16-bit memories. One memory outputs the interleaved data as a data pulse train while the other interfaces directly to a local host computer for further analysis. The microprocessor based display unit synchronizes displayed data during normal operation as well as special storage modes. The display unit outputs data to the fron panel in the form of a numeric value and also makes digital-to-analog conversions of displayed data for external peripheral devices. 5 refs

  7. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  8. LED-driven backlights for automotive displays

    Science.gov (United States)

    Strauch, Frank

    2007-09-01

    As a light source the LED has some advantage over the traditionally used fluorescence tube such as longer life or lower space consumption. Consequently customers are asking for the LED lighting design in their products. We introduced in a company owned backlight the white LED technology. This step opens the possibility to have access to the components in the display market. Instead of having a finalized display product which needs to be integrated in the head unit of a car we assemble the backlight, the glass, own electronics and the housing. A major advantage of this concept is the better control of the heat flow generated by the LEDs to the outer side because only a common housing is used for all the components. Also the requirement for slim products can be fulfilled. As always a new technology doesn't come with advantages only. An LED represents a point source compared to the well-known tube thus requiring a mixing zone for the multiple point sources when they enter a light guide. This zone can't be used in displays because of the lack of homogeneity. It's a design goal to minimize this zone which can be helped by the right choice of the LED in terms of slimness. A step ahead is the implementation of RGB LEDs because of their higher color rendering abilities. This allows for the control of the chromaticity point under temperature change but as a drawback needs a larger mixing zone.

  9. BrachyTPS -Interactive point kernel code package for brachytherapy treatment planning of gynaecological cancers

    International Nuclear Information System (INIS)

    Thilagam, L.; Subbaiah, K.V.

    2008-01-01

    Brachytherapy treatment planning systems (TPS) are always recommended to account for the effect of tissue, applicator and shielding material heterogeneities exist in Intracavitary brachytherapy (ICBT) applicators. Most of the commercially available brachytherapy TPS softwares estimate the absorbed dose at a point, only taking care of the contributions of individual sources and the source distribution, neglecting the dose perturbations arising from the applicator design and construction. So the doses estimated by them are not much accurate under realistic clinical conditions. In this regard, interactive point kernel rode (BrachyTPS) has been developed to perform independent dose calculations by taking into account the effect of these heterogeneities, using two regions build up factors, proposed by Kalos. As primary input data, the code takes patients' planning data including the source specifications, dwell positions, dwell times and it computes the doses at reference points by dose point kernel formalisms, with multi-layer shield build-up factors accounting for the contributions from scattered radiation. In addition to performing dose distribution calculations, this code package is capable of displaying an isodose distribution curve into the patient anatomy images. The primary aim of this study is to validate the developed point kernel code integrated with treatment planning systems against the other tools which are available in the market. In the present work, three brachytherapy applicators commonly used in the treatment of uterine cervical carcinoma, Board of Radiation Isotope and Technology (BRIT) made low dose rate (LDR) applicator, Fletcher Green type LDR applicator and Fletcher Williamson high dose rate (HDR) applicator were studied to test the accuracy of the software

  10. Digital Display Integration Project Project Online 2.0

    CERN Document Server

    Bardsley, J N

    1999-01-01

    The electronic display industry is changing in three important ways. First, the dominance of the cathode ray tube (CRT) is being challenged by the development of flat panel displays (FPDs). This will lead to the availability of displays of higher performance, albeit at greater cost. Secondly, the analog interfaces between displays that show data and the computers that generate the data are being replaced by digital connections. Finally, a high-resolution display is becoming the most expensive component in computer system for homes and small offices. It is therefore desirable that the useful lifetime of the display extend over several years and that the electronics allows the display to be used with many different image sources. Hopefully, the necessity of having three or four large CRTs in one office to accommodate different computer operating systems or communication protocols will soon disappear. Instead, we hope to see a set of flat panels that can be switched to show several independent images from multip...

  11. An Assessment of Some Design Constraints on Heat Production of a 3D Conceptual EGS Model Using an Open-Source Geothermal Reservoir Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Yidong Xia; Mitch Plummer; Robert Podgorney; Ahmad Ghassemi

    2016-02-01

    Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation angle for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.

  12. The Aster code

    International Nuclear Information System (INIS)

    Delbecq, J.M.

    1999-01-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  13. SFACTOR: a computer code for calculating dose equivalent to a target organ per microcurie-day residence of a radionuclide in a source organ

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, D.E. Jr.; Pleasant, J.C.; Killough, G.G.

    1977-11-01

    A computer code SFACTOR was developed to estimate the average dose equivalent S (rem/..mu..Ci-day) to each of a specified list of target organs per microcurie-day residence of a radionuclide in source organs in man. Source and target organs of interest are specified in the input data stream, along with the nuclear decay information. The SFACTOR code computes components of the dose equivalent rate from each type of decay present for a particular radionuclide, including alpha, electron, and gamma radiation. For those transuranic isotopes which also decay by spontaneous fission, components of S from the resulting fission fragments, neutrons, betas, and gammas are included in the tabulation. Tabulations of all components of S are provided for an array of 22 source organs and 24 target organs for 52 radionuclides in an adult.

  14. SFACTOR: a computer code for calculating dose equivalent to a target organ per microcurie-day residence of a radionuclide in a source organ

    International Nuclear Information System (INIS)

    Dunning, D.E. Jr.; Pleasant, J.C.; Killough, G.G.

    1977-11-01

    A computer code SFACTOR was developed to estimate the average dose equivalent S (rem/μCi-day) to each of a specified list of target organs per microcurie-day residence of a radionuclide in source organs in man. Source and target organs of interest are specified in the input data stream, along with the nuclear decay information. The SFACTOR code computes components of the dose equivalent rate from each type of decay present for a particular radionuclide, including alpha, electron, and gamma radiation. For those transuranic isotopes which also decay by spontaneous fission, components of S from the resulting fission fragments, neutrons, betas, and gammas are included in the tabulation. Tabulations of all components of S are provided for an array of 22 source organs and 24 target organs for 52 radionuclides in an adult

  15. Performance evaluation based on data from code reviews

    OpenAIRE

    Andrej, Sekáč

    2016-01-01

    Context. Modern code review tools such as Gerrit have made available great amounts of code review data from different open source projects as well as other commercial projects. Code reviews are used to keep the quality of produced source code under control but the stored data could also be used for evaluation of the software development process. Objectives. This thesis uses machine learning methods for an approximation of review expert’s performance evaluation function. Due to limitations in ...

  16. Recycling source terms for edge plasma fluid models and impact on convergence behaviour of the BRAAMS 'B2' code

    International Nuclear Information System (INIS)

    Maddison, G.P.; Reiter, D.

    1994-02-01

    Predictive simulations of tokamak edge plasmas require the most authentic description of neutral particle recycling sources, not merely the most expedient numerically. Employing a prototypical ITER divertor arrangement under conditions of high recycling, trial calculations with the 'B2' steady-state edge plasma transport code, plus varying approximations or recycling, reveal marked sensitivity of both results and its convergence behaviour to details of sources incorporated. Comprehensive EIRENE Monte Carlo resolution of recycling is implemented by full and so-called 'shot' intermediate cycles between the plasma fluid and statistical neutral particle models. As generally for coupled differencing and stochastic procedures, though, overall convergence properties become more difficult to assess. A pragmatic criterion for the 'B2'/EIRENE code system is proposed to determine its success, proceeding from a stricter condition previously identified for one particular analytic approximation of recycling in 'B2'. Certain procedures are also inferred potentially to improve their convergence further. (orig.)

  17. Application of the source term code package to obtain a specific source term for the Laguna Verde Nuclear Power Plant

    International Nuclear Information System (INIS)

    Souto, F.J.

    1991-06-01

    The main objective of the project was to use the Source Term Code Package (STCP) to obtain a specific source term for those accident sequences deemed dominant as a result of probabilistic safety analyses (PSA) for the Laguna Verde Nuclear Power Plant (CNLV). The following programme has been carried out to meet this objective: (a) implementation of the STCP, (b) acquisition of specific data for CNLV to execute the STCP, and (c) calculations of specific source terms for accident sequences at CNLV. The STCP has been implemented and validated on CDC 170/815 and CDC 180/860 main frames as well as on a Micro VAX 3800 system. In order to get a plant-specific source term, data on the CNLV including initial core inventory, burn-up, primary containment structures, and materials used for the calculations have been obtained. Because STCP does not explicitly model containment failure, dry well failure in the form of a catastrophic rupture has been assumed. One of the most significant sequences from the point of view of possible off-site risk is the loss of off-site power with failure of the diesel generators and simultaneous loss of high pressure core spray and reactor core isolation cooling systems. The probability for that event is approximately 4.5 x 10 -6 . This sequence has been analysed in detail and the release fractions of radioisotope groups are given in the full report. 18 refs, 4 figs, 3 tabs

  18. FLP: a field line plotting code for bundle divertor design

    International Nuclear Information System (INIS)

    Ruchti, C.

    1981-01-01

    A computer code was developed to aid in the design of bundle divertors. The code can handle discrete toroidal field coils and various divertor coil configurations. All coils must be composed of straight line segments. The code runs on the PDP-10 and displays plots of the configuration, field lines, and field ripple. It automatically chooses the coil currents to connect the separatrix produced by the divertor to the outer edge of the plasma and calculates the required coil cross sections. Several divertor designs are illustrated to show how the code works

  19. Introduction to coding and information theory

    CERN Document Server

    Roman, Steven

    1997-01-01

    This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.

  20. A device for displaying defects in concrete

    International Nuclear Information System (INIS)

    Zouboff, Vadim; Darnault, Claude; Leloup, J.-C.

    1973-01-01

    The device comprises a common gamma source, located on one side of the concrete block to be examined on the opposite side, a detecting unit comprising a collimator and a photo-multiplier detector connected to a display unit and moving along rails parallel to the concrete block face. That device is used for displaying concrete defects in particular injection deficiencies in the pre-stress sheaths of concrete used for the building of bridges or tunnels [fr

  1. Efficient Coding of Information: Huffman Coding -RE ...

    Indian Academy of Sciences (India)

    to a stream of equally-likely symbols so as to recover the original stream in the event of errors. The for- ... The source-coding problem is one of finding a mapping from U to a ... probability that the random variable X takes the value x written as ...

  2. Active Fault Near-Source Zones Within and Bordering the State of California for the 1997 Uniform Building Code

    Science.gov (United States)

    Petersen, M.D.; Toppozada, Tousson R.; Cao, T.; Cramer, C.H.; Reichle, M.S.; Bryant, W.A.

    2000-01-01

    The fault sources in the Project 97 probabilistic seismic hazard maps for the state of California were used to construct maps for defining near-source seismic coefficients, Na and Nv, incorporated in the 1997 Uniform Building Code (ICBO 1997). The near-source factors are based on the distance from a known active fault that is classified as either Type A or Type B. To determine the near-source factor, four pieces of geologic information are required: (1) recognizing a fault and determining whether or not the fault has been active during the Holocene, (2) identifying the location of the fault at or beneath the ground surface, (3) estimating the slip rate of the fault, and (4) estimating the maximum earthquake magnitude for each fault segment. This paper describes the information used to produce the fault classifications and distances.

  3. Methods and apparatus for transparent display using scattering nanoparticles

    Science.gov (United States)

    Hsu, Chia Wei; Qiu, Wenjun; Zhen, Bo; Shapira, Ofer; Soljacic, Marin

    2016-05-10

    Transparent displays enable many useful applications, including heads-up displays for cars and aircraft as well as displays on eyeglasses and glass windows. Unfortunately, transparent displays made of organic light-emitting diodes are typically expensive and opaque. Heads-up displays often require fixed light sources and have limited viewing angles. And transparent displays that use frequency conversion are typically energy inefficient. Conversely, the present transparent displays operate by scattering visible light from resonant nanoparticles with narrowband scattering cross sections and small absorption cross sections. More specifically, projecting an image onto a transparent screen doped with nanoparticles that selectively scatter light at the image wavelength(s) yields an image on the screen visible to an observer. Because the nanoparticles scatter light at only certain wavelengths, the screen is practically transparent under ambient light. Exemplary transparent scattering displays can be simple, inexpensive, scalable to large sizes, viewable over wide angular ranges, energy efficient, and transparent simultaneously.

  4. The "Wow! signal" of the terrestrial genetic code

    Science.gov (United States)

    shCherbak, Vladimir I.; Makukov, Maxim A.

    2013-05-01

    It has been repeatedly proposed to expand the scope for SETI, and one of the suggested alternatives to radio is the biological media. Genomic DNA is already used on Earth to store non-biological information. Though smaller in capacity, but stronger in noise immunity is the genetic code. The code is a flexible mapping between codons and amino acids, and this flexibility allows modifying the code artificially. But once fixed, the code might stay unchanged over cosmological timescales; in fact, it is the most durable construct known. Therefore it represents an exceptionally reliable storage for an intelligent signature, if that conforms to biological and thermodynamic requirements. As the actual scenario for the origin of terrestrial life is far from being settled, the proposal that it might have been seeded intentionally cannot be ruled out. A statistically strong intelligent-like "signal" in the genetic code is then a testable consequence of such scenario. Here we show that the terrestrial code displays a thorough precision-type orderliness matching the criteria to be considered an informational signal. Simple arrangements of the code reveal an ensemble of arithmetical and ideographical patterns of the same symbolic language. Accurate and systematic, these underlying patterns appear as a product of precision logic and nontrivial computing rather than of stochastic processes (the null hypothesis that they are due to chance coupled with presumable evolutionary pathways is rejected with P-value < 10-13). The patterns are profound to the extent that the code mapping itself is uniquely deduced from their algebraic representation. The signal displays readily recognizable hallmarks of artificiality, among which are the symbol of zero, the privileged decimal syntax and semantical symmetries. Besides, extraction of the signal involves logically straightforward but abstract operations, making the patterns essentially irreducible to any natural origin. Plausible ways of

  5. On-line data display

    Science.gov (United States)

    Lang, Sherman Y. T.; Brooks, Martin; Gauthier, Marc; Wein, Marceli

    1993-05-01

    A data display system for embedded realtime systems has been developed for use as an operator's user interface and debugging tool. The motivation for development of the On-Line Data Display (ODD) have come from several sources. In particular the design reflects the needs of researchers developing an experimental mobile robot within our laboratory. A proliferation of specialized user interfaces revealed a need for a flexible communications and graphical data display system. At the same time the system had to be readily extensible for arbitrary graphical display formats which would be required for data visualization needs of the researchers. The system defines a communication protocol transmitting 'datagrams' between tasks executing on the realtime system and virtual devices displaying the data in a meaningful way on a graphical workstation. The communication protocol multiplexes logical channels on a single data stream. The current implementation consists of a server for the Harmony realtime operating system and an application written for the Macintosh computer. Flexibility requirements resulted in a highly modular server design, and a layered modular object- oriented design for the Macintosh part of the system. Users assign data types to specific channels at run time. Then devices are instantiated by the user and connected to channels to receive datagrams. The current suite of device types do not provide enough functionality for most users' specialized needs. Instead the system design allows the creation of new device types with modest programming effort. The protocol, design and use of the system are discussed.

  6. A new LED light source for display cases

    DEFF Research Database (Denmark)

    Dam-Hansen, Carsten; Petersen, Paul Michael

    Abstract: We report a new LED light source suitable for illumination of gold objects. It has a variable correlated color temperature from 2760 K to 2200 K with a high color rendering index up to 97.......Abstract: We report a new LED light source suitable for illumination of gold objects. It has a variable correlated color temperature from 2760 K to 2200 K with a high color rendering index up to 97....

  7. Recent advances in coding theory for near error-free communications

    Science.gov (United States)

    Cheung, K.-M.; Deutsch, L. J.; Dolinar, S. J.; Mceliece, R. J.; Pollara, F.; Shahshahani, M.; Swanson, L.

    1991-01-01

    Channel and source coding theories are discussed. The following subject areas are covered: large constraint length convolutional codes (the Galileo code); decoder design (the big Viterbi decoder); Voyager's and Galileo's data compression scheme; current research in data compression for images; neural networks for soft decoding; neural networks for source decoding; finite-state codes; and fractals for data compression.

  8. Expression and surface display of Cellulomonas endoglucanase in the ethanologenic bacterium Zymobacter palmae

    Energy Technology Data Exchange (ETDEWEB)

    Kojima, Motoki; Akahoshi, Tomohiro; Okamoto, Kenji; Yanase, Hideshi [Tottori Univ. (Japan). Dept. of Chemistry and Biotechnology

    2012-11-15

    In order to reduce the cost of bioethanol production from lignocellulosic biomass, we developed a tool for cell surface display of cellulolytic enzymes on the ethanologenic bacterium Zymobacter palmae. Z. palmae is a novel ethanol-fermenting bacterium capable of utilizing a broad range of sugar substrates, but not cellulose. Therefore, to express and display heterologous cellulolytic enzymes on the Z. palmae cell surface, we utilized the cell-surface display motif of the Pseudomonas ice nucleation protein Ina. The gene encoding Ina from Pseudomonas syringae IFO3310 was cloned, and its product was comprised of three functional domains: an N-terminal domain, a central domain with repeated amino acid residues, and a C-terminal domain. The N-terminal domain of Ina was shown to function as the anchoring motif for a green fluorescence protein fusion protein in Escherichia coli. To express a heterologous cellulolytic enzyme extracellularly in Z. palmae, we fused the N-terminal coding sequence of Ina to the coding sequence of an N-terminal-truncated Cellulomonas endoglucanase. Z. palmae cells carrying the fusion endoglucanase gene were shown to degrade carboxymethyl cellulose. Although a portion of the expressed fusion endoglucanase was released from Z. palmae cells into the culture broth, we confirmed the display of the protein on the cell surface by immunofluorescence microscopy. The results indicate that the N-terminal anchoring motif of Ina from P. syringae enabled the translocation and display of the heterologous cellulase on the cell surface of Z. palmae. (orig.)

  9. Generic programming for deterministic neutron transport codes

    International Nuclear Information System (INIS)

    Plagne, L.; Poncot, A.

    2005-01-01

    This paper discusses the implementation of neutron transport codes via generic programming techniques. Two different Boltzmann equation approximations have been implemented, namely the Sn and SPn methods. This implementation experiment shows that generic programming allows us to improve maintainability and readability of source codes with no performance penalties compared to classical approaches. In the present implementation, matrices and vectors as well as linear algebra algorithms are treated separately from the rest of source code and gathered in a tool library called 'Generic Linear Algebra Solver System' (GLASS). Such a code architecture, based on a linear algebra library, allows us to separate the three different scientific fields involved in transport codes design: numerical analysis, reactor physics and computer science. Our library handles matrices with optional storage policies and thus applies both to Sn code, where the matrix elements are computed on the fly, and to SPn code where stored matrices are used. Thus, using GLASS allows us to share a large fraction of source code between Sn and SPn implementations. Moreover, the GLASS high level of abstraction allows the writing of numerical algorithms in a form which is very close to their textbook descriptions. Hence the GLASS algorithms collection, disconnected from computer science considerations (e.g. storage policy), is very easy to read, to maintain and to extend. (authors)

  10. Recent advances in the Poisson/superfish codes

    International Nuclear Information System (INIS)

    Ryne, R.; Barts, T.; Chan, K.C.D.; Cooper, R.; Deaven, H.; Merson, J.; Rodenz, G.

    1992-01-01

    We report on advances in the POISSON/SUPERFISH family of codes used in the design and analysis of magnets and rf cavities. The codes include preprocessors for mesh generation and postprocessors for graphical display of output and calculation of auxiliary quantities. Release 3 became available in January 1992; it contains many code corrections and physics enhancements, and it also includes support for PostScript, DISSPLA, GKS and PLOT10 graphical output. Release 4 will be available in September 1992; it is free of all bit packing, making the codes more portable and able to treat very large numbers of mesh points. Release 4 includes the preprocessor FRONT and a new menu-driven graphical postprocessor that runs on workstations under X-Windows and that is capable of producing arrow plots. We will present examples that illustrate the new capabilities of the codes. (author). 6 refs., 3 figs

  11. Low-Complexity Compression Algorithm for Hyperspectral Images Based on Distributed Source Coding

    Directory of Open Access Journals (Sweden)

    Yongjian Nian

    2013-01-01

    Full Text Available A low-complexity compression algorithm for hyperspectral images based on distributed source coding (DSC is proposed in this paper. The proposed distributed compression algorithm can realize both lossless and lossy compression, which is implemented by performing scalar quantization strategy on the original hyperspectral images followed by distributed lossless compression. Multilinear regression model is introduced for distributed lossless compression in order to improve the quality of side information. Optimal quantized step is determined according to the restriction of the correct DSC decoding, which makes the proposed algorithm achieve near lossless compression. Moreover, an effective rate distortion algorithm is introduced for the proposed algorithm to achieve low bit rate. Experimental results show that the compression performance of the proposed algorithm is competitive with that of the state-of-the-art compression algorithms for hyperspectral images.

  12. Living Up to the Code's Exhortations? Social Workers' Political Knowledge Sources, Expectations, and Behaviors.

    Science.gov (United States)

    Felderhoff, Brandi Jean; Hoefer, Richard; Watson, Larry Dan

    2016-01-01

    The National Association of Social Workers' (NASW's) Code of Ethics urges social workers to engage in political action. However, little recent research has been conducted to examine whether social workers support this admonition and the extent to which they actually engage in politics. The authors gathered data from a survey of social workers in Austin, Texas, to address three questions. First, because keeping informed about government and political news is an important basis for action, the authors asked what sources of knowledge social workers use. Second, they asked what the respondents believe are appropriate political behaviors for other social workers and NASW. Third, they asked for self-reports regarding respondents' own political behaviors. Results indicate that social workers use the Internet and traditional media services to stay informed; expect other social workers and NASW to be active; and are, overall, more active than the general public in many types of political activities. The comparisons made between expectations for others and their own behaviors are interesting in their complex outcomes. Social workers should strive for higher levels of adherence to the code's urgings on political activity. Implications for future work are discussed.

  13. Safety analysis code input automation using the Nuclear Plant Data Bank

    International Nuclear Information System (INIS)

    Kopp, H.; Leung, J.; Tajbakhsh, A.; Viles, F.

    1985-01-01

    The Nuclear Plant Data Bank (NPDB) is a computer-based system that organizes a nuclear power plant's technical data, providing mechanisms for data storage, retrieval, and computer-aided engineering analysis. It has the specific objective to describe thermohydraulic systems in order to support: rapid information retrieval and display, and thermohydraulic analysis modeling. The Nuclear Plant Data Bank (NPBD) system fully automates the storage and analysis based on this data. The system combines the benefits of a structured data base system and computer-aided modeling with links to large scale codes for engineering analysis. Emphasis on a friendly and very graphically oriented user interface facilitates both initial use and longer term efficiency. Specific features are: organization and storage of thermohydraulic data items, ease in locating specific data items, graphical and tabular display capabilities, interactive model construction, organization and display of model input parameters, input deck construction for TRAC and RELAP analysis programs, and traceability of plant data, user model assumptions, and codes used in the input deck construction process. The major accomplishments of this past year were the development of a RELAP model generation capability and the development of a CRAY version of the code

  14. Whether and Where to Code in the Wireless Relay Channel

    DEFF Research Database (Denmark)

    Shi, Xiaomeng; Médard, Muriel; Roetter, Daniel Enrique Lucani

    2013-01-01

    The throughput benefits of random linear network codes have been studied extensively for wirelined and wireless erasure networks. It is often assumed that all nodes within a network perform coding operations. In energy-constrained systems, however, coding subgraphs should be chosen to control...... the number of coding nodes while maintaining throughput. In this paper, we explore the strategic use of network coding in the wireless packet erasure relay channel according to both throughput and energy metrics. In the relay channel, a single source communicates to a single sink through the aid of a half......-duplex relay. The fluid flow model is used to describe the case where both the source and the relay are coding, and Markov chain models are proposed to describe packet evolution if only the source or only the relay is coding. In addition to transmission energy, we take into account coding and reception...

  15. Recent advances in neutral particle transport methods and codes

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1996-01-01

    An overview of ORNL's three-dimensional neutral particle transport code, TORT, is presented. Special features of the code that make it invaluable for large applications are summarized for the prospective user. Advanced capabilities currently under development and installation in the production release of TORT are discussed; they include: multitasking on Cray platforms running the UNICOS operating system; Adjacent cell Preconditioning acceleration scheme; and graphics codes for displaying computed quantities such as the flux. Further developments for TORT and its companion codes to enhance its present capabilities, as well as expand its range of applications are disucssed. Speculation on the next generation of neutron particle transport codes at ORNL, especially regarding unstructured grids and high order spatial approximations, are also mentioned

  16. Panoramic, large-screen, 3-D flight display system design

    Science.gov (United States)

    Franklin, Henry; Larson, Brent; Johnson, Michael; Droessler, Justin; Reinhart, William F.

    1995-01-01

    The report documents and summarizes the results of the required evaluations specified in the SOW and the design specifications for the selected display system hardware. Also included are the proposed development plan and schedule as well as the estimated rough order of magnitude (ROM) cost to design, fabricate, and demonstrate a flyable prototype research flight display system. The thrust of the effort was development of a complete understanding of the user/system requirements for a panoramic, collimated, 3-D flyable avionic display system and the translation of the requirements into an acceptable system design for fabrication and demonstration of a prototype display in the early 1997 time frame. Eleven display system design concepts were presented to NASA LaRC during the program, one of which was down-selected to a preferred display system concept. A set of preliminary display requirements was formulated. The state of the art in image source technology, 3-D methods, collimation methods, and interaction methods for a panoramic, 3-D flight display system were reviewed in depth and evaluated. Display technology improvements and risk reductions associated with maturity of the technologies for the preferred display system design concept were identified.

  17. Network Distributed Data Acquisition, Storage, and Graphical Live Display Software for a Laser Ion Source at CERN

    CERN Document Server

    Rossel, Ralf Erik; Rothe, Sebastian

    2014-01-01

    This project documentation outlines the requirements and implementation details for the measurement data recording software currently in development for the Resonance Ionisation Laser Ion Source (RILIS) at CERN. The software is capable of acquiring data from multiple laser parameter monitoring devices and associating the gathered values to represent qualitative and quantitative measurements. The measurement data is displayed graphically within the program and recorded to files for later analysis. The main application of the software is the acquisition coordination and recording of measurement data during spectroscopy experiments performed by RILIS and collaborating experiments. This document describes the design concept and detailed program implementation status at the end of July 2014 and provides an outlook to future developments in RILIS spectroscopy data acquisition.

  18. Avian leukosis virus is a versatile eukaryotic platform for polypeptide display

    International Nuclear Information System (INIS)

    Khare, Pranay D.; Russell, Stephen J.; Federspiel, Mark J.

    2003-01-01

    Display technology refers to methods of generating libraries of modularly coded biomolecules and screening them for particular properties. Retroviruses are good candidates to be a eukaryotic viral platform for the display of polypeptides synthesized in eukaryotic cells. Here we demonstrate that avian leukosis virus (ALV) provides an ideal platform for display of nonviral polyaeptides expressed in a eukaryotic cell substrate. Different sizes of polypeptides were genetically fused to the extreme N-terminus of the ALV envelope glycoprotein in an ALV infectious clone containing an alkaline phosphatase reporter gene. The chimeric envelope glycoproteins were efficiently incorporated into virions and were stably displayed on the surface of the virions through multiple virus replication cycles. The foreign polypeptides did not interfere with the attachment and entry functions of the underlying ALV envelope glycoproteins. The displayed polypeptides were fully functional and could efficiently mediate attachment of the recombinant viruses to their respective cognate receptors. This study demonstrates that ALV is an ideal display platform for the generation and selection of libraries of polypeptides where there is a need for expression, folding, and posttranslational modification in the endoplasmic reticulum of eukaryotic cells

  19. Binary Systematic Network Coding for Progressive Packet Decoding

    OpenAIRE

    Jones, Andrew L.; Chatzigeorgiou, Ioannis; Tassi, Andrea

    2015-01-01

    We consider binary systematic network codes and investigate their capability of decoding a source message either in full or in part. We carry out a probability analysis, derive closed-form expressions for the decoding probability and show that systematic network coding outperforms conventional net- work coding. We also develop an algorithm based on Gaussian elimination that allows progressive decoding of source packets. Simulation results show that the proposed decoding algorithm can achieve ...

  20. Bit-wise arithmetic coding for data compression

    Science.gov (United States)

    Kiely, A. B.

    1994-01-01

    This article examines the problem of compressing a uniformly quantized independent and identically distributed (IID) source. We present a new compression technique, bit-wise arithmetic coding, that assigns fixed-length codewords to the quantizer output and uses arithmetic coding to compress the codewords, treating the codeword bits as independent. We examine the performance of this method and evaluate the overhead required when used block-adaptively. Simulation results are presented for Gaussian and Laplacian sources. This new technique could be used as the entropy coder in a transform or subband coding system.

  1. Microgravity computing codes. User's guide

    Science.gov (United States)

    1982-01-01

    Codes used in microgravity experiments to compute fluid parameters and to obtain data graphically are introduced. The computer programs are stored on two diskettes, compatible with the floppy disk drives of the Apple 2. Two versions of both disks are available (DOS-2 and DOS-3). The codes are written in BASIC and are structured as interactive programs. Interaction takes place through the keyboard of any Apple 2-48K standard system with single floppy disk drive. The programs are protected against wrong commands given by the operator. The programs are described step by step in the same order as the instructions displayed on the monitor. Most of these instructions are shown, with samples of computation and of graphics.

  2. Computing and Displaying Isosurfaces in R

    Directory of Open Access Journals (Sweden)

    Dai Feng

    2008-09-01

    Full Text Available This paper presents R utilities for computing and displaying isosurfaces, or three-dimensional contour surfaces, from a three-dimensional array of function values. A version of the marching cubes algorithm that takes into account face and internal ambiguities is used to compute the isosurfaces. Vectorization is used to ensure adequate performance using only R code. Examples are presented showing contours of theoretical densities, density estimates, and medical imaging data. Rendering can use the rgl package or standard or grid graphics, and a set of tools for representing and rendering surfaces using standard or grid graphics is presented.

  3. The JSpecView Project: an Open Source Java viewer and converter for JCAMP-DX, and XML spectral data files

    Directory of Open Access Journals (Sweden)

    Lancashire Robert J

    2007-12-01

    Full Text Available Abstract The JSpecView Open Source project began with the intention of providing both a teaching and research tool for the display of JCAMP-DX spectra. The development of the Java source code commenced under license in 2001 and was released as Open Source in March 2006. The scope was then broadened to take advantage of the XML initiative in Chemistry and routines to read and write AnIML and CMLspect documents were added. JSpecView has the ability to display the full range of JCAMP-DX formats and protocols and to display multiple spectra simultaneously. As an aid for the interpretation of spectra it was found useful to offer routines such that if any part of the spectral display is clicked, that region can be highlighted and the (x, y coordinates returned. This is conveniently handled using calls from JavaScript and the feedback results can be used to initiate links to other applets like Jmol, to generate a peak table, or even to load audio clips providing helpful hints. Whilst the current user base is still small, there are a number of sites that already feature the applet. A tutorial video showing how to examine NMR spectra using JSpecView has appeared on YouTube and was formatted for replay on iPods and it has been incorporated into a chemistry search engine.

  4. Coding For Compression Of Low-Entropy Data

    Science.gov (United States)

    Yeh, Pen-Shu

    1994-01-01

    Improved method of encoding digital data provides for efficient lossless compression of partially or even mostly redundant data from low-information-content source. Method of coding implemented in relatively simple, high-speed arithmetic and logic circuits. Also increases coding efficiency beyond that of established Huffman coding method in that average number of bits per code symbol can be less than 1, which is the lower bound for Huffman code.

  5. Development of authentication code for multi-access optical code division multiplexing based quantum key distribution

    Science.gov (United States)

    Taiwo, Ambali; Alnassar, Ghusoon; Bakar, M. H. Abu; Khir, M. F. Abdul; Mahdi, Mohd Adzir; Mokhtar, M.

    2018-05-01

    One-weight authentication code for multi-user quantum key distribution (QKD) is proposed. The code is developed for Optical Code Division Multiplexing (OCDMA) based QKD network. A unique address assigned to individual user, coupled with degrading probability of predicting the source of the qubit transmitted in the channel offer excellent secure mechanism against any form of channel attack on OCDMA based QKD network. Flexibility in design as well as ease of modifying the number of users are equally exceptional quality presented by the code in contrast to Optical Orthogonal Code (OOC) earlier implemented for the same purpose. The code was successfully applied to eight simultaneous users at effective key rate of 32 bps over 27 km transmission distance.

  6. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  7. The Art of Readable Code

    CERN Document Server

    Boswell, Dustin

    2011-01-01

    As programmers, we've all seen source code that's so ugly and buggy it makes our brain ache. Over the past five years, authors Dustin Boswell and Trevor Foucher have analyzed hundreds of examples of "bad code" (much of it their own) to determine why they're bad and how they could be improved. Their conclusion? You need to write code that minimizes the time it would take someone else to understand it-even if that someone else is you. This book focuses on basic principles and practical techniques you can apply every time you write code. Using easy-to-digest code examples from different languag

  8. Office of Codes and Standards resource book. Section 1, Building energy codes and standards

    Energy Technology Data Exchange (ETDEWEB)

    Hattrup, M.P.

    1995-01-01

    The US Department of Energy`s (DOE`s) Office of Codes and Standards has developed this Resource Book to provide: A discussion of DOE involvement in building codes and standards; a current and accurate set of descriptions of residential, commercial, and Federal building codes and standards; information on State contacts, State code status, State building construction unit volume, and State needs; and a list of stakeholders in the building energy codes and standards arena. The Resource Book is considered an evolving document and will be updated occasionally. Users are requested to submit additional data (e.g., more current, widely accepted, and/or documented data) and suggested changes to the address listed below. Please provide sources for all data provided.

  9. Uniform LED illuminator for miniature displays

    Science.gov (United States)

    Medvedev, Vladimir; Pelka, David G.; Parkyn, William A.

    1998-10-01

    The Total Internally Reflecting (TIR) lens is a faceted structure composed of prismatic elements that collect a source's light over a much larger angular range than a conventional Fresnel lens. It has been successfully applied to the efficient collimation of light from incandescent and fluorescent lamps, and from light-emitting diodes (LEDs). A novel LED-powered collimating backlight is presented here, for uniformly illuminating 0.25'-diagonal miniature liquid- crystal displays, which are a burgeoning market for pagers, cellular phones, digital cameras, camcorders, and virtual- reality displays. The backlight lens consists of a central dual-asphere refracting section and an outer TIR section, properly curved with a curved exit face.

  10. Projection display technology for avionics applications

    Science.gov (United States)

    Kalmanash, Michael H.; Tompkins, Richard D.

    2000-08-01

    Avionics displays often require custom image sources tailored to demanding program needs. Flat panel devices are attractive for cockpit installations, however recent history has shown that it is not possible to sustain a business manufacturing custom flat panels in small volume specialty runs. As the number of suppliers willing to undertake this effort shrinks, avionics programs unable to utilize commercial-off-the-shelf (COTS) flat panels are placed in serious jeopardy. Rear projection technology offers a new paradigm, enabling compact systems to be tailored to specific platform needs while using a complement of COTS components. Projection displays enable improved performance, lower cost and shorter development cycles based on inter-program commonality and the wide use of commercial components. This paper reviews the promise and challenges of projection technology and provides an overview of Kaiser Electronics' efforts in developing advanced avionics displays using this approach.

  11. 3D display system using monocular multiview displays

    Science.gov (United States)

    Sakamoto, Kunio; Saruta, Kazuki; Takeda, Kazutoki

    2002-05-01

    A 3D head mounted display (HMD) system is useful for constructing a virtual space. The authors have researched the virtual-reality systems connected with computer networks for real-time remote control and developed a low-priced real-time 3D display for building these systems. We developed a 3D HMD system using monocular multi-view displays. The 3D displaying technique of this monocular multi-view display is based on the concept of the super multi-view proposed by Kajiki at TAO (Telecommunications Advancement Organization of Japan) in 1996. Our 3D HMD has two monocular multi-view displays (used as a visual display unit) in order to display a picture to the left eye and the right eye. The left and right images are a pair of stereoscopic images for the left and right eyes, then stereoscopic 3D images are observed.

  12. MARE2DEM: a 2-D inversion code for controlled-source electromagnetic and magnetotelluric data

    Science.gov (United States)

    Key, Kerry

    2016-10-01

    This work presents MARE2DEM, a freely available code for 2-D anisotropic inversion of magnetotelluric (MT) data and frequency-domain controlled-source electromagnetic (CSEM) data from onshore and offshore surveys. MARE2DEM parametrizes the inverse model using a grid of arbitrarily shaped polygons, where unstructured triangular or quadrilateral grids are typically used due to their ease of construction. Unstructured grids provide significantly more geometric flexibility and parameter efficiency than the structured rectangular grids commonly used by most other inversion codes. Transmitter and receiver components located on topographic slopes can be tilted parallel to the boundary so that the simulated electromagnetic fields accurately reproduce the real survey geometry. The forward solution is implemented with a goal-oriented adaptive finite-element method that automatically generates and refines unstructured triangular element grids that conform to the inversion parameter grid, ensuring accurate responses as the model conductivity changes. This dual-grid approach is significantly more efficient than the conventional use of a single grid for both the forward and inverse meshes since the more detailed finite-element meshes required for accurate responses do not increase the memory requirements of the inverse problem. Forward solutions are computed in parallel with a highly efficient scaling by partitioning the data into smaller independent modeling tasks consisting of subsets of the input frequencies, transmitters and receivers. Non-linear inversion is carried out with a new Occam inversion approach that requires fewer forward calls. Dense matrix operations are optimized for memory and parallel scalability using the ScaLAPACK parallel library. Free parameters can be bounded using a new non-linear transformation that leaves the transformed parameters nearly the same as the original parameters within the bounds, thereby reducing non-linear smoothing effects. Data

  13. Circular displays: control/display arrangements and stereotype strength with eight different display locations.

    Science.gov (United States)

    Chan, Alan H S; Hoffmann, Errol R

    2015-01-01

    Two experiments are reported that were designed to investigate control/display arrangements having high stereotype strengths when using circular displays. Eight display locations relative to the operator and control were tested with rotational and translational controls situated on different planes according to the Frame of Reference Transformation Tool (FORT) model of Wickens et al. (2010). (Left. No, Right! Development of the Frame of Reference Transformation Tool (FORT), Proceedings of the Human Factors and Ergonomics Society 54th Annual Meeting, 54: 1022-1026). In many cases, there was little effect of display locations, indicating the importance of the Worringham and Beringer (1998. Directional stimulus-response compatibility: a test of three alternative principles. Ergonomics, 41(6), 864-880) Visual Field principle and an extension of this principle for rotary controls (Hoffmann and Chan (2013). The Worringham and Beringer 'visual field' principle for rotary controls. Ergonomics, 56(10), 1620-1624). The initial indicator position (12, 3, 6 and 9 o'clock) had a major effect on control/display stereotype strength for many of the six controls tested. Best display/control arrangements are listed for each of the different control types (rotational and translational) and for the planes on which they are mounted. Data have application where a circular display is used due to limited display panel space and applies to space-craft, robotics operators, hospital equipment and home appliances. Practitioner Summary: Circular displays are often used when there is limited space available on a control panel. Display/control arrangements having high stereotype strength are listed for four initial indicator positions. These arrangements are best for design purposes.

  14. CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F.; Flach, G.; BROWN, K.

    2013-06-01

    This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes. 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.

  15. Code portability and data management considerations in the SAS3D LMFBR accident-analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.

    1981-01-01

    The SAS3D code was produced from a predecessor in order to reduce or eliminate interrelated problems in the areas of code portability, the large size of the code, inflexibility in the use of memory and the size of cases that can be run, code maintenance, and running speed. Many conventional solutions, such as variable dimensioning, disk storage, virtual memory, and existing code-maintenance utilities were not feasible or did not help in this case. A new data management scheme was developed, coding standards and procedures were adopted, special machine-dependent routines were written, and a portable source code processing code was written. The resulting code is quite portable, quite flexible in the use of memory and the size of cases that can be run, much easier to maintain, and faster running. SAS3D is still a large, long running code that only runs well if sufficient main memory is available

  16. SKEMA - A computer code to estimate atmospheric dispersion

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1985-01-01

    This computer code is a modified version of DWNWND code, developed in Oak Ridge National Laboratory. The Skema code makes an estimative of concentration in air of a material released in atmosphery, by ponctual source. (C.M.) [pt

  17. Electromagnetic reprogrammable coding-metasurface holograms.

    Science.gov (United States)

    Li, Lianlin; Jun Cui, Tie; Ji, Wei; Liu, Shuo; Ding, Jun; Wan, Xiang; Bo Li, Yun; Jiang, Menghua; Qiu, Cheng-Wei; Zhang, Shuang

    2017-08-04

    Metasurfaces have enabled a plethora of emerging functions within an ultrathin dimension, paving way towards flat and highly integrated photonic devices. Despite the rapid progress in this area, simultaneous realization of reconfigurability, high efficiency, and full control over the phase and amplitude of scattered light is posing a great challenge. Here, we try to tackle this challenge by introducing the concept of a reprogrammable hologram based on 1-bit coding metasurfaces. The state of each unit cell of the coding metasurface can be switched between '1' and '0' by electrically controlling the loaded diodes. Our proof-of-concept experiments show that multiple desired holographic images can be realized in real time with only a single coding metasurface. The proposed reprogrammable hologram may be a key in enabling future intelligent devices with reconfigurable and programmable functionalities that may lead to advances in a variety of applications such as microscopy, display, security, data storage, and information processing.Realizing metasurfaces with reconfigurability, high efficiency, and control over phase and amplitude is a challenge. Here, Li et al. introduce a reprogrammable hologram based on a 1-bit coding metasurface, where the state of each unit cell of the coding metasurface can be switched electrically.

  18. High performance visual display for HENP detectors

    International Nuclear Information System (INIS)

    McGuigan, Michael; Smith, Gordon; Spiletic, John; Fine, Valeri; Nevski, Pavel

    2001-01-01

    A high end visual display for High Energy Nuclear Physics (HENP) detectors is necessary because of the sheer size and complexity of the detector. For BNL this display will be of special interest because of STAR and ATLAS. To load, rotate, query, and debug simulation code with a modern detector simply takes too long even on a powerful work station. To visualize the HENP detectors with maximal performance we have developed software with the following characteristics. We develop a visual display of HENP detectors on BNL multiprocessor visualization server at multiple level of detail. We work with general and generic detector framework consistent with ROOT, GAUDI etc, to avoid conflicting with the many graphic development groups associated with specific detectors like STAR and ATLAS. We develop advanced OpenGL features such as transparency and polarized stereoscopy. We enable collaborative viewing of detector and events by directly running the analysis in BNL stereoscopic theatre. We construct enhanced interactive control, including the ability to slice, search and mark areas of the detector. We incorporate the ability to make a high quality still image of a view of the detector and the ability to generate animations and a fly through of the detector and output these to MPEG or VRML models. We develop data compression hardware and software so that remote interactive visualization will be possible among dispersed collaborators. We obtain real time visual display for events accumulated during simulations

  19. The Feasibility of Multidimensional CFD Applied to Calandria System in the Moderator of CANDU-6 PHWR Using Commercial and Open-Source Codes

    Directory of Open Access Journals (Sweden)

    Hyoung Tae Kim

    2016-01-01

    Full Text Available The moderator system of CANDU, a prototype of PHWR (pressurized heavy-water reactor, has been modeled in multidimension for the computation based on CFD (computational fluid dynamics technique. Three CFD codes are tested in modeled hydrothermal systems of heavy-water reactors. Commercial codes, COMSOL Multiphysics and ANSYS-CFX with OpenFOAM, an open-source code, are introduced for the various simplified and practical problems. All the implemented computational codes are tested for a benchmark problem of STERN laboratory experiment with a precise modeling of tubes, compared with each other as well as the measured data and a porous model based on the experimental correlation of pressure drop. Also the effect of turbulence model is discussed for these low Reynolds number flows. As a result, they are shown to be successful for the analysis of three-dimensional numerical models related to the calandria system of CANDU reactors.

  20. Lattice Index Coding

    OpenAIRE

    Natarajan, Lakshmi; Hong, Yi; Viterbo, Emanuele

    2014-01-01

    The index coding problem involves a sender with K messages to be transmitted across a broadcast channel, and a set of receivers each of which demands a subset of the K messages while having prior knowledge of a different subset as side information. We consider the specific case of noisy index coding where the broadcast channel is Gaussian and every receiver demands all the messages from the source. Instances of this communication problem arise in wireless relay networks, sensor networks, and ...

  1. Dominant, open nonverbal displays are attractive at zero-acquaintance.

    Science.gov (United States)

    Vacharkulksemsuk, Tanya; Reit, Emily; Khambatta, Poruz; Eastwick, Paul W; Finkel, Eli J; Carney, Dana R

    2016-04-12

    Across two field studies of romantic attraction, we demonstrate that postural expansiveness makes humans more romantically appealing. In a field study (n = 144 speed-dates), we coded nonverbal behaviors associated with liking, love, and dominance. Postural expansiveness-expanding the body in physical space-was most predictive of attraction, with each one-unit increase in coded behavior from the video recordings nearly doubling a person's odds of getting a "yes" response from one's speed-dating partner. In a subsequent field experiment (n = 3,000), we tested the causality of postural expansion (vs. contraction) on attraction using a popular Global Positioning System-based online-dating application. Mate-seekers rapidly flipped through photographs of potential sexual/date partners, selecting those they desired to meet for a date. Mate-seekers were significantly more likely to select partners displaying an expansive (vs. contractive) nonverbal posture. Mediation analyses demonstrate one plausible mechanism through which expansiveness is appealing: Expansiveness makes the dating candidate appear more dominant. In a dating world in which success sometimes is determined by a split-second decision rendered after a brief interaction or exposure to a static photograph, single persons have very little time to make a good impression. Our research suggests that a nonverbal dominance display increases a person's chances of being selected as a potential mate.

  2. Concurrent Codes: A Holographic-Type Encoding Robust against Noise and Loss.

    Directory of Open Access Journals (Sweden)

    David M Benton

    Full Text Available Concurrent coding is an encoding scheme with 'holographic' type properties that are shown here to be robust against a significant amount of noise and signal loss. This single encoding scheme is able to correct for random errors and burst errors simultaneously, but does not rely on cyclic codes. A simple and practical scheme has been tested that displays perfect decoding when the signal to noise ratio is of order -18dB. The same scheme also displays perfect reconstruction when a contiguous block of 40% of the transmission is missing. In addition this scheme is 50% more efficient in terms of transmitted power requirements than equivalent cyclic codes. A simple model is presented that describes the process of decoding and can determine the computational load that would be expected, as well as describing the critical levels of noise and missing data at which false messages begin to be generated.

  3. Upgrades to the WIMS-ANL code

    International Nuclear Information System (INIS)

    Woodruff, W. L.

    1998-01-01

    The dusty old source code in WIMS-D4M has been completely rewritten to conform more closely with current FORTRAN coding practices. The revised code contains many improvements in appearance, error checking and in control of the output. The output is now tabulated to fit the typical 80 column window or terminal screen. The Segev method for resonance integral interpolation is now an option. Most of the dimension limitations have been removed and replaced with variable dimensions within a compile-time fixed container. The library is no longer restricted to the 69 energy group structure, and two new libraries have been generated for use with the code. The new libraries are both based on ENDF/B-VI data with one having the original 69 energy group structure and the second with a 172 group structure. The common source code can be used with PCs using both Windows 95 and NT, with a Linux based operating system and with UNIX based workstations. Comparisons of this version of the code to earlier evaluations with ENDF/B-V are provided, as well as, comparisons with the new libraries

  4. Upgrades to the WIMS-ANL code

    International Nuclear Information System (INIS)

    Woodruff, W.L.; Leopando, L.S.

    1998-01-01

    The dusty old source code in WIMS-D4M has been completely rewritten to conform more closely with current FORTRAN coding practices. The revised code contains many improvements in appearance, error checking and in control of the output. The output is now tabulated to fit the typical 80 column window or terminal screen. The Segev method for resonance integral interpolation is now an option. Most of the dimension limitations have been removed and replaced with variable dimensions within a compile-time fixed container. The library is no longer restricted to the 69 energy group structure, and two new libraries have been generated for use with the code. The new libraries are both based on ENDF/B-VI data with one having the original 69 energy group structure and the second with a 172 group structure. The common source code can be used with PCs using both Windows 95 and NT, with a Linux based operating system and with UNIX based workstations. Comparisons of this version of the code to earlier evaluations with ENDF/B-V are provided, as well as, comparisons with the new libraries. (author)

  5. Stars with shell energy sources. Part 1. Special evolutionary code

    International Nuclear Information System (INIS)

    Rozyczka, M.

    1977-01-01

    A new version of the Henyey-type stellar evolution code is described and tested. It is shown, as a by-product of the tests, that the thermal time scale of the core of a red giant approaching the helium flash is of the order of the evolutionary time scale. The code itself appears to be a very efficient tool for investigations of the helium flash, carbon flash and the evolution of a white dwarf accreting mass. (author)

  6. Vocal individuality cues in the African penguin (Spheniscus demersus): a source-filter theory approach.

    Science.gov (United States)

    Favaro, Livio; Gamba, Marco; Alfieri, Chiara; Pessani, Daniela; McElligott, Alan G

    2015-11-25

    The African penguin is a nesting seabird endemic to southern Africa. In penguins of the genus Spheniscus vocalisations are important for social recognition. However, it is not clear which acoustic features of calls can encode individual identity information. We recorded contact calls and ecstatic display songs of 12 adult birds from a captive colony. For each vocalisation, we measured 31 spectral and temporal acoustic parameters related to both source and filter components of calls. For each parameter, we calculated the Potential of Individual Coding (PIC). The acoustic parameters showing PIC ≥ 1.1 were used to perform a stepwise cross-validated discriminant function analysis (DFA). The DFA correctly classified 66.1% of the contact calls and 62.5% of display songs to the correct individual. The DFA also resulted in the further selection of 10 acoustic features for contact calls and 9 for display songs that were important for vocal individuality. Our results suggest that studying the anatomical constraints that influence nesting penguin vocalisations from a source-filter perspective, can lead to a much better understanding of the acoustic cues of individuality contained in their calls. This approach could be further extended to study and understand vocal communication in other bird species.

  7. Some optimizations of the animal code

    International Nuclear Information System (INIS)

    Fletcher, W.T.

    1975-01-01

    Optimizing techniques were performed on a version of the ANIMAL code (MALAD1B) at the source-code (FORTRAN) level. Sample optimizing techniques and operations used in MALADOP--the optimized version of the code--are presented, along with a critique of some standard CDC 7600 optimizing techniques. The statistical analysis of total CPU time required for MALADOP and MALAD1B shows a run-time saving of 174 msec (almost 3 percent) in the code MALADOP during one time step

  8. Adaptation of Control Center Software to Commerical Real-Time Display Applications

    Science.gov (United States)

    Collier, Mark D.

    1994-01-01

    NASA-Marshall Space Flight Center (MSFC) is currently developing an enhanced Huntsville Operation Support Center (HOSC) system designed to support multiple spacecraft missions. The Enhanced HOSC is based upon a distributed computing architecture using graphic workstation hardware and industry standard software including POSIX, X Windows, Motif, TCP/IP, and ANSI C. Southwest Research Institute (SwRI) is currently developing a prototype of the Display Services application for this system. Display Services provides the capability to generate and operate real-time data-driven graphic displays. This prototype is a highly functional application designed to allow system end users to easily generate complex data-driven displays. The prototype is easy to use, flexible, highly functional, and portable. Although this prototype is being developed for NASA-MSFC, the general-purpose real-time display capability can be reused in similar mission and process control environments. This includes any environment depending heavily upon real-time data acquisition and display. Reuse of the prototype will be a straight-forward transition because the prototype is portable, is designed to add new display types easily, has a user interface which is separated from the application code, and is very independent of the specifics of NASA-MSFC's system. Reuse of this prototype in other environments is a excellent alternative to creation of a new custom application, or for environments with a large number of users, to purchasing a COTS package.

  9. Runtime Detection of C-Style Errors in UPC Code

    Energy Technology Data Exchange (ETDEWEB)

    Pirkelbauer, P; Liao, C; Panas, T; Quinlan, D

    2011-09-29

    Unified Parallel C (UPC) extends the C programming language (ISO C 99) with explicit parallel programming support for the partitioned global address space (PGAS), which provides a global memory space with localized partitions to each thread. Like its ancestor C, UPC is a low-level language that emphasizes code efficiency over safety. The absence of dynamic (and static) safety checks allows programmer oversights and software flaws that can be hard to spot. In this paper, we present an extension of a dynamic analysis tool, ROSE-Code Instrumentation and Runtime Monitor (ROSECIRM), for UPC to help programmers find C-style errors involving the global address space. Built on top of the ROSE source-to-source compiler infrastructure, the tool instruments source files with code that monitors operations and keeps track of changes to the system state. The resulting code is linked to a runtime monitor that observes the program execution and finds software defects. We describe the extensions to ROSE-CIRM that were necessary to support UPC. We discuss complications that arise from parallel code and our solutions. We test ROSE-CIRM against a runtime error detection test suite, and present performance results obtained from running error-free codes. ROSE-CIRM is released as part of the ROSE compiler under a BSD-style open source license.

  10. Display Sharing: An Alternative Paradigm

    Science.gov (United States)

    Brown, Michael A.

    2010-01-01

    The current Johnson Space Center (JSC) Mission Control Center (MCC) Video Transport System (VTS) provides flight controllers and management the ability to meld raw video from various sources with telemetry to improve situational awareness. However, maintaining a separate infrastructure for video delivery and integration of video content with data adds significant complexity and cost to the system. When considering alternative architectures for a VTS, the current system's ability to share specific computer displays in their entirety to other locations, such as large projector systems, flight control rooms, and back supporting rooms throughout the facilities and centers must be incorporated into any new architecture. Internet Protocol (IP)-based systems also support video delivery and integration. IP-based systems generally have an advantage in terms of cost and maintainability. Although IP-based systems are versatile, the task of sharing a computer display from one workstation to another can be time consuming for an end-user and inconvenient to administer at a system level. The objective of this paper is to present a prototype display sharing enterprise solution. Display sharing is a system which delivers image sharing across the LAN while simultaneously managing bandwidth, supporting encryption, enabling recovery and resynchronization following a loss of signal, and, minimizing latency. Additional critical elements will include image scaling support, multi -sharing, ease of initial integration and configuration, integration with desktop window managers, collaboration tools, host and recipient controls. This goal of this paper is to summarize the various elements of an IP-based display sharing system that can be used in today's control center environment.

  11. Transparent 3D display for augmented reality

    Science.gov (United States)

    Lee, Byoungho; Hong, Jisoo

    2012-11-01

    Two types of transparent three-dimensional display systems applicable for the augmented reality are demonstrated. One of them is a head-mounted-display-type implementation which utilizes the principle of the system adopting the concave floating lens to the virtual mode integral imaging. Such configuration has an advantage in that the threedimensional image can be displayed at sufficiently far distance resolving the accommodation conflict with the real world scene. Incorporating the convex half mirror, which shows a partial transparency, instead of the concave floating lens, makes it possible to implement the transparent three-dimensional display system. The other type is the projection-type implementation, which is more appropriate for the general use than the head-mounted-display-type implementation. Its imaging principle is based on the well-known reflection-type integral imaging. We realize the feature of transparent display by imposing the partial transparency to the array of concave mirror which is used for the screen of reflection-type integral imaging. Two types of configurations, relying on incoherent and coherent light sources, are both possible. For the incoherent configuration, we introduce the concave half mirror array, whereas the coherent one adopts the holographic optical element which replicates the functionality of the lenslet array. Though the projection-type implementation is beneficial than the head-mounted-display in principle, the present status of the technical advance of the spatial light modulator still does not provide the satisfactory visual quality of the displayed three-dimensional image. Hence we expect that the head-mounted-display-type and projection-type implementations will come up in the market in sequence.

  12. Simulation of droplet impact onto a deep pool for large Froude numbers in different open-source codes

    Science.gov (United States)

    Korchagova, V. N.; Kraposhin, M. V.; Marchevsky, I. K.; Smirnova, E. V.

    2017-11-01

    A droplet impact on a deep pool can induce macro-scale or micro-scale effects like a crown splash, a high-speed jet, formation of secondary droplets or thin liquid films, etc. It depends on the diameter and velocity of the droplet, liquid properties, effects of external forces and other factors that a ratio of dimensionless criteria can account for. In the present research, we considered the droplet and the pool consist of the same viscous incompressible liquid. We took surface tension into account but neglected gravity forces. We used two open-source codes (OpenFOAM and Gerris) for our computations. We review the possibility of using these codes for simulation of processes in free-surface flows that may take place after a droplet impact on the pool. Both codes simulated several modes of droplet impact. We estimated the effect of liquid properties with respect to the Reynolds number and Weber number. Numerical simulation enabled us to find boundaries between different modes of droplet impact on a deep pool and to plot corresponding mode maps. The ratio of liquid density to that of the surrounding gas induces several changes in mode maps. Increasing this density ratio suppresses the crown splash.

  13. Using self-similarity compensation for improving inter-layer prediction in scalable 3D holoscopic video coding

    Science.gov (United States)

    Conti, Caroline; Nunes, Paulo; Ducla Soares, Luís.

    2013-09-01

    Holoscopic imaging, also known as integral imaging, has been recently attracting the attention of the research community, as a promising glassless 3D technology due to its ability to create a more realistic depth illusion than the current stereoscopic or multiview solutions. However, in order to gradually introduce this technology into the consumer market and to efficiently deliver 3D holoscopic content to end-users, backward compatibility with legacy displays is essential. Consequently, to enable 3D holoscopic content to be delivered and presented on legacy displays, a display scalable 3D holoscopic coding approach is required. Hence, this paper presents a display scalable architecture for 3D holoscopic video coding with a three-layer approach, where each layer represents a different level of display scalability: Layer 0 - a single 2D view; Layer 1 - 3D stereo or multiview; and Layer 2 - the full 3D holoscopic content. In this context, a prediction method is proposed, which combines inter-layer prediction, aiming to exploit the existing redundancy between the multiview and the 3D holoscopic layers, with self-similarity compensated prediction (previously proposed by the authors for non-scalable 3D holoscopic video coding), aiming to exploit the spatial redundancy inherent to the 3D holoscopic enhancement layer. Experimental results show that the proposed combined prediction can improve significantly the rate-distortion performance of scalable 3D holoscopic video coding with respect to the authors' previously proposed solutions, where only inter-layer or only self-similarity prediction is used.

  14. Use of color on airport moving maps and cockpit displays of traffic information (CDTIs)

    Science.gov (United States)

    2014-06-01

    Color can be an effective method for coding visual information, making it easier to find and identify symbols on a display (Christ, 1975). However, careful consideration should be given when applying color because excessive or inappropriate use of co...

  15. Gaze-based interaction with public displays using off-the-shelf components

    DEFF Research Database (Denmark)

    San Agustin, Javier; Hansen, John Paulin; Tall, Martin Henrik

    Eye gaze can be used to interact with high-density information presented on large displays. We have built a system employing off-the-shelf hardware components and open-source gaze tracking software that enables users to interact with an interface displayed on a 55” screen using their eye movement...

  16. Fundamentals of information theory and coding design

    CERN Document Server

    Togneri, Roberto

    2003-01-01

    In a clear, concise, and modular format, this book introduces the fundamental concepts and mathematics of information and coding theory. The authors emphasize how a code is designed and discuss the main properties and characteristics of different coding algorithms along with strategies for selecting the appropriate codes to meet specific requirements. They provide comprehensive coverage of source and channel coding, address arithmetic, BCH, and Reed-Solomon codes and explore some more advanced topics such as PPM compression and turbo codes. Worked examples and sets of basic and advanced exercises in each chapter reinforce the text's clear explanations of all concepts and methodologies.

  17. The Real Time Interactive Display Environment (RTIDE), a display building tool developed by Space Shuttle flight controllers

    Science.gov (United States)

    Kalvelage, Thomas A.

    1989-01-01

    NASA's Mission Control Center, located at Johnson Space Center, is incrementally moving from a centralized architecture to a distributed architecture. Starting with STS-29, some host-driven console screens will be replaced with graphics terminals driven by workstations. These workstations will be supplied realtime data first by the Real Time Data System (RTDS), a system developed inhouse, and then months later (in parallel with RTDS) by interim and subsequently operational versions of the Mission Control Center Upgrade (MCCU) software package. The Real Time Interactive Display Environment (RTIDE) was built by Space Shuttle flight controllers to support the rapid development of multiple new displays to support Shuttle flights. RTIDE is a display building tool that allows non-programmers to define object-oriented, event-driven, mouseable displays. Particular emphasis was placed on upward compatibility between RTIDE versions, ability to acquire data from different data sources, realtime performance, ability to modularly upgrade RTIDE, machine portability, and a clean, powerful user interface. The operational and organizational factors that drove RTIDE to its present form, the actual design itself, simulation and flight performance, and lessons learned in the process are discussed.

  18. Applications guide to the RSIC-distributed version of the MCNP code (coupled Monte Carlo neutron-photon Code)

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1985-09-01

    An overview of the RSIC-distributed version of the MCNP code (a soupled Monte Carlo neutron-photon code) is presented. All general features of the code, from machine hardware requirements to theoretical details, are discussed. The current nuclide cross-section and other libraries available in the standard code package are specified, and a realistic example of the flexible geometry input is given. Standard and nonstandard source, estimator, and variance-reduction procedures are outlined. Examples of correct usage and possible misuse of certain code features are presented graphically and in standard output listings. Finally, itemized summaries of sample problems, various MCNP code documentation, and future work are given

  19. Simplified diagnostic coding sheet for computerized data storage and analysis in ophthalmology.

    Science.gov (United States)

    Tauber, J; Lahav, M

    1987-11-01

    A review of currently-available diagnostic coding systems revealed that most are either too abbreviated or too detailed. We have compiled a simplified diagnostic coding sheet based on the International Coding and Diagnosis (ICD-9), which is both complete and easy to use in a general practice. The information is transferred to a computer, which uses the relevant (ICD-9) diagnoses as database and can be retrieved later for display of patients' problems or analysis of clinical data.

  20. ON CODE REFACTORING OF THE DIALOG SUBSYSTEM OF CDSS PLATFORM FOR THE OPEN-SOURCE MIS OPENMRS

    Directory of Open Access Journals (Sweden)

    A. V. Semenets

    2016-08-01

    The open-source MIS OpenMRS developer tools and software API are reviewed. The results of code refactoring of the dialog subsystem of the CDSS platform which is made as module for the open-source MIS OpenMRS are presented. The structure of information model of database of the CDSS dialog subsystem was updated according with MIS OpenMRS requirements. The Model-View-Controller (MVC based approach to the CDSS dialog subsystem architecture was re-implemented with Java programming language using Spring and Hibernate frameworks. The MIS OpenMRS Encounter portlet form for the CDSS dialog subsystem integration is developed as an extension. The administrative module of the CDSS platform is recreated. The data exchanging formats and methods for interaction of OpenMRS CDSS dialog subsystem module and DecisionTree GAE service are re-implemented with help of AJAX technology via jQuery library

  1. The effect of display movement angle, indicator type and display location on control/display stereotype strength.

    Science.gov (United States)

    Hoffmann, Errol R; Chan, Alan H S

    2017-08-01

    Much research on stereotype strength relating display and control movements for displays moving in the vertical or horizontal directions has been reported. Here we report effects of display movement angle, where the display moves at angles (relative to the vertical) of between 0° and 180°. The experiment used six different controls, four display locations relative to the operator and three types of indicator. Indicator types were included because of the strong effects of the 'scale-side principle' that are variable with display angle. A directional indicator had higher stereotype strength than a neutral indicator, and showed an apparent reversal in control/display stereotype direction beyond an angle of 90°. However, with a neutral indicator this control reversal was not present. Practitioner Summary: The effects of display moving at angles other than the four cardinal directions, types of control, location of display and types of indicator are investigated. Indicator types (directional and neutral) have an effect on stereotype strength and may cause an apparent control reversal with change of display movement angle.

  2. Interactive and Animated Scalable Vector Graphics and R Data Displays

    Directory of Open Access Journals (Sweden)

    Deborah Nolan

    2012-01-01

    Full Text Available We describe an approach to creating interactive and animated graphical displays using R's graphics engine and Scalable Vector Graphics, an XML vocabulary for describing two-dimensional graphical displays. We use the svg( graphics device in R and then post-process the resulting XML documents. The post-processing identities the elements in the SVG that correspond to the different components of the graphical display, e.g., points, axes, labels, lines. One can then annotate these elements to add interactivity and animation effects. One can also use JavaScript to provide dynamic interactive effects to the plot, enabling rich user interactions and compelling visualizations. The resulting SVG documents can be embedded withinHTML documents and can involve JavaScript code that integrates the SVG and HTML objects. The functionality is provided via the SVGAnnotation package and makes static plots generated via R graphics functions available as stand-alone, interactive and animated plots for the Web and other venues.

  3. Code generation of RHIC accelerator device objects

    International Nuclear Information System (INIS)

    Olsen, R.H.; Hoff, L.; Clifford, T.

    1995-01-01

    A RHIC Accelerator Device Object is an abstraction which provides a software view of a collection of collider control points known as parameters. A grammar has been defined which allows these parameters, along with code describing methods for acquiring and modifying them, to be specified efficiently in compact definition files. These definition files are processed to produce C++ source code. This source code is compiled to produce an object file which can be loaded into a front end computer. Each loaded object serves as an Accelerator Device Object class definition. The collider will be controlled by applications which set and get the parameters in instances of these classes using a suite of interface routines. Significant features of the grammar are described with details about the generated C++ code

  4. The Journey of a Source Line: How your Code is Translated into a Controlled Flow of Electrons

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    In this series we help you understand the bits and pieces that make your code command the underlying hardware. A multitude of layers translate and optimize source code, written in compiled and interpreted programming languages such as C++, Python or Java, to machine language. We explain the role and behavior of the layers in question in a typical usage scenario. While our main focus is on compilers and interpreters, we also talk about other facilities - such as the operating system, instruction sets and instruction decoders.   Biographie: Andrzej Nowak runs TIK Services, a technology and innovation consultancy based in Geneva, Switzerland. In the recent past, he co-founded and sold an award-winning Fintech start-up focused on peer-to-peer lending. Earlier, Andrzej worked at Intel and in the CERN openlab. At openlab, he managed a lab collaborating with Intel and was part of the Chief Technology Office, which set up next-generation technology projects for CERN and the openlab partne...

  5. The Journey of a Source Line: How your Code is Translated into a Controlled Flow of Electrons

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    In this series we help you understand the bits and pieces that make your code command the underlying hardware. A multitude of layers translate and optimize source code, written in compiled and interpreted programming languages such as C++, Python or Java, to machine language. We explain the role and behavior of the layers in question in a typical usage scenario. While our main focus is on compilers and interpreters, we also talk about other facilities - such as the operating system, instruction sets and instruction decoders. Biographie: Andrzej Nowak runs TIK Services, a technology and innovation consultancy based in Geneva, Switzerland. In the recent past, he co-founded and sold an award-winning Fintech start-up focused on peer-to-peer lending. Earlier, Andrzej worked at Intel and in the CERN openlab. At openlab, he managed a lab collaborating with Intel and was part of the Chief Technology Office, which set up next-generation technology projects for CERN and the openlab partners.

  6. Code system for fast reactor neutronics analysis

    International Nuclear Information System (INIS)

    Nakagawa, Masayuki; Abe, Junji; Sato, Wakaei.

    1983-04-01

    A code system for analysis of fast reactor neutronics has been developed for the purpose of handy use and error reduction. The JOINT code produces the input data file to be used in the neutronics calculation code and also prepares the cross section library file with an assigned format. The effective cross sections are saved in the PDS file with an unified format. At the present stage, this code system includes the following codes; SLAROM, ESELEM5, EXPANDA-G for the production of effective cross sections and CITATION-FBR, ANISN-JR, TWOTRAN2, PHENIX, 3DB, MORSE, CIPER and SNPERT. In the course of the development, some utility programs and service programs have been additionaly developed. These are used for access of PDS file, edit of the cross sections and graphic display. Included in this report are a description of input data format of the JOINT and other programs, and of the function of each subroutine and utility programs. The usage of PDS file is also explained. In Appendix A, the input formats are described for the revised version of the CIPER code. (author)

  7. Coupled geochemical and solute transport code development

    International Nuclear Information System (INIS)

    Morrey, J.R.; Hostetler, C.J.

    1985-01-01

    A number of coupled geochemical hydrologic codes have been reported in the literature. Some of these codes have directly coupled the source-sink term to the solute transport equation. The current consensus seems to be that directly coupling hydrologic transport and chemical models through a series of interdependent differential equations is not feasible for multicomponent problems with complex geochemical processes (e.g., precipitation/dissolution reactions). A two-step process appears to be the required method of coupling codes for problems where a large suite of chemical reactions must be monitored. Two-step structure requires that the source-sink term in the transport equation is supplied by a geochemical code rather than by an analytical expression. We have developed a one-dimensional two-step coupled model designed to calculate relatively complex geochemical equilibria (CTM1D). Our geochemical module implements a Newton-Raphson algorithm to solve heterogeneous geochemical equilibria, involving up to 40 chemical components and 400 aqueous species. The geochemical module was designed to be efficient and compact. A revised version of the MINTEQ Code is used as a parent geochemical code

  8. Neutron and photon measurements through concrete from a 15 GeV electron beam on a target-comparison with models and calculations. [Intermediate energy source term, Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Jenkins, T M [Stanford Linear Accelerator Center, CA (USA)

    1979-02-15

    Measurements of neutron and photon dose equivalents from a 15 GeV electron beam striking an iron target inside a scale model of a PEP IR hall are described, and compared with analytic-empirical calculations and with the Monte Carlo code, MORSE. The MORSE code is able to predict both absolute neutron and photon dose equivalents for geometries where the shield is relatively thin, but fails as the shield thickness is increased. An intermediate energy source term is postulated for analytic-empirical neutron shielding calculations to go along with the giant resonance and high energy terms, and a new source term due to neutron capture is postulated for analytic-empirical photon shielding calculations. The source strengths for each energy source term, and each type, are given from analysis of the measurements.

  9. Seismic Analysis Code (SAC): Development, porting, and maintenance within a legacy code base

    Science.gov (United States)

    Savage, B.; Snoke, J. A.

    2017-12-01

    The Seismic Analysis Code (SAC) is the result of toil of many developers over almost a 40-year history. Initially a Fortran-based code, it has undergone major transitions in underlying bit size from 16 to 32, in the 1980s, and 32 to 64 in 2009; as well as a change in language from Fortran to C in the late 1990s. Maintenance of SAC, the program and its associated libraries, have tracked changes in hardware and operating systems including the advent of Linux in the early 1990, the emergence and demise of Sun/Solaris, variants of OSX processors (PowerPC and x86), and Windows (Cygwin). Traces of these systems are still visible in source code and associated comments. A major concern while improving and maintaining a routinely used, legacy code is a fear of introducing bugs or inadvertently removing favorite features of long-time users. Prior to 2004, SAC was maintained and distributed by LLNL (Lawrence Livermore National Lab). In that year, the license was transferred from LLNL to IRIS (Incorporated Research Institutions for Seismology), but the license is not open source. However, there have been thousands of downloads a year of the package, either source code or binaries for specific system. Starting in 2004, the co-authors have maintained the SAC package for IRIS. In our updates, we fixed bugs, incorporated newly introduced seismic analysis procedures (such as EVALRESP), added new, accessible features (plotting and parsing), and improved the documentation (now in HTML and PDF formats). Moreover, we have added modern software engineering practices to the development of SAC including use of recent source control systems, high-level tests, and scripted, virtualized environments for rapid testing and building. Finally, a "sac-help" listserv (administered by IRIS) was setup for SAC-related issues and is the primary avenue for users seeking advice and reporting bugs. Attempts are always made to respond to issues and bugs in a timely fashion. For the past thirty-plus years

  10. System for recording and displaying two-phase flow topographies

    International Nuclear Information System (INIS)

    Cary, C.N.; Block, J.A.

    1979-01-01

    A system of hardware and software has been developed and used to record and display in various forms details of the countercurrent flow topographies occurring in a scaled Pressurized Water Reactor downcomer annulus. An array of 288 conductivity sensors was mounted in a 1/15 scale PWR annulus. At each moment in time, the state of each probe indicates the presence or absence of water in this immediate vicinity. An electronic data acquisition system records the states of all probes 108 times per second on magnetic tape; software routines retrieve the data and reconstruct visual analogs of the flow topographies. The instantaneous two-phase state of the annulus at each instant can be displayed on a hard copy plotter or on a CRT screen. By synchronizing a camera drive with the CRT display, 16mm films have been made recreating the flow process at full speed and at various slow motion rates. All data obtained are stored in computer files in numerical form and can be subjected to various types of quantitative analysis to assist in advanced code development and verification

  11. Calculations of fuel burn-up and radionuclide inventory in the syrian miniature neutron source reactor using the WIMSD4 code

    International Nuclear Information System (INIS)

    Khattab, K.

    2005-01-01

    Calculations of the fuel burn up and radionuclide inventory in the Miniature Neutron Source Reactor after 10 years (the reactor core expected life) of the reactor operating time are presented in this paper. The WIMSD4 code is used to generate the fuel group constants and the infinite multiplication factor versus the reactor operating time for 10, 20, and 30 kW operating power levels. The amounts of uranium burnt up and plutonium produced in the reactor core, the concentrations and radioactivities of the most important fission product and actinide radionuclides accumulated in the reactor core, and the total radioactivity of the reactor core are calculated using the WIMSD4 code as well

  12. Four energy group neutron flux distribution in the Syrian miniature neutron source reactor using the WIMSD4 and CITATION code

    International Nuclear Information System (INIS)

    Khattab, K.; Omar, H.; Ghazi, N.

    2009-01-01

    A 3-D (R, θ , Z) neutronic model for the Miniature Neutron Source Reactor (MNSR) was developed earlier to conduct the reactor neutronic analysis. The group constants for all the reactor components were generated using the WIMSD4 code. The reactor excess reactivity and the four group neutron flux distributions were calculated using the CITATION code. This model is used in this paper to calculate the point wise four energy group neutron flux distributions in the MNSR versus the radius, angle and reactor axial directions. Good agreement is noticed between the measured and the calculated thermal neutron flux in the inner and the outer irradiation site with relative difference less than 7% and 5% respectively. (author)

  13. Personalized reminiscence therapy M-health application for patients living with dementia: Innovating using open source code repository.

    Science.gov (United States)

    Zhang, Melvyn W B; Ho, Roger C M

    2017-01-01

    Dementia is known to be an illness which brings forth marked disability amongst the elderly individuals. At times, patients living with dementia do also experience non-cognitive symptoms, and these symptoms include that of hallucinations, delusional beliefs as well as emotional liability, sexualized behaviours and aggression. According to the National Institute of Clinical Excellence (NICE) guidelines, non-pharmacological techniques are typically the first-line option prior to the consideration of adjuvant pharmacological options. Reminiscence and music therapy are thus viable options. Lazar et al. [3] previously performed a systematic review with regards to the utilization of technology to delivery reminiscence based therapy to individuals who are living with dementia and has highlighted that technology does have benefits in the delivery of reminiscence therapy. However, to date, there has been a paucity of M-health innovations in this area. In addition, most of the current innovations are not personalized for each of the person living with Dementia. Prior research has highlighted the utility for open source repository in bioinformatics study. The authors hoped to explain how they managed to tap upon and make use of open source repository in the development of a personalized M-health reminiscence therapy innovation for patients living with dementia. The availability of open source code repository has changed the way healthcare professionals and developers develop smartphone applications today. Conventionally, a long iterative process is needed in the development of native application, mainly because of the need for native programming and coding, especially so if the application needs to have interactive features or features that could be personalized. Such repository enables the rapid and cost effective development of application. Moreover, developers are also able to further innovate, as less time is spend in the iterative process.

  14. A New Monte Carlo Neutron Transport Code at UNIST

    International Nuclear Information System (INIS)

    Lee, Hyunsuk; Kong, Chidong; Lee, Deokjung

    2014-01-01

    Monte Carlo neutron transport code named MCS is under development at UNIST for the advanced reactor design and research purpose. This MC code can be used for fixed source calculation and criticality calculation. Continuous energy neutron cross section data and multi-group cross section data can be used for the MC calculation. This paper presents the overview of developed MC code and its calculation results. The real time fixed source calculation ability is also tested in this paper. The calculation results show good agreement with commercial code and experiment. A new Monte Carlo neutron transport code is being developed at UNIST. The MC codes are tested with several benchmark problems: ICSBEP, VENUS-2, and Hoogenboom-Martin benchmark. These benchmarks covers pin geometry to 3-dimensional whole core, and results shows good agreement with reference results

  15. TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES

    Energy Technology Data Exchange (ETDEWEB)

    Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver, E-mail: jasmina@physics.ucf.edu [Planetary Sciences Group, Department of Physics, University of Central Florida, Orlando, FL 32816-2385 (United States)

    2016-07-01

    We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.

  16. TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES

    International Nuclear Information System (INIS)

    Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver

    2016-01-01

    We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.

  17. Open Source Subtitle Editor Software Study for Section 508 Close Caption Applications

    Science.gov (United States)

    Murphy, F. Brandon

    2013-01-01

    This paper will focus on a specific item within the NASA Electronic Information Accessibility Policy - Multimedia Presentation shall have synchronized caption; thus making information accessible to a person with hearing impairment. This synchronized caption will assist a person with hearing or cognitive disability to access the same information as everyone else. This paper focuses on the research and implementation for CC (subtitle option) support to video multimedia. The goal of this research is identify the best available open-source (free) software to achieve synchronized captions requirement and achieve savings, while meeting the security requirement for Government information integrity and assurance. CC and subtitling are processes that display text within a video to provide additional or interpretive information for those whom may need it or those whom chose it. Closed captions typically show the transcription of the audio portion of a program (video) as it occurs (either verbatim or in its edited form), sometimes including non-speech elements (such as sound effects). The transcript can be provided by a third party source or can be extracted word for word from the video. This feature can be made available for videos in two forms: either Soft-Coded or Hard-Coded. Soft-Coded is the more optional version of CC, where you can chose to turn them on if you want, or you can turn them off. Most of the time, when using the Soft-Coded option, the transcript is also provided to the view along-side the video. This option is subject to compromise, whereas the transcript is merely a text file that can be changed by anyone who has access to it. With this option the integrity of the CC is at the mercy of the user. Hard-Coded CC is a more permanent form of CC. A Hard-Coded CC transcript is embedded within a video, without the option of removal.

  18. Digital Image Processing Overview For Helmet Mounted Displays

    Science.gov (United States)

    Parise, Michael J.

    1989-09-01

    Digital image processing provides a means to manipulate an image and presents a user with a variety of display formats that are not available in the analog image processing environment. When performed in real time and presented on a Helmet Mounted Display, system capability and flexibility are greatly enhanced. The information content of a display can be increased by the addition of real time insets and static windows from secondary sensor sources, near real time 3-D imaging from a single sensor can be achieved, graphical information can be added, and enhancement techniques can be employed. Such increased functionality is generating a considerable amount of interest in the military and commercial markets. This paper discusses some of these image processing techniques and their applications.

  19. Array display tool ADT reference manual. Version 1.2

    International Nuclear Information System (INIS)

    Evans, K. Jr.

    1995-12-01

    Array Display Tool (ADT) is a Motif program to display arrays of process variables from the Advanced Photon Source control system. A typical use is to display the horizontal and vertical monitor readings. A picture of the ADT interface is here. The screen layout, apart from the menu bar, consists of two types of graphic areas in which the values for the arrays of process variables are shown: Display areas, which display one or more arrays as a function of index, and a zoom area. In the zoom area specified arrays only are displayed as a function of lattice position along with symbols for the major elements of the lattice. There can be several display areas, but at most one zoom area. When the screen is resized these areas change size proportionally. There are a number of options in the View Menu to change the way the values are displayed. It is also possible via the Options Menu to: (1) Store the current values internally. (2) Store the values from a snapshot file internally. (3) Display one of the stored sets of values along with the current values. (4) Display the difference of the current values with one of the stored sets of values. (5) Write the current values to a snapshot file. There are several (currently 5) slots in which you can store values internally. In addition you can display the values with specified reference values subtracted

  20. A dual-sided coded-aperture radiation detection system

    International Nuclear Information System (INIS)

    Penny, R.D.; Hood, W.E.; Polichar, R.M.; Cardone, F.H.; Chavez, L.G.; Grubbs, S.G.; Huntley, B.P.; Kuharski, R.A.; Shyffer, R.T.; Fabris, L.; Ziock, K.P.; Labov, S.E.; Nelson, K.

    2011-01-01

    We report the development of a large-area, mobile, coded-aperture radiation imaging system for localizing compact radioactive sources in three dimensions while rejecting distributed background. The 3D Stand-Off Radiation Detection System (SORDS-3D) has been tested at speeds up to 95 km/h and has detected and located sources in the millicurie range at distances of over 100 m. Radiation data are imaged to a geospatially mapped world grid with a nominal 1.25- to 2.5-m pixel pitch at distances out to 120 m on either side of the platform. Source elevation is also extracted. Imaged radiation alarms are superimposed on a side-facing video log that can be played back for direct localization of sources in buildings in urban environments. The system utilizes a 37-element array of 5x5x50 cm 3 cesium-iodide (sodium) detectors. Scintillation light is collected by a pair of photomultiplier tubes placed at either end of each detector, with the detectors achieving an energy resolution of 6.15% FWHM (662 keV) and a position resolution along their length of 5 cm FWHM. The imaging system generates a dual-sided two-dimensional image allowing users to efficiently survey a large area. Imaged radiation data and raw spectra are forwarded to the RadioNuclide Analysis Kit (RNAK), developed by our collaborators, for isotope ID. An intuitive real-time display aids users in performing searches. Detector calibration is dynamically maintained by monitoring the potassium-40 peak and digitally adjusting individual detector gains. We have recently realized improvements, both in isotope identification and in distinguishing compact sources from background, through the installation of optimal-filter reconstruction kernels.

  1. Radiation transport phenomena and modeling - part A: Codes

    International Nuclear Information System (INIS)

    Lorence, L.J.

    1997-01-01

    The need to understand how particle radiation (high-energy photons and electrons) from a variety of sources affects materials and electronics has motivated the development of sophisticated computer codes that describe how radiation with energies from 1.0 keV to 100.0 GeV propagates through matter. Predicting radiation transport is the necessary first step in predicting radiation effects. The radiation transport codes that are described here are general-purpose codes capable of analyzing a variety of radiation environments including those produced by nuclear weapons (x-rays, gamma rays, and neutrons), by sources in space (electrons and ions) and by accelerators (x-rays, gamma rays, and electrons). Applications of these codes include the study of radiation effects on electronics, nuclear medicine (imaging and cancer treatment), and industrial processes (food disinfestation, waste sterilization, manufacturing.) The primary focus will be on coupled electron-photon transport codes, with some brief discussion of proton transport. These codes model a radiation cascade in which electrons produce photons and vice versa. This coupling between particles of different types is important for radiation effects. For instance, in an x-ray environment, electrons are produced that drive the response in electronics. In an electron environment, dose due to bremsstrahlung photons can be significant once the source electrons have been stopped

  2. Method for coding low entrophy data

    Science.gov (United States)

    Yeh, Pen-Shu (Inventor)

    1995-01-01

    A method of lossless data compression for efficient coding of an electronic signal of information sources of very low information rate is disclosed. In this method, S represents a non-negative source symbol set, (s(sub 0), s(sub 1), s(sub 2), ..., s(sub N-1)) of N symbols with s(sub i) = i. The difference between binary digital data is mapped into symbol set S. Consecutive symbols in symbol set S are then paired into a new symbol set Gamma which defines a non-negative symbol set containing the symbols (gamma(sub m)) obtained as the extension of the original symbol set S. These pairs are then mapped into a comma code which is defined as a coding scheme in which every codeword is terminated with the same comma pattern, such as a 1. This allows a direct coding and decoding of the n-bit positive integer digital data differences without the use of codebooks.

  3. Gaseous discharge display panel including pilot electrodes and radioactive wire

    International Nuclear Information System (INIS)

    Edwards, R.J.; Hairabedian, B.Z.; Poley, N.M.

    1975-01-01

    In a plasma display panel consisting of gas enclosed between adjacent insulating members, a light source is used to supply charged particles in the gas to permit firing of the gas when coordinate conductors identifying a site location are energized. The use of such pilot lamps facilitates ignition in firing with uniform selection and firing potentials within all sites of the display panel. To eliminate the difficulty in achieving firing during cold starts a radioactive source comprised of a copper wire electroplated with nickel 63 and overcoated with a protective coat of nickel is placed within the gas panel to provide a source of free electrons. The wire is held in place by friction against the inside walls of the panel. Since the wire emits only beta radiation, no radiation hazard exists externally to the panel

  4. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    Science.gov (United States)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  5. It takes two-coincidence coding within the dual olfactory pathway of the honeybee.

    Science.gov (United States)

    Brill, Martin F; Meyer, Anneke; Rössler, Wolfgang

    2015-01-01

    To rapidly process biologically relevant stimuli, sensory systems have developed a broad variety of coding mechanisms like parallel processing and coincidence detection. Parallel processing (e.g., in the visual system), increases both computational capacity and processing speed by simultaneously coding different aspects of the same stimulus. Coincidence detection is an efficient way to integrate information from different sources. Coincidence has been shown to promote associative learning and memory or stimulus feature detection (e.g., in auditory delay lines). Within the dual olfactory pathway of the honeybee both of these mechanisms might be implemented by uniglomerular projection neurons (PNs) that transfer information from the primary olfactory centers, the antennal lobe (AL), to a multimodal integration center, the mushroom body (MB). PNs from anatomically distinct tracts respond to the same stimulus space, but have different physiological properties, characteristics that are prerequisites for parallel processing of different stimulus aspects. However, the PN pathways also display mirror-imaged like anatomical trajectories that resemble neuronal coincidence detectors as known from auditory delay lines. To investigate temporal processing of olfactory information, we recorded PN odor responses simultaneously from both tracts and measured coincident activity of PNs within and between tracts. Our results show that coincidence levels are different within each of the two tracts. Coincidence also occurs between tracts, but to a minor extent compared to coincidence within tracts. Taken together our findings support the relevance of spike timing in coding of olfactory information (temporal code).

  6. Parity-Check Network Coding for Multiple Access Relay Channel in Wireless Sensor Cooperative Communications

    Directory of Open Access Journals (Sweden)

    Du Bing

    2010-01-01

    Full Text Available A recently developed theory suggests that network coding is a generalization of source coding and channel coding and thus yields a significant performance improvement in terms of throughput and spatial diversity. This paper proposes a cooperative design of a parity-check network coding scheme in the context of a two-source multiple access relay channel (MARC model, a common compact model in hierarchical wireless sensor networks (WSNs. The scheme uses Low-Density Parity-Check (LDPC as the surrogate to build up a layered structure which encapsulates the multiple constituent LDPC codes in the source and relay nodes. Specifically, the relay node decodes the messages from two sources, which are used to generate extra parity-check bits by a random network coding procedure to fill up the rate gap between Source-Relay and Source-Destination transmissions. Then, we derived the key algebraic relationships among multidimensional LDPC constituent codes as one of the constraints for code profile optimization. These extra check bits are sent to the destination to realize a cooperative diversity as well as to approach MARC decode-and-forward (DF capacity.

  7. Development of an Evaluation Method for the Design Complexity of Computer-Based Displays

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyoung Ju; Lee, Seung Woo; Kang, Hyun Gook; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Park, Jin Kyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    The importance of the design of human machine interfaces (HMIs) for human performance and the safety of process industries has long been continuously recognized for many decades. Especially, in the case of nuclear power plants (NPPs), HMIs have significant implications for the safety of the NPPs because poor HMIs can impair the decision making ability of human operators. In order to support and increase the decision making ability of human operators, advanced HMIs based on the up-to-date computer technology are provided. Human operators in advanced main control room (MCR) acquire information through video display units (VDUs) and large display panel (LDP), which is required for the operation of NPPs. These computer-based displays contain a huge amount of information and present it with a variety of formats compared to those of a conventional MCR. For example, these displays contain more display elements such as abbreviations, labels, icons, symbols, coding, etc. As computer-based displays contain more information, the complexity of advanced displays becomes greater due to less distinctiveness of each display element. A greater understanding is emerging about the effectiveness of designs of computer-based displays, including how distinctively display elements should be designed. This study covers the early phase in the development of an evaluation method for the design complexity of computer-based displays. To this end, a series of existing studies were reviewed to suggest an appropriate concept that is serviceable to unravel this problem

  8. Real-time display of flow-pressure-volume loops.

    Science.gov (United States)

    Morozoff, P E; Evans, R W

    1992-01-01

    Graphic display of respiratory waveforms can be valuable for monitoring the progress of ventilated patients. A system has been developed that can display flow-pressure-volume loops as derived from a patient's respiratory circuit in real time. It can also display, store, print, and retrieve ventilatory waveforms. Five loops can be displayed at once: current, previous, reference, "ideal," and previously saved. Two components, the data-display device (DDD) and the data-collection device (DCD), comprise the system. An IBM 286/386 computer with a graphics card (VGA) and bidirectional parallel port is used for the DDD; an eight-bit microprocessor card and an A/D convertor card make up the DCD. A real-time multitasking operating system was written to control the DDD, while the DCD operates from in-line assembly code. The DCD samples the pressure and flow sensors at 100 Hz and looks for a complete flow waveform pattern based on flow slope. These waveforms are then passed to the DDD via the mutual parallel port. Within the DDD a process integrates the flow to create a volume signal and performs a multilinear regression on the pressure, flow, and volume data to calculate the elastance, resistance, pressure offset, and coefficient of determination. Elastance, resistance, and offset are used to calculate Pr and Pc where: Pr[k] = P[k]-offset-(elastance.V[k]) and Pc[k] = P[k]-offset-(resistance.F[k]). Volume vs. Pc and flow vs. Pr can be displayed in real time. Patient data from previous clinical tests were loaded into the device to verify the software calculations. An analog waveform generator was used to simulate flow and pressure waveforms that validated the system.(ABSTRACT TRUNCATED AT 250 WORDS)

  9. Factors influencing users' attitude towards display advertising on Facebook

    OpenAIRE

    Halalau, Ruxandra; Kornias, Gustaf

    2012-01-01

    Background: Researchers have investigated display advertising in the past several years from different perspectives but only in connection to traditional Web sites and not specifically for social networking sites. Facebook is the most prominent social networking site in terms of number of users and its main source of revenue is its online advertising business. Having display advertising in their virtual space is the reason why social networks are able to offer free service and as such the nee...

  10. Radioactive releases of nuclear power plants: the code ASTEC

    International Nuclear Information System (INIS)

    Sdouz, G.; Pachole, M.

    1999-11-01

    In order to adopt potential countermeasures to protect the population during the course of an accident in a nuclear power plant a fast prediction of the radiation exposure is necessary. The basic input value for such a dispersion calculation is the source term, which is the description of the physical and chemical behavior of the released radioactive nuclides. Based on a source term data base a pilot system has been developed to determine a relevant source term and to generate the input file for the dispersion code TAMOS of the Zentralanstalt fuer Meteorologie und Geodynamik (ZAMG). This file can be sent directly as an attachment of e-mail to the TAMOS user for further processing. The source terms for 56 European nuclear power plant units are included in the pilot version of the code ASTEC (Austrian Source Term Estimation Code). The use of the system is demonstrated in an example based on an accident in the unit TEMELIN-1. In order to calculate typical core inventories for the data bank the international computer code OBIGEN 2.1 was installed and applied. The report has been completed with a discussion on the optimal data transfer. (author)

  11. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corp., Tokyo (Japan)

    2012-07-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  12. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2012-07-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  13. Calculation of source terms for NUREG-1150

    International Nuclear Information System (INIS)

    Breeding, R.J.; Williams, D.C.; Murfin, W.B.; Amos, C.N.; Helton, J.C.

    1987-10-01

    The source terms estimated for NUREG-1150 are generally based on the Source Term Code Package (STCP), but the actual source term calculations used in computing risk are performed by much smaller codes which are specific to each plant. This was done because the method of estimating the uncertainty in risk for NUREG-1150 requires hundreds of source term calculations for each accident sequence. This is clearly impossible with a large, detailed code like the STCP. The small plant-specific codes are based on simple algorithms and utilize adjustable parameters. The values of the parameters appearing in these codes are derived from the available STCP results. To determine the uncertainty in the estimation of the source terms, these parameters were varied as specified by an expert review group. This method was used to account for the uncertainties in the STCP results and the uncertainties in phenomena not considered by the STCP

  14. Fast-neutron, coded-aperture imager

    International Nuclear Information System (INIS)

    Woolf, Richard S.; Phlips, Bernard F.; Hutcheson, Anthony L.; Wulf, Eric A.

    2015-01-01

    This work discusses a large-scale, coded-aperture imager for fast neutrons, building off a proof-of concept instrument developed at the U.S. Naval Research Laboratory (NRL). The Space Science Division at the NRL has a heritage of developing large-scale, mobile systems, using coded-aperture imaging, for long-range γ-ray detection and localization. The fast-neutron, coded-aperture imaging instrument, designed for a mobile unit (20 ft. ISO container), consists of a 32-element array of 15 cm×15 cm×15 cm liquid scintillation detectors (EJ-309) mounted behind a 12×12 pseudorandom coded aperture. The elements of the aperture are composed of 15 cm×15 cm×10 cm blocks of high-density polyethylene (HDPE). The arrangement of the aperture elements produces a shadow pattern on the detector array behind the mask. By measuring of the number of neutron counts per masked and unmasked detector, and with knowledge of the mask pattern, a source image can be deconvolved to obtain a 2-d location. The number of neutrons per detector was obtained by processing the fast signal from each PMT in flash digitizing electronics. Digital pulse shape discrimination (PSD) was performed to filter out the fast-neutron signal from the γ background. The prototype instrument was tested at an indoor facility at the NRL with a 1.8-μCi and 13-μCi 252Cf neutron/γ source at three standoff distances of 9, 15 and 26 m (maximum allowed in the facility) over a 15-min integration time. The imaging and detection capabilities of the instrument were tested by moving the source in half- and one-pixel increments across the image plane. We show a representative sample of the results obtained at one-pixel increments for a standoff distance of 9 m. The 1.8-μCi source was not detected at the 26-m standoff. In order to increase the sensitivity of the instrument, we reduced the fastneutron background by shielding the top, sides and back of the detector array with 10-cm-thick HDPE. This shielding configuration led

  15. Fast-neutron, coded-aperture imager

    Energy Technology Data Exchange (ETDEWEB)

    Woolf, Richard S., E-mail: richard.woolf@nrl.navy.mil; Phlips, Bernard F., E-mail: bernard.phlips@nrl.navy.mil; Hutcheson, Anthony L., E-mail: anthony.hutcheson@nrl.navy.mil; Wulf, Eric A., E-mail: eric.wulf@nrl.navy.mil

    2015-06-01

    This work discusses a large-scale, coded-aperture imager for fast neutrons, building off a proof-of concept instrument developed at the U.S. Naval Research Laboratory (NRL). The Space Science Division at the NRL has a heritage of developing large-scale, mobile systems, using coded-aperture imaging, for long-range γ-ray detection and localization. The fast-neutron, coded-aperture imaging instrument, designed for a mobile unit (20 ft. ISO container), consists of a 32-element array of 15 cm×15 cm×15 cm liquid scintillation detectors (EJ-309) mounted behind a 12×12 pseudorandom coded aperture. The elements of the aperture are composed of 15 cm×15 cm×10 cm blocks of high-density polyethylene (HDPE). The arrangement of the aperture elements produces a shadow pattern on the detector array behind the mask. By measuring of the number of neutron counts per masked and unmasked detector, and with knowledge of the mask pattern, a source image can be deconvolved to obtain a 2-d location. The number of neutrons per detector was obtained by processing the fast signal from each PMT in flash digitizing electronics. Digital pulse shape discrimination (PSD) was performed to filter out the fast-neutron signal from the γ background. The prototype instrument was tested at an indoor facility at the NRL with a 1.8-μCi and 13-μCi 252Cf neutron/γ source at three standoff distances of 9, 15 and 26 m (maximum allowed in the facility) over a 15-min integration time. The imaging and detection capabilities of the instrument were tested by moving the source in half- and one-pixel increments across the image plane. We show a representative sample of the results obtained at one-pixel increments for a standoff distance of 9 m. The 1.8-μCi source was not detected at the 26-m standoff. In order to increase the sensitivity of the instrument, we reduced the fastneutron background by shielding the top, sides and back of the detector array with 10-cm-thick HDPE. This shielding configuration led

  16. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corporation, Tokyo (Japan)

    2013-10-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  17. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2013-10-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  18. Application of holographic elements in displays and planar illuminators

    Science.gov (United States)

    Putilin, Andrew; Gustomiasov, Igor

    2007-05-01

    Holographic Optical Elements (HOE's) on planar waveguides can be used to design the planar optics for backlit units, color selectors or filters, lenses for virtual reality displays. The several schemes for HOE recording are proposed to obtain planar stereo backlit unit and private eye displays light source. It is shown in the paper that the specific light transformation grating permits to construct efficient backlit units for display holograms and LCD. Several schemes of reflection/transmission backlit units and scattering films based on holographic optical elements are also proposed. The performance of the waveguide HOE can be optimized using the parameters of recording scheme and etching parameters. The schemes of HOE application are discussed and some experimental results are shown.

  19. Methods for Coding Tobacco-Related Twitter Data: A Systematic Review.

    Science.gov (United States)

    Lienemann, Brianna A; Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai

    2017-03-31

    As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter's Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter's databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. ©Brianna A Lienemann, Jennifer B Unger, Tess Boley Cruz, Kar-Hai Chu. Originally published in the

  20. Young Children's Analogical Problem Solving: Gaining Insights from Video Displays

    Science.gov (United States)

    Chen, Zhe; Siegler, Robert S.

    2013-01-01

    This study examined how toddlers gain insights from source video displays and use the insights to solve analogous problems. Two- to 2.5-year-olds viewed a source video illustrating a problem-solving strategy and then attempted to solve analogous problems. Older but not younger toddlers extracted the problem-solving strategy depicted in the video…

  1. Polarization diversity scheme on spectral polarization coding optical code-division multiple-access network

    Science.gov (United States)

    Yen, Chih-Ta; Huang, Jen-Fa; Chang, Yao-Tang; Chen, Bo-Hau

    2010-12-01

    We present an experiment demonstrating the spectral-polarization coding optical code-division multiple-access system introduced with a nonideal state of polarization (SOP) matching conditions. In the proposed system, the encoding and double balanced-detection processes are implemented using a polarization-diversity scheme. Because of the quasiorthogonality of Hadamard codes combining with array waveguide grating routers and a polarization beam splitter, the proposed codec pair can encode-decode multiple code words of Hadamard code while retaining the ability for multiple-access interference cancellation. The experimental results demonstrate that when the system is maintained with an orthogonal SOP for each user, an effective reduction in the phase-induced intensity noise is obtained. The analytical SNR values are found to overstate the experimental results by around 2 dB when the received effective power is large. This is mainly limited by insertion losses of components and a nonflattened optical light source. Furthermore, the matching conditions can be improved by decreasing nonideal influences.

  2. Computation of the bounce-average code

    International Nuclear Information System (INIS)

    Cutler, T.A.; Pearlstein, L.D.; Rensink, M.E.

    1977-01-01

    The bounce-average computer code simulates the two-dimensional velocity transport of ions in a mirror machine. The code evaluates and bounce-averages the collision operator and sources along the field line. A self-consistent equilibrium magnetic field is also computed using the long-thin approximation. Optionally included are terms that maintain μ, J invariance as the magnetic field changes in time. The assumptions and analysis that form the foundation of the bounce-average code are described. When references can be cited, the required results are merely stated and explained briefly. A listing of the code is appended

  3. Code system BCG for gamma-ray skyshine calculation

    International Nuclear Information System (INIS)

    Ryufuku, Hiroshi; Numakunai, Takao; Miyasaka, Shun-ichi; Minami, Kazuyoshi.

    1979-03-01

    A code system BCG has been developed for calculating conveniently and efficiently gamma-ray skyshine doses using the transport calculation codes ANISN and DOT and the point-kernel calculation codes G-33 and SPAN. To simplify the input forms to the system, the forms for these codes are unified, twelve geometric patterns are introduced to give material regions, and standard data are available as a library. To treat complex arrangements of source and shield, it is further possible to use successively the code such that the results from one code may be used as input data to the same or other code. (author)

  4. A deviation display method for visualising data in mobile gamma-ray spectrometry.

    OpenAIRE

    Kock, Peder; Finck, Robert R; Nilsson, Jonas M C; Östlund, Karl; Samuelsson, Christer

    2010-01-01

    A real time visualisation method, to be used in mobile gamma-spectrometric search operations using standard detector systems is presented. The new method, called deviation display, uses a modified waterfall display to present relative changes in spectral data over energy and time. Using unshielded (137)Cs and (241)Am point sources and different natural background environments, the behaviour of the deviation displays is demonstrated and analysed for two standard detector types (NaI(Tl) and HPG...

  5. IMDISP - INTERACTIVE IMAGE DISPLAY PROGRAM

    Science.gov (United States)

    Martin, M. D.

    1994-01-01

    screen at once, the image can be "subsampled." For example, if the image were subsampled by a factor of 2, every other pixel from every other line would be displayed, starting from the upper left corner of the image. Any positive integer may be used for subsampling. The user may produce a histogram of an image file, which is a graph showing the number of pixels per DN value, or per range of DN values, for the entire image. IMDISP can also plot the DN value versus pixels along a line between two points on the image. The user can "stretch" or increase the contrast of an image by specifying low and high DN values; all pixels with values lower than the specified "low" will then become black, and all pixels higher than the specified "high" value will become white. Pixels between the low and high values will be evenly shaded between black and white. IMDISP is written in a modular form to make it easy to change it to work with different display devices or on other computers. The code can also be adapted for use in other application programs. There are device dependent image display modules, general image display subroutines, image I/O routines, and image label and command line parsing routines. The IMDISP system is written in C-language (94%) and Assembler (6%). It was implemented on an IBM PC with the MS DOS 3.21 operating system. IMDISP has a memory requirement of about 142k bytes. IMDISP was developed in 1989 and is a copyrighted work with all copyright vested in NASA. Additional planetary images can be obtained from the National Space Science Data Center at (301) 286-6695.

  6. Adaptive Combined Source and Channel Decoding with Modulation ...

    African Journals Online (AJOL)

    In this paper, an adaptive system employing combined source and channel decoding with modulation is proposed for slow Rayleigh fading channels. Huffman code is used as the source code and Convolutional code is used for error control. The adaptive scheme employs a family of Convolutional codes of different rates ...

  7. TU-AB-BRC-10: Modeling of Radiotherapy Linac Source Terms Using ARCHER Monte Carlo Code: Performance Comparison of GPU and MIC Computing Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Liu, T; Lin, H; Xu, X [Rensselaer Polytechnic Institute, Troy, NY (United States); Su, L [John Hopkins University, Baltimore, MD (United States); Shi, C [Saint Vincent Medical Center, Bridgeport, CT (United States); Tang, X [Memorial Sloan Kettering Cancer Center, West Harrison, NY (United States); Bednarz, B [University of Wisconsin, Madison, WI (United States)

    2016-06-15

    Purpose: (1) To perform phase space (PS) based source modeling for Tomotherapy and Varian TrueBeam 6 MV Linacs, (2) to examine the accuracy and performance of the ARCHER Monte Carlo code on a heterogeneous computing platform with Many Integrated Core coprocessors (MIC, aka Xeon Phi) and GPUs, and (3) to explore the software micro-optimization methods. Methods: The patient-specific source of Tomotherapy and Varian TrueBeam Linacs was modeled using the PS approach. For the helical Tomotherapy case, the PS data were calculated in our previous study (Su et al. 2014 41(7) Medical Physics). For the single-view Varian TrueBeam case, we analytically derived them from the raw patient-independent PS data in IAEA’s database, partial geometry information of the jaw and MLC as well as the fluence map. The phantom was generated from DICOM images. The Monte Carlo simulation was performed by ARCHER-MIC and GPU codes, which were benchmarked against a modified parallel DPM code. Software micro-optimization was systematically conducted, and was focused on SIMD vectorization of tight for-loops and data prefetch, with the ultimate goal of increasing 512-bit register utilization and reducing memory access latency. Results: Dose calculation was performed for two clinical cases, a Tomotherapy-based prostate cancer treatment and a TrueBeam-based left breast treatment. ARCHER was verified against the DPM code. The statistical uncertainty of the dose to the PTV was less than 1%. Using double-precision, the total wall time of the multithreaded CPU code on a X5650 CPU was 339 seconds for the Tomotherapy case and 131 seconds for the TrueBeam, while on 3 5110P MICs it was reduced to 79 and 59 seconds, respectively. The single-precision GPU code on a K40 GPU took 45 seconds for the Tomotherapy dose calculation. Conclusion: We have extended ARCHER, the MIC and GPU-based Monte Carlo dose engine to Tomotherapy and Truebeam dose calculations.

  8. TU-AB-BRC-10: Modeling of Radiotherapy Linac Source Terms Using ARCHER Monte Carlo Code: Performance Comparison of GPU and MIC Computing Accelerators

    International Nuclear Information System (INIS)

    Liu, T; Lin, H; Xu, X; Su, L; Shi, C; Tang, X; Bednarz, B

    2016-01-01

    Purpose: (1) To perform phase space (PS) based source modeling for Tomotherapy and Varian TrueBeam 6 MV Linacs, (2) to examine the accuracy and performance of the ARCHER Monte Carlo code on a heterogeneous computing platform with Many Integrated Core coprocessors (MIC, aka Xeon Phi) and GPUs, and (3) to explore the software micro-optimization methods. Methods: The patient-specific source of Tomotherapy and Varian TrueBeam Linacs was modeled using the PS approach. For the helical Tomotherapy case, the PS data were calculated in our previous study (Su et al. 2014 41(7) Medical Physics). For the single-view Varian TrueBeam case, we analytically derived them from the raw patient-independent PS data in IAEA’s database, partial geometry information of the jaw and MLC as well as the fluence map. The phantom was generated from DICOM images. The Monte Carlo simulation was performed by ARCHER-MIC and GPU codes, which were benchmarked against a modified parallel DPM code. Software micro-optimization was systematically conducted, and was focused on SIMD vectorization of tight for-loops and data prefetch, with the ultimate goal of increasing 512-bit register utilization and reducing memory access latency. Results: Dose calculation was performed for two clinical cases, a Tomotherapy-based prostate cancer treatment and a TrueBeam-based left breast treatment. ARCHER was verified against the DPM code. The statistical uncertainty of the dose to the PTV was less than 1%. Using double-precision, the total wall time of the multithreaded CPU code on a X5650 CPU was 339 seconds for the Tomotherapy case and 131 seconds for the TrueBeam, while on 3 5110P MICs it was reduced to 79 and 59 seconds, respectively. The single-precision GPU code on a K40 GPU took 45 seconds for the Tomotherapy dose calculation. Conclusion: We have extended ARCHER, the MIC and GPU-based Monte Carlo dose engine to Tomotherapy and Truebeam dose calculations.

  9. It Takes Two – Coincidence coding within the dual olfactory pathway of the honeybee

    Directory of Open Access Journals (Sweden)

    Martin F. Brill

    2015-07-01

    Full Text Available To rapidly process biologically relevant stimuli, sensory systems have developed a broad variety of coding mechanisms like parallel processing and coincidence detection. Parallel processing (e.g. in the visual system, increases both computational capacity and processing speed by simultaneously coding different aspects of the same stimulus. Coincidence detection is an efficient way to integrateinformation from different sources. Coincidence has been shown to promote associative learning and memory or stimulus feature detection (e.g. in auditory delay lines. Within the dual olfactory pathway of the honeybee both of these mechanisms might be implemented by uniglomerular projection neurons (PNs that transfer information from the primary olfactory centers, the antennal lobe (AL, to a multimodal integration center, the mushroom body (MB. PNs from anatomically distinct tracts respond to the same stimulus space, but have different physiological properties, characteristics that are prerequisites for parallel processing of different stimulus aspects. However, the PN pathways also display mirror-imaged like anatomical trajectories that resemble neuronal coincidence detectors as known from auditory delay lines. To investigate temporal processing of olfactory information, we recorded PN odor responses simultaneously from both tracts and measured coincident activity of PNs within and between tracts. Our results show that coincidence levels are different within each of the two tracts. Coincidence also occurs between tracts, but to a minor extent compared to coincidence within tracts. Taken together our findings support the relevance of spike timing in coding of olfactory information (temporal code.

  10. Web- and system-code based, interactive, nuclear power plant simulators

    International Nuclear Information System (INIS)

    Kim, K. D.; Jain, P.; Rizwan, U.

    2006-01-01

    Using two different approaches, on-line, web- and system-code based graphical user interfaces have been developed for reactor system analysis. Both are LabVIEW (graphical programming language developed by National Instruments) based systems that allow local users as well as those at remote sites to run, interact and view the results of the system code in a web browser. In the first approach, only the data written by the system code in a tab separated ASCII output file is accessed and displayed graphically. In the second approach, LabVIEW virtual instruments are coupled with the system code as dynamic link libraries (DLL). RELAP5 is used as the system code to demonstrate the capabilities of these approaches. From collaborative projects between teams in geographically remote locations to providing system code experience to distance education students, these tools can be very beneficial in many areas of teaching and R and D. (authors)

  11. Subjective assessment of impairment in scale-space-coded images

    NARCIS (Netherlands)

    Ridder, de H.; Majoor, G.M.M.

    1988-01-01

    Direct category scaling and a scaling procedure in accordance with Functional Measurement Theory (Anderson, 1982) have been used to assess impairment in scale-space-coded illlages, displayed on a black-and-white TV monitor. The image of a complex scene was passed through a Gaussian filter of limited

  12. Methods and apparatus for graphical display and editing of flight plans

    Science.gov (United States)

    Gibbs, Michael J. (Inventor); Adams, Jr., Mike B. (Inventor); Chase, Karl L. (Inventor); Lewis, Daniel E. (Inventor); McCrobie, Daniel E. (Inventor); Omen, Debi Van (Inventor)

    2002-01-01

    Systems and methods are provided for an integrated graphical user interface which facilitates the display and editing of aircraft flight-plan data. A user (e.g., a pilot) located within the aircraft provides input to a processor through a cursor control device and receives visual feedback via a display produced by a monitor. The display includes various graphical elements associated with the lateral position, vertical position, flight-plan and/or other indicia of the aircraft's operational state as determined from avionics data and/or various data sources. Through use of the cursor control device, the user may modify the flight-plan and/or other such indicia graphically in accordance with feedback provided by the display. In one embodiment, the display includes a lateral view, a vertical profile view, and a hot-map view configured to simplify the display and editing of the aircraft's flight-plan data.

  13. Studies on hand-held visual communication device for the deaf and speech-impaired I. Visual display window size.

    Science.gov (United States)

    Thurlow, W R

    1980-01-01

    Messages were presented which moved from right to left along an electronic alphabetic display which was varied in "window" size from 4 through 32 letter spaces. Deaf subjects signed the messages they perceived. Relatively few errors were made even at the highest rate of presentation, which corresponded to a typing rate of 60 words/min. It is concluded that many deaf persons can make effective use of a small visual display. A reduced cost is then possible for visual communication instruments for these people through reduced display size. Deaf subjects who can profit from a small display can be located by a sentence test administered by tape recorder which drives the display of the communication device by means of the standard code of the deaf teletype network.

  14. Coding conventions and principles for a National Land-Change Modeling Framework

    Science.gov (United States)

    Donato, David I.

    2017-07-14

    This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.

  15. A Monte Carlo code for ion beam therapy

    CERN Multimedia

    Anaïs Schaeffer

    2012-01-01

    Initially developed for applications in detector and accelerator physics, the modern Fluka Monte Carlo code is now used in many different areas of nuclear science. Over the last 25 years, the code has evolved to include new features, such as ion beam simulations. Given the growing use of these beams in cancer treatment, Fluka simulations are being used to design treatment plans in several hadron-therapy centres in Europe.   Fluka calculates the dose distribution for a patient treated at CNAO with proton beams. The colour-bar displays the normalized dose values. Fluka is a Monte Carlo code that very accurately simulates electromagnetic and nuclear interactions in matter. In the 1990s, in collaboration with NASA, the code was developed to predict potential radiation hazards received by space crews during possible future trips to Mars. Over the years, it has become the standard tool to investigate beam-machine interactions, radiation damage and radioprotection issues in the CERN accelerator com...

  16. Development of a coupling code for PWR reactor cavity radiation streaming calculation

    International Nuclear Information System (INIS)

    Zheng, Z.; Wu, H.; Cao, L.; Zheng, Y.; Zhang, H.; Wang, M.

    2012-01-01

    PWR reactor cavity radiation streaming is important for the safe of the personnel and equipment, thus calculation has to be performed to evaluate the neutron flux distribution around the reactor. For this calculation, the deterministic codes have difficulties in fine geometrical modeling and need huge computer resource; and the Monte Carlo codes require very long sampling time to obtain results with acceptable precision. Therefore, a coupling method has been developed to eliminate the two problems mentioned above in each code. In this study, we develop a coupling code named DORT2MCNP to link the Sn code DORT and Monte Carlo code MCNP. DORT2MCNP is used to produce a combined surface source containing top, bottom and side surface simultaneously. Because SDEF card is unsuitable for the combined surface source, we modify the SOURCE subroutine of MCNP and compile MCNP for this application. Numerical results demonstrate the correctness of the coupling code DORT2MCNP and show reasonable agreement between the coupling method and the other two codes (DORT and MCNP). (authors)

  17. A Framework for Reverse Engineering Large C++ Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Byelas, Heorhiy; Voinea, Lucian

    2009-01-01

    When assessing the quality and maintainability of large C++ code bases, tools are needed for extracting several facts from the source code, such as: architecture, structure, code smells, and quality metrics. Moreover, these facts should be presented in such ways so that one can correlate them and

  18. A Framework for Reverse Engineering Large C++ Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Byelas, Heorhiy; Voinea, Lucian

    2008-01-01

    When assessing the quality and maintainability of large C++ code bases, tools are needed for extracting several facts from the source code, such as: architecture, structure, code smells, and quality metrics. Moreover, these facts should be presented in such ways so that one can correlate them and

  19. OFF, Open source Finite volume Fluid dynamics code: A free, high-order solver based on parallel, modular, object-oriented Fortran API

    Science.gov (United States)

    Zaghi, S.

    2014-07-01

    OFF, an open source (free software) code for performing fluid dynamics simulations, is presented. The aim of OFF is to solve, numerically, the unsteady (and steady) compressible Navier-Stokes equations of fluid dynamics by means of finite volume techniques: the research background is mainly focused on high-order (WENO) schemes for multi-fluids, multi-phase flows over complex geometries. To this purpose a highly modular, object-oriented application program interface (API) has been developed. In particular, the concepts of data encapsulation and inheritance available within Fortran language (from standard 2003) have been stressed in order to represent each fluid dynamics "entity" (e.g. the conservative variables of a finite volume, its geometry, etc…) by a single object so that a large variety of computational libraries can be easily (and efficiently) developed upon these objects. The main features of OFF can be summarized as follows: Programming LanguageOFF is written in standard (compliant) Fortran 2003; its design is highly modular in order to enhance simplicity of use and maintenance without compromising the efficiency; Parallel Frameworks Supported the development of OFF has been also targeted to maximize the computational efficiency: the code is designed to run on shared-memory multi-cores workstations and distributed-memory clusters of shared-memory nodes (supercomputers); the code's parallelization is based on Open Multiprocessing (OpenMP) and Message Passing Interface (MPI) paradigms; Usability, Maintenance and Enhancement in order to improve the usability, maintenance and enhancement of the code also the documentation has been carefully taken into account; the documentation is built upon comprehensive comments placed directly into the source files (no external documentation files needed): these comments are parsed by means of doxygen free software producing high quality html and latex documentation pages; the distributed versioning system referred as git

  20. Human factors guidelines for large-screen displays

    International Nuclear Information System (INIS)

    Collier, Steve

    2005-09-01

    Any control-room project (including upgrades or evolutionary improvements to existing control-rooms) is well advised at the outset first to gather and update related background material for the design. This information-gathering exercise should also take into account experience from similar projects and operating experience. For these reasons, we decided to use our research, and experience in large-screen display design with several clients to update human factors guidance for large-screen displays, to take into account new ergonomics guidelines, operating experience, and work from similar projects. To write the updated guidelines, we drew on much of our experience across several departments at IFE, including research funded by the HRP programme, and experience with individual clients. Guidance here is accordingly focused mainly on recent areas of technical and human innovations in the man-machine interface. One particular area of focus was on the increasing use of large-screen display systems in modern control-rooms, and on how guidelines could be adapted and supplemented for their design. Guidance or reference to recommended sources is also given for control suite arrangement and layout, control-room layout, workstation layout, design of displays and controls, and design of the work environment, especially insofar as these ergonomic issues interact with the effectiveness of modern displays, in particular large screen displays. The work shows that there can be synergy between HRP research and bilateral activities: the one side offers a capability to develop tools and guidelines, while the other side gives an opportunity to test and refine these in practice, to the benefit of both parties. (Author)

  1. Human factors guidelines for large-screen displays

    Energy Technology Data Exchange (ETDEWEB)

    Collier, Steve

    2005-09-15

    Any control-room project (including upgrades or evolutionary improvements to existing control-rooms) is well advised at the outset first to gather and update related background material for the design. This information-gathering exercise should also take into account experience from similar projects and operating experience. For these reasons, we decided to use our research, and experience in large-screen display design with several clients to update human factors guidance for large-screen displays, to take into account new ergonomics guidelines, operating experience, and work from similar projects. To write the updated guidelines, we drew on much of our experience across several departments at IFE, including research funded by the HRP programme, and experience with individual clients. Guidance here is accordingly focused mainly on recent areas of technical and human innovations in the man-machine interface. One particular area of focus was on the increasing use of large-screen display systems in modern control-rooms, and on how guidelines could be adapted and supplemented for their design. Guidance or reference to recommended sources is also given for control suite arrangement and layout, control-room layout, workstation layout, design of displays and controls, and design of the work environment, especially insofar as these ergonomic issues interact with the effectiveness of modern displays, in particular large screen displays. The work shows that there can be synergy between HRP research and bilateral activities: the one side offers a capability to develop tools and guidelines, while the other side gives an opportunity to test and refine these in practice, to the benefit of both parties. (Author)

  2. Application of RASCAL code for multiunit accident in domestic nuclear sites

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sang Hyun; Jeong, Seung Young [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-10-15

    All of domestic nuclear power plant sites are multiunit site (at least 5 - 6 reactors are operating), so this capability has to be quickly secured for nuclear licensee and institutes responsible for nuclear emergency response. In this study, source term and offsite dose from multiunit event were assessed using a computer code, RASCAL. An emergency exercise scenario was chosen to verify applicability of the codes to domestic nuclear site accident. Employing tools and new features of the code, such as merging more than two individual source terms and source term estimate for long term progression accident, main parameters and information in the scenario, release estimates and dose projections were performed. Radiological releases and offsite doses from multiunit accident were calculated using RASCAL.. A scenario, in which three reactors were damaged coincidently by a great natural disaster, was considered. Surrogate plants were chosen for the code calculation. Source terms of each damaged unit were calculated individually first, and then total source term and integrated offsite dose assessment data was acquired using a source term merge function in the code. Also comparison between LTSBO and LOCA source term estimate options was performed. Differences in offsite doses were caused by release characteristics. From LTSBO option, iodines were released much higher than LOCA. Also LTSBO source term release was delayed and the duration was longer than LOCA. This option would be useful to accidents which progress with much longer time frame than LOCA. RASCAL can be useful tool for radiological consequence assessment in domestic nuclear site accidents.

  3. LFSC - Linac Feedback Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, Valentin; /Fermilab

    2008-05-01

    The computer program LFSC (Code>) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output.

  4. The OpenMOC method of characteristics neutral particle transport code

    International Nuclear Information System (INIS)

    Boyd, William; Shaner, Samuel; Li, Lulu; Forget, Benoit; Smith, Kord

    2014-01-01

    Highlights: • An open source method of characteristics neutron transport code has been developed. • OpenMOC shows nearly perfect scaling on CPUs and 30× speedup on GPUs. • Nonlinear acceleration techniques demonstrate a 40× reduction in source iterations. • OpenMOC uses modern software design principles within a C++ and Python framework. • Validation with respect to the C5G7 and LRA benchmarks is presented. - Abstract: The method of characteristics (MOC) is a numerical integration technique for partial differential equations, and has seen widespread use for reactor physics lattice calculations. The exponential growth in computing power has finally brought the possibility for high-fidelity full core MOC calculations within reach. The OpenMOC code is being developed at the Massachusetts Institute of Technology to investigate algorithmic acceleration techniques and parallel algorithms for MOC. OpenMOC is a free, open source code written using modern software languages such as C/C++ and CUDA with an emphasis on extensible design principles for code developers and an easy to use Python interface for code users. The present work describes the OpenMOC code and illustrates its ability to model large problems accurately and efficiently

  5. Modernization of the graphics post-processors of the Hamburg German Climate Computer Center Carbon Cycle Codes

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, E.J.; McNeilly, G.S.

    1994-03-01

    The existing National Center for Atmospheric Research (NCAR) code in the Hamburg Oceanic Carbon Cycle Circulation Model and the Hamburg Large-Scale Geostrophic Ocean General Circulation Model was modernized and reduced in size while still producing an equivalent end result. A reduction in the size of the existing code from more than 50,000 lines to approximately 7,500 lines in the new code has made the new code much easier to maintain. The existing code in Hamburg model uses legacy NCAR (including even emulated CALCOMP subrountines) graphics to display graphical output. The new code uses only current (version 3.1) NCAR subrountines.

  6. Common display performance requirements for military and commercial aircraft product lines

    Science.gov (United States)

    Hoener, Steven J.; Behrens, Arthur J.; Flint, John R.; Jacobsen, Alan R.

    2001-09-01

    Obtaining high quality Active Matrix Liquid Crystal (AMLCD) glass to meet the needs of the commercial and military aerospace business is a major challenge, at best. With the demise of all domestic sources of AMLCD substrate glass, the industry is now focused on overseas sources, which are primarily producing glass for consumer electronics. Previous experience with ruggedizing commercial glass leads to the expectation that the aerospace industry can leverage off the commercial market. The problem remains, while the commercial industry is continually changing and improving its products, the commercial and military aerospace industries require stable and affordable supplies of AMLCD glass for upwards of 20 years to support production and maintenance operations. The Boeing Engineering and Supplier Management Process Councils have chartered a group of displays experts from multiple aircraft product divisions within the Boeing Company, the Displays Process Action Team (DPAT), to address this situation from an overall corporate perspective. The DPAT has formulated a set of Common Displays Performance Requirements for use across the corporate line of commercial and military aircraft products. Though focused on the AMLCD problem, the proposed common requirements are largely independent of display technology. This paper describes the strategy being pursued within the Boeing Company to address the AMLCD supply problem and details the proposed implementation process, centered on common requirements for both commercial and military aircraft displays. Highlighted in this paper are proposed common, or standard, display sizes and the other major requirements established by the DPAT, along with the rationale for these requirements.

  7. Verification and Validation of the k-kL Turbulence Model in FUN3D and CFL3D Codes

    Science.gov (United States)

    Abdol-Hamid, Khaled S.; Carlson, Jan-Renee; Rumsey, Christopher L.

    2015-01-01

    The implementation of the k-kL turbulence model using multiple computational uid dy- namics (CFD) codes is reported herein. The k-kL model is a two-equation turbulence model based on Abdol-Hamid's closure and Menter's modi cation to Rotta's two-equation model. Rotta shows that a reliable transport equation can be formed from the turbulent length scale L, and the turbulent kinetic energy k. Rotta's equation is well suited for term-by-term mod- eling and displays useful features compared to other two-equation models. An important di erence is that this formulation leads to the inclusion of higher-order velocity derivatives in the source terms of the scale equations. This can enhance the ability of the Reynolds- averaged Navier-Stokes (RANS) solvers to simulate unsteady ows. The present report documents the formulation of the model as implemented in the CFD codes Fun3D and CFL3D. Methodology, veri cation and validation examples are shown. Attached and sepa- rated ow cases are documented and compared with experimental data. The results show generally very good comparisons with canonical and experimental data, as well as matching results code-to-code. The results from this formulation are similar or better than results using the SST turbulence model.

  8. Noncoherent Spectral Optical CDMA System Using 1D Active Weight Two-Code Keying Codes

    Directory of Open Access Journals (Sweden)

    Bih-Chyun Yeh

    2016-01-01

    Full Text Available We propose a new family of one-dimensional (1D active weight two-code keying (TCK in spectral amplitude coding (SAC optical code division multiple access (OCDMA networks. We use encoding and decoding transfer functions to operate the 1D active weight TCK. The proposed structure includes an optical line terminal (OLT and optical network units (ONUs to produce the encoding and decoding codes of the proposed OLT and ONUs, respectively. The proposed ONU uses the modified cross-correlation to remove interferences from other simultaneous users, that is, the multiuser interference (MUI. When the phase-induced intensity noise (PIIN is the most important noise, the modified cross-correlation suppresses the PIIN. In the numerical results, we find that the bit error rate (BER for the proposed system using the 1D active weight TCK codes outperforms that for two other systems using the 1D M-Seq codes and 1D balanced incomplete block design (BIBD codes. The effective source power for the proposed system can achieve −10 dBm, which has less power than that for the other systems.

  9. Characterizing the reflectivity of handheld display devices.

    Science.gov (United States)

    Liu, Peter; Badano, Aldo

    2014-08-01

    , both luminance and illuminance increased as the size of the display window decreased. The TG18 method does not account for this variability. The authors conclude that the method requires a definitive description of the back panel used in the light source setup. The methods described in the TG18 document may need to be improved to provide consistent comparisons of desktop monitors, phones, and tablets.

  10. Bring out your codes! Bring out your codes! (Increasing Software Visibility and Re-use)

    Science.gov (United States)

    Allen, A.; Berriman, B.; Brunner, R.; Burger, D.; DuPrie, K.; Hanisch, R. J.; Mann, R.; Mink, J.; Sandin, C.; Shortridge, K.; Teuben, P.

    2013-10-01

    Progress is being made in code discoverability and preservation, but as discussed at ADASS XXI, many codes still remain hidden from public view. With the Astrophysics Source Code Library (ASCL) now indexed by the SAO/NASA Astrophysics Data System (ADS), the introduction of a new journal, Astronomy & Computing, focused on astrophysics software, and the increasing success of education efforts such as Software Carpentry and SciCoder, the community has the opportunity to set a higher standard for its science by encouraging the release of software for examination and possible reuse. We assembled representatives of the community to present issues inhibiting code release and sought suggestions for tackling these factors. The session began with brief statements by panelists; the floor was then opened for discussion and ideas. Comments covered a diverse range of related topics and points of view, with apparent support for the propositions that algorithms should be readily available, code used to produce published scientific results should be made available, and there should be discovery mechanisms to allow these to be found easily. With increased use of resources such as GitHub (for code availability), ASCL (for code discovery), and a stated strong preference from the new journal Astronomy & Computing for code release, we expect to see additional progress over the next few years.

  11. Applications guide to the MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1985-08-01

    A practical guide for the implementation of the MORESE-CG Monte Carlo radiation transport computer code system is presented. The various versions of the MORSE code are compared and contrasted, and the many references dealing explicitly with the MORSE-CG code are reviewed. The treatment of angular scattering is discussed, and procedures for obtaining increased differentiality of results in terms of reaction types and nuclides from a multigroup Monte Carlo code are explained in terms of cross-section and geometry data manipulation. Examples of standard cross-section data input and output are shown. Many other features of the code system are also reviewed, including (1) the concept of primary and secondary particles, (2) fission neutron generation, (3) albedo data capability, (4) DOMINO coupling, (5) history file use for post-processing of results, (6) adjoint mode operation, (7) variance reduction, and (8) input/output. In addition, examples of the combinatorial geometry are given, and the new array of arrays geometry feature (MARS) and its three-dimensional plotting code (JUNEBUG) are presented. Realistic examples of user routines for source, estimation, path-length stretching, and cross-section data manipulation are given. A deatiled explanation of the coupling between the random walk and estimation procedure is given in terms of both code parameters and physical analogies. The operation of the code in the adjoint mode is covered extensively. The basic concepts of adjoint theory and dimensionality are discussed and examples of adjoint source and estimator user routines are given for all common situations. Adjoint source normalization is explained, a few sample problems are given, and the concept of obtaining forward differential results from adjoint calculations is covered. Finally, the documentation of the standard MORSE-CG sample problem package is reviewed and on-going and future work is discussed

  12. Application of the Decomposition Method to the Design Complexity of Computer-based Display

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyoung Ju; Lee, Seung Woo; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Park, Jin Kyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-05-15

    The importance of the design of human machine interfaces (HMIs) for human performance and safety has long been recognized in process industries. In case of nuclear power plants (NPPs), HMIs have significant implications for the safety of the NPPs since poor implementation of HMIs can impair the operators' information searching ability which is considered as one of the important aspects of human behavior. To support and increase the efficiency of the operators' information searching behavior, advanced HMIs based on computer technology are provided. Operators in advanced main control room (MCR) acquire information through video display units (VDUs), and large display panel (LDP) required for the operation of NPPs. These computer-based displays contain a very large quantity of information and present them in a variety of formats than conventional MCR. For example, these displays contain more elements such as abbreviations, labels, icons, symbols, coding, and highlighting than conventional ones. As computer-based displays contain more information, complexity of the elements becomes greater due to less distinctiveness of each element. A greater understanding is emerging about the effectiveness of designs of computer-based displays, including how distinctively display elements should be designed. And according to Gestalt theory, people tend to group similar elements based on attributes such as shape, color or pattern based on the principle of similarity. Therefore, it is necessary to consider not only human operator's perception but the number of element consisting of computer-based display

  13. Application of the Decomposition Method to the Design Complexity of Computer-based Display

    International Nuclear Information System (INIS)

    Kim, Hyoung Ju; Lee, Seung Woo; Seong, Poong Hyun; Park, Jin Kyun

    2012-01-01

    The importance of the design of human machine interfaces (HMIs) for human performance and safety has long been recognized in process industries. In case of nuclear power plants (NPPs), HMIs have significant implications for the safety of the NPPs since poor implementation of HMIs can impair the operators' information searching ability which is considered as one of the important aspects of human behavior. To support and increase the efficiency of the operators' information searching behavior, advanced HMIs based on computer technology are provided. Operators in advanced main control room (MCR) acquire information through video display units (VDUs), and large display panel (LDP) required for the operation of NPPs. These computer-based displays contain a very large quantity of information and present them in a variety of formats than conventional MCR. For example, these displays contain more elements such as abbreviations, labels, icons, symbols, coding, and highlighting than conventional ones. As computer-based displays contain more information, complexity of the elements becomes greater due to less distinctiveness of each element. A greater understanding is emerging about the effectiveness of designs of computer-based displays, including how distinctively display elements should be designed. And according to Gestalt theory, people tend to group similar elements based on attributes such as shape, color or pattern based on the principle of similarity. Therefore, it is necessary to consider not only human operator's perception but the number of element consisting of computer-based display

  14. Documentation for TRACE: an interactive beam-transport code

    International Nuclear Information System (INIS)

    Crandall, K.R.; Rusthoi, D.P.

    1985-01-01

    TRACE is an interactive, first-order, beam-dynamics computer program. TRACE includes space-charge forces and mathematical models for a number of beamline elements not commonly found in beam-transport codes, such as permanent-magnet quadrupoles, rf quadrupoles, rf gaps, accelerator columns, and accelerator tanks. TRACE provides an immediate graphic display of calculative results, has a powerful and easy-to-use command procedure, includes eight different types of beam-matching or -fitting capabilities, and contains its own internal HELP package. This report describes the models and equations used for each of the transport elements, the fitting procedures, and the space-charge/emittance calculations, and provides detailed instruction for using the code

  15. Notes on nuclear reactor core analysis code: CITATION

    International Nuclear Information System (INIS)

    Cepraga, D.G.

    1980-01-01

    The method which has evolved over the years for making power reactor calculations is the multigroup diffusion method. The CITATION code is designed to solve multigroup neutronics problems with application of the finite-difference diffusion theory approximation to neutron transport in up to three-dimensional geometry. The first part of this paper presents information about the mathematical equations programmed along with background material and certain displays to convey the nature of some of the formulations. The results obtained with the CITATION code regarding the neutron and burnup core analysis for a typical PWR reactor are presented in the second part of this paper. (author)

  16. A MORET tool to assist code bias estimation

    International Nuclear Information System (INIS)

    Fernex, F.; Richet, Y.; Letang, E.

    2003-01-01

    This new Graphical User Interface (GUI) developed in JAVA is one of the post-processing tools for MORET4 code. It aims to help users to estimate the importance of the k eff bias due to the code in order to better define the upper safety limit. Moreover, it allows visualizing the distance between an actual configuration case and evaluated critical experiments. This tool depends on a validated experiments database, on sets of physical parameters and on various statistical tools allowing interpolating the calculation bias of the database or displaying the projections of experiments on a reduced base of parameters. The development of this tool is still in progress. (author)

  17. A color display device recording X ray spectra, especially intended for medical radiography

    International Nuclear Information System (INIS)

    Boulch, J.-M.

    1975-01-01

    Said invention relates to a color display recording device for X ray spectra intended for medical radiography. The video signal of the X ray camera receiving the radiation having passed through the patient is amplified and transformed into a color coding according to the energy spectrum received by the camera. In a first version, the energy spectrum from the camera gives directly an image on the color tube. In a second version the energy spectrum, after having been transformed into digital signals, is first sent into a memory, then into a computer used as a spectrum analyzer, and finally into the color display device [fr

  18. Project of decree relative to the licensing and statement system of nuclear activities and to their control and bearing various modifications of the public health code and working code; Projet de decret relatif au regime d'autorisation et de declaration des activites nucleaires et a leur controle et portant diverses modifications du code de la sante publique et du code du travail

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    This decree concerns the control of high level sealed radioactive sources and orphan sources. It has for objective to introduce administrative simplification, especially the radiation sources licensing and statement system, to reinforce the control measures planed by the public health code and by the employment code, to bring precision and complements in the editing of several already existing arrangements. (N.C.)

  19. A Unique Perspective on Data Coding and Decoding

    Directory of Open Access Journals (Sweden)

    Wen-Yan Wang

    2010-12-01

    Full Text Available The concept of a loss-less data compression coding method is proposed, and a detailed description of each of its steps follows. Using the Calgary Corpus and Wikipedia data as the experimental samples and compared with existing algorithms, like PAQ or PPMstr, the new coding method could not only compress the source data, but also further re-compress the data produced by the other compression algorithms. The final files are smaller, and by comparison with the original compression ratio, at least 1% redundancy could be eliminated. The new method is simple and easy to realize. Its theoretical foundation is currently under study. The corresponding Matlab source code is provided in  the Appendix.

  20. 3D video coding: an overview of present and upcoming standards

    Science.gov (United States)

    Merkle, Philipp; Müller, Karsten; Wiegand, Thomas

    2010-07-01

    An overview of existing and upcoming 3D video coding standards is given. Various different 3D video formats are available, each with individual pros and cons. The 3D video formats can be separated into two classes: video-only formats (such as stereo and multiview video) and depth-enhanced formats (such as video plus depth and multiview video plus depth). Since all these formats exist of at least two video sequences and possibly additional depth data, efficient compression is essential for the success of 3D video applications and technologies. For the video-only formats the H.264 family of coding standards already provides efficient and widely established compression algorithms: H.264/AVC simulcast, H.264/AVC stereo SEI message, and H.264/MVC. For the depth-enhanced formats standardized coding algorithms are currently being developed. New and specially adapted coding approaches are necessary, as the depth or disparity information included in these formats has significantly different characteristics than video and is not displayed directly, but used for rendering. Motivated by evolving market needs, MPEG has started an activity to develop a generic 3D video standard within the 3DVC ad-hoc group. Key features of the standard are efficient and flexible compression of depth-enhanced 3D video representations and decoupling of content creation and display requirements.

  1. Burnup calculation code system COMRAD96

    International Nuclear Information System (INIS)

    Suyama, Kenya; Masukawa, Fumihiro; Ido, Masaru; Enomoto, Masaki; Takyu, Shuiti; Hara, Toshiharu.

    1997-06-01

    COMRAD was one of the burnup code system developed by JAERI. COMRAD96 is a transfered version of COMRAD to Engineering Work Station. It is divided to several functional modules, 'Cross Section Treatment', 'Generation and Depletion Calculation', and 'Post Process'. It enables us to analyze a burnup problem considering a change of neutron spectrum using UNITBURN. Also it can display the γ Spectrum on a terminal. This report is the general description and user's manual of COMRAD96. (author)

  2. Effect of spatial coherence of LED sources on image resolution in holographic displays

    NARCIS (Netherlands)

    Pourreza Ghoushchi, Vahid; Aas, Mehdi; Ulusoy, Erdem; Ürey, Hakan

    2017-01-01

    Holographic Displays (HDs) provide 3D images with all natural depth cues via computer generated holograms (CGHs) implemented on spatial light modulators (SLMs). HDs are coherent light processing systems based on interference and diffraction, thus they generally use laser light. However, laser

  3. Sound localization with head movement: implications for 3-d audio displays.

    Directory of Open Access Journals (Sweden)

    Ken Ian McAnally

    2014-08-01

    Full Text Available Previous studies have shown that the accuracy of sound localization is improved if listeners are allowed to move their heads during signal presentation. This study describes the function relating localization accuracy to the extent of head movement in azimuth. Sounds that are difficult to localize were presented in the free field from sources at a wide range of azimuths and elevations. Sounds remained active until the participants’ heads had rotated through windows ranging in width of 2°, 4°, 8°, 16°, 32°, or 64° of azimuth. Error in determining sound-source elevation and the rate of front/back confusion were found to decrease with increases in azimuth window width. Error in determining sound-source lateral angle was not found to vary with azimuth window width. Implications for 3-d audio displays: The utility of a 3-d audio display for imparting spatial information is likely to be improved if operators are able to move their heads during signal presentation. Head movement may compensate in part for a paucity of spectral cues to sound-source location resulting from limitations in either the audio signals presented or the directional filters (i.e., head-related transfer functions used to generate a display. However, head movements of a moderate size (i.e., through around 32° of azimuth may be required to ensure that spatial information is conveyed with high accuracy.

  4. Effect of display location on control-display stereotype strength for translational and rotational controls with linear displays.

    Science.gov (United States)

    Chan, Alan H S; Hoffmann, Errol R

    2015-01-01

    Experiments were designed to investigate the effects of control type and display location, relative to the operator, on the strength of control/display stereotypes. The Worringham and Beringer Visual Field principle and an extension of this principle for rotary controls (Hoffmann E.R., and Chan A.H.S. 2013). "The Worringham and Beringer 'Visual Field' Principle for Rotary Controls. Ergonomics." 56 (10): 1620-1624) indicated that, for a number of different control types (rotary and lever) on different planes, there should be no significant effect of the display location relative to the seated operator. Past data were surveyed and stereotype strengths listed. Experiments filled gaps where data are not available. Six different control types and seven display locations were used, as in the Frame of Reference Transformation Tool (FORT) model of Wickens et al. (Wickens, C.D., Keller, J.W., and Small, R.L. (2010). "Left. No, Right! Development of the Frame of Reference Transformation Tool (FORT)." Proceedings of the Human Factors and Ergonomics Society 54th Annual Meeting September 2010, 54: 1022-1026). Control/display arrangements with high stereotype strengths were evaluated yielding data for designers of complex control/display arrangements where the control and display are in different planes and for where the operator is moving. It was found possible to predict display/control arrangements with high stereotype strength, based on past data. Practitioner Summary: Controls and displays in complex arrangements need to have high compatibility. These experiments provide arrangements for six different controls (rotary and translational) and seven different display locations relative to the operator.

  5. Improved algorithm for surface display from volumetric data

    International Nuclear Information System (INIS)

    Lobregt, S.; Schaars, H.W.G.K.; OpdeBeek, J.C.A.; Zonneveld, F.W.

    1988-01-01

    A high-resolution surface display is produced from three-dimensional datasets (computed tomography or magnetic resonance imaging). Unlike other voxel-based methods, this algorithm does not show a cuberille surface structure, because the surface orientation is calculated from original gray values. The applied surface shading is a function of local orientation and position of the surface and of a virtual light source, giving a realistic impression of the surface of bone and soft tissue. The projection and shading are table driven, combining variable viewpoint and illumination conditions with speed. Other options are cutplane gray-level display and surface transparency. Combined with volume scanning, this algorithm offers powerful application possibilities

  6. Orbit Display's Use of the Physics Application Framework

    International Nuclear Information System (INIS)

    Zelazny, Michael

    2009-01-01

    At the SLAC National Accelerator Laboratory (SLAC) the Controls Department (CD) is developing a physics application framework based on the Java(tm) programming language developed by Sun Microsystems. This paper will discuss the first application developed using this approach: a new Orbit Display. The software is being developed by several individuals in reusable Java packages. It relies on the Experimental Physics and Industrial Control System (EPICS) toolkit for data collection and XAL - A Java based Hierarchy for Application Programming for model parameters. The Orbit Display tracks and displays electron paths through the Linac Coherent Light Source (LCLS) in both a graphical, beam line plot, and tabular format. It contains many features that may be unique to SLAC and is meant to be used both in the control room and by individuals in their offices or at home. Unique features include BSA Beam Synchronous Acquisition (BSA), Orbit Fitting, and Buffered Acquisition.

  7. Displays in scintigraphy

    International Nuclear Information System (INIS)

    Todd-Pokropek, A.E.; Pizer, S.M.

    1977-01-01

    Displays have several functions: to transmit images, to permit interaction, to quantitate features and to provide records. The main characteristics of displays used for image transmission are their resolution, dynamic range, signal-to-noise ratio and uniformity. Considerations of visual acuity suggest that the display element size should be much less than the data element size, and in current practice at least 256X256 for a gamma camera image. The dynamic range for image transmission should be such that at least 64 levels of grey (or equivalent) are displayed. Scanner displays are also considered, and in particular, the requirements of a whole-body camera are examined. A number of display systems and devices are presented including a 'new' heated object colour display system. Interaction with displays is considered, including background subtraction, contrast enhancement, position indication and region-of-interest generation. Such systems lead to methods of quantitation, which imply knowledge of the expected distributions. Methods for intercomparing displays are considered. Polaroid displays, which have for so long dominated the field, are in the process of being replaced by stored image displays, now that large cheap memories exist which give an equivalent image quality. The impact of this in nuclear medicine is yet to be seen, but a major effect will be to enable true quantitation. (author)

  8. Mathematical models and illustrative results for the RINGBEARER II monopole/dipole beam-propagation code

    International Nuclear Information System (INIS)

    Chambers, F.W.; Masamitsu, J.A.; Lee, E.P.

    1982-01-01

    RINGBEARER II is a linearized monopole/dipole particle simulation code for studying intense relativistic electron beam propagation in gas. In this report the mathematical models utilized for beam particle dynamics and pinch field computation are delineated. Difficulties encountered in code operations and some remedies are discussed. Sample output is presented detailing the diagnostics and the methods of display and analysis utilized

  9. New code of conduct

    CERN Multimedia

    Laëtitia Pedroso

    2010-01-01

    During his talk to the staff at the beginning of the year, the Director-General mentioned that a new code of conduct was being drawn up. What exactly is it and what is its purpose? Anne-Sylvie Catherin, Head of the Human Resources (HR) Department, talked to us about the whys and wherefores of the project.   Drawing by Georges Boixader from the cartoon strip “The World of Particles” by Brian Southworth. A code of conduct is a general framework laying down the behaviour expected of all members of an organisation's personnel. “CERN is one of the very few international organisations that don’t yet have one", explains Anne-Sylvie Catherin. “We have been thinking about introducing a code of conduct for a long time but lacked the necessary resources until now”. The call for a code of conduct has come from different sources within the Laboratory. “The Equal Opportunities Advisory Panel (read also the "Equal opportuni...

  10. Development of NPTC-11 intelligence control instrument with digital display

    International Nuclear Information System (INIS)

    Wang Chengming; Pu Li; Yu Jiang; Xue Yuping; Zhang Bo; Chen Yong

    2007-01-01

    The accurate of the process control gauge has direct influence on the safe operation of nuclear power plants. Therefore it is necessary to accumulate experiences for the domestic development of this Instrument. In this paper, NPTC-11 intelligence control Instrument with digital display is developed based on the design code for nuclear Instrument, considering the actual application requirements and technical redundancy. Its application in nuclear power plant for almost one year indicates that this Instrument satisfies the development purpose and requirements. (authors)

  11. Micro Computer Feedback Report for the Strategic Leader Development Inventory; Source Code

    Science.gov (United States)

    1994-03-01

    SEL5 ;exit if error CALL SELZCT SCRZEN ;display select screen JC SEL4 ;no files in directory .------- display the files NOV BX, [BarPos] ;starting...SEL2 ;if not goto next test imp SEL4 ; Ecit SEL2: CUP AL,ODh ;in it a pick ? 3Z SEL3 ;if YES exit loop ------- see if an active control key was...file CALL READCOMFIG eread file into memory JC SEL5 ;exit to main menu CALL OPEN DATA FILE ;is data arailable? SEL4 : CALL RELEASE_ _MDR ;release mom

  12. Statistical analysis and data display an intermediate course with examples in R

    CERN Document Server

    Heiberger, Richard M

    2015-01-01

    This contemporary presentation of statistical methods features extensive use of graphical displays for exploring data and for displaying the analysis. The authors demonstrate how to analyze data—showing code, graphics, and accompanying tabular listings—for all the methods they cover. They emphasize how to construct and interpret graphs. They discuss principles of graphical design. They identify situations where visual impressions from graphs may need confirmation from traditional tabular results. All chapters have exercises. The authors provide and discuss R functions for all the new graphical display formats. All graphs and tabular output in the book were constructed using these functions. Complete R scripts for all examples and figures are provided for readers to use as models for their own analyses. This book can serve as a standalone text for statistics majors at the master’s level and for other quantitatively oriented disciplines at the doctoral level, and as a reference book for researchers. In-de...

  13. The RETRAN-03 computer code

    International Nuclear Information System (INIS)

    Paulsen, M.P.; McFadden, J.H.; Peterson, C.E.; McClure, J.A.; Gose, G.C.; Jensen, P.J.

    1991-01-01

    The RETRAN-03 code development effort is designed to overcome the major theoretical and practical limitations associated with the RETRAN-02 computer code. The major objectives of the development program are to extend the range of analyses that can be performed with RETRAN, to make the code more dependable and faster running, and to have a more transportable code. The first two objectives are accomplished by developing new models and adding other models to the RETRAN-02 base code. The major model additions for RETRAN-03 are as follows: implicit solution methods for the steady-state and transient forms of the field equations; additional options for the velocity difference equation; a new steady-state initialization option for computer low-power steam generator initial conditions; models for nonequilibrium thermodynamic conditions; and several special-purpose models. The source code and the environmental library for RETRAN-03 are written in standard FORTRAN 77, which allows the last objective to be fulfilled. Some models in RETRAN-02 have been deleted in RETRAN-03. In this paper the changes between RETRAN-02 and RETRAN-03 are reviewed

  14. LFSC - Linac Feedback Simulation Code

    International Nuclear Information System (INIS)

    Ivanov, Valentin; Fermilab

    2008-01-01

    The computer program LFSC ( ) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output

  15. A deviation display method for visualising data in mobile gamma-ray spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Kock, Peder, E-mail: Peder.Kock@med.lu.s [Department of Medical Radiation Physics, Clinical Sciences, Lund University, University Hospital, SE-221 85 Lund (Sweden); Finck, Robert R. [Swedish Radiation Protection Authority, SE-171 16 Stockholm (Sweden); Nilsson, Jonas M.C.; Ostlund, Karl; Samuelsson, Christer [Department of Medical Radiation Physics, Clinical Sciences, Lund University, University Hospital, SE-221 85 Lund (Sweden)

    2010-09-15

    A real time visualisation method, to be used in mobile gamma-spectrometric search operations using standard detector systems is presented. The new method, called deviation display, uses a modified waterfall display to present relative changes in spectral data over energy and time. Using unshielded {sup 137}Cs and {sup 241}Am point sources and different natural background environments, the behaviour of the deviation displays is demonstrated and analysed for two standard detector types (NaI(Tl) and HPGe). The deviation display enhances positive significant changes while suppressing the natural background fluctuations. After an initialisation time of about 10 min this technique leads to a homogeneous display dominated by the background colour, where even small changes in spectral data are easy to discover. As this paper shows, the deviation display method works well for all tested gamma energies and natural background radiation levels and with both tested detector systems.

  16. A deviation display method for visualising data in mobile gamma-ray spectrometry

    International Nuclear Information System (INIS)

    Kock, Peder; Finck, Robert R.; Nilsson, Jonas M.C.; Ostlund, Karl; Samuelsson, Christer

    2010-01-01

    A real time visualisation method, to be used in mobile gamma-spectrometric search operations using standard detector systems is presented. The new method, called deviation display, uses a modified waterfall display to present relative changes in spectral data over energy and time. Using unshielded 137 Cs and 241 Am point sources and different natural background environments, the behaviour of the deviation displays is demonstrated and analysed for two standard detector types (NaI(Tl) and HPGe). The deviation display enhances positive significant changes while suppressing the natural background fluctuations. After an initialisation time of about 10 min this technique leads to a homogeneous display dominated by the background colour, where even small changes in spectral data are easy to discover. As this paper shows, the deviation display method works well for all tested gamma energies and natural background radiation levels and with both tested detector systems.

  17. Remote Software Application and Display Development

    Science.gov (United States)

    Sanders, Brandon T.

    2014-01-01

    The era of the shuttle program has come to an end, but only to give rise to newer and more exciting projects. Now is the time of the Orion spacecraft, a work of art designed to exceed all previous endeavors of man. NASA is exiting the time of exploration and is entering a new period, a period of pioneering. With this new mission, many of NASAs organizations must undergo a great deal of change and development to support the Orion missions. The Spaceport Command and Control System (SCCS) is the new system that will provide NASA the ability to launch rockets into orbit and thus control Orion and other spacecraft as the goal of populating Mars becomes ever increasingly tangible. Since the previous control system, Launch Processing System (LPS), was primarily designed to launch the shuttles, SCCS was needed as Kennedy Space Center (KSC) reorganized to a multiuser spaceport for commercial flights, providing a more versatile control over rockets. Within SCCS, is the Launch Control System (LCS), which is the remote software behind the command and monitoring of flight and ground system hardware. This internship at KSC has involved two main components in LCS, including Remote Software Application and Display development. The display environment provides a graphical user interface for an operator to view and see if any cautions are raised, while the remote applications are the backbone that communicate with hardware, and then relay the data back to the displays. These elements go hand in hand as they provide monitoring and control over hardware and software alike from the safety of the Launch Control Center. The remote software applications are written in Application Control Language (ACL), which must undergo unit testing to ensure data integrity. This paper describes both the implementation and writing of unit tests in ACL code for remote software applications, as well as the building of remote displays to be used in the Launch Control Center (LCC).

  18. Status of the ASTEC integral code

    International Nuclear Information System (INIS)

    Van Dorsselaere, J.P.; Jacq, F.; Allelein, H.J.

    2000-01-01

    The ASTEC (Accident Source Term Evaluation Code) integrated code is developed since 1997 in close collaboration by IPSN and GRS to predict an entire LWR severe accident sequence from the initiating event up to Fission Product (FP) release out of the containment. The applications of such a code are source term determination studies, scenario evaluations, accident management studies and Probabilistic Safety Assessment level 2 (PSA-2) studies. The version V0 of ASTEC is based on the RCS modules of the ESCADRE integrated code (IPSN) and on the upgraded RALOC and FIPLOC codes (GRS) for containment thermalhydraulics and aerosol behaviour. The latest version V0.2 includes the general feed-back from the overall validation performed in 1998 (25 separate-effect experiments, PHEBUS.FP FPT1 integrated experiment), some modelling improvements (i.e. silver-iodine reactions in the containment sump), and the implementation of the main safety systems for Severe Accident Management. Several reactor-applications are under way on French and German PWR, and on VVER-1000, all with a multi-compartment configuration of the containment. The total IPSN-GRS manpower involved in ASTEC project is today about 20 men/year. The main evolution of the next version V1, foreseen end of 2001, concerns the integration of the front-end phase and the improvement of the in-vessel degradation late-phase modelling. (author)

  19. Monocular display unit for 3D display with correct depth perception

    Science.gov (United States)

    Sakamoto, Kunio; Hosomi, Takashi

    2009-11-01

    A study of virtual-reality system has been popular and its technology has been applied to medical engineering, educational engineering, a CAD/CAM system and so on. The 3D imaging display system has two types in the presentation method; one is a 3-D display system using a special glasses and the other is the monitor system requiring no special glasses. A liquid crystal display (LCD) recently comes into common use. It is possible for this display unit to provide the same size of displaying area as the image screen on the panel. A display system requiring no special glasses is useful for a 3D TV monitor, but this system has demerit such that the size of a monitor restricts the visual field for displaying images. Thus the conventional display can show only one screen, but it is impossible to enlarge the size of a screen, for example twice. To enlarge the display area, the authors have developed an enlarging method of display area using a mirror. Our extension method enables the observers to show the virtual image plane and to enlarge a screen area twice. In the developed display unit, we made use of an image separating technique using polarized glasses, a parallax barrier or a lenticular lens screen for 3D imaging. The mirror can generate the virtual image plane and it enlarges a screen area twice. Meanwhile the 3D display system using special glasses can also display virtual images over a wide area. In this paper, we present a monocular 3D vision system with accommodation mechanism, which is useful function for perceiving depth.

  20. Scanning laser beam displays based on a 2D MEMS

    Science.gov (United States)

    Niesten, Maarten; Masood, Taha; Miller, Josh; Tauscher, Jason

    2010-05-01

    The combination of laser light sources and MEMS technology enables a range of display systems such as ultra small projectors for mobile devices, head-up displays for vehicles, wearable near-eye displays and projection systems for 3D imaging. Images are created by scanning red, green and blue lasers horizontally and vertically with a single two-dimensional MEMS. Due to the excellent beam quality of laser beams, the optical designs are efficient and compact. In addition, the laser illumination enables saturated display colors that are desirable for augmented reality applications where a virtual image is used. With this technology, the smallest projector engine for high volume manufacturing to date has been developed. This projector module has a height of 7 mm and a volume of 5 cc. The resolution of this projector is WVGA. No additional projection optics is required, resulting in an infinite focus depth. Unlike with micro-display projection displays, an increase in resolution will not lead to an increase in size or a decrease in efficiency. Therefore future projectors can be developed that combine a higher resolution in an even smaller and thinner form factor with increased efficiencies that will lead to lower power consumption.

  1. TRIPOLI-4: Monte Carlo transport code functionalities and applications; TRIPOLI-4: code de transport Monte Carlo fonctionnalites et applications

    Energy Technology Data Exchange (ETDEWEB)

    Both, J P; Lee, Y K; Mazzolo, A; Peneliau, Y; Petit, O; Roesslinger, B [CEA Saclay, Dir. de l' Energie Nucleaire (DEN), Service d' Etudes de Reacteurs et de Modelisation Avancee, 91 - Gif sur Yvette (France)

    2003-07-01

    Tripoli-4 is a three dimensional calculations code using the Monte Carlo method to simulate the transport of neutrons, photons, electrons and positrons. This code is used in four application fields: the protection studies, the criticality studies, the core studies and the instrumentation studies. Geometry, cross sections, description of sources, principle. (N.C.)

  2. HTML 5 Displays for On-Board Flight Systems

    Science.gov (United States)

    Silva, Chandika

    2016-01-01

    During my Internship at NASA in the summer of 2016, I was assigned to a project which dealt with developing a web-server that would display telemetry and other system data using HTML 5, JavaScript, and CSS. By doing this, it would be possible to view the data across a variety of screen sizes, and establish a standard that could be used to simplify communication and software development between NASA and other countries. Utilizing a web- approach allowed us to add in more functionality, as well as make the displays more aesthetically pleasing for the users. When I was assigned to this project my main task was to first establish communication with the current display server. This display server would output data from the on-board systems in XML format. Once communication was established I was then asked to create a dynamic telemetry table web page that would update its header and change as new information came in. After this was completed, certain minor functionalities were added to the table such as a hide column and filter by system option. This was more for the purpose of making the table more useful for the users, as they can now filter and view relevant data. Finally my last task was to create a graphical system display for all the systems on the space craft. This was by far the most challenging part of my internship as finding a JavaScript library that was both free and contained useful functions to assist me in my task was difficult. In the end I was able to use the JointJs library and accomplish the task. With the help of my mentor and the HIVE lab team, we were able to establish stable communication with the display server. We also succeeded in creating a fully dynamic telemetry table and in developing a graphical system display for the advanced modular power system. Working in JSC for this internship has taught me a lot about coding in JavaScript and HTML 5. I was also introduced to the concept of developing software as a team, and exposed to the different

  3. SPIDERMAN: an open-source code to model phase curves and secondary eclipses

    Science.gov (United States)

    Louden, Tom; Kreidberg, Laura

    2018-03-01

    We present SPIDERMAN (Secondary eclipse and Phase curve Integrator for 2D tempERature MAppiNg), a fast code for calculating exoplanet phase curves and secondary eclipses with arbitrary surface brightness distributions in two dimensions. Using a geometrical algorithm, the code solves exactly the area of sections of the disc of the planet that are occulted by the star. The code is written in C with a user-friendly Python interface, and is optimised to run quickly, with no loss in numerical precision. Approximately 1000 models can be generated per second in typical use, making Markov Chain Monte Carlo analyses practicable. The modular nature of the code allows easy comparison of the effect of multiple different brightness distributions for the dataset. As a test case we apply the code to archival data on the phase curve of WASP-43b using a physically motivated analytical model for the two dimensional brightness map. The model provides a good fit to the data; however, it overpredicts the temperature of the nightside. We speculate that this could be due to the presence of clouds on the nightside of the planet, or additional reflected light from the dayside. When testing a simple cloud model we find that the best fitting model has a geometric albedo of 0.32 ± 0.02 and does not require a hot nightside. We also test for variation of the map parameters as a function of wavelength and find no statistically significant correlations. SPIDERMAN is available for download at https://github.com/tomlouden/spiderman.

  4. SPIDERMAN: an open-source code to model phase curves and secondary eclipses

    Science.gov (United States)

    Louden, Tom; Kreidberg, Laura

    2018-06-01

    We present SPIDERMAN (Secondary eclipse and Phase curve Integrator for 2D tempERature MAppiNg), a fast code for calculating exoplanet phase curves and secondary eclipses with arbitrary surface brightness distributions in two dimensions. Using a geometrical algorithm, the code solves exactly the area of sections of the disc of the planet that are occulted by the star. The code is written in C with a user-friendly Python interface, and is optimized to run quickly, with no loss in numerical precision. Approximately 1000 models can be generated per second in typical use, making Markov Chain Monte Carlo analyses practicable. The modular nature of the code allows easy comparison of the effect of multiple different brightness distributions for the data set. As a test case, we apply the code to archival data on the phase curve of WASP-43b using a physically motivated analytical model for the two-dimensional brightness map. The model provides a good fit to the data; however, it overpredicts the temperature of the nightside. We speculate that this could be due to the presence of clouds on the nightside of the planet, or additional reflected light from the dayside. When testing a simple cloud model, we find that the best-fitting model has a geometric albedo of 0.32 ± 0.02 and does not require a hot nightside. We also test for variation of the map parameters as a function of wavelength and find no statistically significant correlations. SPIDERMAN is available for download at https://github.com/tomlouden/spiderman.

  5. European inter-comparison of Monte Carlo codes users for the uncertainty calculation of the kerma in air beside a caesium-137 source; Intercomparaison europeenne d'utilisateurs de codes monte carlo pour le calcul d'incertitudes sur le kerma dans l'air aupres d'une source de cesium-137

    Energy Technology Data Exchange (ETDEWEB)

    De Carlan, L.; Bordy, J.M.; Gouriou, J. [CEA Saclay, LIST, Laboratoire National Henri Becquerel, Laboratoire de Metrologie de la Dose 91 - Gif-sur-Yvette (France)

    2010-07-01

    Within the frame of the CONRAD European project (Coordination Network for Radiation Dosimetry), and more precisely within a work group paying attention to uncertainty assessment in computational dosimetry and aiming at comparing different approaches, the authors report the simulation of an irradiator containing a caesium 137 source to calculate the kerma in air as well as its uncertainty due to different parameters. They present the problem geometry, recall the studied issues (kerma uncertainty, influence of capsule source, influence of the collimator, influence of the air volume surrounding the source). They indicate the codes which have been used (MNCP, Fluka, Penelope, etc.) and discuss the obtained results for the first issue

  6. Realization of diverse displays for multiple color patterns on metal surfaces

    International Nuclear Information System (INIS)

    Li, Guoqiang; Li, Jiawen; Hu, Yanlei; Zhang, Chenchu; Li, Xiaohong; Chu, Jiaru; Huang, Wenhao

    2014-01-01

    Highlights: • We have demonstrated that the combined influence of incident white light angle and the ripples orientation on the diversity of structural colors. • Our investigation revealed that multi-patterns constituted by ripples with different orientations could be precisely designed on metal surfaces. • The diverse display for the desired ones can be realized by exquisitely varying the incident light angle and rotating sample angle. - Abstract: Enhanced colors can be formed when white light is irradiated on the surface ripples induced by femtosecond laser. In this paper, we have demonstrated the ability to display the diverse colors by simultaneously adjusting the incident white light angle and the ripples orientation. Furthermore, our investigation revealed that multi-patterns constituted by ripples with different orientations could be designed on metal surfaces. The diverse display for the desired ones can be realized by exquisitely varying the incident light angle and rotating sample angle. More interestingly, it is found that, although the same patterns could be displayed under different conditions, the colors might be different. These findings can provide a novel method to carry and identify high quantity of information, which may find potential applications in the fields of information storage, identifying codes and anti-counterfeiting patterns

  7. ArraySolver: an algorithm for colour-coded graphical display and Wilcoxon signed-rank statistics for comparing microarray gene expression data.

    Science.gov (United States)

    Khan, Haseeb Ahmad

    2004-01-01

    The massive surge in the production of microarray data poses a great challenge for proper analysis and interpretation. In recent years numerous computational tools have been developed to extract meaningful interpretation of microarray gene expression data. However, a convenient tool for two-groups comparison of microarray data is still lacking and users have to rely on commercial statistical packages that might be costly and require special skills, in addition to extra time and effort for transferring data from one platform to other. Various statistical methods, including the t-test, analysis of variance, Pearson test and Mann-Whitney U test, have been reported for comparing microarray data, whereas the utilization of the Wilcoxon signed-rank test, which is an appropriate test for two-groups comparison of gene expression data, has largely been neglected in microarray studies. The aim of this investigation was to build an integrated tool, ArraySolver, for colour-coded graphical display and comparison of gene expression data using the Wilcoxon signed-rank test. The results of software validation showed similar outputs with ArraySolver and SPSS for large datasets. Whereas the former program appeared to be more accurate for 25 or fewer pairs (n < or = 25), suggesting its potential application in analysing molecular signatures that usually contain small numbers of genes. The main advantages of ArraySolver are easy data selection, convenient report format, accurate statistics and the familiar Excel platform.

  8. Noise Residual Learning for Noise Modeling in Distributed Video Coding

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Forchhammer, Søren

    2012-01-01

    Distributed video coding (DVC) is a coding paradigm which exploits the source statistics at the decoder side to reduce the complexity at the encoder. The noise model is one of the inherently difficult challenges in DVC. This paper considers Transform Domain Wyner-Ziv (TDWZ) coding and proposes...

  9. Optical characterization of display screens by speckle patterns

    Science.gov (United States)

    Pozo, Antonio M.; Castro, José J.; Rubiño, Manuel

    2013-10-01

    In recent years, flat-panel display (FPD) technology has undergone great development, and now FPDs appear in many devices. A significant element in FPD manufacturing is the display front surface. Manufacturers sell FPDs with different types of front surfaces, which can be matte (also called anti-glare) or glossy screens. Users who prefer glossy screens consider these displays to show more vivid colors compared with matte-screen displays. However, on the glossy screens, external light sources may cause unpleasant reflections that can be reduced by a matte treatment in the front surface. In this work, we present a method to characterize FPD screens using laser-speckle patterns. We characterize three FPDs: a Samsung XL2370 LCD monitor of 23 in. with matte screen, a Toshiba Satellite A100 LCD laptop of 15.4 in. with glossy screen, and a Grammata Papyre 6.1 electronic book reader of 6 in. with ePaper screen (E-ink technology). The results show great differences in speckle-contrast values for the three screens characterized and, therefore, this work shows the feasibility of this method for characterizing and comparing FPDs that have different types of front surfaces.

  10. Investigation of some possible changes in Am-Be neutron source configuration in order to increase the thermal neutron flux using Monte Carlo code

    Science.gov (United States)

    Basiri, H.; Tavakoli-Anbaran, H.

    2018-01-01

    Am-Be neutrons source is based on (α, n) reaction and generates neutrons in the energy range of 0-11 MeV. Since the thermal neutrons are widely used in different fields, in this work, we investigate how to improve the source configuration in order to increase the thermal flux. These suggested changes include a spherical moderator instead of common cylindrical geometry, a reflector layer and an appropriate materials selection in order to achieve the maximum thermal flux. All calculations were done by using MCNP1 Monte Carlo code. Our final results indicated that a spherical paraffin moderator, a layer of beryllium as a reflector can efficiently increase the thermal neutron flux of Am-Be source.

  11. Code of practice : safe use of ionizing radiation

    International Nuclear Information System (INIS)

    1988-07-01

    Ionizing radiation is used extensively in the field of scientific research. The risk of uncontrolled exposure to both the worker and the environment is ever present. The purpose of this Code is to set out practices considered by the CSIRO Health and Safety Committee to be appropriate for CSIRO staff and, if followed, they will result in appropriate protection for research staff and the environment. The Code does not cover sources of non-ionizing radiation such as microwave ovens, RF generators and laser sources

  12. Project of decree relative to the licensing and statement system of nuclear activities and to their control and bearing various modifications of the public health code and working code

    International Nuclear Information System (INIS)

    2005-01-01

    This decree concerns the control of high level sealed radioactive sources and orphan sources. It has for objective to introduce administrative simplification, especially the radiation sources licensing and statement system, to reinforce the control measures planed by the public health code and by the employment code, to bring precision and complements in the editing of several already existing arrangements. (N.C.)

  13. Burnup calculation code system COMRAD96

    Energy Technology Data Exchange (ETDEWEB)

    Suyama, Kenya [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Masukawa, Fumihiro; Ido, Masaru; Enomoto, Masaki; Takyu, Shuiti; Hara, Toshiharu

    1997-06-01

    COMRAD was one of the burnup code system developed by JAERI. COMRAD96 is a transfered version of COMRAD to Engineering Work Station. It is divided to several functional modules, `Cross Section Treatment`, `Generation and Depletion Calculation`, and `Post Process`. It enables us to analyze a burnup problem considering a change of neutron spectrum using UNITBURN. Also it can display the {gamma} Spectrum on a terminal. This report is the general description and user`s manual of COMRAD96. (author)

  14. An improvement of estimation method of source term to the environment for interfacing system LOCA for typical PWR using MELCOR code

    Energy Technology Data Exchange (ETDEWEB)

    Han, Seok Jung; Kim, Tae Woon; Ahn, Kwang Il [Risk and Environmental Safety Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2017-06-15

    Interfacing-system loss-of-coolant-accident (ISLOCA) has been identified as the most hazardous accident scenario in the typical PWR plants. The present study as an effort to improve the knowledge of the source term to the environment during ISLOCA focuses on an improvement of the estimation method. The improvement was performed to take into account an effect of broken pipeline and auxiliary building structures relevant to ISLOCA. An estimation of the source term to the environment was for the OPR-1000 plants by MELOCR code version 1.8.6. The key features of the source term showed that the massive amount of fission products departed from the beginning of core degradation to the vessel breach. The release amount of fission products may be affected by the broken pipeline and the auxiliary building structure associated with release pathway.

  15. The intercomparison of aerosol codes

    International Nuclear Information System (INIS)

    Dunbar, I.H.; Fermandjian, J.; Gauvain, J.

    1988-01-01

    The behavior of aerosols in a reactor containment vessel following a severe accident could be an important determinant of the accident source term to the environment. Various processes result in the deposition of the aerosol onto surfaces within the containment, from where they are much less likely to be released. Some of these processes are very sensitive to particle size, so it is important to model the aerosol growth processes: agglomeration and condensation. A number of computer codes have been written to model growth and deposition processes. They have been tested against each other in a series of code comparison exercises. These exercises have investigated sensitivities to physical and numerical assumptions and have also proved a useful means of quality control for the codes. Various exercises in which code predictions are compared with experimental results are now under way

  16. MARMER, a flexible point-kernel shielding code

    International Nuclear Information System (INIS)

    Kloosterman, J.L.; Hoogenboom, J.E.

    1990-01-01

    A point-kernel shielding code entitled MARMER is described. It has several options with respect to geometry input, source description and detector point description which extend the flexibility and usefulness of the code, and which are especially useful in spent fuel shielding. MARMER has been validated using the TN12 spent fuel shipping cask benchmark. (author)

  17. MARMER, a flexible point-kernel shielding code

    Energy Technology Data Exchange (ETDEWEB)

    Kloosterman, J.L.; Hoogenboom, J.E. (Interuniversitair Reactor Inst., Delft (Netherlands))

    1990-01-01

    A point-kernel shielding code entitled MARMER is described. It has several options with respect to geometry input, source description and detector point description which extend the flexibility and usefulness of the code, and which are especially useful in spent fuel shielding. MARMER has been validated using the TN12 spent fuel shipping cask benchmark. (author).

  18. Codes, Ciphers, and Cryptography--An Honors Colloquium

    Science.gov (United States)

    Karls, Michael A.

    2010-01-01

    At the suggestion of a colleague, I read "The Code Book", [32], by Simon Singh to get a basic introduction to the RSA encryption scheme. Inspired by Singh's book, I designed a Ball State University Honors Colloquium in Mathematics for both majors and non-majors, with material coming from "The Code Book" and many other sources. This course became…

  19. Abstracts of digital computer code packages. Assembled by the Radiation Shielding Information Center. [Radiation transport codes

    Energy Technology Data Exchange (ETDEWEB)

    McGill, B.; Maskewitz, B.F.; Anthony, C.M.; Comolander, H.E.; Hendrickson, H.R.

    1976-01-01

    The term ''code package'' is used to describe a miscellaneous grouping of materials which, when interpreted in connection with a digital computer, enables the scientist--user to solve technical problems in the area for which the material was designed. In general, a ''code package'' consists of written material--reports, instructions, flow charts, listings of data, and other useful material and IBM card decks (or, more often, a reel of magnetic tape) on which the source decks, sample problem input (including libraries of data) and the BCD/EBCDIC output listing from the sample problem are written. In addition to the main code, and any available auxiliary routines are also included. The abstract format was chosen to give to a potential code user several criteria for deciding whether or not he wishes to request the code package. (RWR)

  20. Scintillator Based Coded-Aperture Imaging for Neutron Detection

    International Nuclear Information System (INIS)

    Hayes, Sean-C.; Gamage, Kelum-A-A.

    2013-06-01

    In this paper we are going to assess the variations of neutron images using a series of Monte Carlo simulations. We are going to study neutron images of the same neutron source with different source locations, using a scintillator based coded-aperture system. The Monte Carlo simulations have been conducted making use of the EJ-426 neutron scintillator detector. This type of detector has a low sensitivity to gamma rays and is therefore of particular use in a system with a source that emits a mixed radiation field. From the use of different source locations, several neutron images have been produced, compared both qualitatively and quantitatively for each case. This allows conclusions to be drawn on how suited the scintillator based coded-aperture neutron imaging system is to detecting various neutron source locations. This type of neutron imaging system can be easily used to identify and locate nuclear materials precisely. (authors)

  1. Balanced distributed coding of omnidirectional images

    Science.gov (United States)

    Thirumalai, Vijayaraghavan; Tosic, Ivana; Frossard, Pascal

    2008-01-01

    This paper presents a distributed coding scheme for the representation of 3D scenes captured by stereo omni-directional cameras. We consider a scenario where images captured from two different viewpoints are encoded independently, with a balanced rate distribution among the different cameras. The distributed coding is built on multiresolution representation and partitioning of the visual information in each camera. The encoder transmits one partition after entropy coding, as well as the syndrome bits resulting from the channel encoding of the other partition. The decoder exploits the intra-view correlation and attempts to reconstruct the source image by combination of the entropy-coded partition and the syndrome information. At the same time, it exploits the inter-view correlation using motion estimation between images from different cameras. Experiments demonstrate that the distributed coding solution performs better than a scheme where images are handled independently, and that the coding rate stays balanced between encoders.

  2. Evaluation of the FRAPCON-3 Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Jernkvist, Lars Olof; Massih, Ali [Quantum Technologies AB, Uppsala (Sweden)

    2002-03-01

    The FRAPCON-3 computer code has been evaluated with respect to its applicability, modeling capability, user friendliness, source code structure and supporting experimental database. The code is intended for thermo-mechanical analyses of light water reactor nuclear fuel rods under steady-state operational conditions and moderate power excursions. It is applicable to both boiling- and pressurized water reactor fuel rods with UO{sub 2} fuel, ranging up to about 65 MWd/kg U in rod average burnup. The models and numerical methods in FRAPCON-3 are relatively simple, which makes the code transparent and also fairly easy to modify and extend for the user. The fundamental equations for heat transfer, structural analysis and fuel fission gas release are solved in one-dimensional (radial) and stationary (time-independent) form, and interaction between axial segments of the rod is confined to calculations of coolant axial flow and rod internal gas pressure. The code is fairly easy to use; fuel rod design data and time histories of fuel rod power and coolant inlet conditions are input via a single text file, and the corresponding calculated variation with time of important fuel rod parameters are printed to a single output file in textual form. The results can also be presented in graphical form through an interface to the general graphics program xmgr. FRAPCON-3 also provides the possibility to export calculated results to the transient fuel rod analysis code FRAPTRAN, where the data can be used as burnup-dependent initial conditions to a postulated transient. Most of the source code to FRAPCON-3 is written in Fortran-IV, which is an archaic, non-standard dialect of the Fortran programming language. Since Fortran-IV is not accepted by all compilers for the latest standard of the language, Fortran-95, there is a risk that the source code must be partly rewritten in the future. Documentation of the code comprises (i) a general code description, which briefly presents models

  3. Evaluation of the FRAPCON-3 Computer Code

    International Nuclear Information System (INIS)

    Jernkvist, Lars Olof; Massih, Ali

    2002-03-01

    The FRAPCON-3 computer code has been evaluated with respect to its applicability, modeling capability, user friendliness, source code structure and supporting experimental database. The code is intended for thermo-mechanical analyses of light water reactor nuclear fuel rods under steady-state operational conditions and moderate power excursions. It is applicable to both boiling- and pressurized water reactor fuel rods with UO 2 fuel, ranging up to about 65 MWd/kg U in rod average burnup. The models and numerical methods in FRAPCON-3 are relatively simple, which makes the code transparent and also fairly easy to modify and extend for the user. The fundamental equations for heat transfer, structural analysis and fuel fission gas release are solved in one-dimensional (radial) and stationary (time-independent) form, and interaction between axial segments of the rod is confined to calculations of coolant axial flow and rod internal gas pressure. The code is fairly easy to use; fuel rod design data and time histories of fuel rod power and coolant inlet conditions are input via a single text file, and the corresponding calculated variation with time of important fuel rod parameters are printed to a single output file in textual form. The results can also be presented in graphical form through an interface to the general graphics program xmgr. FRAPCON-3 also provides the possibility to export calculated results to the transient fuel rod analysis code FRAPTRAN, where the data can be used as burnup-dependent initial conditions to a postulated transient. Most of the source code to FRAPCON-3 is written in Fortran-IV, which is an archaic, non-standard dialect of the Fortran programming language. Since Fortran-IV is not accepted by all compilers for the latest standard of the language, Fortran-95, there is a risk that the source code must be partly rewritten in the future. Documentation of the code comprises (i) a general code description, which briefly presents models

  4. The Los Alamos accelerator code group

    International Nuclear Information System (INIS)

    Krawczyk, F.L.; Billen, J.H.; Ryne, R.D.; Takeda, Harunori; Young, L.M.

    1995-01-01

    The Los Alamos Accelerator Code Group (LAACG) is a national resource for members of the accelerator community who use and/or develop software for the design and analysis of particle accelerators, beam transport systems, light sources, storage rings, and components of these systems. Below the authors describe the LAACG's activities in high performance computing, maintenance and enhancement of POISSON/SUPERFISH and related codes and the dissemination of information on the INTERNET

  5. In vitro evolution and affinity-maturation with Coliphage qβ display.

    Directory of Open Access Journals (Sweden)

    Claudia Skamel

    Full Text Available The Escherichia coli bacteriophage, Qβ (Coliphage Qβ, offers a favorable alternative to M13 for in vitro evolution of displayed peptides and proteins due to high mutagenesis rates in Qβ RNA replication that better simulate the affinity maturation processes of the immune response. We describe a benchtop in vitro evolution system using Qβ display of the VP1 G-H loop peptide of foot-and-mouth disease virus (FMDV. DNA encoding the G-H loop was fused to the A1 minor coat protein of Qβ resulting in a replication-competent hybrid phage that efficiently displayed the FMDV peptide. The surface-localized FMDV VP1 G-H loop cross-reacted with the anti-FMDV monoclonal antibody (mAb SD6 and was found to decorate the corners of the Qβ icosahedral shell by electron microscopy. Evolution of Qβ-displayed peptides, starting from fully degenerate coding sequences corresponding to the immunodominant region of VP1, allowed rapid in vitro affinity maturation to SD6 mAb. Qβ selected under evolutionary pressure revealed a non-canonical, but essential epitope for mAb SD6 recognition consisting of an Arg-Gly tandem pair. Finally, the selected hybrid phages induced polyclonal antibodies in guinea pigs with good affinity to both FMDV and hybrid Qβ-G-H loop, validating the requirement of the tandem pair epitope. Qβ-display emerges as a novel framework for rapid in vitro evolution with affinity-maturation to molecular targets.

  6. Instant website optimization for retina displays how-to

    CERN Document Server

    Larson, Kyle J

    2013-01-01

    Written in an accessible and practical manner which quickly imparts the knowledge you want to know. As a How-to book it will use applied examples and teach you to optimize websites for retina displays. This book is for web designers and developers who are familiar with HTML, CSS, and editing graphics who would like to improve their existing website or their next web project with high-resolution images. You'll need to have a high-definition device to be able to test the examples in this book and a server to upload your code to if you're not developing it on that device.

  7. Scalable-to-lossless transform domain distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Ukhanova, Ann; Veselov, Anton

    2010-01-01

    Distributed video coding (DVC) is a novel approach providing new features as low complexity encoding by mainly exploiting the source statistics at the decoder based on the availability of decoder side information. In this paper, scalable-tolossless DVC is presented based on extending a lossy Tran...... codec provides frame by frame encoding. Comparing the lossless coding efficiency, the proposed scalable-to-lossless TDWZ video codec can save up to 5%-13% bits compared to JPEG LS and H.264 Intra frame lossless coding and do so as a scalable-to-lossless coding....

  8. Improved side information generation for distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2008-01-01

    As a new coding paradigm, distributed video coding (DVC) deals with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. The performance of DVC highly depends on the quality of side information. With a better side...... information generation method, fewer bits will be requested from the encoder and more reliable decoded frames will be obtained. In this paper, a side information generation method is introduced to further improve the rate-distortion (RD) performance of transform domain distributed video coding. This algorithm...

  9. Modules of the SUMMA system for data readout to the oscillograph, digital display devices and digital printing

    International Nuclear Information System (INIS)

    Bushnin, Yu.B.; Denisenko, A.A.; Dunajtsev, A.F.; Rybakov, V.G.; Sytin, A.N.

    1975-01-01

    The modules of the ''Summa'' system are described which allow outputting of information to an oscilloscope, a digital tableau, and a digital printing mechanism; they are: a digital-analog converter, a converter that converts a binary code to a binary-decimal code, a digital display module, a block for outputting to a digital printing mechanism, and a block for stipulating the programs during information outputting. The block diagrams of the modules and the block diagram of the information-outputting programs are presented

  10. Human engineering guidelines for the evaluation and assessment of Video Display Units

    International Nuclear Information System (INIS)

    Gilmore, W.E.

    1985-07-01

    This report provides the Nuclear Regulatory Commission with a single source that documents known guidelines for conducting formal Human Factors evaluations of Video Display Units (VDUs). The handbook is a ''cookbook'' of acceptance guidelines for the reviewer faced with the task of evaluating VDUs already designed or planned for service in the control room. The areas addressed are video displays, controls, control/display integration, and workplace layout. Guidelines relevant to each of those areas are presented. The existence of supporting research is also indicated for each guideline. A Comment section and Method for Assessment section are provided for each set of guidelines

  11. An efficient CDMA decoder for correlated information sources

    International Nuclear Information System (INIS)

    Efraim, Hadar; Yacov, Nadav; Kanter, Ido; Shental, Ori

    2009-01-01

    We consider the detection of correlated information sources in the ubiquitous code-division multiple-access (CDMA) scheme. We propose a message-passing based scheme for detecting correlated sources directly, with no need for source coding. The detection is done simultaneously over a block of transmitted binary symbols (word). Simulation results are provided, demonstrating a substantial improvement in bit error rate in comparison with the unmodified detector and the alternative of source compression. The robustness of the error-performance improvement is shown under practical model settings, including wrong estimation of the generating Markov transition matrix and finite-length spreading codes

  12. A joint multi-view plus depth image coding scheme based on 3D-warping

    DEFF Research Database (Denmark)

    Zamarin, Marco; Zanuttigh, Pietro; Milani, Simone

    2011-01-01

    on the scene structure that can be effectively exploited to improve the performance of multi-view coding schemes. In this paper we introduce a novel coding architecture that replaces the inter-view motion prediction operation with a 3D warping approach based on depth information to improve the coding......Free viewpoint video applications and autostereoscopic displays require the transmission of multiple views of a scene together with depth maps. Current compression and transmission solutions just handle these two data streams as separate entities. However, depth maps contain key information...

  13. Screen-Display-Induced Photoresponse Mapping for Large-Area Photovoltaics

    DEFF Research Database (Denmark)

    Gupta, Ritu; Kiruthika, S.; Rao, K. D. M.

    2013-01-01

    As solar cell modules are becoming larger, it is important to pay attention to defects originating from the fabrication process and degradation during operation in the ambient. In this article, a simple method of using computer screen display as a light source to map the photoresponse of the solar...

  14. Fast Computation of Pulse Height Spectra Using SGRD Code

    Directory of Open Access Journals (Sweden)

    Humbert Philippe

    2017-01-01

    Full Text Available SGRD (Spectroscopy, Gamma rays, Rapid, Deterministic code is used for fast calculation of the gamma ray spectrum produced by a spherical shielded source and measured by a detector. The photon source lines originate from the radioactive decay of the unstable isotopes. The emission rate and spectrum of these primary sources are calculated using the DARWIN code. The leakage spectrum is separated in two parts, the uncollided component is transported by ray-tracing and the scattered component is calculated using a multigroup discrete ordinates method. The pulsed height spectrum is then simulated by folding the leakage spectrum with the detector response functions which are pre-calculated using MCNP5 code for each considered detector type. An application to the simulation of the gamma spectrum produced by a natural uranium ball coated with plexiglass and measured using a NaI detector is presented.

  15. HELIOS: An Open-source, GPU-accelerated Radiative Transfer Code for Self-consistent Exoplanetary Atmospheres

    Science.gov (United States)

    Malik, Matej; Grosheintz, Luc; Mendonça, João M.; Grimm, Simon L.; Lavie, Baptiste; Kitzmann, Daniel; Tsai, Shang-Min; Burrows, Adam; Kreidberg, Laura; Bedell, Megan; Bean, Jacob L.; Stevenson, Kevin B.; Heng, Kevin

    2017-02-01

    We present the open-source radiative transfer code named HELIOS, which is constructed for studying exoplanetary atmospheres. In its initial version, the model atmospheres of HELIOS are one-dimensional and plane-parallel, and the equation of radiative transfer is solved in the two-stream approximation with nonisotropic scattering. A small set of the main infrared absorbers is employed, computed with the opacity calculator HELIOS-K and combined using a correlated-k approximation. The molecular abundances originate from validated analytical formulae for equilibrium chemistry. We compare HELIOS with the work of Miller-Ricci & Fortney using a model of GJ 1214b, and perform several tests, where we find: model atmospheres with single-temperature layers struggle to converge to radiative equilibrium; k-distribution tables constructed with ≳ 0.01 cm-1 resolution in the opacity function (≲ {10}3 points per wavenumber bin) may result in errors ≳ 1%-10% in the synthetic spectra; and a diffusivity factor of 2 approximates well the exact radiative transfer solution in the limit of pure absorption. We construct “null-hypothesis” models (chemical equilibrium, radiative equilibrium, and solar elemental abundances) for six hot Jupiters. We find that the dayside emission spectra of HD 189733b and WASP-43b are consistent with the null hypothesis, while the latter consistently underpredicts the observed fluxes of WASP-8b, WASP-12b, WASP-14b, and WASP-33b. We demonstrate that our results are somewhat insensitive to the choice of stellar models (blackbody, Kurucz, or PHOENIX) and metallicity, but are strongly affected by higher carbon-to-oxygen ratios. The code is publicly available as part of the Exoclimes Simulation Platform (exoclime.net).

  16. Reliability and code level

    NARCIS (Netherlands)

    Kasperski, M.; Geurts, C.P.W.

    2005-01-01

    The paper describes the work of the IAWE Working Group WBG - Reliability and Code Level, one of the International Codification Working Groups set up at ICWE10 in Copenhagen. The following topics are covered: sources of uncertainties in the design wind load, appropriate design target values for the

  17. Radiation therapy sources, equipment and installations

    International Nuclear Information System (INIS)

    2011-03-01

    The safety code for Telegamma Therapy Equipment and Installations, (AERB/SC/MED-1) and safety code for Brachytherapy Sources, Equipment and Installations, (AERB/SC/MED-3) were issued by AERB in 1986 and 1988 respectively. These codes specified mandatory requirements for radiation therapy facilities, covering the entire spectrum of operations ranging from the setting up of a facility to its ultimate decommissioning, including procedures to be followed during emergency situations. The codes also stipulated requirements of personnel and their responsibilities. With the advent of new techniques and equipment such as 3D-conformal radiation therapy, intensity modulated radiation therapy, image guided radiation therapy, treatment planning system, stereotactic radiosurgery, stereotactic radiotherapy, portal imaging, integrated brachytherapy and endovascular brachytherapy during the last two decades, AERB desires that these codes be revised and merged into a single code titled Radiation Therapy Sources, Equipment, and Installations

  18. Invisible Display in Aluminum

    DEFF Research Database (Denmark)

    Prichystal, Jan Phuklin; Hansen, Hans Nørgaard; Bladt, Henrik Henriksen

    2005-01-01

    Bang & Olufsen a/s has been working with ideas for invisible integration of displays in metal surfaces. Invisible integration of information displays traditionally has been possible by placing displays behind transparent or semitransparent materials such as plastic or glass. The wish for an integ......Bang & Olufsen a/s has been working with ideas for invisible integration of displays in metal surfaces. Invisible integration of information displays traditionally has been possible by placing displays behind transparent or semitransparent materials such as plastic or glass. The wish...... for an integrated display in a metal surface is often ruled by design and functionality of a product. The integration of displays in metal surfaces requires metal removal in order to clear the area of the display to some extent. The idea behind an invisible display in Aluminum concerns the processing of a metal...

  19. Revised SRAC code system

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro; Ishiguro, Yukio; Kaneko, Kunio; Ido, Masaru.

    1986-09-01

    Since the publication of JAERI-1285 in 1983 for the preliminary version of the SRAC code system, a number of additions and modifications to the functions have been made to establish an overall neutronics code system. Major points are (1) addition of JENDL-2 version of data library, (2) a direct treatment of doubly heterogeneous effect on resonance absorption, (3) a generalized Dancoff factor, (4) a cell calculation based on the fixed boundary source problem, (5) the corresponding edit required for experimental analysis and reactor design, (6) a perturbation theory calculation for reactivity change, (7) an auxiliary code for core burnup and fuel management, etc. This report is a revision of the users manual which consists of the general description, input data requirements and their explanation, detailed information on usage, mathematics, contents of libraries and sample I/O. (author)

  20. Pseudo color ghost coding imaging with pseudo thermal light

    Science.gov (United States)

    Duan, De-yang; Xia, Yun-jie

    2018-04-01

    We present a new pseudo color imaging scheme named pseudo color ghost coding imaging based on ghost imaging but with multiwavelength source modulated by a spatial light modulator. Compared with conventional pseudo color imaging where there is no nondegenerate wavelength spatial correlations resulting in extra monochromatic images, the degenerate wavelength and nondegenerate wavelength spatial correlations between the idle beam and signal beam can be obtained simultaneously. This scheme can obtain more colorful image with higher quality than that in conventional pseudo color coding techniques. More importantly, a significant advantage of the scheme compared to the conventional pseudo color coding imaging techniques is the image with different colors can be obtained without changing the light source and spatial filter.

  1. Design and Implementation of a New Run-time Life-cycle for Interactive Public Display Applications

    OpenAIRE

    Cardoso, Jorge C. S.; Perpétua, Alice

    2015-01-01

    Public display systems are becoming increasingly complex. They are moving from passive closed systems to open interactive systems that are able to accommodate applications from several independent sources. This shift needs to be accompanied by a more flexible and powerful application management. In this paper, we propose a run-time life-cycle model for interactive public display applications that addresses several shortcomings of current display systems. Our mo...

  2. The Premar Code for the Monte Carlo Simulation of Radiation Transport In the Atmosphere

    International Nuclear Information System (INIS)

    Cupini, E.; Borgia, M.G.; Premuda, M.

    1997-03-01

    The Montecarlo code PREMAR is described, which allows the user to simulate the radiation transport in the atmosphere, in the ultraviolet-infrared frequency interval. A plan multilayer geometry is at present foreseen by the code, witch albedo possibility at the lower boundary surface. For a given monochromatic point source, the main quantities computed by the code are the absorption spatial distributions of aerosol and molecules, together with the related atmospheric transmittances. Moreover, simulation of of Lidar experiments are foreseen by the code, the source and telescope fields of view being assigned. To build-up the appropriate probability distributions, an input data library is assumed to be read by the code. For this purpose the radiance-transmittance LOWTRAN-7 code has been conveniently adapted as a source of the library so as to exploit the richness of information of the code for a large variety of atmospheric simulations. Results of applications of the PREMAR code are finally presented, with special reference to simulations of Lidar system and radiometer experiments carried out at the Brasimone ENEA Centre by the Environment Department

  3. Optical display for radar sensing

    Science.gov (United States)

    Szu, Harold; Hsu, Charles; Willey, Jefferson; Landa, Joseph; Hsieh, Minder; Larsen, Louis V.; Krzywicki, Alan T.; Tran, Binh Q.; Hoekstra, Philip; Dillard, John T.; Krapels, Keith A.; Wardlaw, Michael; Chu, Kai-Dee

    2015-05-01

    Boltzmann headstone S = kB Log W turns out to be the Rosette stone for Greek physics translation optical display of the microwave sensing hieroglyphics. The LHS is the molecular entropy S measuring the degree of uniformity scattering off the sensing cross sections. The RHS is the inverse relationship (equation) predicting the Planck radiation spectral distribution parameterized by the Kelvin temperature T. Use is made of the conservation energy law of the heat capacity of Reservoir (RV) change T Δ S = -ΔE equals to the internal energy change of black box (bb) subsystem. Moreover, an irreversible thermodynamics Δ S > 0 for collision mixing toward totally larger uniformity of heat death, asserted by Boltzmann, that derived the so-called Maxwell-Boltzmann canonical probability. Given the zero boundary condition black box, Planck solved a discrete standing wave eigenstates (equation). Together with the canonical partition function (equation) an average ensemble average of all possible internal energy yielded the celebrated Planck radiation spectral (equation) where the density of states (equation). In summary, given the multispectral sensing data (equation), we applied Lagrange Constraint Neural Network (LCNN) to solve the Blind Sources Separation (BSS) for a set of equivalent bb target temperatures. From the measurements of specific value, slopes and shapes we can fit a set of Kelvin temperatures T's for each bb targets. As a result, we could apply the analytical continuation for each entropy sources along the temperature-unique Planck spectral curves always toward the RGB color temperature display for any sensing probing frequency.

  4. Visualization of RELAP5-3D best estimate code

    International Nuclear Information System (INIS)

    Mesina, G.L.

    2004-01-01

    The Idaho National Engineering Laboratory has developed a number of nuclear plant analysis codes such as RELAP5-3D, SCDAP/RELAP5-3D, and FLUENT/RELAP5-3D that have multi-dimensional modeling capability. The output of these codes is very difficult to analyze without the aid of visualization tools. The RELAP5-3D Graphical User Interface (RGUI) displays these calculations on plant images, functional diagrams, graphs, and by other means. These representations of the data enhance the analysts' ability to recognize plant behavior visually and reduce the difficulty of analyzing complex three-dimensional models. This paper describes the Graphical User Interface system for the RELAP5-3D suite of Best Estimate codes. The uses of the Graphical User Interface are illustrated. Examples of user problems solved by use of this interface are given. (author)

  5. A multi-level code for metallurgical effects in metal-forming processes

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, P.A.; Silling, S.A. [Sandia National Labs., Albuquerque, NM (United States). Computational Physics and Mechanics Dept.; Hughes, D.A.; Bammann, D.J.; Chiesa, M.L. [Sandia National Labs., Livermore, CA (United States)

    1997-08-01

    The authors present the final report on a Laboratory-Directed Research and Development (LDRD) project, A Multi-level Code for Metallurgical Effects in metal-Forming Processes, performed during the fiscal years 1995 and 1996. The project focused on the development of new modeling capabilities for simulating forging and extrusion processes that typically display phenomenology occurring on two different length scales. In support of model fitting and code validation, ring compression and extrusion experiments were performed on 304L stainless steel, a material of interest in DOE nuclear weapons applications.

  6. Virtual environment display for a 3D audio room simulation

    Science.gov (United States)

    Chapin, William L.; Foster, Scott

    1992-06-01

    Recent developments in virtual 3D audio and synthetic aural environments have produced a complex acoustical room simulation. The acoustical simulation models a room with walls, ceiling, and floor of selected sound reflecting/absorbing characteristics and unlimited independent localizable sound sources. This non-visual acoustic simulation, implemented with 4 audio ConvolvotronsTM by Crystal River Engineering and coupled to the listener with a Poihemus IsotrakTM, tracking the listener's head position and orientation, and stereo headphones returning binaural sound, is quite compelling to most listeners with eyes closed. This immersive effect should be reinforced when properly integrated into a full, multi-sensory virtual environment presentation. This paper discusses the design of an interactive, visual virtual environment, complementing the acoustic model and specified to: 1) allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; 2) reinforce the listener's feeling of telepresence into the acoustical environment with visual and proprioceptive sensations; 3) enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and 4) serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations. The installed system implements a head-coupled, wide-angle, stereo-optic tracker/viewer and multi-computer simulation control. The portable demonstration system implements a head-mounted wide-angle, stereo-optic display, separate head and pointer electro-magnetic position trackers, a heterogeneous parallel graphics processing system, and object oriented C++ program code.

  7. Depth-enhanced three-dimensional-two-dimensional convertible display based on modified integral imaging.

    Science.gov (United States)

    Park, Jae-Hyeung; Kim, Hak-Rin; Kim, Yunhee; Kim, Joohwan; Hong, Jisoo; Lee, Sin-Doo; Lee, Byoungho

    2004-12-01

    A depth-enhanced three-dimensional-two-dimensional convertible display that uses a polymer-dispersed liquid crystal based on the principle of integral imaging is proposed. In the proposed method, a lens array is located behind a transmission-type display panel to form an array of point-light sources, and a polymer-dispersed liquid crystal is electrically controlled to pass or to scatter light coming from these point-light sources. Therefore, three-dimensional-two-dimensional conversion is accomplished electrically without any mechanical movement. Moreover, the nonimaging structure of the proposed method increases the expressible depth range considerably. We explain the method of operation and present experimental results.

  8. A novel approach to correct the coded aperture misalignment for fast neutron imaging

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, F. N.; Hu, H. S., E-mail: huasi-hu@mail.xjtu.edu.cn; Wang, D. M.; Jia, J. [School of Energy and Power Engineering, Xi’an Jiaotong University, Xi’an 710049 (China); Zhang, T. K. [Laser Fusion Research Center, CAEP, Mianyang, 621900 Sichuan (China); Jia, Q. G. [Institute of Applied Physics and Computational Mathematics, Beijing 100094 (China)

    2015-12-15

    Aperture alignment is crucial for the diagnosis of neutron imaging because it has significant impact on the coding imaging and the understanding of the neutron source. In our previous studies on the neutron imaging system with coded aperture for large field of view, “residual watermark,” certain extra information that overlies reconstructed image and has nothing to do with the source is discovered if the peak normalization is employed in genetic algorithms (GA) to reconstruct the source image. Some studies on basic properties of residual watermark indicate that the residual watermark can characterize coded aperture and can thus be used to determine the location of coded aperture relative to the system axis. In this paper, we have further analyzed the essential conditions for the existence of residual watermark and the requirements of the reconstruction algorithm for the emergence of residual watermark. A gamma coded imaging experiment has been performed to verify the existence of residual watermark. Based on the residual watermark, a correction method for the aperture misalignment has been studied. A multiple linear regression model of the position of coded aperture axis, the position of residual watermark center, and the gray barycenter of neutron source with twenty training samples has been set up. Using the regression model and verification samples, we have found the position of the coded aperture axis relative to the system axis with an accuracy of approximately 20 μm. Conclusively, a novel approach has been established to correct the coded aperture misalignment for fast neutron coded imaging.

  9. The Los Alamos accelerator code group

    Energy Technology Data Exchange (ETDEWEB)

    Krawczyk, F.L.; Billen, J.H.; Ryne, R.D.; Takeda, Harunori; Young, L.M.

    1995-05-01

    The Los Alamos Accelerator Code Group (LAACG) is a national resource for members of the accelerator community who use and/or develop software for the design and analysis of particle accelerators, beam transport systems, light sources, storage rings, and components of these systems. Below the authors describe the LAACG`s activities in high performance computing, maintenance and enhancement of POISSON/SUPERFISH and related codes and the dissemination of information on the INTERNET.

  10. MELCOR computer code manuals

    Energy Technology Data Exchange (ETDEWEB)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L. [Sandia National Labs., Albuquerque, NM (United States); Hodge, S.A.; Hyman, C.R.; Sanders, R.L. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  11. MELCOR computer code manuals

    International Nuclear Information System (INIS)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR's phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package

  12. Fast-neutron, coded-aperture imager

    Science.gov (United States)

    Woolf, Richard S.; Phlips, Bernard F.; Hutcheson, Anthony L.; Wulf, Eric A.

    2015-06-01

    This work discusses a large-scale, coded-aperture imager for fast neutrons, building off a proof-of concept instrument developed at the U.S. Naval Research Laboratory (NRL). The Space Science Division at the NRL has a heritage of developing large-scale, mobile systems, using coded-aperture imaging, for long-range γ-ray detection and localization. The fast-neutron, coded-aperture imaging instrument, designed for a mobile unit (20 ft. ISO container), consists of a 32-element array of 15 cm×15 cm×15 cm liquid scintillation detectors (EJ-309) mounted behind a 12×12 pseudorandom coded aperture. The elements of the aperture are composed of 15 cm×15 cm×10 cm blocks of high-density polyethylene (HDPE). The arrangement of the aperture elements produces a shadow pattern on the detector array behind the mask. By measuring of the number of neutron counts per masked and unmasked detector, and with knowledge of the mask pattern, a source image can be deconvolved to obtain a 2-d location. The number of neutrons per detector was obtained by processing the fast signal from each PMT in flash digitizing electronics. Digital pulse shape discrimination (PSD) was performed to filter out the fast-neutron signal from the γ background. The prototype instrument was tested at an indoor facility at the NRL with a 1.8-μCi and 13-μCi 252Cf neutron/γ source at three standoff distances of 9, 15 and 26 m (maximum allowed in the facility) over a 15-min integration time. The imaging and detection capabilities of the instrument were tested by moving the source in half- and one-pixel increments across the image plane. We show a representative sample of the results obtained at one-pixel increments for a standoff distance of 9 m. The 1.8-μCi source was not detected at the 26-m standoff. In order to increase the sensitivity of the instrument, we reduced the fastneutron background by shielding the top, sides and back of the detector array with 10-cm-thick HDPE. This shielding configuration led

  13. Distributed video coding with multiple side information

    DEFF Research Database (Denmark)

    Huang, Xin; Brites, C.; Ascenso, J.

    2009-01-01

    Distributed Video Coding (DVC) is a new video coding paradigm which mainly exploits the source statistics at the decoder based on the availability of some decoder side information. The quality of the side information has a major impact on the DVC rate-distortion (RD) performance in the same way...... the quality of the predictions had a major impact in predictive video coding. In this paper, a DVC solution exploiting multiple side information is proposed; the multiple side information is generated by frame interpolation and frame extrapolation targeting to improve the side information of a single...

  14. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  15. Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui

    Science.gov (United States)

    2012-01-01

    Background The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. Methods This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Results Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. Conclusions This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics

  16. New procedures to evaluate visually lossless compression for display systems

    Science.gov (United States)

    Stolitzka, Dale F.; Schelkens, Peter; Bruylants, Tim

    2017-09-01

    Visually lossless image coding in isochronous display streaming or plesiochronous networks reduces link complexity and power consumption and increases available link bandwidth. A new set of codecs developed within the last four years promise a new level of coding quality, but require new techniques that are sufficiently sensitive to the small artifacts or color variations induced by this new breed of codecs. This paper begins with a summary of the new ISO/IEC 29170-2, a procedure for evaluation of lossless coding and reports the new work by JPEG to extend the procedure in two important ways, for HDR content and for evaluating the differences between still images, panning images and image sequences. ISO/IEC 29170-2 relies on processing test images through a well-defined process chain for subjective, forced-choice psychophysical experiments. The procedure sets an acceptable quality level equal to one just noticeable difference. Traditional image and video coding evaluation techniques, such as, those used for television evaluation have not proven sufficiently sensitive to the small artifacts that may be induced by this breed of codecs. In 2015, JPEG received new requirements to expand evaluation of visually lossless coding for high dynamic range images, slowly moving images, i.e., panning, and image sequences. These requirements are the basis for new amendments of the ISO/IEC 29170-2 procedures described in this paper. These amendments promise to be highly useful for the new content in television and cinema mezzanine networks. The amendments passed the final ballot in April 2017 and are on track to be published in 2018.

  17. Advanced Colorimetry of Display Systems: Tetra-Chroma3 Display Unit

    Directory of Open Access Journals (Sweden)

    J. Kaiser

    2005-06-01

    Full Text Available High-fidelity color image reproduction is one of the key issues invisual telecommunication systems, for electronic commerce,telemedicine, digital museum and so on. All colorimetric standards ofdisplay systems are up to the present day trichromatic. But, from theshape of a horseshoe-area of all existing colors in the CIE xychromaticity diagram it follows that with three real reproductivelights, the stated area in the CIE xy chromaticity diagram cannot beoverlaid. The expansion of the color gamut of a display device ispossible in a few ways. In this paper, the way of increasing the numberof primaries is studied. The fourth cyan primary is added to threeconventional ones to enlarge the color gamut of reproduction towardscyans and yellow-oranges. The original method of color management forthis new display unit is introduced. In addition, the color gamut ofthe designed additive-based display is successfully compared with thecolor gamut of a modern subtractive-based system. A display with morethan three primary colors is called a multiprimary color display. Thevery advantageous property of such display is the possibility todisplay metameric colors.

  18. Fuel rod modelling during transients: The TOUTATIS code

    International Nuclear Information System (INIS)

    Bentejac, F.; Bourreau, S.; Brochard, J.; Hourdequin, N.; Lansiart, S.

    2001-01-01

    The TOUTATIS code is devoted to the PCI local phenomena simulation, in correlation with the METEOR code for the global behaviour of the fuel rod. More specifically, the TOUTATIS objective is to evaluate the mechanical constraints on the cladding during a power transient thus predicting its behaviour in term of stress corrosion cracking. Based upon the finite element computation code CASTEM 2000, TOUTATIS is a set of modules written in a macro language. The aim of this paper is to present both code modules: The axisymmetric bi-dimensional module, modeling a unique block pellet; The tri dimensional module modeling a radially fragmented pellet. Having shown the boundary conditions and the algorithms used, the application will be illustrated by: A short presentation of the bidimensional axisymmetric modeling performances as well as its limits; The enhancement due to the three dimensional modeling will be displayed by sensitivity studies to the geometry, in this case the pellet height/diameter ratio. Finally, we will show the easiness of the development inherent to the CASTEM 2000 system by depicting the process of a modeling enhancement by adding the possibility of an axial (horizontal) fissuration of the pellet. As conclusion, the future improvements planned for the code are depicted. (author)

  19. Color display and encryption with a plasmonic polarizing metamirror

    Directory of Open Access Journals (Sweden)

    Song Maowen

    2018-01-01

    Full Text Available Structural colors emerge when a particular wavelength range is filtered out from a broadband light source. It is regarded as a valuable platform for color display and digital imaging due to the benefits of environmental friendliness, higher visibility, and durability. However, current devices capable of generating colors are all based on direct transmission or reflection. Material loss, thick configuration, and the lack of tunability hinder their transition to practical applications. In this paper, a novel mechanism that generates high-purity colors by photon spin restoration on ultrashallow plasmonic grating is proposed. We fabricated the sample by interference lithography and experimentally observed full color display, tunable color logo imaging, and chromatic sensing. The unique combination of high efficiency, high-purity colors, tunable chromatic display, ultrathin structure, and friendliness for fabrication makes this design an easy way to bridge the gap between theoretical investigations and daily-life applications.

  20. The evolution of the mitochondrial genetic code in arthropods revisited.

    Science.gov (United States)

    Abascal, Federico; Posada, David; Zardoya, Rafael

    2012-04-01

    A variant of the invertebrate mitochondrial genetic code was previously identified in arthropods (Abascal et al. 2006a, PLoS Biol 4:e127) in which, instead of translating the AGG codon as serine, as in other invertebrates, some arthropods translate AGG as lysine. Here, we revisit the evolution of the genetic code in arthropods taking into account that (1) the number of arthropod mitochondrial genomes sequenced has triplicated since the original findings were published; (2) the phylogeny of arthropods has been recently resolved with confidence for many groups; and (3) sophisticated probabilistic methods can be applied to analyze the evolution of the genetic code in arthropod mitochondria. According to our analyses, evolutionary shifts in the genetic code have been more common than previously inferred, with many taxonomic groups displaying two alternative codes. Ancestral character-state reconstruction using probabilistic methods confirmed that the arthropod ancestor most likely translated AGG as lysine. Point mutations at tRNA-Lys and tRNA-Ser correlated with the meaning of the AGG codon. In addition, we identified three variables (GC content, number of AGG codons, and taxonomic information) that best explain the use of each of the two alternative genetic codes.