WorldWideScience

Sample records for platform independent code

  1. Data exchange between zero dimensional code and physics platform in the CFETR integrated system code

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Guoliang [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Shi, Nan [Institute of Plasma Physics, Chinese Academy of Sciences, No. 350 Shushanhu Road, Hefei (China); Zhou, Yifu; Mao, Shifeng [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Jian, Xiang [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, School of Electrical and Electronics Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Chen, Jiale [Institute of Plasma Physics, Chinese Academy of Sciences, No. 350 Shushanhu Road, Hefei (China); Liu, Li; Chan, Vincent [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Ye, Minyou, E-mail: yemy@ustc.edu.cn [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China)

    2016-11-01

    Highlights: • The workflow of the zero dimensional code and the multi-dimension physics platform of CFETR integrated system codeis introduced. • The iteration process among the codes in the physics platform. • The data transfer between the zero dimensionalcode and the physical platform, including data iteration and validation, and justification for performance parameters.. - Abstract: The China Fusion Engineering Test Reactor (CFETR) integrated system code contains three parts: a zero dimensional code, a physics platform and an engineering platform. We use the zero dimensional code to identify a set of preliminary physics and engineering parameters for CFETR, which is used as input to initiate multi-dimension studies using the physics and engineering platform for design, verification and validation. Effective data exchange between the zero dimensional code and the physical platform is critical for the optimization of CFETR design. For example, in evaluating the impact of impurity radiation on core performance, an open field line code is used to calculate the impurity transport from the first-wall boundary to the pedestal. The impurity particle in the pedestal are used as boundary conditions in a transport code for calculating impurity transport in the core plasma and the impact of core radiation on core performance. Comparison of the results from the multi-dimensional study to those from the zero dimensional code is used to further refine the controlled radiation model. The data transfer between the zero dimensional code and the physical platform, including data iteration and validation, and justification for performance parameters will be presented in this paper.

  2. Multimedia distribution using network coding on the iphone platform

    DEFF Research Database (Denmark)

    Vingelmann, Peter; Pedersen, Morten Videbæk; Fitzek, Frank

    2010-01-01

    This paper looks into the implementation details of random linear network coding on the Apple iPhone and iPod Touch mobile platforms for multimedia distribution. Previous implementations of network coding on this platform failed to achieve a throughput which is sufficient to saturate the WLAN...

  3. A platform independent framework for Statecharts code generation

    International Nuclear Information System (INIS)

    Andolfato, L.; Chiozzi, G.; Migliorini, N.; Morales, C.

    2012-01-01

    Control systems for telescopes and their instruments are reactive systems very well suited to be modelled using Statecharts formalism. The World Wide Web Consortium is working on a new standard called SCXML that specifies XML notation to describe Statecharts and provides a well defined operational semantic for run-time interpretation of the SCXML models. This paper presents a generic application framework for reactive non realtime systems based on interpreted Statecharts. The framework consists of a model to text transformation tool and an SCXML interpreter. The tool generates from UML state machine models the SCXML representation of the state machines as well as the application skeletons for the supported software platforms. An abstraction layer propagates the events from the middle-ware to the SCXML interpreter facilitating the support for different software platforms. This project benefits from the positive experience gained in several years of development of coordination and monitoring applications for the telescope control software domain using Model Driven Development technologies. (authors)

  4. Porting of serial molecular dynamics code on MIMD platforms

    International Nuclear Information System (INIS)

    Celino, M.

    1995-05-01

    A molecular Dynamics (MD) code, utilized for the study of atomistic models of metallic systems has been parallelized for MIMD (Multiple Instructions Multiple Data) parallel platforms by means of the Parallel Virtual Machine (PVM) message passing library. Since the parallelization implies modifications of the sequential algorithms, these are described from the point of view of the Statistical Mechanics theory. Furthermore, techniques and parallelization strategies utilized and the MD parallel code are described in detail. Benchmarks on several MIMD platforms (IBM SP1 and SP2, Cray T3D, Cluster of workstations) allow performances evaluation of the code versus the different characteristics of the parallel platforms

  5. Designing platform independent mobile apps and services

    CERN Document Server

    Heckman, Rocky

    2016-01-01

    This book explains how to help create an innovative and future proof architecture for mobile apps by introducing practical approaches to increase the value and flexibility of their service layers and reduce their delivery time. Designing Platform Independent Mobile Apps and Services begins by describing the mobile computing landscape and previous attempts at cross platform development. Platform independent mobile technologies and development strategies are described in chapter two and three. Communication protocols, details of a recommended five layer architecture, service layers, and the data abstraction layer are also introduced in these chapters. Cross platform languages and multi-client development tools for the User Interface (UI) layer, as well as message processing patterns and message routing of the Service Int rface (SI) layer are explained in chapter four and five. Ways to design the service layer for mobile computing, using Command Query Responsibility Segregation (CQRS) and the Data Abstraction La...

  6. A platform independent communication library for distributed computing

    NARCIS (Netherlands)

    Groen, D.; Rieder, S.; Grosso, P.; de Laat, C.; Portegies Zwart, S.

    2010-01-01

    We present MPWide, a platform independent communication library for performing message passing between supercomputers. Our library couples several local MPI applications through a long distance network using, for example, optical links. The implementation is deliberately kept light-weight, platform

  7. An Evaluation of Automated Code Generation with the PetriCode Approach

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Automated code generation is an important element of model driven development methodologies. We have previously proposed an approach for code generation based on Coloured Petri Net models annotated with textual pragmatics for the network protocol domain. In this paper, we present and evaluate thr...... important properties of our approach: platform independence, code integratability, and code readability. The evaluation shows that our approach can generate code for a wide range of platforms which is integratable and readable....

  8. Independent peer review of nuclear safety computer codes

    International Nuclear Information System (INIS)

    Boyack, B.E.; Jenks, R.P.

    1993-01-01

    A structured, independent computer code peer-review process has been developed to assist the US Nuclear Regulatory Commission (NRC) and the US Department of Energy in their nuclear safety missions. This paper describes a structured process of independent code peer review, benefits associated with a code-independent peer review, as well as the authors' recent peer-review experience. The NRC adheres to the principle that safety of plant design, construction, and operation are the responsibility of the licensee. Nevertheless, NRC staff must have the ability to independently assess plant designs and safety analyses submitted by license applicants. According to Ref. 1, open-quotes this requires that a sound understanding be obtained of the important physical phenomena that may occur during transients in operating power plants.close quotes The NRC concluded that computer codes are the principal products to open-quotes understand and predict plant response to deviations from normal operating conditionsclose quotes and has developed several codes for that purpose. However, codes cannot be used blindly; they must be assessed and found adequate for the purposes they are intended. A key part of the qualification process can be accomplished through code peer reviews; this approach has been adopted by the NRC

  9. Platform-independent method for computer aided schematic drawings

    Science.gov (United States)

    Vell, Jeffrey L [Slingerlands, NY; Siganporia, Darius M [Clifton Park, NY; Levy, Arthur J [Fort Lauderdale, FL

    2012-02-14

    A CAD/CAM method is disclosed for a computer system to capture and interchange schematic drawing and associated design information. The schematic drawing and design information are stored in an extensible, platform-independent format.

  10. Cross-Platform JavaScript Coding: Shifting Sand Dunes and Shimmering Mirages.

    Science.gov (United States)

    Merchant, David

    1999-01-01

    Most libraries don't have the resources to cross-platform and cross-version test all of their JavaScript coding. Many turn to WYSIWYG; however, WYSIWYG editors don't generally produce optimized coding. Web developers should: test their coding on at least one 3.0 browser, code by hand using tools to help speed that process up, and include a simple…

  11. A Platform-Independent Plugin for Navigating Online Radiology Cases.

    Science.gov (United States)

    Balkman, Jason D; Awan, Omer A

    2016-06-01

    Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.

  12. Porting of a serial molecular dynamics code on MIMD platforms

    Energy Technology Data Exchange (ETDEWEB)

    Celino, M. [ENEA Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). HPCN Project

    1999-07-01

    A molecular dynamics (MD) code, utilized for the study of atomistic models of metallic systems has been parallelized for MIMD (multiple instructions multiple data) parallel platforms by means of the parallel virtual machine (PVM) message passing library. Since the parallelization implies modifications of the sequential algorithms, these are described from the point of view of the statistical mechanical theory. Furthermore, techniques and parallelization strategies utilized and the MD parallel code are described in detail. Benchmarks on several MIMD platforms (IBM SP1, SP2, Cray T3D, cluster of workstations) allow performances evaluation of the code versus the different characteristics of the parallel platforms. [Italian] Un codice seriale di dinamica molecolare (MD) utilizzato per lo studio di modelli atomici di materiali metallici e' stato parallelizzato per piattaforme parallele MIMD (multiple instructions multiple data) utilizzando librerie del parallel virtual machine (PVM). Poiche' l'operazione di parallelizzazione ha implicato la modifica degli algoritmi seriali del codice, questi vengono descritti ripercorrendo i concetti fondamentali della meccanica statistica. Inoltre sono presentate le tecniche e le strategie di parallelizzazione utilizzate descrivendo in dettaglio il codice parallelo di MD: Risultati di benchmark su diverse piattaforme MIMD (IBM SP1, SP2, Cray T3D, cluster of workstations) permettono di analizzare le performances del codice in funzione delle differenti caratteristiche delle piattaforme parallele.

  13. Los Alamos radiation transport code system on desktop computing platforms

    International Nuclear Information System (INIS)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.; West, J.T.

    1990-01-01

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. The current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines

  14. Independent rate and temporal coding in hippocampal pyramidal cells.

    Science.gov (United States)

    Huxter, John; Burgess, Neil; O'Keefe, John

    2003-10-23

    In the brain, hippocampal pyramidal cells use temporal as well as rate coding to signal spatial aspects of the animal's environment or behaviour. The temporal code takes the form of a phase relationship to the concurrent cycle of the hippocampal electroencephalogram theta rhythm. These two codes could each represent a different variable. However, this requires the rate and phase to vary independently, in contrast to recent suggestions that they are tightly coupled, both reflecting the amplitude of the cell's input. Here we show that the time of firing and firing rate are dissociable, and can represent two independent variables: respectively the animal's location within the place field, and its speed of movement through the field. Independent encoding of location together with actions and stimuli occurring there may help to explain the dual roles of the hippocampus in spatial and episodic memory, or may indicate a more general role of the hippocampus in relational/declarative memory.

  15. Modular turbine airfoil and platform assembly with independent root teeth

    Science.gov (United States)

    Campbell, Christian X; Davies, Daniel O; Eng, Darryl

    2013-07-30

    A turbine airfoil (22E-H) extends from a shank (23E-H). A platform (30E-H) brackets or surrounds a first portion of the shank (23E-H). Opposed teeth (33, 35) extend laterally from the platform (30E-H) to engage respective slots (50) in a disk. Opposed teeth (25, 27) extend laterally from a second portion of the shank (29) that extends below the platform (30E-H) to engage other slots (52) in the disk. Thus the platform (30E-H) and the shank (23E-H) independently support their own centrifugal loads via their respective teeth. The platform may be formed in two portions (32E-H, 34E-H), that are bonded to each other at matching end-walls (37) and/or via pins (36G) passing through the shank (23E-H). Coolant channels (41, 43) may pass through the shank beside the pins (36G).

  16. An Efficient Platform for the Automatic Extraction of Patterns in Native Code

    Directory of Open Access Journals (Sweden)

    Javier Escalada

    2017-01-01

    Full Text Available Different software tools, such as decompilers, code quality analyzers, recognizers of packed executable files, authorship analyzers, and malware detectors, search for patterns in binary code. The use of machine learning algorithms, trained with programs taken from the huge number of applications in the existing open source code repositories, allows finding patterns not detected with the manual approach. To this end, we have created a versatile platform for the automatic extraction of patterns from native code, capable of processing big binary files. Its implementation has been parallelized, providing important runtime performance benefits for multicore architectures. Compared to the single-processor execution, the average performance improvement obtained with the best configuration is 3.5 factors over the maximum theoretical gain of 4 factors.

  17. Evidence for modality-independent order coding in working memory.

    Science.gov (United States)

    Depoorter, Ann; Vandierendonck, André

    2009-03-01

    The aim of the present study was to investigate the representation of serial order in working memory, more specifically whether serial order is coded by means of a modality-dependent or a modality-independent order code. This was investigated by means of a series of four experiments based on a dual-task methodology in which one short-term memory task was embedded between the presentation and recall of another short-term memory task. Two aspects were varied in these memory tasks--namely, the modality of the stimulus materials (verbal or visuo-spatial) and the presence of an order component in the task (an order or an item memory task). The results of this study showed impaired primary-task recognition performance when both the primary and the embedded task included an order component, irrespective of the modality of the stimulus materials. If one or both of the tasks did not contain an order component, less interference was found. The results of this study support the existence of a modality-independent order code.

  18. Practical Salesforce.com development without code customizing salesforce on the Force.com platform

    CERN Document Server

    Weinmeister, Philip

    2014-01-01

    Are you facing a challenging Salesforce.com problem-say, relating to customization, configuration, reporting, dashboards, or formulation-that you can't quite crack? Or maybe you are hoping to infuse some creativity into your solution design strategy to solve problems faster or make solutions more efficient? Practical Salesforce.com Development Without Code shows you how to unlock the power of the Force.com platform to solve real business problems-and all without writing a line of code. Adhering to Salesforce.com's ""Clicks, not code"" mantra, Salesforce.com expert Phil Weinmeister walks you t

  19. PAC++: Object-oriented platform for accelerator codes

    International Nuclear Information System (INIS)

    Malitsky, N.; Reshetov, A.; Bourianoff, G.

    1994-06-01

    Software packages in accelerator physics have relatively long life cycles. They had been developed and used for a wide range of accelerators in the past as well as for the current projects. For example, the basic algorithms written in the first accelerator Program TRANSPORT are actual for design of most magnet systems. Most of these packages had been implemented on Fortran. But this language is rather inconvenient as a basic language for large integrated projects that possibly could include real-time data acquisition, data base access, graphic riser interface modules (GUI), arid other features. Some later accelerator programs had been based on object-oriented tools (primarily, C++ language). These range from systems for advanced theoretical studies to control system software. For the new generations of accelerators it would be desirable to have an integrated platform in which all simulation and control tasks will be considered with one point of view. In this report the basic principles of an object-oriented platform for accelerator research software (PAC++) are suggested and analyzed. Primary objectives of this work are to enable efficient self-explaining realization of the accelerator concepts and to provide an integrated environment for the updating and the developing of the code

  20. Performance evaluations of advanced massively parallel platforms based on gyrokinetic toroidal five-dimensional Eulerian code GT5D

    International Nuclear Information System (INIS)

    Idomura, Yasuhiro; Jolliet, Sebastien

    2010-01-01

    A gyrokinetic toroidal five dimensional Eulerian code GT5D is ported on six advanced massively parallel platforms and comprehensive benchmark tests are performed. A parallelisation technique based on physical properties of the gyrokinetic equation is presented. By extending the parallelisation technique with a hybrid parallel model, the scalability of the code is improved on platforms with multi-core processors. In the benchmark tests, a good salability is confirmed up to several thousands cores on every platforms, and the maximum sustained performance of ∼18.6 Tflops is achieved using 16384 cores of BX900. (author)

  1. Independent validation testing of the FLAME computer code, Version 1.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-07-01

    Independent testing of the FLAME computer code, Version 1.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Validation tests, (i.e., tests which compare field data to the computer generated solutions) were used to determine the operational status of the FLAME computer code and were done on a qualitative basis through graphical comparisons of the experimental and numerical data. These tests were specifically designed to check: (1) correctness of the FORTRAN coding, (2) computational accuracy, and (3) suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: (1) independent applications, and (2) graduated difficulty of test cases. Three tests ranging in complexity from simple one-dimensional steady-state flow field problems under near-saturated conditions to two-dimensional transient flow problems with very dry initial conditions

  2. Design and simulation of material-integrated distributed sensor processing with a code-based agent platform and mobile multi-agent systems.

    Science.gov (United States)

    Bosse, Stefan

    2015-02-16

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.

  3. Design and Simulation of Material-Integrated Distributed Sensor Processing with a Code-Based Agent Platform and Mobile Multi-Agent Systems

    Directory of Open Access Journals (Sweden)

    Stefan Bosse

    2015-02-01

    Full Text Available Multi-agent systems (MAS can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.

  4. NASA Glenn Steady-State Heat Pipe Code GLENHP: Compilation for 64- and 32-Bit Windows Platforms

    Science.gov (United States)

    Tower, Leonard K.; Geng, Steven M.

    2016-01-01

    A new version of the NASA Glenn Steady State Heat Pipe Code, designated "GLENHP," is introduced here. This represents an update to the disk operating system (DOS) version LERCHP reported in NASA/TM-2000-209807. The new code operates on 32- and 64-bit Windows-based platforms from within the 32-bit command prompt window. An additional evaporator boundary condition and other features are provided.

  5. A platform-independent method for detecting errors in metagenomic sequencing data: DRISEE.

    Directory of Open Access Journals (Sweden)

    Kevin P Keegan

    Full Text Available We provide a novel method, DRISEE (duplicate read inferred sequencing error estimation, to assess sequencing quality (alternatively referred to as "noise" or "error" within and/or between sequencing samples. DRISEE provides positional error estimates that can be used to inform read trimming within a sample. It also provides global (whole sample error estimates that can be used to identify samples with high or varying levels of sequencing error that may confound downstream analyses, particularly in the case of studies that utilize data from multiple sequencing samples. For shotgun metagenomic data, we believe that DRISEE provides estimates of sequencing error that are more accurate and less constrained by technical limitations than existing methods that rely on reference genomes or the use of scores (e.g. Phred. Here, DRISEE is applied to (non amplicon data sets from both the 454 and Illumina platforms. The DRISEE error estimate is obtained by analyzing sets of artifactual duplicate reads (ADRs, a known by-product of both sequencing platforms. We present DRISEE as an open-source, platform-independent method to assess sequencing error in shotgun metagenomic data, and utilize it to discover previously uncharacterized error in de novo sequence data from the 454 and Illumina sequencing platforms.

  6. Benchmark testing and independent verification of the VS2DT computer code

    International Nuclear Information System (INIS)

    McCord, J.T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code's original documentation

  7. FASTQSim: platform-independent data characterization and in silico read generation for NGS datasets.

    Science.gov (United States)

    Shcherbina, Anna

    2014-08-15

    High-throughput next generation sequencing technologies have enabled rapid characterization of clinical and environmental samples. Consequently, the largest bottleneck to actionable data has become sample processing and bioinformatics analysis, creating a need for accurate and rapid algorithms to process genetic data. Perfectly characterized in silico datasets are a useful tool for evaluating the performance of such algorithms. Background contaminating organisms are observed in sequenced mixtures of organisms. In silico samples provide exact truth. To create the best value for evaluating algorithms, in silico data should mimic actual sequencer data as closely as possible. FASTQSim is a tool that provides the dual functionality of NGS dataset characterization and metagenomic data generation. FASTQSim is sequencing platform-independent, and computes distributions of read length, quality scores, indel rates, single point mutation rates, indel size, and similar statistics for any sequencing platform. To create training or testing datasets, FASTQSim has the ability to convert target sequences into in silico reads with specific error profiles obtained in the characterization step. FASTQSim enables users to assess the quality of NGS datasets. The tool provides information about read length, read quality, repetitive and non-repetitive indel profiles, and single base pair substitutions. FASTQSim allows the user to simulate individual read datasets that can be used as standardized test scenarios for planning sequencing projects or for benchmarking metagenomic software. In this regard, in silico datasets generated with the FASTQsim tool hold several advantages over natural datasets: they are sequencing platform independent, extremely well characterized, and less expensive to generate. Such datasets are valuable in a number of applications, including the training of assemblers for multiple platforms, benchmarking bioinformatics algorithm performance, and creating challenge

  8. Application of the SALOME platform to the loose coupling of the CATHENA and ELOCA codes

    International Nuclear Information System (INIS)

    Zhuchkova, A.

    2012-01-01

    Use of coupled codes for the safety analysis of nuclear power plants is highly desirable, as it permits multi-disciplinary studies of complex reactor behaviors and, in particular, accident simulations. The present work demonstrates the potential of the SALOME platform as an interface for creating integrated, multi-disciplinary simulations of reactor scenarios. For this purpose two codes currently in use within the Canadian nuclear industry, CATHENA and ELOCA, were coupled by means of SALOME. The coupled codes were used to model the Power Burst Facility (PBF)-CANDU Test, which was to test the thermal-mechanical behavior of PHWR (pressurized heavy water reactor) fuel during a simulated Large Loss-Of-Coolant Accident (LLOCA). The results of the SALOME-coupled simulations are compared with a previous analysis in which the two codes were coupled using a package of scripts. (author)

  9. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies

  10. A Cross-Platform Tactile Capabilities Interface for Humanoid Robots

    Directory of Open Access Journals (Sweden)

    Jie eMa

    2016-04-01

    Full Text Available This article presents the core elements of a cross-platform tactile capabilities interface (TCI for humanoid arms. The aim of the interface is to reduce the cost of developing humanoid robot capabilities by supporting reuse through cross-platform deployment. The article presents a comparative analysis of existing robot middleware frameworks, as well as the technical details of the TCI framework that builds on the the existing YARP platform. The TCI framework currently includes robot arm actuators with robot skin sensors. It presents such hardware in a platform independent manner, making it possible to write robot control software that can be executed on different robots through the TCI frameworks. The TCI framework supports multiple humanoid platforms and this article also presents a case study of a cross-platform implementation of a set of tactile protective withdrawal reflexes that have been realised on both the Nao and iCub humanoid robot platforms using the same high-level source code.

  11. Hooke: an open software platform for force spectroscopy.

    Science.gov (United States)

    Sandal, Massimo; Benedetti, Fabrizio; Brucale, Marco; Gomez-Casado, Alberto; Samorì, Bruno

    2009-06-01

    Hooke is an open source, extensible software intended for analysis of atomic force microscope (AFM)-based single molecule force spectroscopy (SMFS) data. We propose it as a platform on which published and new algorithms for SMFS analysis can be integrated in a standard, open fashion, as a general solution to the current lack of a standard software for SMFS data analysis. Specific features and support for file formats are coded as independent plugins. Any user can code new plugins, extending the software capabilities. Basic automated dataset filtering and semi-automatic analysis facilities are included. Software and documentation are available at (http://code.google.com/p/hooke). Hooke is a free software under the GNU Lesser General Public License.

  12. Synchronized Multimedia Streaming on the iPhone Platform with Network Coding

    DEFF Research Database (Denmark)

    Vingelmann, Peter; Fitzek, Frank; Pedersen, Morten Videbæk

    2011-01-01

    on the iPhone that use point-to-point architectures. After acknowledging their limitations, we propose a solution based on network coding to efficiently and reliably deliver the multimedia content to many devices in a synchronized manner. Then we introduce an application that implements this technique......This work presents the implementation of synchronized multimedia streaming for the Apple iPhone platform. The idea is to stream multimedia content from a single source to multiple receivers with direct or multihop connections to the source. First we look into existing solutions for video streaming...... on the iPhone. We also present our testbed, which consists of 16 iPod Touch devices to showcase the capabilities of our application....

  13. Validation of tumor protein marker quantification by two independent automated immunofluorescence image analysis platforms

    Science.gov (United States)

    Peck, Amy R; Girondo, Melanie A; Liu, Chengbao; Kovatich, Albert J; Hooke, Jeffrey A; Shriver, Craig D; Hu, Hai; Mitchell, Edith P; Freydin, Boris; Hyslop, Terry; Chervoneva, Inna; Rui, Hallgeir

    2016-01-01

    Protein marker levels in formalin-fixed, paraffin-embedded tissue sections traditionally have been assayed by chromogenic immunohistochemistry and evaluated visually by pathologists. Pathologist scoring of chromogen staining intensity is subjective and generates low-resolution ordinal or nominal data rather than continuous data. Emerging digital pathology platforms now allow quantification of chromogen or fluorescence signals by computer-assisted image analysis, providing continuous immunohistochemistry values. Fluorescence immunohistochemistry offers greater dynamic signal range than chromogen immunohistochemistry, and combined with image analysis holds the promise of enhanced sensitivity and analytic resolution, and consequently more robust quantification. However, commercial fluorescence scanners and image analysis software differ in features and capabilities, and claims of objective quantitative immunohistochemistry are difficult to validate as pathologist scoring is subjective and there is no accepted gold standard. Here we provide the first side-by-side validation of two technologically distinct commercial fluorescence immunohistochemistry analysis platforms. We document highly consistent results by (1) concordance analysis of fluorescence immunohistochemistry values and (2) agreement in outcome predictions both for objective, data-driven cutpoint dichotomization with Kaplan–Meier analyses or employment of continuous marker values to compute receiver-operating curves. The two platforms examined rely on distinct fluorescence immunohistochemistry imaging hardware, microscopy vs line scanning, and functionally distinct image analysis software. Fluorescence immunohistochemistry values for nuclear-localized and tyrosine-phosphorylated Stat5a/b computed by each platform on a cohort of 323 breast cancer cases revealed high concordance after linear calibration, a finding confirmed on an independent 382 case cohort, with concordance correlation coefficients >0

  14. Cross-Platform Technologies

    Directory of Open Access Journals (Sweden)

    Maria Cristina ENACHE

    2017-04-01

    Full Text Available Cross-platform - a concept becoming increasingly used in recent years especially in the development of mobile apps, but this consistently over time and in the development of conventional desktop applications. The notion of cross-platform software (multi-platform or platform-independent refers to a software application that can run on more than one operating system or computing architecture. Thus, a cross-platform application can operate independent of software or hardware platform on which it is execute. As a generic definition presents a wide range of meanings for purposes of this paper we individualize this definition as follows: we will reduce the horizon of meaning and we use functionally following definition: a cross-platform application is a software application that can run on more than one operating system (desktop or mobile identical or in a similar way.

  15. Morphology Independent Learning in Modular Robots

    DEFF Research Database (Denmark)

    Christensen, David Johan; Bordignon, Mirko; Schultz, Ulrik Pagh

    2009-01-01

    Hand-coding locomotion controllers for modular robots is difficult due to their polymorphic nature. Instead, we propose to use a simple and distributed reinforcement learning strategy. ATRON modules with identical controllers can be assembled in any configuration. To optimize the robot’s locomotion...... speed its modules independently and in parallel adjust their behavior based on a single global reward signal. In simulation, we study the learning strategy’s performance on different robot configurations. On the physical platform, we perform learning experiments with ATRON robots learning to move as fast...

  16. DC Brushless Motor Control Design and Preliminary Testing for Independent 4-Wheel Drive Rev-11 Robotic Platform

    Directory of Open Access Journals (Sweden)

    Roni Permana Saputra

    2012-03-01

    Full Text Available This paper discusses the design of control system for brushless DC motor using microcontroller ATMega 16 that will be applied to an independent 4-wheel drive Mobile Robot LIPI version 2 (REV-11. The control system consists of two parts which are brushless DC motor control module and supervisory control module that coordinates the desired command to the motor control module. To control the REV-11 platform, supervisory control transmit the reference data of speed and direction of motor to control the speed and direction of each actuator on the platform REV-11. From the test results it is concluded that the designed control system work properly to coordinate and control the speed and direction of motion of the actuator motor REV-11 platform

  17. RTE - Compliance with the code of good practices and Independence of RTE. 2013 Annual Report

    International Nuclear Information System (INIS)

    2013-01-01

    RTE Reseau de Transport d'Electricite (Electricity Transmission System Operator) is referred to in Article L111-40 of the French Energy Code as the company in charge of managing France's public electricity transmission grid. For this purpose, RTE must comply with all the rules and obligations that apply to transmission grid management companies as defined by the Energy Code. More particularly, the articles concerning the Transmission System Operators (TSOs) belonging to a Vertically Integrated Undertaking (VIU) apply to RTE, a wholly-owned subsidiary of Electricite de France. The purpose of these provisions is to establish and maintain over time the independence of the transmission grid operator vis-a-vis the VIU. The Commission de Regulation de l'Energie (CRE - Energy Regulation Board) certified RTE in its deliberation of January 26, 2012: To maintain this certification, RTE is required to comply with its commitments made within the framework of the certification process and maintain the conditions of independence that were approved by the CRE. Among the obligations that RTE is required to comply with as an independent transmission manager is the need to bring together 'in a code of good practices approved by the Energy Regulation Board, the organisational measures taken to prevent any risks of discriminatory practices in terms of access to the grid' (Article L111-22). RTE is also required to put in place 'a Compliance Officer in charge of ensuring [...] the conformity of its methods with the obligations of independence incumbent on it with regard to other companies belonging to the VIU', 'to verify the application [...] of the commitments appearing in the code of good practices' and to draw up an annual report [...] which it sends on to the Energy Regulating Board' on the subject (Article L111-34). This document is the report regarding compliance with the code of good practices for 2013 by the RTE Compliance Officer. It is destined for the CRE and is intended to

  18. Growth platform-dependent and -independent phenotypic and metabolic responses of Arabidopsis and its halophytic relative, Eutrema salsugineum, to salt stress.

    Science.gov (United States)

    Kazachkova, Yana; Batushansky, Albert; Cisneros, Aroldo; Tel-Zur, Noemi; Fait, Aaron; Barak, Simon

    2013-07-01

    Comparative studies of the stress-tolerant Arabidopsis (Arabidopsis thaliana) halophytic relative, Eutrema salsugineum, have proven a fruitful approach to understanding natural stress tolerance. Here, we performed comparative phenotyping of Arabidopsis and E. salsugineum vegetative development under control and salt-stress conditions, and then compared the metabolic responses of the two species on different growth platforms in a defined leaf developmental stage. Our results reveal both growth platform-dependent and -independent phenotypes and metabolic responses. Leaf emergence was affected in a similar way in both species grown in vitro but the effects observed in Arabidopsis occurred at higher salt concentrations in E. salsugineum. No differences in leaf emergence were observed on soil. A new effect of a salt-mediated reduction in E. salsugineum leaf area was unmasked. On soil, leaf area reduction in E. salsugineum was mainly due to a fall in cell number, whereas both cell number and cell size contributed to the decrease in Arabidopsis leaf area. Common growth platform-independent leaf metabolic signatures such as high raffinose and malate, and low fumarate contents that could reflect core stress tolerance mechanisms, as well as growth platform-dependent metabolic responses were identified. In particular, the in vitro growth platform led to repression of accumulation of many metabolites including sugars, sugar phosphates, and amino acids in E. salsugineum compared with the soil system where these same metabolites accumulated to higher levels in E. salsugineum than in Arabidopsis. The observation that E. salsugineum maintains salt tolerance despite growth platform-specific phenotypes and metabolic responses suggests a considerable degree of phenotypic and metabolic adaptive plasticity in this extremophile.

  19. An integrated development framework for rapid development of platform-independent and reusable satellite on-board software

    Science.gov (United States)

    Ziemke, Claas; Kuwahara, Toshinori; Kossev, Ivan

    2011-09-01

    Even in the field of small satellites, the on-board data handling subsystem has become complex and powerful. With the introduction of powerful CPUs and the availability of considerable amounts of memory on-board a small satellite it has become possible to utilize the flexibility and power of contemporary platform-independent real-time operating systems. Especially the non-commercial sector such like university institutes and community projects such as AMSAT or SSETI are characterized by the inherent lack of financial as well as manpower resources. The opportunity to utilize such real-time operating systems will contribute significantly to achieve a successful mission. Nevertheless the on-board software of a satellite is much more than just an operating system. It has to fulfill a multitude of functional requirements such as: Telecommand interpretation and execution, execution of control loops, generation of telemetry data and frames, failure detection isolation and recovery, the communication with peripherals and so on. Most of the aforementioned tasks are of generic nature and have to be conducted on any satellite with only minor modifications. A general set of functional requirements as well as a protocol for communication is defined in the SA ECSS-E-70-41A standard "Telemetry and telecommand packet utilization". This standard not only defines the communication protocol of the satellite-ground link but also defines a set of so called services which have to be available on-board of every compliant satellite and which are of generic nature. In this paper, a platform-independent and reusable framework is described which is implementing not only the ECSS-E-70-41A standard but also functionalities for interprocess communication, scheduling and a multitude of tasks commonly performed on-board of a satellite. By making use of the capabilities of the high-level programming language C/C++, the powerful open source library BOOST, the real-time operating system RTEMS and

  20. Latest improvements on TRACPWR six-equations thermohydraulic code

    International Nuclear Information System (INIS)

    Rivero, N.; Batuecas, T.; Martinez, R.; Munoz, J.; Lenhardt, G.; Serrano, P.

    1999-01-01

    The paper presents the latest improvements on TRACPWR aimed at adapting the code to present trends on computer platforms, architectures and training requirements as well as extending the scope of the code itself and its applicability to other technologies different from Westinghouse PWR one. Firstly major features of TRACPWR as best estimate and real time simulation code are summed, then the areas where TRACPWR is being improved are presented. These areas comprising: (1) Architecture: integrating TRACPWR and RELAP5 codes, (2) Code scope enhancement: modelling the Mid-Loop operation, (3) Code speed-up: applying parallelization techniques, (4) Code platform downswing: porting to Windows N1 platform, (5) On-line performance: allowing simulation initialisation from a Plant Process Computer, and (6) Code scope extension: using the code for modelling VVER and PHWR technology. (author)

  1. A PLC platform-independent structural analysis on FBD programs for digital reactor protection systems

    International Nuclear Information System (INIS)

    Jung, Sejin; Yoo, Junbeom; Lee, Young-Jun

    2017-01-01

    Highlights: • FBD has been widely used to implement safety-critical software for PLC-based systems. • The safety-critical software should be developed strictly with safety programming guidelines. • There are no argued rules that have specific links to higher guidelines NUREG/CR-6463 PLC platform-independently. • This paper proposes a set of rules on the structure of FBD programs with providing specific links to higher guidelines. • This paper also provides CASE tool ‘FBD Checker’ for analyzing the structure of FBD. - Abstract: FBD (function block diagram) has been widely used to implement safety-critical software for PLC (programmable logic controller)-based digital nuclear reactor protection systems. The software should be developed strictly in accordance with safety programming guidelines such as NUREG/CR-6463. Software engineering tools of PLC vendors enable us to present structural analyses using FBD programs, but specific rules pertaining to the guidelines are enclosed within the commercial tools, and specific links to the guidelines are not clearly communicated. This paper proposes a set of rules on the structure of FBD programs in accordance with guidelines, and we develop an automatic analysis tool for FBD programs written in the PLCopen TC6 format. With the proposed tool, any FBD program that is transformed into an open format can be analyzed the PLC platform-independently. We consider a case study on FBD programs obtained from a preliminary version of a Korean nuclear power plant, and we demonstrate the effectiveness and potential of the proposed rules and analysis tool.

  2. ON CODE REFACTORING OF THE DIALOG SUBSYSTEM OF CDSS PLATFORM FOR THE OPEN-SOURCE MIS OPENMRS

    Directory of Open Access Journals (Sweden)

    A. V. Semenets

    2016-08-01

    The open-source MIS OpenMRS developer tools and software API are reviewed. The results of code refactoring of the dialog subsystem of the CDSS platform which is made as module for the open-source MIS OpenMRS are presented. The structure of information model of database of the CDSS dialog subsystem was updated according with MIS OpenMRS requirements. The Model-View-Controller (MVC based approach to the CDSS dialog subsystem architecture was re-implemented with Java programming language using Spring and Hibernate frameworks. The MIS OpenMRS Encounter portlet form for the CDSS dialog subsystem integration is developed as an extension. The administrative module of the CDSS platform is recreated. The data exchanging formats and methods for interaction of OpenMRS CDSS dialog subsystem module and DecisionTree GAE service are re-implemented with help of AJAX technology via jQuery library

  3. MARS Code in Linux Environment

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2005-07-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  4. MARS Code in Linux Environment

    International Nuclear Information System (INIS)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong

    2005-01-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  5. On the notion of abstract platform in MDA development

    NARCIS (Netherlands)

    Andrade Almeida, João; Dijkman, R.M.; van Sinderen, Marten J.; Ferreira Pires, Luis

    2004-01-01

    Although platform-independence is a central property in MDA models, the study of platform-independence has been largely overlooked in MDA. As a consequence, there is a lack of guidelines to select abstraction criteria and modelling concepts for platform-independent design. In addition, there is

  6. Network Coding Applications and Implementations on Mobile Devices

    DEFF Research Database (Denmark)

    Fitzek, Frank; Pedersen, Morten Videbæk; Heide, Janus

    2010-01-01

    Network coding has attracted a lot of attention lately. The goal of this paper is to demonstrate that the implementation of network coding is feasible on mobile platforms. The paper will guide the reader through some examples and demonstrate uses for network coding. Furthermore the paper will also...... show that the implementation of network coding is feasible today on commercial mobile platforms....

  7. Implementation of Online Veterinary Hospital on Cloud Platform.

    Science.gov (United States)

    Chen, Tzer-Shyong; Chen, Tzer-Long; Chung, Yu-Fang; Huang, Yao-Min; Chen, Tao-Chieh; Wang, Huihui; Wei, Wei

    2016-06-01

    Pet markets involve in great commercial possibilities, which boost thriving development of veterinary hospital businesses. The service tends to intensive competition and diversified channel environment. Information technology is integrated for developing the veterinary hospital cloud service platform. The platform contains not only pet medical services but veterinary hospital management and services. In the study, QR Code andcloud technology are applied to establish the veterinary hospital cloud service platform for pet search by labeling a pet's identification with QR Code. This technology can break the restriction on veterinary hospital inspection in different areas and allows veterinary hospitals receiving the medical records and information through the exclusive QR Code for more effective inspection. As an interactive platform, the veterinary hospital cloud service platform allows pet owners gaining the knowledge of pet diseases and healthcare. Moreover, pet owners can enquire and communicate with veterinarians through the platform. Also, veterinary hospitals can periodically send reminders of relevant points and introduce exclusive marketing information with the platform for promoting the service items and establishing individualized marketing. Consequently, veterinary hospitals can increase the profits by information share and create the best solution in such a competitive veterinary market with industry alliance.

  8. Independent assessment of TRAC and RELAP5 codes through separate effects tests

    International Nuclear Information System (INIS)

    Saha, P.; Rohatgi, U.S.; Jo, J.H.; Neymotin, L.; Slovik, G.; Yuelys-Miksis, C.; Pu, J.

    1983-01-01

    Independent assessment of TRAC-PF1 (Version 7.0), TRAC-BD1 (Version 12.0) and RELAP5/MOD1 (Cycle 14) that was initiated at BNL in FY 1982, has been completed in FY 1983. As in the previous years, emphasis at Brookhaven has been in simulating various separate-effects tests with these advanced codes and identifying the areas where further thermal-hydraulic modeling improvements are needed. The following six catetories of tests were simulated with the above codes: (1) critical flow tests (Moby-Dick nitrogen-water, BNL flashing flow, Marviken Test 24); (2) Counter-Current Flow Limiting (CCFL) tests (University of Houston, Dartmouth College single and parallel tube test); (3) level swell tests (G.E. large vessel test); (4) steam generator tests (B and W 19-tube model S.G. tests, FLECHT-SEASET U-tube S.G. tests); (5) natural circulation tests (FRIGG loop tests); and (6) post-CHF tests (Oak Ridge steady-state test)

  9. Design and Implementation of Malicious Code Detection Platform for iOS System%iOS系统恶意代码检测平台设计与实现

    Institute of Scientific and Technical Information of China (English)

    田庆宜

    2013-01-01

    随着苹果手机日益普及,苹果终端已成为黑客重要的攻击目标。黑客利用恶意代码窃取个人信息及窃财犯罪层出不穷。但目前执法部门缺乏对苹果iOS移动平台恶意代码的检测平台。文章以iOS平台安全模型为基础,在分析国内外攻击方法的基础上,设计了苹果恶意代码检测框架,在此基础上构筑了对iOS的app应用恶意代码检测平台。经实际测试,本平台具备对iOS系统恶意代码通讯流量及文件改变的检测能力。本平台总体成本较低,有利于装备基层执法部门。%With the increasing popularity of Apple mobile phone, apple terminal has become the important target of hackers. The malicious hackers steal personal information and crime emerge in an endless stream. But the law enforcement departments lack of detection platform for Apple iOS mobile platform of malicious code. This paper is based on the iOS platform security model, designed the apple of malicious code detection framework, on the basis of iOS app application platform to build a malicious code detection. The actual test, the platform has the ability of detecting malicious code iOS system communication trafifc and ifle change. The overall cost of the platform is relatively low, in favor of equipment in the basic law enforcement.

  10. Integration of the TNXYZ computer program inside the platform Salome

    International Nuclear Information System (INIS)

    Chaparro V, F. J.

    2014-01-01

    The present work shows the procedure carried out to integrate the code TNXYZ as a calculation tool at the graphical simulation platform Salome. The TNXYZ code propose a numerical solution of the neutron transport equation, in several groups of energy, steady-state and three-dimensional geometry. In order to discretized the variables of the transport equation, the code uses the method of discrete ordinates for the angular variable, and a nodal method for the spatial dependence. The Salome platform is a graphical environment designed for building, editing and simulating mechanical models mainly focused on the industry and unlike other software, in order to form a complete scheme of pre and post processing of information, to integrate and control an external source code. Before the integration the in the Salome platform TNXYZ code was upgraded. TNXYZ was programmed in the 90s using Fortran 77 compiler; for this reason the code was adapted to the characteristics of the current Fortran compilers; in addition, with the intention of extracting partial results over the process sequence, the original structure of the program underwent a modularization process, i.e. the main program was divided into sections where the code performs major operations. This procedure is controlled by the information module (YACS) on Salome platform, and it could be useful for a subsequent coupling with thermal-hydraulics codes. Finally, with the help of the Monte Carlo code Serpent several study cases were defined in order to check the process of integration; the verification process consisted in performing a comparison of the results obtained with the code executed as stand-alone and after modernized, integrated and controlled by the Salome platform. (Author)

  11. LeARN: a platform for detecting, clustering and annotating non-coding RNAs

    Directory of Open Access Journals (Sweden)

    Schiex Thomas

    2008-01-01

    Full Text Available Abstract Background In the last decade, sequencing projects have led to the development of a number of annotation systems dedicated to the structural and functional annotation of protein-coding genes. These annotation systems manage the annotation of the non-protein coding genes (ncRNAs in a very crude way, allowing neither the edition of the secondary structures nor the clustering of ncRNA genes into families which are crucial for appropriate annotation of these molecules. Results LeARN is a flexible software package which handles the complete process of ncRNA annotation by integrating the layers of automatic detection and human curation. Conclusion This software provides the infrastructure to deal properly with ncRNAs in the framework of any annotation project. It fills the gap between existing prediction software, that detect independent ncRNA occurrences, and public ncRNA repositories, that do not offer the flexibility and interactivity required for annotation projects. The software is freely available from the download section of the website http://bioinfo.genopole-toulouse.prd.fr/LeARN

  12. Omnidirectional holonomic platforms

    International Nuclear Information System (INIS)

    Pin, F.G.; Killough, S.M.

    1994-01-01

    This paper presents the concepts for a new family of wheeled platforms which feature full omnidirectionality with simultaneous and independently controlled rotational and translational motion capabilities. The authors first present the orthogonal-wheels concept and the two major wheel assemblies on which these platforms are based. They then describe how a combination of these assemblies with appropriate control can be used to generate an omnidirectional capability for mobile robot platforms. The design and control of two prototype platforms are then presented and their respective characteristics with respect to rotational and translational motion control are discussed

  13. Error floor behavior study of LDPC codes for concatenated codes design

    Science.gov (United States)

    Chen, Weigang; Yin, Liuguo; Lu, Jianhua

    2007-11-01

    Error floor behavior of low-density parity-check (LDPC) codes using quantized decoding algorithms is statistically studied with experimental results on a hardware evaluation platform. The results present the distribution of the residual errors after decoding failure and reveal that the number of residual error bits in a codeword is usually very small using quantized sum-product (SP) algorithm. Therefore, LDPC code may serve as the inner code in a concatenated coding system with a high code rate outer code and thus an ultra low error floor can be achieved. This conclusion is also verified by the experimental results.

  14. YARP: Yet Another Robot Platform

    Directory of Open Access Journals (Sweden)

    Lorenzo Natale

    2008-11-01

    Full Text Available We describe YARP, Yet Another Robot Platform, an open-source project that encapsulates lessons from our experience in building humanoid robots. The goal of YARP is to minimize the effort devoted to infrastructure-level software development by facilitating code reuse, modularity and so maximize research-level development and collaboration. Humanoid robotics is a "bleeding edge" field of research, with constant flux in sensors, actuators, and processors. Code reuse and maintenance is therefore a significant challenge. We describe the main problems we faced and the solutions we adopted. In short, the main features of YARP include support for inter-process communication, image processing as well as a class hierarchy to ease code reuse across different hardware platforms. YARP is currently used and tested on Windows, Linux and QNX6 which are common operating systems used in robotics.

  15. Targeting multiple heterogeneous hardware platforms with OpenCL

    Science.gov (United States)

    Fox, Paul A.; Kozacik, Stephen T.; Humphrey, John R.; Paolini, Aaron; Kuller, Aryeh; Kelmelis, Eric J.

    2014-06-01

    The OpenCL API allows for the abstract expression of parallel, heterogeneous computing, but hardware implementations have substantial implementation differences. The abstractions provided by the OpenCL API are often insufficiently high-level to conceal differences in hardware architecture. Additionally, implementations often do not take advantage of potential performance gains from certain features due to hardware limitations and other factors. These factors make it challenging to produce code that is portable in practice, resulting in much OpenCL code being duplicated for each hardware platform being targeted. This duplication of effort offsets the principal advantage of OpenCL: portability. The use of certain coding practices can mitigate this problem, allowing a common code base to be adapted to perform well across a wide range of hardware platforms. To this end, we explore some general practices for producing performant code that are effective across platforms. Additionally, we explore some ways of modularizing code to enable optional optimizations that take advantage of hardware-specific characteristics. The minimum requirement for portability implies avoiding the use of OpenCL features that are optional, not widely implemented, poorly implemented, or missing in major implementations. Exposing multiple levels of parallelism allows hardware to take advantage of the types of parallelism it supports, from the task level down to explicit vector operations. Static optimizations and branch elimination in device code help the platform compiler to effectively optimize programs. Modularization of some code is important to allow operations to be chosen for performance on target hardware. Optional subroutines exploiting explicit memory locality allow for different memory hierarchies to be exploited for maximum performance. The C preprocessor and JIT compilation using the OpenCL runtime can be used to enable some of these techniques, as well as to factor in hardware

  16. Coupling of the neutron-kinetic core model DYN3D with the thermal hydraulic code FLICA-4 within the NURESIM platform

    International Nuclear Information System (INIS)

    Gommlich, A.; Kliem, S.; Rohde, U.; Gomez, A.; Sanchez, V.

    2010-01-01

    Within the FP7 Collaborative Project NURISP (NUclear Reactor Integrated Simulation Project) new and significant steps will be done towards a European Reference Simulation Platform for applications relevant to present PWR and BWR and to future reactors. The first step towards this target has been made during the FP6 NURESIM Integrated Project, where the already common and well-proven NURESIM informatics platform has been developed. This platform is based on the open source software SALOME. The 3D neutron kinetic core model DYN3D developed at Forschungszentrum Dresden-Rossendorf is part of the NURESIM platform. Within the NURESIM project, a SALOME based pre-processor for creation of DYN3D input data sets via GUI has been developed. DYN3D has been implemented into SALOME as black box, which allowed an independent execution. A conversion of the DYN3D result file into SALOME format was developed which opened the possibility using SALOME tools to visualize DYN3D results. (orig.)

  17. Evaluation of Independent Audit and Corporate Go vernance Practices in Turkey Under The Turkish Commercıal Code No. 6102: A Qualitative Research

    Directory of Open Access Journals (Sweden)

    Yasin Karadeniz

    2015-12-01

    Full Text Available The purpose of this study is as follows: To explain the new dimension that the corporate governance practices, which have had troubles for years in Turkey, have acquired with the Turkish Commercial Code and, while explaining such relations, to reveal the importance of independent auditing, which could not become fully functional and has gone through many problems again in the practices of our country, and also the importance of present situation and the situation in future with the help of Turkish Commercial Code and corporate governance relations.Interviews as a way of qualitative research has been done face to face with at least one chief auditor (mostly CPAs working in any of the independent auditing firms in İzmir and Çanakkale cities.Following interviews with auditors it has been revealed that the Turkish Commercial Code, corporate governance in Turkey would contribute positively to development of independent auditing.

  18. AZTLAN: Mexican platform for analysis and design of nuclear reactors - 15493

    International Nuclear Information System (INIS)

    Gomez Torres, A.M.; Puente Espel, F.; Valle Gallegos, E. del; Francois, J.L.; Martin-del-Campo, C.; Espinosa-Paredes, G.

    2015-01-01

    The AZTLAN platform is presented in this paper. This project aims at modernizing, improving and incorporating the neutron transport codes such as AZTLAN, AZKIND and AZNHEX, thermo-hydraulics codes like AZTHECA and thermo-mechanical codes developed in the Mexican institutions of higher education as well as in the Mexican nuclear research institute, in an integrated platform, established and maintained for the benefit of the Mexican nuclear knowledge. An important part of the project is to develop a coupling methodology between neutron transport codes and thermal-hydraulics codes in order to get an accurate 3-dimensional simulation of a reactor core

  19. prfectBLAST: a platform-independent portable front end for the command terminal BLAST+ stand-alone suite.

    Science.gov (United States)

    Santiago-Sotelo, Perfecto; Ramirez-Prado, Jorge Humberto

    2012-11-01

    prfectBLAST is a multiplatform graphical user interface (GUI) for the stand-alone BLAST+ suite of applications. It allows researchers to do nucleotide or amino acid sequence similarity searches against public (or user-customized) databases that are locally stored. It does not require any dependencies or installation and can be used from a portable flash drive. prfectBLAST is implemented in Java version 6 (SUN) and runs on all platforms that support Java and for which National Center for Biotechnology Information has made available stand-alone BLAST executables, including MS Windows, Mac OS X, and Linux. It is free and open source software, made available under the GNU General Public License version 3 (GPLv3) and can be downloaded at www.cicy.mx/sitios/jramirez or http://code.google.com/p/prfectblast/.

  20. Users and Programmers Guide for HPC Platforms in CIEMAT

    International Nuclear Information System (INIS)

    Munoz Roldan, A.

    2003-01-01

    This Technical Report presents a description of the High Performance Computing platforms available to researchers in CIEMAT and dedicated mainly to scientific computing. It targets to users and programmers and tries to help in the processes of developing new code and porting code across platforms. A brief review is also presented about historical evolution in the field of HPC, ie, the programming paradigms and underlying architectures. (Author) 32 refs

  1. Development of a platform-independent receiver control system for SISIFOS

    Science.gov (United States)

    Lemke, Roland; Olberg, Michael

    1998-05-01

    Up to now receiver control software was a time consuming development usually written by receiver engineers who had mainly the hardware in mind. We are presenting a low-cost and very flexible system which uses a minimal interface to the real hardware, and which makes it easy to adapt to new receivers. Our system uses Tcl/Tk as a graphical user interface (GUI), SpecTcl as a GUI builder, Pgplot as plotting software, a simple query language (SQL) database for information storage and retrieval, Ethernet socket to socket communication and SCPI as a command control language. The complete system is in principal platform independent but for cost saving reasons we are using it actually on a PC486 running Linux 2.0.30, which is a copylefted Unix. The only hardware dependent part are the digital input/output boards, analog to digital and digital to analog convertors. In the case of the Linux PC we are using a device driver development kit to integrate the boards fully into the kernel of the operating system, which indeed makes them look like an ordinary device. The advantage of this system is firstly the low price and secondly the clear separation between the different software components which are available for many operating systems. If it is not possible, due to CPU performance limitations, to run all the software in a single machine,the SQL-database or the graphical user interface could be installed on separate computers.

  2. Press touch code: A finger press based screen size independent authentication scheme for smart devices.

    Science.gov (United States)

    Ranak, M S A Noman; Azad, Saiful; Nor, Nur Nadiah Hanim Binti Mohd; Zamli, Kamal Z

    2017-01-01

    Due to recent advancements and appealing applications, the purchase rate of smart devices is increasing at a higher rate. Parallely, the security related threats and attacks are also increasing at a greater ratio on these devices. As a result, a considerable number of attacks have been noted in the recent past. To resist these attacks, many password-based authentication schemes are proposed. However, most of these schemes are not screen size independent; whereas, smart devices come in different sizes. Specifically, they are not suitable for miniature smart devices due to the small screen size and/or lack of full sized keyboards. In this paper, we propose a new screen size independent password-based authentication scheme, which also offers an affordable defense against shoulder surfing, brute force, and smudge attacks. In the proposed scheme, the Press Touch (PT)-a.k.a., Force Touch in Apple's MacBook, Apple Watch, ZTE's Axon 7 phone; 3D Touch in iPhone 6 and 7; and so on-is transformed into a new type of code, named Press Touch Code (PTC). We design and implement three variants of it, namely mono-PTC, multi-PTC, and multi-PTC with Grid, on the Android Operating System. An in-lab experiment and a comprehensive survey have been conducted on 105 participants to demonstrate the effectiveness of the proposed scheme.

  3. Press touch code: A finger press based screen size independent authentication scheme for smart devices

    Science.gov (United States)

    Ranak, M. S. A. Noman; Nor, Nur Nadiah Hanim Binti Mohd; Zamli, Kamal Z.

    2017-01-01

    Due to recent advancements and appealing applications, the purchase rate of smart devices is increasing at a higher rate. Parallely, the security related threats and attacks are also increasing at a greater ratio on these devices. As a result, a considerable number of attacks have been noted in the recent past. To resist these attacks, many password-based authentication schemes are proposed. However, most of these schemes are not screen size independent; whereas, smart devices come in different sizes. Specifically, they are not suitable for miniature smart devices due to the small screen size and/or lack of full sized keyboards. In this paper, we propose a new screen size independent password-based authentication scheme, which also offers an affordable defense against shoulder surfing, brute force, and smudge attacks. In the proposed scheme, the Press Touch (PT)—a.k.a., Force Touch in Apple’s MacBook, Apple Watch, ZTE’s Axon 7 phone; 3D Touch in iPhone 6 and 7; and so on—is transformed into a new type of code, named Press Touch Code (PTC). We design and implement three variants of it, namely mono-PTC, multi-PTC, and multi-PTC with Grid, on the Android Operating System. An in-lab experiment and a comprehensive survey have been conducted on 105 participants to demonstrate the effectiveness of the proposed scheme. PMID:29084262

  4. Press touch code: A finger press based screen size independent authentication scheme for smart devices.

    Directory of Open Access Journals (Sweden)

    M S A Noman Ranak

    Full Text Available Due to recent advancements and appealing applications, the purchase rate of smart devices is increasing at a higher rate. Parallely, the security related threats and attacks are also increasing at a greater ratio on these devices. As a result, a considerable number of attacks have been noted in the recent past. To resist these attacks, many password-based authentication schemes are proposed. However, most of these schemes are not screen size independent; whereas, smart devices come in different sizes. Specifically, they are not suitable for miniature smart devices due to the small screen size and/or lack of full sized keyboards. In this paper, we propose a new screen size independent password-based authentication scheme, which also offers an affordable defense against shoulder surfing, brute force, and smudge attacks. In the proposed scheme, the Press Touch (PT-a.k.a., Force Touch in Apple's MacBook, Apple Watch, ZTE's Axon 7 phone; 3D Touch in iPhone 6 and 7; and so on-is transformed into a new type of code, named Press Touch Code (PTC. We design and implement three variants of it, namely mono-PTC, multi-PTC, and multi-PTC with Grid, on the Android Operating System. An in-lab experiment and a comprehensive survey have been conducted on 105 participants to demonstrate the effectiveness of the proposed scheme.

  5. The Definitive Guide to NetBeans Platform

    CERN Document Server

    Bock, Heiko

    2009-01-01

    The Definitive Guide to NetBeans(t) Platform is a thorough and definitive introduction to the NetBeans Platform, covering all its major APIs in detail, with relevant code examples used throughout. The original German book on which this title is based was well received. The NetBeans Platform Community has put together this English translation, which author Heiko Bock updated to cover the latest NetBeans Platform 6.5 APIs. With an introduction by known NetBeans Platform experts Jaroslav Tulach, Tim Boudreau, and Geertjan Wielenga, this is the most up-to-date book on this topic at the moment. All

  6. Advances in the development of the Mexican platform for analysis and design of nuclear reactors: AZTLAN Platform

    International Nuclear Information System (INIS)

    Gomez T, A. M.; Puente E, F.; Del Valle G, E.; Francois L, J. L.; Espinosa P, G.

    2017-09-01

    The AZTLAN platform project: development of a Mexican platform for the analysis and design of nuclear reactors, financed by the SENER-CONACYT Energy Sustain ability Fund, was approved in early 2014 and formally began at the end of that year. It is a national project led by the Instituto Nacional de Investigaciones Nucleares (ININ) and with the collaboration of Instituto Politecnico Nacional (IPN), the Universidad Autonoma Metropolitana (UAM) and Universidad Nacional Autonoma de Mexico (UNAM) as part of the development team and with the participation of the Laguna Verde Nuclear Power Plant, the National Commission of Nuclear Safety and Safeguards, the Ministry of Energy and the Karlsruhe Institute of Technology (Kit, Germany) as part of the user group. The general objective of the project is to modernize, improve and integrate the neutronic, thermo-hydraulic and thermo-mechanical codes, developed in Mexican institutions, in an integrated platform, developed and maintained by Mexican experts for the benefit of Mexican institutions. Two years into the process, important steps have been taken that have consolidated the platform. The main results of these first two years have been presented in different national and international forums. In this congress, some of the most recent results that have been implemented in the platform codes are shown in more detail. The current status of the platform from a more executive view point is summarized in this paper. (Author)

  7. Wireless sensor platform

    Science.gov (United States)

    Joshi, Pooran C.; Killough, Stephen M.; Kuruganti, Phani Teja

    2017-08-08

    A wireless sensor platform and methods of manufacture are provided. The platform involves providing a plurality of wireless sensors, where each of the sensors is fabricated on flexible substrates using printing techniques and low temperature curing. Each of the sensors can include planar sensor elements and planar antennas defined using the printing and curing. Further, each of the sensors can include a communications system configured to encode the data from the sensors into a spread spectrum code sequence that is transmitted to a central computer(s) for use in monitoring an area associated with the sensors.

  8. Windows Azure Platform

    CERN Document Server

    Redkar, Tejaswi

    2010-01-01

    The Azure Services Platform is a brand-new cloud-computing technology from Microsoft. It is composed of four core components-Windows Azure, .NET Services, SQL Services, and Live Services-each with a unique role in the functioning of your cloud service. It is the goal of this book to show you how to use these components, both separately and together, to build flawless cloud services. At its heart Windows Azure Platform is a down-to-earth, code-centric book. This book aims to show you precisely how the components are employed and to demonstrate the techniques and best practices you need to know

  9. FCJ-128 A Programmable Platform? Drupal, Modularity, and the Future of the Web

    Directory of Open Access Journals (Sweden)

    Fenwick McKelvey

    2011-10-01

    Full Text Available Sent as a walking advertisement of Canada’s technology sector, I arrived in Argentina to help a women’s rights organization develop a new website. I began using the Drupal content management platform to construct the site. Its interface brought me into the rarified world of web programming. My experience provides a way of entry into the Drupal platform – a platform I believe is re-programmable. The paper introduces the concept of re-programmability as a processes by which users and code interact to alter software’s running code, and works out this concept through the case of Drupal and how its modular code can be re-programmed by its users. The paper utilizes the theory of transduction to flip the critique of web2.0 platforms on its head – focusing on the processes of becoming a platform, rather than the platform as a final state. This offers a new line of critique for web2.0 platforms, namely how they enact their re-programming.

  10. Low Computational Complexity Network Coding For Mobile Networks

    DEFF Research Database (Denmark)

    Heide, Janus

    2012-01-01

    Network Coding (NC) is a technique that can provide benefits in many types of networks, some examples from wireless networks are: In relay networks, either the physical or the data link layer, to reduce the number of transmissions. In reliable multicast, to reduce the amount of signaling and enable......-flow coding technique. One of the key challenges of this technique is its inherent computational complexity which can lead to high computational load and energy consumption in particular on the mobile platforms that are the target platform in this work. To increase the coding throughput several...

  11. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  12. Independent assessment of TRAC-PD2 and RELAP5/MOD1 codes at BNL in FY 1981

    International Nuclear Information System (INIS)

    Saha, P.; Jo, J.H.; Neymotin, L.; Rohatgi, U.S.; Slovik, G.

    1982-12-01

    This report documents the independent assessment calculations performed with the TRAC-PD2 and RELAP/MOD1 codes at Brookhaven National Laboratory (BNL) during Fiscal Year 1981. A large variety of separate-effects experiments dealing with (1) steady-state and transient critical flow, (2) level swell, (3) flooding and entrainment, (4) steady-state flow boiling, (5) integral economizer once-through steam generator (IEOTSG) performance, (6) bottom reflood, and (7) two-dimensional phase separation of two-phase mixtures were simulated with TRAC-PD2. In addition, the early part of an overcooling transient which occurred at the Rancho Seco nuclear power plant on March 20, 1978 was also computed with an updated version of TRAC-PD2. Three separate-effects tests dealing with (1) transient critical flow, (2) steady-state flow boiling, and (3) IEOTSG performance were also simulated with RELAP5/MOD1 code. Comparisons between the code predictions and the test data are presented

  13. Code Generation from Pragmatics Annotated Coloured Petri Nets

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    limited work has been done on transforming CPN model to protocol implementations. The goal of the thesis is to be able to automatically generate high-quality implementations of communication protocols based on CPN models. In this thesis, we develop a methodology for generating implementations of protocols...... third party libraries and the code should be easily usable by third party code. Finally, the code should be readable by developers with expertise on the considered platforms. In this thesis, we show that our code generation approach is able to generate code for a wide range of platforms without altering...... such as games and rich web applications. Finally, we conclude the evaluation of the criteria of our approach by using the WebSocket PA-CPN model to show that we are able to verify fairly large protocols....

  14. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    International Nuclear Information System (INIS)

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H. Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H. Harold

    2016-01-01

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: PENELOPE was first translated from FORTRAN to C++ and the result was confirmed to produce equivalent results to the original code. The C++ code was then adapted to CUDA in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gPENELOPE as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gPENELOPE. Ultimately, gPENELOPE was applied toward independent validation of patient doses calculated by MRIdian’s KMC. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread FORTRAN implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of PENELOPE. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this

  15. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model.

    Science.gov (United States)

    Wang, Yuhe; Mazur, Thomas R; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H Harold

    2016-07-01

    The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian's kmc. An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and

  16. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H. Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H. Harold, E-mail: hli@radonc.wustl.edu [Department of Radiation Oncology, Washington University School of Medicine, 4921 Parkview Place, Campus Box 8224, St. Louis, Missouri 63110 (United States)

    2016-07-15

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: PENELOPE was first translated from FORTRAN to C++ and the result was confirmed to produce equivalent results to the original code. The C++ code was then adapted to CUDA in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gPENELOPE as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gPENELOPE. Ultimately, gPENELOPE was applied toward independent validation of patient doses calculated by MRIdian’s KMC. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread FORTRAN implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of PENELOPE. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this

  17. SALOME. A software integration platform for multi-physics, pre-processing and visualisation

    International Nuclear Information System (INIS)

    Bergeaud, Vincent; Lefebvre, Vincent

    2010-01-01

    In order to ease the development of applications integrating simulation codes, CAD modelers and post-processing tools. CEA and EDF R and D have invested in the SALOME platform, a tool dedicated to the environment of the scientific codes. The platform comes in the shape of a toolbox which offers functionalities for CAD, meshing, code coupling, visualization, GUI development. These tools can be combined to create integrated applications that make the scientific codes easier to use and well-interfaced with their environment be it other codes, CAD and meshing tools or visualization software. Many projects in CEA and EDF R and D now use SALOME, bringing technical coherence to the software suites of our institutions. (author)

  18. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  19. Cross platform SCA component using C++ builder and KYLIX

    International Nuclear Information System (INIS)

    Nishimura, Hiroshi; Timossi, Chiris; McDonald, James L.

    2003-01-01

    A cross-platform component for EPICS Simple Channel Access (SCA) has been developed. EPICS client programs with GUI become portable at their C++ source-code level both on Windows and Linux by using Borland C++ Builder 6 and Kylix 3 on these platforms respectively

  20. pix2code: Generating Code from a Graphical User Interface Screenshot

    OpenAIRE

    Beltramelli, Tony

    2017-01-01

    Transforming a graphical user interface screenshot created by a designer into computer code is a typical task conducted by a developer in order to build customized software, websites, and mobile applications. In this paper, we show that deep learning methods can be leveraged to train a model end-to-end to automatically generate code from a single input image with over 77% of accuracy for three different platforms (i.e. iOS, Android and web-based technologies).

  1. An imprinted non-coding genomic cluster at 14q32 defines clinically relevant molecular subtypes in osteosarcoma across multiple independent datasets

    OpenAIRE

    Hill, Katherine E.; Kelly, Andrew D.; Kuijjer, Marieke L.; Barry, William; Rattani, Ahmed; Garbutt, Cassandra C.; Kissick, Haydn; Janeway, Katherine; Perez-Atayde, Antonio; Goldsmith, Jeffrey; Gebhardt, Mark C.; Arredouani, Mohamed S.; Cote, Greg; Hornicek, Francis; Choy, Edwin

    2017-01-01

    Background: A microRNA (miRNA) collection on the imprinted 14q32 MEG3 region has been associated with outcome in osteosarcoma. We assessed the clinical utility of this miRNA set and their association with methylation status. Methods: We integrated coding and non-coding RNA data from three independent annotated clinical osteosarcoma cohorts (n = 65, n = 27, and n = 25) and miRNA and methylation data from one in vitro (19 cell lines) and one clinical (NCI Therapeutically Applicable Research to ...

  2. Integration of the program TNXYZ in the platform SALOME

    International Nuclear Information System (INIS)

    Chaparro V, F. J.; Silva A, L.; Del Valle G, E.; Gomez T, A. M.; Vargas E, S.

    2013-10-01

    This work presents the procedure realized to integrate the code TNXYZ like a processing tool to the graphic simulation platform SALOME. The code TNXYZ solves the neutron transport equation in stationary state, for several energy groups, quantizing the angular variable by the discrete ordinates method and the space variable by nodal methods. The platform SALOME is a graphic surrounding designed for the construction, edition and simulation of mechanical models focused to the industry and contrary to other software, it allows to integrate external source codes to the surrounding, to form a complete scheme of execution, supervision, pre and post information processing. The code TNXYZ was programmed in the 90s in a Fortran compiler, but to be used at the present time the code should be actualized to the current compiler characteristics; also, in the original scheme was carried out a modularization process, that is to say, the main program was divided in sections where the code carries out important operations, with the intention of flexibility the data extraction process along its processing sequence and that can be useful in a later development of coupling. Finally, to verify the integration a fuel assembly BWR was modeled, as well as a control cell. The cross sections were obtained with the Monte Carlo Serpent code. Some results obtained with Serpent were used to verify and to begin with the validation of the code, being obtained an acceptable comparison in the infinite multiplication factor. The validation process should extend and one has planned to present in a future work. This work is part of the development of the research group formed between the Escuela Superior de Fisica y Matematicas del Instituto Politecnico Nacional (IPN) and the Instituto Nacional de Investigaciones Nucleares (ININ) in which a simulation Mexican platform of nuclear reactors is developed. (Author)

  3. Physical, taxonomic code, and other data from current meter and other instruments in New York Bight from DOLPHIN and other platforms; 14 March 1971 to 03 August 1975 (NODC Accession 7601385)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Physical, taxonomic code, and other data were collected using current meter and other instruments from DOLPHIN and other platforms in New York Bight. Data were...

  4. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  5. Advances in the development of the Mexican platform for analysis and design of nuclear reactors: AZTLAN Platform; Avances en el desarrollo de la plataforma mexicana para analisis y diseno de reactores nucleares: AZTLAN Platform

    Energy Technology Data Exchange (ETDEWEB)

    Gomez T, A. M.; Puente E, F. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Del Valle G, E. [IPN, Escuela Superior de Fisica y Matematicas, Av. IPN s/n, 07738 Ciudad de Mexico (Mexico); Francois L, J. L. [UNAM, Facultad de Ingenieria, Departamento de Sistemas Energeticos, Paseo Cuauhnahuac 8532, Col. Progreso, 62550 Jiutepec, Morelos (Mexico); Espinosa P, G., E-mail: armando.gomez@inin.gob.mx [Universidad Autonoma Metropolitana, Unidad Iztapalapa, Av. San Rafael Atlixco 186, Col. Vicentina, 09340 Ciudad de Mexico (Mexico)

    2017-09-15

    The AZTLAN platform project: development of a Mexican platform for the analysis and design of nuclear reactors, financed by the SENER-CONACYT Energy Sustain ability Fund, was approved in early 2014 and formally began at the end of that year. It is a national project led by the Instituto Nacional de Investigaciones Nucleares (ININ) and with the collaboration of Instituto Politecnico Nacional (IPN), the Universidad Autonoma Metropolitana (UAM) and Universidad Nacional Autonoma de Mexico (UNAM) as part of the development team and with the participation of the Laguna Verde Nuclear Power Plant, the National Commission of Nuclear Safety and Safeguards, the Ministry of Energy and the Karlsruhe Institute of Technology (Kit, Germany) as part of the user group. The general objective of the project is to modernize, improve and integrate the neutronic, thermo-hydraulic and thermo-mechanical codes, developed in Mexican institutions, in an integrated platform, developed and maintained by Mexican experts for the benefit of Mexican institutions. Two years into the process, important steps have been taken that have consolidated the platform. The main results of these first two years have been presented in different national and international forums. In this congress, some of the most recent results that have been implemented in the platform codes are shown in more detail. The current status of the platform from a more executive view point is summarized in this paper. (Author)

  6. Modeling report of DYMOND code (DUPIC version)

    International Nuclear Information System (INIS)

    Park, Joo Hwan; Yacout, Abdellatif M.

    2003-04-01

    The DYMOND code employs the ITHINK dynamic modeling platform to assess the 100-year dynamic evolution scenarios for postulated global nuclear energy parks. Firstly, DYMOND code has been developed by ANL(Argonne National Laboratory) to perform the fuel cycle analysis of LWR once-through and LWR-FBR mixed plant. Since the extensive application of DYMOND code has been requested, the first version of DYMOND has been modified to adapt the DUPIC, MSR and RTF fuel cycle. DYMOND code is composed of three parts; the source language platform, input supply and output. But those platforms are not clearly distinguished. This report described all the equations which were modeled in the modified DYMOND code (which is called as DYMOND-DUPIC version). It divided into five parts;Part A deals model in reactor history which is included amount of the requested fuels and spent fuels. Part B aims to describe model of fuel cycle about fuel flow from the beginning to the end of fuel cycle. Part C is for model in re-processing which is included recovery of burned uranium, plutonium, minor actinide and fission product as well as the amount of spent fuels in storage and disposal. Part D is for model in other fuel cycle which is considered the thorium fuel cycle for MSR and RTF reactor. Part E is for model in economics. This part gives all the information of cost such as uranium mining cost, reactor operating cost, fuel cost etc

  7. Modeling report of DYMOND code (DUPIC version)

    Energy Technology Data Exchange (ETDEWEB)

    Park, Joo Hwan [KAERI, Taejon (Korea, Republic of); Yacout, Abdellatif M [Argonne National Laboratory, Ilinois (United States)

    2003-04-01

    The DYMOND code employs the ITHINK dynamic modeling platform to assess the 100-year dynamic evolution scenarios for postulated global nuclear energy parks. Firstly, DYMOND code has been developed by ANL(Argonne National Laboratory) to perform the fuel cycle analysis of LWR once-through and LWR-FBR mixed plant. Since the extensive application of DYMOND code has been requested, the first version of DYMOND has been modified to adapt the DUPIC, MSR and RTF fuel cycle. DYMOND code is composed of three parts; the source language platform, input supply and output. But those platforms are not clearly distinguished. This report described all the equations which were modeled in the modified DYMOND code (which is called as DYMOND-DUPIC version). It divided into five parts;Part A deals model in reactor history which is included amount of the requested fuels and spent fuels. Part B aims to describe model of fuel cycle about fuel flow from the beginning to the end of fuel cycle. Part C is for model in re-processing which is included recovery of burned uranium, plutonium, minor actinide and fission product as well as the amount of spent fuels in storage and disposal. Part D is for model in other fuel cycle which is considered the thorium fuel cycle for MSR and RTF reactor. Part E is for model in economics. This part gives all the information of cost such as uranium mining cost, reactor operating cost, fuel cost etc.

  8. Synthesizing Certified Code

    OpenAIRE

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since code certification uses the same underlying technology as program verification, it requires detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding annotations to th...

  9. RELAP5/MOD3 code manual: Summaries and reviews of independent code assessment reports. Volume 7, Revision 1

    International Nuclear Information System (INIS)

    Moore, R.L.; Sloan, S.M.; Schultz, R.R.; Wilson, G.E.

    1996-10-01

    Summaries of RELAP5/MOD3 code assessments, a listing of the assessment matrix, and a chronology of the various versions of the code are given. Results from these code assessments have been used to formulate a compilation of some of the strengths and weaknesses of the code. These results are documented in the report. Volume 7 was designed to be updated periodically and to include the results of the latest code assessments as they become available. Consequently, users of Volume 7 should ensure that they have the latest revision available

  10. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  11. FENICIA: a generic plasma simulation code using a flux-independent field-aligned coordinate approach

    International Nuclear Information System (INIS)

    Hariri, Farah

    2013-01-01

    The primary thrust of this work is the development and implementation of a new approach to the problem of field-aligned coordinates in magnetized plasma turbulence simulations called the FCI approach (Flux-Coordinate Independent). The method exploits the elongated nature of micro-instability driven turbulence which typically has perpendicular scales on the order of a few ion gyro-radii, and parallel scales on the order of the machine size. Mathematically speaking, it relies on local transformations that align a suitable coordinate to the magnetic field to allow efficient computation of the parallel derivative. However, it does not rely on flux coordinates, which permits discretizing any given field on a regular grid in the natural coordinates such as (x, y, z) in the cylindrical limit. The new method has a number of advantages over methods constructed starting from flux coordinates, allowing for more flexible coding in a variety of situations including X-point configurations. In light of these findings, a plasma simulation code FENICIA has been developed based on the FCI approach with the ability to tackle a wide class of physical models. The code has been verified on several 3D test models. The accuracy of the approach is tested in particular with respect to the question of spurious radial transport. Tests on 3D models of the drift wave propagation and of the Ion Temperature Gradient (ITG) instability in cylindrical geometry in the linear regime demonstrate again the high quality of the numerical method. Finally, the FCI approach is shown to be able to deal with an X-point configuration such as one with a magnetic island with good convergence and conservation properties. (author) [fr

  12. Online Crowdfunding Campaign for an Independent Video Game

    OpenAIRE

    Kivikangas, Inessa

    2014-01-01

    Over the past several years online reward-model crowdfunding platforms have become a popular tool for raising funds among independent game developers. Big success of several brilliant indie titles brought to the online crowdfunding platforms Kickstarter and Indiegogo hundreds of hopeful independent developers. However, apart from creating an excellent game indie developers have to be able to reach out to their audience and capture attention of potential supporters and gaming media. Time and e...

  13. Memory for pictures and sounds: independence of auditory and visual codes.

    Science.gov (United States)

    Thompson, V A; Paivio, A

    1994-09-01

    Three experiments examined the mnemonic independence of auditory and visual nonverbal stimuli in free recall. Stimulus lists consisted of (1) pictures, (2) the corresponding environmental sounds, or (3) picture-sound pairs. In Experiment 1, free recall was tested under three learning conditions: standard intentional, intentional with a rehearsal-inhibiting distracter task, or incidental with the distracter task. In all three groups, recall was best for the picture-sound items. In addition, recall for the picture-sound stimuli appeared to be additive relative to pictures or sounds alone when the distracter task was used. Experiment 2 included two additional groups: In one, two copies of the same picture were shown simultaneously; in the other, two different pictures of the same concept were shown. There was no difference in recall among any of the picture groups; in contrast, recall in the picture-sound condition was greater than recall in either single-modality condition. However, doubling the exposure time in a third experiment resulted in additively higher recall for repeated pictures with different exemplars than ones with identical exemplars. The results are discussed in terms of dual coding theory and alternative conceptions of the memory trace.

  14. MouSensor: A Versatile Genetic Platform to Create Super Sniffer Mice for Studying Human Odor Coding

    Directory of Open Access Journals (Sweden)

    Charlotte D’Hulst

    2016-07-01

    Full Text Available Typically, ∼0.1% of the total number of olfactory sensory neurons (OSNs in the main olfactory epithelium express the same odorant receptor (OR in a singular fashion and their axons coalesce into homotypic glomeruli in the olfactory bulb. Here, we have dramatically increased the total number of OSNs expressing specific cloned OR coding sequences by multimerizing a 21-bp sequence encompassing the predicted homeodomain binding site sequence, TAATGA, known to be essential in OR gene choice. Singular gene choice is maintained in these “MouSensors.” In vivo synaptopHluorin imaging of odor-induced responses by known M71 ligands shows functional glomerular activation in an M71 MouSensor. Moreover, a behavioral avoidance task demonstrates that specific odor detection thresholds are significantly decreased in multiple transgenic lines, expressing mouse or human ORs. We have developed a versatile platform to study gene choice and axon identity, to create biosensors with great translational potential, and to finally decode human olfaction.

  15. The 1996 ENDF pre-processing codes

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1996-01-01

    The codes are named 'the Pre-processing' codes, because they are designed to pre-process ENDF/B data, for later, further processing for use in applications. This is a modular set of computer codes, each of which reads and writes evaluated nuclear data in the ENDF/B format. Each code performs one or more independent operations on the data, as described below. These codes are designed to be computer independent, and are presently operational on every type of computer from large mainframe computer to small personal computers, such as IBM-PC and Power MAC. The codes are available from the IAEA Nuclear Data Section, free of charge upon request. (author)

  16. Volttron: An Agent Platform for the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Haack, Jereme N.; Akyol, Bora A.; Carpenter, Brandon J.; Tews, Cody W.; Foglesong, Lance W.

    2013-05-06

    VOLLTRON platform enables the deployment of intelligent sensors and controllers in the smart grid and provides a stable, secure and flexible framework that expands the sensing and control capabilities. VOLTTRON platform provides services fulfilling the essential requirements of resource management and security for agent operation in the power grid. The facilities provided by the platform allow agent developers to focus on the implementation of their agent system and not on the necessary "plumbing' code. For example, a simple collaborative demand response application was written in less than 200 lines of Python.

  17. Authorization request for potential non-compliance with the American Standard Safety Code for Elevators Dumbwaiters and Escalators

    Energy Technology Data Exchange (ETDEWEB)

    Boyd, J.E.

    1964-09-28

    A Third Party inspection of the reactor work platforms was conducted by representatives of the Travelers Insurance Company in 1958. An inspection report submitted by these representatives described hazardous conditions noted and presented a series of recommendations to improve the operational safety of the systems. Project CGI-960, ``C`` & ``D`` Work Platform Safety Improvements -- All Reactors, vas initiated to modify the platforms in compliance with the Third Party recommendations. The American Standard Safety Code for Elevators Dumbwaiters and Escalators (A-17.1) is used as a guide by the Third Party in formulating their recommendations. This code is used because there is no other applicable code for this type of equipment. While the work platforms do not and in some cases can not comply with this code because of operational use, every effort is made to comply with the intent of the code.

  18. Linear-Time Non-Malleable Codes in the Bit-Wise Independent Tampering Model

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Döttling, Nico

    Non-malleable codes were introduced by Dziembowski et al. (ICS 2010) as coding schemes that protect a message against tampering attacks. Roughly speaking, a code is non-malleable if decoding an adversarially tampered encoding of a message m produces the original message m or a value m' (eventuall...... non-malleable codes of Agrawal et al. (TCC 2015) and of Cher- aghchi and Guruswami (TCC 2014) and improves the previous result in the bit-wise tampering model: it builds the first non-malleable codes with linear-time complexity and optimal-rate (i.e. rate 1 - o(1)).......Non-malleable codes were introduced by Dziembowski et al. (ICS 2010) as coding schemes that protect a message against tampering attacks. Roughly speaking, a code is non-malleable if decoding an adversarially tampered encoding of a message m produces the original message m or a value m' (eventually...... abort) completely unrelated with m. It is known that non-malleability is possible only for restricted classes of tampering functions. Since their introduction, a long line of works has established feasibility results of non-malleable codes against different families of tampering functions. However...

  19. Conversion of HSPF Legacy Model to a Platform-Independent, Open-Source Language

    Science.gov (United States)

    Heaphy, R. T.; Burke, M. P.; Love, J. T.

    2015-12-01

    Since its initial development over 30 years ago, the Hydrologic Simulation Program - FORTAN (HSPF) model has been used worldwide to support water quality planning and management. In the United States, HSPF receives widespread endorsement as a regulatory tool at all levels of government and is a core component of the EPA's Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) system, which was developed to support nationwide Total Maximum Daily Load (TMDL) analysis. However, the model's legacy code and data management systems have limitations in their ability to integrate with modern software, hardware, and leverage parallel computing, which have left voids in optimization, pre-, and post-processing tools. Advances in technology and our scientific understanding of environmental processes that have occurred over the last 30 years mandate that upgrades be made to HSPF to allow it to evolve and continue to be a premiere tool for water resource planners. This work aims to mitigate the challenges currently facing HSPF through two primary tasks: (1) convert code to a modern widely accepted, open-source, high-performance computing (hpc) code; and (2) convert model input and output files to modern widely accepted, open-source, data model, library, and binary file format. Python was chosen as the new language for the code conversion. It is an interpreted, object-oriented, hpc code with dynamic semantics that has become one of the most popular open-source languages. While python code execution can be slow compared to compiled, statically typed programming languages, such as C and FORTRAN, the integration of Numba (a just-in-time specializing compiler) has allowed this challenge to be overcome. For the legacy model data management conversion, HDF5 was chosen to store the model input and output. The code conversion for HSPF's hydrologic and hydraulic modules has been completed. The converted code has been tested against HSPF's suite of "test" runs and shown

  20. Simulator platform for fast reactor operation and safety technology demonstration

    International Nuclear Information System (INIS)

    Vilim, R.B.; Park, Y.S.; Grandy, C.; Belch, H.; Dworzanski, P.; Misterka, J.

    2012-01-01

    A simulator platform for visualization and demonstration of innovative concepts in fast reactor technology is described. The objective is to make more accessible the workings of fast reactor technology innovations and to do so in a human factors environment that uses state-of-the art visualization technologies. In this work the computer codes in use at Argonne National Laboratory (ANL) for the design of fast reactor systems are being integrated to run on this platform. This includes linking reactor systems codes with mechanical structures codes and using advanced graphics to depict the thermo-hydraulic-structure interactions that give rise to an inherently safe response to upsets. It also includes visualization of mechanical systems operation including advanced concepts that make use of robotics for operations, in-service inspection, and maintenance.

  1. Simulator platform for fast reactor operation and safety technology demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Vilim, R. B.; Park, Y. S.; Grandy, C.; Belch, H.; Dworzanski, P.; Misterka, J. (Nuclear Engineering Division)

    2012-07-30

    A simulator platform for visualization and demonstration of innovative concepts in fast reactor technology is described. The objective is to make more accessible the workings of fast reactor technology innovations and to do so in a human factors environment that uses state-of-the art visualization technologies. In this work the computer codes in use at Argonne National Laboratory (ANL) for the design of fast reactor systems are being integrated to run on this platform. This includes linking reactor systems codes with mechanical structures codes and using advanced graphics to depict the thermo-hydraulic-structure interactions that give rise to an inherently safe response to upsets. It also includes visualization of mechanical systems operation including advanced concepts that make use of robotics for operations, in-service inspection, and maintenance.

  2. Brake for rollable platform

    Science.gov (United States)

    Morris, A. L.

    1974-01-01

    Frame-mounted brake is independent of wheels and consists of simple lever-actuated foot. Brake makes good contact with surface even though foot pad is at higher or lower level than wheels, this is particularly important when a rollable platform is used on irregular surface.

  3. NESTLE: A nodal kinetics code

    International Nuclear Information System (INIS)

    Al-Chalabi, R.M.; Turinsky, P.J.; Faure, F.-X.; Sarsour, H.N.; Engrand, P.R.

    1993-01-01

    The NESTLE nodal kinetics code has been developed for utilization as a stand-alone code for steady-state and transient reactor neutronic analysis and for incorporation into system transient codes, such as TRAC and RELAP. The latter is desirable to increase the simulation fidelity over that obtained from currently employed zero- and one-dimensional neutronic models and now feasible due to advances in computer performance and efficiency of nodal methods. As a stand-alone code, requirements are that it operate on a range of computing platforms from memory-limited personal computers (PCs) to supercomputers with vector processors. This paper summarizes the features of NESTLE that reflect the utilization and requirements just noted

  4. The Definitive Guide to NetBeans™ Platform 7

    CERN Document Server

    Bock, Heiko

    2011-01-01

    The NetBeans Platform is the world's only modular Swing application framework, used by very large organizations in mission-critical scenarios, such as at Boeing and Northrop Grumman, as well as in the financial sector and in the oil/gas industry. For these large customers in enterprises who are increasingly interested in Maven and OSGi, the book will have particular relevance. The Definitive Guide to NetBeans Platform 7 is a thorough and authoritative introduction to the open-source NetBeans Platform, covering all its major APIs in detail, with relevant code examples used throughout. * Provide

  5. JBoss Weld CDI for Java platform

    CERN Document Server

    Finnegan, Ken

    2013-01-01

    This book is a mini tutorial with plenty of code examples and strategies to give you numerous options when building your own applications.""JBoss Weld CDI for Java Platform"" is written for developers who are new to dependency injection. A rudimentary knowledge of Java is required.

  6. Remembering to learn: independent place and journey coding mechanisms contribute to memory transfer.

    Science.gov (United States)

    Bahar, Amir S; Shapiro, Matthew L

    2012-02-08

    The neural mechanisms that integrate new episodes with established memories are unknown. When rats explore an environment, CA1 cells fire in place fields that indicate locations. In goal-directed spatial memory tasks, some place fields differentiate behavioral histories ("journey-dependent" place fields) while others do not ("journey-independent" place fields). To investigate how these signals inform learning and memory for new and familiar episodes, we recorded CA1 and CA3 activity in rats trained to perform a "standard" spatial memory task in a plus maze and in two new task variants. A "switch" task exchanged the start and goal locations in the same environment; an "altered environment" task contained unfamiliar local and distal cues. In the switch task, performance was mildly impaired, new firing maps were stable, but the proportion and stability of journey-dependent place fields declined. In the altered environment, overall performance was strongly impaired, new firing maps were unstable, and stable proportions of journey-dependent place fields were maintained. In both tasks, memory errors were accompanied by a decline in journey codes. The different dynamics of place and journey coding suggest that they reflect separate mechanisms and contribute to distinct memory computations. Stable place fields may represent familiar relationships among environmental features that are required for consistent memory performance. Journey-dependent activity may correspond with goal-directed behavioral sequences that reflect expectancies that generalize across environments. The complementary signals could help link current events with established memories, so that familiarity with either a behavioral strategy or an environment can inform goal-directed learning.

  7. Coupling methodology within the software platform alliances

    Energy Technology Data Exchange (ETDEWEB)

    Montarnal, Ph; Deville, E; Adam, E; Bengaouer, A [CEA Saclay, Dept. de Modelisation des Systemes et Structures 91 - Gif-sur-Yvette (France); Dimier, A; Gaombalet, J; Loth, L [Agence Nationale pour la Gestion des Dechets Radioactifs (ANDRA), 92 - Chatenay Malabry (France); Chavant, C [Electricite de France (EDF), 92 - Clamart (France)

    2005-07-01

    CEA, ANDRA and EDF are jointly developing the software platform ALLIANCES which aim is to produce a tool for the simulation of nuclear waste storage and disposal repository. This type of simulations deals with highly coupled thermo-hydro-mechanical and chemical (T-H-M-C) processes. A key objective of Alliances is to give the capability for coupling algorithms development between existing codes. The aim of this paper is to present coupling methodology use in the context of this software platform. (author)

  8. Coupling methodology within the software platform alliances

    International Nuclear Information System (INIS)

    Montarnal, Ph.; Deville, E.; Adam, E.; Bengaouer, A.; Dimier, A.; Gaombalet, J.; Loth, L.; Chavant, C.

    2005-01-01

    CEA, ANDRA and EDF are jointly developing the software platform ALLIANCES which aim is to produce a tool for the simulation of nuclear waste storage and disposal repository. This type of simulations deals with highly coupled thermo-hydro-mechanical and chemical (T-H-M-C) processes. A key objective of Alliances is to give the capability for coupling algorithms development between existing codes. The aim of this paper is to present coupling methodology use in the context of this software platform. (author)

  9. Planning for Pre-Exascale Platform Environment (Fiscal Year 2015 Level 2 Milestone 5216)

    Energy Technology Data Exchange (ETDEWEB)

    Springmeyer, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lang, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Noe, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-09-28

    This Plan for ASC Pre-Exascale Platform Environments document constitutes the deliverable for the fiscal year 2015 (FY15) Advanced Simulation and Computing (ASC) Program Level 2 milestone Planning for Pre-Exascale Platform Environment. It acknowledges and quantifies challenges and recognized gaps for moving the ASC Program towards effective use of exascale platforms and recommends strategies to address these gaps. This document also presents an update to the concerns, strategies, and plans presented in the FY08 predecessor document that dealt with the upcoming (at the time) petascale high performance computing (HPC) platforms. With the looming push towards exascale systems, a review of the earlier document was appropriate in light of the myriad architectural choices currently under consideration. The ASC Program believes the platforms to be fielded in the 2020s will be fundamentally different systems that stress ASC’s ability to modify codes to take full advantage of new or unique features. In addition, the scale of components will increase the difficulty of maintaining an errorfree system, thus driving new approaches to resilience and error detection/correction. The code revamps of the past, from serial- to vector-centric code to distributed memory to threaded implementations, will be revisited as codes adapt to a new message passing interface (MPI) plus “x” or more advanced and dynamic programming models based on architectural specifics. Development efforts are already underway in some cases, and more difficult or uncertain aspects of the new architectures will require research and analysis that may inform future directions for program choices. In addition, the potential diversity of system architectures may require parallel if not duplicative efforts to analyze and modify environments, codes, subsystems, libraries, debugging tools, and performance analysis techniques as well as exploring new monitoring methodologies. It is difficult if not impossible to

  10. Strategies for comparing gene expression profiles from different microarray platforms: application to a case-control experiment.

    Science.gov (United States)

    Severgnini, Marco; Bicciato, Silvio; Mangano, Eleonora; Scarlatti, Francesca; Mezzelani, Alessandra; Mattioli, Michela; Ghidoni, Riccardo; Peano, Clelia; Bonnal, Raoul; Viti, Federica; Milanesi, Luciano; De Bellis, Gianluca; Battaglia, Cristina

    2006-06-01

    Meta-analysis of microarray data is increasingly important, considering both the availability of multiple platforms using disparate technologies and the accumulation in public repositories of data sets from different laboratories. We addressed the issue of comparing gene expression profiles from two microarray platforms by devising a standardized investigative strategy. We tested this procedure by studying MDA-MB-231 cells, which undergo apoptosis on treatment with resveratrol. Gene expression profiles were obtained using high-density, short-oligonucleotide, single-color microarray platforms: GeneChip (Affymetrix) and CodeLink (Amersham). Interplatform analyses were carried out on 8414 common transcripts represented on both platforms, as identified by LocusLink ID, representing 70.8% and 88.6% of annotated GeneChip and CodeLink features, respectively. We identified 105 differentially expressed genes (DEGs) on CodeLink and 42 DEGs on GeneChip. Among them, only 9 DEGs were commonly identified by both platforms. Multiple analyses (BLAST alignment of probes with target sequences, gene ontology, literature mining, and quantitative real-time PCR) permitted us to investigate the factors contributing to the generation of platform-dependent results in single-color microarray experiments. An effective approach to cross-platform comparison involves microarrays of similar technologies, samples prepared by identical methods, and a standardized battery of bioinformatic and statistical analyses.

  11. Auto Code Generation for Simulink-Based Attitude Determination Control System

    Science.gov (United States)

    MolinaFraticelli, Jose Carlos

    2012-01-01

    This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.

  12. Linking Training Course Support to Fleet Platforms: An Equipment-Based Approach.

    Science.gov (United States)

    1981-01-01

    REQUIREMENT SPONSOR:OP-04 RESOURCE SPONSOR:OP-04 COURSE TITLE - ECONOMIC ANAL ACTIVITY ADDRESS- NAVSCOLCECOFF PT HUENEME FIND CODE = 2 - NO SPECIFIC...RESOURCE SPONSOR:OP-O1 COURSE TITLE - DD-963 MPU MAINTENANCE ACTIVITY ADDRESS COMBATSYSTECHSCOLCOM FIND CODE = 5 - SPECIFIC PLATFORM IN CANTRAC SHIP

  13. Speech coding code- excited linear prediction

    CERN Document Server

    Bäckström, Tom

    2017-01-01

    This book provides scientific understanding of the most central techniques used in speech coding both for advanced students as well as professionals with a background in speech audio and or digital signal processing. It provides a clear connection between the whys hows and whats thus enabling a clear view of the necessity purpose and solutions provided by various tools as well as their strengths and weaknesses in each respect Equivalently this book sheds light on the following perspectives for each technology presented Objective What do we want to achieve and especially why is this goal important Resource Information What information is available and how can it be useful and Resource Platform What kind of platforms are we working with and what are their capabilities restrictions This includes computational memory and acoustic properties and the transmission capacity of devices used. The book goes on to address Solutions Which solutions have been proposed and how can they be used to reach the stated goals and ...

  14. Development of an IHE MRRT-compliant open-source web-based reporting platform.

    Science.gov (United States)

    Pinto Dos Santos, Daniel; Klos, G; Kloeckner, R; Oberle, R; Dueber, C; Mildenberger, P

    2017-01-01

    To develop a platform that uses structured reporting templates according to the IHE Management of Radiology Report Templates (MRRT) profile, and to implement this platform into clinical routine. The reporting platform uses standard web technologies (HTML / JavaScript and PHP / MySQL) only. Several freely available external libraries were used to simplify the programming. The platform runs on a standard web server, connects with the radiology information system (RIS) and PACS, and is easily accessible via a standard web browser. A prototype platform that allows structured reporting to be easily incorporated into the clinical routine was developed and successfully tested. To date, 797 reports were generated using IHE MRRT-compliant templates (many of them downloaded from the RSNA's radreport.org website). Reports are stored in a MySQL database and are easily accessible for further analyses. Development of an IHE MRRT-compliant platform for structured reporting is feasible using only standard web technologies. All source code will be made available upon request under a free license, and the participation of other institutions in further development is welcome. • A platform for structured reporting using IHE MRRT-compliant templates is presented. • Incorporating structured reporting into clinical routine is feasible. • Full source code will be provided upon request under a free license.

  15. User's manual for the TMAD code

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    1995-01-01

    This document serves as the User's Manual for the TMAD code system, which includes the TMAD code and the LIBMAKR code. The TMAD code was commissioned to make it easier to interpret moisture probe measurements in the Hanford Site waste tanks. In principle, the code is an interpolation routine that acts over a library of benchmark data based on two independent variables, typically anomaly size and moisture content. Two additional variables, anomaly type and detector type, also can be considered independent variables, but no interpolation is done over them. The dependent variable is detector response. The intent is to provide the code with measured detector responses from two or more detectors. The code then will interrogate (and interpolate upon) the benchmark data library and find the anomaly-type/anomaly-size/moisture-content combination that provides the closest match to the measured data

  16. Automated Testing Infrastructure and Result Comparison for Geodynamics Codes

    Science.gov (United States)

    Heien, E. M.; Kellogg, L. H.

    2013-12-01

    The geodynamics community uses a wide variety of codes on a wide variety of both software and hardware platforms to simulate geophysical phenomenon. These codes are generally variants of finite difference or finite element calculations involving Stokes flow or wave propagation. A significant problem is that codes of even low complexity will return different results depending on the platform due to slight differences in hardware, software, compiler, and libraries. Furthermore, changes to the codes during development may affect solutions in unexpected ways such that previously validated results are altered. The Computational Infrastructure for Geodynamics (CIG) is funded by the NSF to enhance the capabilities of the geodynamics community through software development. CIG has recently done extensive work in setting up an automated testing and result validation system based on the BaTLab system developed at the University of Wisconsin, Madison. This system uses 16 variants of Linux and Mac platforms on both 32 and 64-bit processors to test several CIG codes, and has also recently been extended to support testing on the XSEDE TACC (Texas Advanced Computing Center) Stampede cluster. In this work we overview the system design and demonstrate how automated testing and validation occurs and results are reported. We also examine several results from the system from different codes and discuss how changes in compilers and libraries affect the results. Finally we detail some result comparison tools for different types of output (scalar fields, velocity fields, seismogram data), and discuss within what margins different results can be considered equivalent.

  17. Autonomous platform for distributed sensing and actuation over bluetooth

    OpenAIRE

    Carvalhal, Paulo; Coelho, Ezequiel T.; Ferreira, Manuel João Oliveira; Afonso, José A.; Santos, Cristina

    2006-01-01

    This paper presents a short range wireless network platform based on Bluetooth technology and on a Round Robin scheduling algotithm. The main goal is to provide an application independent platform in order to support a distributed data acquisition and control system used to control a model of a greenhouse. This platform enables the advantages of wireless communications while assuring low weight, small energy consumption and reliable communications.

  18. Linear-time non-malleable codes in the bit-wise independent tampering model

    NARCIS (Netherlands)

    R.J.F. Cramer (Ronald); I.B. Damgård (Ivan); N.M. Döttling (Nico); I. Giacomelli (Irene); C. Xing (Chaoping)

    2017-01-01

    textabstractNon-malleable codes were introduced by Dziembowski et al. (ICS 2010) as coding schemes that protect a message against tampering attacks. Roughly speaking, a code is non-malleable if decoding an adversarially tampered encoding of a message m produces the original message m or a value m′

  19. MouSensor: A Versatile Genetic Platform to Create Super Sniffer Mice for Studying Human Odor Coding.

    Science.gov (United States)

    D'Hulst, Charlotte; Mina, Raena B; Gershon, Zachary; Jamet, Sophie; Cerullo, Antonio; Tomoiaga, Delia; Bai, Li; Belluscio, Leonardo; Rogers, Matthew E; Sirotin, Yevgeniy; Feinstein, Paul

    2016-07-26

    Typically, ∼0.1% of the total number of olfactory sensory neurons (OSNs) in the main olfactory epithelium express the same odorant receptor (OR) in a singular fashion and their axons coalesce into homotypic glomeruli in the olfactory bulb. Here, we have dramatically increased the total number of OSNs expressing specific cloned OR coding sequences by multimerizing a 21-bp sequence encompassing the predicted homeodomain binding site sequence, TAATGA, known to be essential in OR gene choice. Singular gene choice is maintained in these "MouSensors." In vivo synaptopHluorin imaging of odor-induced responses by known M71 ligands shows functional glomerular activation in an M71 MouSensor. Moreover, a behavioral avoidance task demonstrates that specific odor detection thresholds are significantly decreased in multiple transgenic lines, expressing mouse or human ORs. We have developed a versatile platform to study gene choice and axon identity, to create biosensors with great translational potential, and to finally decode human olfaction. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  1. Performance Measurement of Complex Event Platforms

    Directory of Open Access Journals (Sweden)

    Eva Zámečníková

    2016-12-01

    Full Text Available The aim of this paper is to find and compare existing solutions of complex event processing platforms (CEP. CEP platforms generally serve for processing and/or predicting of high frequency data. We intend to use CEP platform for processing of complex time series and integrate a solution for newly proposed method of decision making. The decision making process will be described by formal grammar. As there are lots of CEP solutions we will take the following characteristics under consideration - the processing in real time, possibility of processing of high volume data from multiple sources, platform independence, platform allowing integration with user solution and open license. At first we will talk about existing CEP tools and their specific way of use in praxis. Then we will mention the design of method for formalization of business rules used for decision making. Afterwards, we focus on two platforms which seem to be the best fit for integration of our solution and we will list the main pros and cons of each approach. Next part is devoted to benchmark platforms for CEP. Final part is devoted to experimental measurements of platform with integrated method for decision support.

  2. A EU simulation platform for nuclear reactor safety: multi-scale and multi-physics calculations, sensitivity and uncertainty analysis (NURESIM project)

    International Nuclear Information System (INIS)

    Chauliac, Christian; Bestion, Dominique; Crouzet, Nicolas; Aragones, Jose-Maria; Cacuci, Dan Gabriel; Weiss, Frank-Peter; Zimmermann, Martin A.

    2010-01-01

    The NURESIM project, the numerical simulation platform, is developed in the frame of the NURISP European Collaborative Project (FP7), which includes 22 organizations from 14 European countries. NURESIM intends to be a reference platform providing high quality software tools, physical models, generic functions and assessment results. The NURESIM platform provides an accurate representation of the physical phenomena by promoting and incorporating the latest advances in core physics, two-phase thermal-hydraulics and fuel modelling. It includes multi-scale and multi-physics features, especially for coupling core physics and thermal-hydraulics models for reactor safety. Easy coupling of the different codes and solvers is provided through the use of a common data structure and generic functions (e.g., for interpolation between non-conforming meshes). More generally, the platform includes generic pre-processing, post-processing and supervision functions through the open-source SALOME software, in order to make the codes more user-friendly. The platform also provides the informatics environment for testing and comparing different codes. The contribution summarizes the achievements and ongoing developments of the simulation platform in core physics, thermal-hydraulics, multi-physics, uncertainties and code integration

  3. Static Verification for Code Contracts

    Science.gov (United States)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  4. Analytical validation of the CACECO containment analysis code

    International Nuclear Information System (INIS)

    Peak, R.D.

    1979-08-01

    The CACECO containment analysis code was developed to predict the thermodynamic responses of LMFBR containment facilities to a variety of accidents. This report covers the verification of the CACECO code by problems that can be solved by hand calculations or by reference to textbook and literature examples. The verification concentrates on the accuracy of the material and energy balances maintained by the code and on the independence of the four cells analyzed by the code so that the user can be assured that the code analyses are numerically correct and independent of the organization of the input data submitted to the code

  5. Cotton phenotyping with lidar from a track-mounted platform

    Science.gov (United States)

    French, Andrew N.; Gore, Michael A.; Thompson, Alison

    2016-05-01

    High-Throughput Phenotyping (HTP) is a discipline for rapidly identifying plant architectural and physiological responses to environmental factors such as heat and water stress. Experiments conducted since 2010 at Maricopa, Arizona with a three-fold sensor group, including thermal infrared radiometers, active visible/near infrared reflectance sensors, and acoustic plant height sensors, have shown the validity of HTP with a tractor-based system. However, results from these experiments also show that accuracy of plant phenotyping is limited by the system's inability to discriminate plant components and their local environmental conditions. This limitation may be overcome with plant imaging and laser scanning which can help map details in plant architecture and sunlit/shaded leaves. To test the capability for mapping cotton plants with a laser system, a track-mounted platform was deployed in 2015 over a full canopy and defoliated cotton crop consisting of a scanning LIDAR driven by Arduinocontrolled stepper motors. Using custom Python and Tkinter code, the platform moved autonomously along a pipe-track at 0.1 m/s while collecting LIDAR scans at 25 Hz (0.1667 deg. beam). These tests showed that an autonomous LIDAR platform can reduce HTP logistical problems and provide the capability to accurately map cotton plants and cotton bolls. A prototype track-mounted platform was developed to test the use of LIDAR scanning for High- Throughput Phenotyping (HTP). The platform was deployed in 2015 at Maricopa, Arizona over a senescent cotton crop. Using custom Python and Tkinter code, the platform moved autonomously along a pipe-track at cotton bolls.

  6. 2009 Analysis Platform Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Ferrell, John [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States

    2009-12-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the U.S. Department of Energy Biomass Program’s Analysis platform review meeting, held on February 18, 2009, at the Marriott Residence Inn, National Harbor, Maryland.

  7. 2009 Infrastructure Platform Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Ferrell, John [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2009-12-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the U.S. Department of Energy Biomass program‘s Infrastructure platform review meeting, held on February 19, 2009, at the Marriott Residence Inn, National Harbor, Maryland.

  8. Taxonomic code, physical, and other data collected from NOAA Ship DELAWARE II and other platforms in New York Bight from net casts and other instruments; 1973-02-20 to 1975-12-16 (NODC Accession 7601402)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Taxonomic Code, physical, and other data were collected using net casts and other instruments in the New York Bight from NOAA Ship DELAWARE II and other platforms....

  9. Integration of the TNXYZ computer program inside the platform Salome; Integracion del programa de computo TNXYZ dentro de la plataforma Salome

    Energy Technology Data Exchange (ETDEWEB)

    Chaparro V, F. J.

    2014-07-01

    The present work shows the procedure carried out to integrate the code TNXYZ as a calculation tool at the graphical simulation platform Salome. The TNXYZ code propose a numerical solution of the neutron transport equation, in several groups of energy, steady-state and three-dimensional geometry. In order to discretized the variables of the transport equation, the code uses the method of discrete ordinates for the angular variable, and a nodal method for the spatial dependence. The Salome platform is a graphical environment designed for building, editing and simulating mechanical models mainly focused on the industry and unlike other software, in order to form a complete scheme of pre and post processing of information, to integrate and control an external source code. Before the integration the in the Salome platform TNXYZ code was upgraded. TNXYZ was programmed in the 90s using Fortran 77 compiler; for this reason the code was adapted to the characteristics of the current Fortran compilers; in addition, with the intention of extracting partial results over the process sequence, the original structure of the program underwent a modularization process, i.e. the main program was divided into sections where the code performs major operations. This procedure is controlled by the information module (YACS) on Salome platform, and it could be useful for a subsequent coupling with thermal-hydraulics codes. Finally, with the help of the Monte Carlo code Serpent several study cases were defined in order to check the process of integration; the verification process consisted in performing a comparison of the results obtained with the code executed as stand-alone and after modernized, integrated and controlled by the Salome platform. (Author)

  10. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  11. Combining independent de novo assemblies optimizes the coding transcriptome for nonconventional model eukaryotic organisms.

    Science.gov (United States)

    Cerveau, Nicolas; Jackson, Daniel J

    2016-12-09

    Next-generation sequencing (NGS) technologies are arguably the most revolutionary technical development to join the list of tools available to molecular biologists since PCR. For researchers working with nonconventional model organisms one major problem with the currently dominant NGS platform (Illumina) stems from the obligatory fragmentation of nucleic acid material that occurs prior to sequencing during library preparation. This step creates a significant bioinformatic challenge for accurate de novo assembly of novel transcriptome data. This challenge becomes apparent when a variety of modern assembly tools (of which there is no shortage) are applied to the same raw NGS dataset. With the same assembly parameters these tools can generate markedly different assembly outputs. In this study we present an approach that generates an optimized consensus de novo assembly of eukaryotic coding transcriptomes. This approach does not represent a new assembler, rather it combines the outputs of a variety of established assembly packages, and removes redundancy via a series of clustering steps. We test and validate our approach using Illumina datasets from six phylogenetically diverse eukaryotes (three metazoans, two plants and a yeast) and two simulated datasets derived from metazoan reference genome annotations. All of these datasets were assembled using three currently popular assembly packages (CLC, Trinity and IDBA-tran). In addition, we experimentally demonstrate that transcripts unique to one particular assembly package are likely to be bioinformatic artefacts. For all eight datasets our pipeline generates more concise transcriptomes that in fact possess more unique annotatable protein domains than any of the three individual assemblers we employed. Another measure of assembly completeness (using the purpose built BUSCO databases) also confirmed that our approach yields more information. Our approach yields coding transcriptome assemblies that are more likely to be

  12. A 2.9 ps equivalent resolution interpolating time counter based on multiple independent coding lines

    International Nuclear Information System (INIS)

    Szplet, R; Jachna, Z; Kwiatkowski, P; Rozyc, K

    2013-01-01

    We present the design, operation and test results of a time counter that has an equivalent resolution of 2.9 ps, a measurement uncertainty at the level of 6 ps, and a measurement range of 10 s. The time counter has been implemented in a general-purpose reprogrammable device Spartan-6 (Xilinx). To obtain both high precision and wide measurement range the counting of periods of a reference clock is combined with a two-stage interpolation within a single period of the clock signal. The interpolation involves a four-phase clock in the first interpolation stage (FIS) and an equivalent coding line (ECL) in the second interpolation stage (SIS). The ECL is created as a compound of independent discrete time coding lines (TCL). The number of TCLs used to create the virtual ECL has an effect on its resolution. We tested ECLs made from up to 16 TCLs, but the idea may be extended to a larger number of lines. In the presented time counter the coarse resolution of the counting method equal to 2 ns (period of the 500 MHz reference clock) is firstly improved fourfold in the FIS and next even more than 400 times in the SIS. The proposed solution allows us to overcome the technological limitation in achievable resolution and improve the precision of conversion of integrated interpolators based on tapped delay lines. (paper)

  13. 2009 Feedstocks Platform Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Ferrell, John [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2009-12-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the U.S. Department of Energy Biomass Program‘s Feedstock platform review meeting, held on April 8–10, 2009, at the Grand Hyatt Washington, Washington, D.C.

  14. Performance awareness execution performance of HEP codes on RISC platforms,issues and solutions

    CERN Document Server

    Yaari, R; Yaari, Refael; Jarp, Sverre

    1995-01-01

    The work described in this paper was started during the migration of Aleph's production jobs from the IBM mainframe/CRAY supercomputer to several RISC/Unix workstation platforms. The aim was to understand why Aleph did not obtain the performance on the RISC platforms that was "promised" after a CERN Unit comparison between these RISC platforms and the IBM mainframe. Remedies were also sought. Since the work with the Aleph jobs in turn led to the related task of understanding compilers and their options, the conditions under which the CERN benchmarks (and other benchmarks) were run, kernel routines and frequently used CERNLIB routines, the whole undertaking expanded to try to look at all the factors that influence the performance of High Energy Physics (HEP) jobs in general. Finally, key performance issues were reviewed against the programs of one of the LHC collaborations (Atlas) with the hope that the conclusions would be of long- term interest during the establishment of their simulation, reconstruction and...

  15. 2011 Biomass Program Platform Peer Review: Feedstock

    Energy Technology Data Exchange (ETDEWEB)

    McCann, Laura [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s Feedstock Platform Review meeting.

  16. 2011 Biomass Program Platform Peer Review. Sustainability

    Energy Technology Data Exchange (ETDEWEB)

    Eng, Alison Goss [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s Sustainability Platform Review meeting.

  17. 2011 Biomass Program Platform Peer Review. Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Lindauer, Alicia [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s Infrastructure Platform Review meeting.

  18. 2011 Biomass Program Platform Peer Review: Algae

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Joyce [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s Algae Platform Review meeting.

  19. 2011 Biomass Program Platform Peer Review: Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haq, Zia [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s Analysis Platform Review meeting.

  20. HardwareSoftware Co-design for Heterogeneous Multi-core Platforms The hArtes Toolchain

    CERN Document Server

    2012-01-01

    This book describes the results and outcome of the FP6 project, known as hArtes, which focuses on the development of an integrated tool chain targeting a heterogeneous multi core platform comprising of a general purpose processor (ARM or powerPC), a DSP (the diopsis) and an FPGA. The tool chain takes existing source code and proposes transformations and mappings such that legacy code can easily be ported to a modern, multi-core platform. Benefits of the hArtes approach, described in this book, include: Uses a familiar programming paradigm: hArtes proposes a familiar programming paradigm which is compatible with the widely used programming practice, irrespective of the target platform. Enables users to view multiple cores as a single processor: the hArtes approach abstracts away the heterogeneity as well as the multi-core aspect of the underlying hardware so the developer can view the platform as consisting of a single, general purpose processor. Facilitates easy porting of existing applications: hArtes provid...

  1. Lattice QCD simulations using the OpenACC platform

    International Nuclear Information System (INIS)

    Majumdar, Pushan

    2016-01-01

    In this article we will explore the OpenACC platform for programming Graphics Processing Units (GPUs). The OpenACC platform offers a directive based programming model for GPUs which avoids the detailed data flow control and memory management necessary in a CUDA programming environment. In the OpenACC model, programs can be written in high level languages with OpenMP like directives. We present some examples of QCD simulation codes using OpenACC and discuss their performance on the Fermi and Kepler GPUs. (paper)

  2. Balanced distributed coding of omnidirectional images

    Science.gov (United States)

    Thirumalai, Vijayaraghavan; Tosic, Ivana; Frossard, Pascal

    2008-01-01

    This paper presents a distributed coding scheme for the representation of 3D scenes captured by stereo omni-directional cameras. We consider a scenario where images captured from two different viewpoints are encoded independently, with a balanced rate distribution among the different cameras. The distributed coding is built on multiresolution representation and partitioning of the visual information in each camera. The encoder transmits one partition after entropy coding, as well as the syndrome bits resulting from the channel encoding of the other partition. The decoder exploits the intra-view correlation and attempts to reconstruct the source image by combination of the entropy-coded partition and the syndrome information. At the same time, it exploits the inter-view correlation using motion estimation between images from different cameras. Experiments demonstrate that the distributed coding solution performs better than a scheme where images are handled independently, and that the coding rate stays balanced between encoders.

  3. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  4. Long non-coding RNA HOTAIR is an independent prognostic marker of metastasis in estrogen receptor-positive primary breast cancer

    DEFF Research Database (Denmark)

    Sørensen, Kristina P; Thomassen, Mads; Tan, Qihua

    2013-01-01

    Expression of HOX transcript antisense intergenic RNA (HOTAIR)-a long non-coding RNA-has been examined in a variety of human cancers, and overexpression of HOTAIR is correlated with poor survival among breast, colon, and liver cancer patients. In this retrospective study, we examine HOTAIR......-negative tumor samples, we are not able to detect a prognostic value of HOTAIR expression, probably due to the limited sample size. These results are successfully validated in an independent dataset with similar associations (P = 0.018, HR 1.825). In conclusion, our findings suggest that HOTAIR expression may...

  5. Pre-processing of input files for the AZTRAN code

    International Nuclear Information System (INIS)

    Vargas E, S.; Ibarra, G.

    2017-09-01

    The AZTRAN code began to be developed in the Nuclear Engineering Department of the Escuela Superior de Fisica y Matematicas (ESFM) of the Instituto Politecnico Nacional (IPN) with the purpose of numerically solving various models arising from the physics and engineering of nuclear reactors. The code is still under development and is part of the AZTLAN platform: Development of a Mexican platform for the analysis and design of nuclear reactors. Due to the complexity to generate an input file for the code, a script based on D language is developed, with the purpose of making its elaboration easier, based on a new input file format which includes specific cards, which have been divided into two blocks, mandatory cards and optional cards, including a pre-processing of the input file to identify possible errors within it, as well as an image generator for the specific problem based on the python interpreter. (Author)

  6. Analysis of offshore platforms lifting with fixed pile structure type (fixed platform) based on ASD89

    Science.gov (United States)

    Sugianto, Agus; Indriani, Andi Marini

    2017-11-01

    Platform construction GTS (Gathering Testing Sattelite) is offshore construction platform with fix pile structure type/fixed platform functioning to support the mining of petroleum exploitation. After construction fabrication process platform was moved to barges, then shipped to the installation site. Moving process is generally done by pull or push based on construction design determined when planning. But at the time of lifting equipment/cranes available in the work area then the moving process can be done by lifting so that moving activity can be implemented more quickly of work. This analysis moving process of GTS platform in a different way that is generally done to GTS platform types by lifting using problem is construction reinforcement required, so the construction can be moved by lifting with analyzing and checking structure working stress that occurs due to construction moving process by lifting AISC code standard and analysis using the SAP2000 structure analysis program. The analysis result showed that existing condition cannot be moved by lifting because stress ratio is above maximum allowable value that is 0.950 (AISC-ASD89). Overstress occurs on the member 295 and 324 with stress ratio value 0.97 and 0.95 so that it is required structural reinforcement. Box plate aplication at both members so that it produces stress ratio values 0.78 at the member 295 and stress ratio of 0.77 at the member 324. These results indicate that the construction have qualified structural reinforcement for being moved by lifting.

  7. cMapper: gene-centric connectivity mapper for EBI-RDF platform.

    Science.gov (United States)

    Shoaib, Muhammad; Ansari, Adnan Ahmad; Ahn, Sung-Min

    2017-01-15

    In this era of biological big data, data integration has become a common task and a challenge for biologists. The Resource Description Framework (RDF) was developed to enable interoperability of heterogeneous datasets. The EBI-RDF platform enables an efficient data integration of six independent biological databases using RDF technologies and shared ontologies. However, to take advantage of this platform, biologists need to be familiar with RDF technologies and SPARQL query language. To overcome this practical limitation of the EBI-RDF platform, we developed cMapper, a web-based tool that enables biologists to search the EBI-RDF databases in a gene-centric manner without a thorough knowledge of RDF and SPARQL. cMapper allows biologists to search data entities in the EBI-RDF platform that are connected to genes or small molecules of interest in multiple biological contexts. The input to cMapper consists of a set of genes or small molecules, and the output are data entities in six independent EBI-RDF databases connected with the given genes or small molecules in the user's query. cMapper provides output to users in the form of a graph in which nodes represent data entities and the edges represent connections between data entities and inputted set of genes or small molecules. Furthermore, users can apply filters based on database, taxonomy, organ and pathways in order to focus on a core connectivity graph of their interest. Data entities from multiple databases are differentiated based on background colors. cMapper also enables users to investigate shared connections between genes or small molecules of interest. Users can view the output graph on a web browser or download it in either GraphML or JSON formats. cMapper is available as a web application with an integrated MySQL database. The web application was developed using Java and deployed on Tomcat server. We developed the user interface using HTML5, JQuery and the Cytoscape Graph API. cMapper can be accessed at

  8. A Dual Coding View of Vocabulary Learning

    Science.gov (United States)

    Sadoski, Mark

    2005-01-01

    A theoretical perspective on acquiring sight vocabulary and developing meaningful vocabulary is presented. Dual Coding Theory assumes that cognition occurs in two independent but connected codes: a verbal code for language and a nonverbal code for mental imagery. The mixed research literature on using pictures in teaching sight vocabulary is…

  9. Comparison and validation of the results of the AZNHEX v.1.0 code with the MCNP code simulating the core of a fast reactor cooled with sodium

    International Nuclear Information System (INIS)

    Galicia A, J.; Francois L, J. L.; Bastida O, G. E.; Esquivel E, J.

    2016-09-01

    The development of the AZTLAN platform for the analysis and design of nuclear reactors is led by Instituto Nacional de Investigaciones Nucleares (ININ) and divided into four working groups, which have well-defined activities to achieve significant progress in this project individually and jointly. Within these working groups is the users group, whose main task is to use the codes that make up the AZTLAN platform to provide feedback to the developers, and in this way to make the final versions of the codes are efficient and at the same time reliable and easy to understand. In this paper we present the results provided by the AZNHEX v.1.0 code when simulating the core of a fast reactor cooled with sodium at steady state. The validation of these results is a fundamental part of the platform development and responsibility of the users group, so in this research the results obtained with AZNHEX are compared and analyzed with those provided by the Monte Carlo code MCNP-5, software worldwide used and recognized. A description of the methodology used with MCNP-5 is also presented for the calculation of the interest variables and the difference that is obtained with respect to the calculated with AZNHEX. (Author)

  10. 2009 Biochemical Conversion Platform Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Ferrell, John [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2009-12-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the U.S. Department of Energy Biomass Program’s Biochemical Conversion platform review meeting, held on April 14-16, 2009, at the Sheraton Denver Downtown, Denver, Colorado.

  11. A Platform for Simulating Language Evolution

    Science.gov (United States)

    Vogel, Carl; Woods, Justin

    A platform for conducting experiments in the simulation of natural language evolution is presented. The system is paramaterized for independent specification of important features like: number of agents, communication attempt frequency, agent short term memory capacity, communicative urgency, etc. Representative experiments are demonstrated.

  12. 2009 Thermochemical Conversion Platform Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Ferrell, John [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2009-12-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the U.S. Department of Energy Biomass Program’s Thermochemical Conversion platform review meeting, held on April 14-16, 2009, at the Sheraton Denver Downtown, Denver, Colorado.

  13. Probabilistic evaluations for CANTUP computer code analysis improvement

    International Nuclear Information System (INIS)

    Florea, S.; Pavelescu, M.

    2004-01-01

    Structural analysis with finite element method is today an usual way to evaluate and predict the behavior of structural assemblies subject to hard conditions in order to ensure their safety and reliability during their operation. A CANDU 600 fuel channel is an example of an assembly working in hard conditions, in which, except the corrosive and thermal aggression, long time irradiation, with implicit consequences on material properties evolution, interferes. That leads inevitably to material time-dependent properties scattering, their dynamic evolution being subject to a great degree of uncertainness. These are the reasons for developing, in association with deterministic evaluations with computer codes, the probabilistic and statistical methods in order to predict the structural component response. This work initiates the possibility to extend the deterministic thermomechanical evaluation on fuel channel components to probabilistic structural mechanics approach starting with deterministic analysis performed with CANTUP computer code which is a code developed to predict the long term mechanical behavior of the pressure tube - calandria tube assembly. To this purpose the structure of deterministic calculus CANTUP computer code has been reviewed. The code has been adapted from LAHEY 77 platform to Microsoft Developer Studio - Fortran Power Station platform. In order to perform probabilistic evaluations, it was added a part to the deterministic code which, using a subroutine from IMSL library from Microsoft Developer Studio - Fortran Power Station platform, generates pseudo-random values of a specified value. It was simulated a normal distribution around the deterministic value and 5% standard deviation for Young modulus material property in order to verify the statistical calculus of the creep behavior. The tube deflection and effective stresses were the properties subject to probabilistic evaluation. All the values of these properties obtained for all the values for

  14. Distributed Video Coding for Multiview and Video-plus-depth Coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo

    The interest in Distributed Video Coding (DVC) systems has grown considerably in the academic world in recent years. With DVC the correlation between frames is exploited at the decoder (joint decoding). The encoder codes the frame independently, performing relatively simple operations. Therefore......, with DVC the complexity is shifted from encoder to decoder, making the coding architecture a viable solution for encoders with limited resources. DVC may empower new applications which can benefit from this reversed coding architecture. Multiview Distributed Video Coding (M-DVC) is the application...... of the to-be-decoded frame. Another key element is the Residual estimation, indicating the reliability of the SI, which is used to calculate the parameters of the correlation noise model between SI and original frame. In this thesis new methods for Inter-camera SI generation are analyzed in the Stereo...

  15. Modality independence of order coding in working memory: Evidence from cross-modal order interference at recall.

    Science.gov (United States)

    Vandierendonck, André

    2016-01-01

    Working memory researchers do not agree on whether order in serial recall is encoded by dedicated modality-specific systems or by a more general modality-independent system. Although previous research supports the existence of autonomous modality-specific systems, it has been shown that serial recognition memory is prone to cross-modal order interference by concurrent tasks. The present study used a serial recall task, which was performed in a single-task condition and in a dual-task condition with an embedded memory task in the retention interval. The modality of the serial task was either verbal or visuospatial, and the embedded tasks were in the other modality and required either serial or item recall. Care was taken to avoid modality overlaps during presentation and recall. In Experiment 1, visuospatial but not verbal serial recall was more impaired when the embedded task was an order than when it was an item task. Using a more difficult verbal serial recall task, verbal serial recall was also more impaired by another order recall task in Experiment 2. These findings are consistent with the hypothesis of modality-independent order coding. The implications for views on short-term recall and the multicomponent view of working memory are discussed.

  16. Design Patterns for Sparse-Matrix Computations on Hybrid CPU/GPU Platforms

    Directory of Open Access Journals (Sweden)

    Valeria Cardellini

    2014-01-01

    Full Text Available We apply object-oriented software design patterns to develop code for scientific software involving sparse matrices. Design patterns arise when multiple independent developments produce similar designs which converge onto a generic solution. We demonstrate how to use design patterns to implement an interface for sparse matrix computations on NVIDIA GPUs starting from PSBLAS, an existing sparse matrix library, and from existing sets of GPU kernels for sparse matrices. We also compare the throughput of the PSBLAS sparse matrix–vector multiplication on two platforms exploiting the GPU with that obtained by a CPU-only PSBLAS implementation. Our experiments exhibit encouraging results regarding the comparison between CPU and GPU executions in double precision, obtaining a speedup of up to 35.35 on NVIDIA GTX 285 with respect to AMD Athlon 7750, and up to 10.15 on NVIDIA Tesla C2050 with respect to Intel Xeon X5650.

  17. Bit-wise arithmetic coding for data compression

    Science.gov (United States)

    Kiely, A. B.

    1994-01-01

    This article examines the problem of compressing a uniformly quantized independent and identically distributed (IID) source. We present a new compression technique, bit-wise arithmetic coding, that assigns fixed-length codewords to the quantizer output and uses arithmetic coding to compress the codewords, treating the codeword bits as independent. We examine the performance of this method and evaluate the overhead required when used block-adaptively. Simulation results are presented for Gaussian and Laplacian sources. This new technique could be used as the entropy coder in a transform or subband coding system.

  18. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  19. Reconfigurable, Intelligently-Adaptive, Communication System, an SDR Platform

    Science.gov (United States)

    Roche, Rigoberto J.; Shalkhauser, Mary Jo; Hickey, Joseph P.; Briones, Janette C.

    2016-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework to abstract the application software from the radio platform hardware. STRS aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. The NASA Glenn Research Center (GRC) team made a software defined radio (SDR) platform STRS compliant by adding an STRS operating environment and a field programmable gate array (FPGA) wrapper, capable of implementing each of the platforms interfaces, as well as a test waveform to exercise those interfaces. This effort serves to provide a framework toward waveform development onto an STRS compliant platform to support future space communication systems for advanced exploration missions. The use of validated STRS compliant applications provides tested code with extensive documentation to potentially reduce risk, cost and e ort in development of space-deployable SDRs. This paper discusses the advantages of STRS, the integration of STRS onto a Reconfigurable, Intelligently-Adaptive, Communication System (RIACS) SDR platform, and the test waveform and wrapper development e orts. The paper emphasizes the infusion of the STRS Architecture onto the RIACS platform for potential use in next generation flight system SDRs for advanced exploration missions.

  20. Upgrade of The Cyber R and D Platform

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Chang Seong; Lee, Kyung Jong; Yang, Tae Chun; Kim, Tae Sung; Hong, Hyun Joo [Kyungsung Univ., Seoul (Korea, Republic of)

    2007-02-15

    Recently, it is necessary to demonstrate the program and experience the safety of radiation disposal by themselves. The objective of this research is to develop a cyber-based performance assessment program for the radiation disposal which is fit for Korean environment. This research covers the following four areas: - Development of Java-based MDPSA pre-processor - Linking MDPSA code to pre and post-processor. - Linking MDPSA code to Cyber R and D platform - Modification of the Cyber R and D platform. The results of this research can be used as a PR database for the central and local government. Using the web-based system, any person or interest group can plug in the system and experience the safety and clarity of the atomic energy and radiation disposal. Also, within the KAERI, research-related knowledge can be stored as a structured format. This enables the sharing, reusability, transparency, reliability and transferability of research results, and promotes the efficiency of research efforts within and outside of research team.

  1. Upgrade of The Cyber R and D Platform

    International Nuclear Information System (INIS)

    Ko, Chang Seong; Lee, Kyung Jong; Yang, Tae Chun; Kim, Tae Sung; Hong, Hyun Joo

    2007-02-01

    Recently, it is necessary to demonstrate the program and experience the safety of radiation disposal by themselves. The objective of this research is to develop a cyber-based performance assessment program for the radiation disposal which is fit for Korean environment. This research covers the following four areas: - Development of Java-based MDPSA pre-processor - Linking MDPSA code to pre and post-processor. - Linking MDPSA code to Cyber R and D platform - Modification of the Cyber R and D platform. The results of this research can be used as a PR database for the central and local government. Using the web-based system, any person or interest group can plug in the system and experience the safety and clarity of the atomic energy and radiation disposal. Also, within the KAERI, research-related knowledge can be stored as a structured format. This enables the sharing, reusability, transparency, reliability and transferability of research results, and promotes the efficiency of research efforts within and outside of research team

  2. Feasibility analysis of the modified ATHLET code for supercritical water cooled systems

    Energy Technology Data Exchange (ETDEWEB)

    Zhou Chong, E-mail: ch.zhou@sjtu.edu.cn [School of Nuclear Science and Engineering, Shanghai Jiao Tong University, 800 Dongchuan Road, Shanghai 200240 (China); Institute of Fusion and Reactor Technology, Karlsruhe Institute of Technology, Vincenz-Priessnitz-Str. 3, 76131 Karlsruhe (Germany); Yang Yanhua [School of Nuclear Science and Engineering, Shanghai Jiao Tong University, 800 Dongchuan Road, Shanghai 200240 (China); Cheng Xu [Institute of Fusion and Reactor Technology, Karlsruhe Institute of Technology, Vincenz-Priessnitz-Str. 3, 76131 Karlsruhe (Germany)

    2012-09-15

    Highlights: Black-Right-Pointing-Pointer Modification of system code ATHLET for supercritical water application. Black-Right-Pointing-Pointer Development and assessment of a heat transfer package for supercritical water. Black-Right-Pointing-Pointer Validation of the modified code at supercritical pressures with the theoretical point-hydraulics model and the SASC code. Black-Right-Pointing-Pointer Application of the modified code to LOCA analysis of a supercritical water cooled in-pile fuel qualification test loop. - Abstract: Since the existing thermal-hydraulic computer codes for light water reactors are not applicable to supercritical water cooled reactors (SCWRs) owing to the limitation of physical models and numerical treatments, the development of a reliable thermal-hydraulic computer code is very important to design analysis and safety assessment of SCWRs. Based on earlier modification of ATHLET for SCWR, a general interface is implemented to the code, which serves as the platform for information exchange between ATHLET and the external independent physical modules. A heat transfer package containing five correlations for supercritical water is connected to the ATHLET code through the interface. The correlations are assessed with experimental data. To verify the modified ATHLET code, the Edwards-O'Brian blow-down test is simulated. As first validation at supercritical pressures, a simplified supercritical water cooled loop is modeled and its stability behavior is analyzed. Results are compared with that of the theoretical model and SASC code in the reference and show good agreement. To evaluate its feasibility, the modified ATHLET code is applied to a supercritical water cooled in-pile fuel qualification test loop. Loss of coolant accidents (LOCAs) due to break of coolant supply lines are calculated for the loop. Sensitivity analysis of some safety system parameters is performed to get further knowledge about their influence on the function of the

  3. 2011 Biomass Program Platform Peer Review. Integrated Biorefineries

    Energy Technology Data Exchange (ETDEWEB)

    Rossmeissl, Neil [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s IBR Platform Review meeting.

  4. 2011 Biomass Program Platform Peer Review: Biochemical Conversion

    Energy Technology Data Exchange (ETDEWEB)

    Pezzullo, Leslie [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s Biochemical Conversion Platform Review meeting.

  5. 2011 Biomass Program Platform Peer Review. Thermochemical Conversion

    Energy Technology Data Exchange (ETDEWEB)

    Grabowski, Paul E. [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s Thermochemical Conversion Platform Review meeting.

  6. 2009 Integrated Biorefinery Platform Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Ferrell, John [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2009-12-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the U.S. Department of Energy Biomass Program‘s Integrated Biorefinery (IBR) platform review meeting, held on February 18–19, 2009, at the Westin National Harbor, National Harbor, Maryland.

  7. Massively parallel Fokker-Planck code ALLAp

    International Nuclear Information System (INIS)

    Batishcheva, A.A.; Krasheninnikov, S.I.; Craddock, G.G.; Djordjevic, V.

    1996-01-01

    The recently developed for workstations Fokker-Planck code ALLA simulates the temporal evolution of 1V, 2V and 1D2V collisional edge plasmas. In this work we present the results of code parallelization on the CRI T3D massively parallel platform (ALLAp version). Simultaneously we benchmark the 1D2V parallel vesion against an analytic self-similar solution of the collisional kinetic equation. This test is not trivial as it demands a very strong spatial temperature and density variation within the simulation domain. (orig.)

  8. Pro Smartphone Cross-Platform Development IPhone, Blackberry, Windows Mobile, and Android Development and Distribution

    CERN Document Server

    Allen, Sarah; Lundrigan, Lee

    2010-01-01

    Learn the theory behind cross-platform development, and put the theory into practice with code using the invaluable information presented in this book. With in-depth coverage of development and distribution techniques for iPhone, BlackBerry, Windows Mobile, and Android, you'll learn the native approach to working with each of these platforms. With detailed coverage of emerging frameworks like PhoneGap and Rhomobile, you'll learn the art of creating applications that will run across all devices. You'll also be introduced to the code-signing process and the distribution of applications through t

  9. ETF system code: composition and applications

    International Nuclear Information System (INIS)

    Reid, R.L.; Wu, K.F.

    1980-01-01

    A computer code has been developed for application to ETF tokamak system and conceptual design studies. The code determines cost, performance, configuration, and technology requirements as a function of tokamak parameters. The ETF code is structured in a modular fashion in order to allow independent modeling of each major tokamak component. The primary benefit of modularization is that it allows updating of a component module, such as the TF coil module, without disturbing the remainder of the system code as long as the input/output to the modules remains unchanged. The modules may be run independently to perform specific design studies, such as determining the effect of allowable strain on TF coil structural requirements, or the modules may be executed together as a system to determine global effects, such as defining the impact of aspect ratio on the entire tokamak system

  10. Overview of codes and tools for nuclear engineering education

    Science.gov (United States)

    Yakovlev, D.; Pryakhin, A.; Medvedeva, L.

    2017-01-01

    The recent world trends in nuclear education have been developed in the direction of social education, networking, virtual tools and codes. MEPhI as a global leader on the world education market implements new advanced technologies for the distance and online learning and for student research work. MEPhI produced special codes, tools and web resources based on the internet platform to support education in the field of nuclear technology. At the same time, MEPhI actively uses codes and tools from the third parties. Several types of the tools are considered: calculation codes, nuclear data visualization tools, virtual labs, PC-based educational simulators for nuclear power plants (NPP), CLP4NET, education web-platforms, distance courses (MOOCs and controlled and managed content systems). The university pays special attention to integrated products such as CLP4NET, which is not a learning course, but serves to automate the process of learning through distance technologies. CLP4NET organizes all tools in the same information space. Up to now, MEPhI has achieved significant results in the field of distance education and online system implementation.

  11. IP-MLI: An Independency of Learning Materials from Platforms in a Mobile Learning using Intelligent Method

    Directory of Open Access Journals (Sweden)

    Mohammed Abdallh Otair

    2006-06-01

    Full Text Available Attempting to deliver a monolithic mobile learning system is too inflexible in view of the heterogeneous mixture of hardware and services available and the desirability of facility blended approaches to learning delivery, and how to build learning materials to run on all platforms[1]. This paper proposes a framework of mobile learning system using an intelligent method (IP-MLI . A fuzzy matching method is used to find suitable learning material design. It will provide a best matching for each specific platform type for each learner. The main contribution of the proposed method is to use software layer to insulate learning materials from device-specific features. Consequently, many versions of learning materials can be designed to work on many platform types.

  12. Continuous-variable quantum erasure correcting code

    DEFF Research Database (Denmark)

    Lassen, Mikael Østergaard; Sabuncu, Metin; Huck, Alexander

    2010-01-01

    We experimentally demonstrate a continuous variable quantum erasure-correcting code, which protects coherent states of light against complete erasure. The scheme encodes two coherent states into a bi-party entangled state, and the resulting 4-mode code is conveyed through 4 independent channels...

  13. Geospatial Data Management Platform for Urban Groundwater

    Science.gov (United States)

    Gaitanaru, D.; Priceputu, A.; Gogu, C. R.

    2012-04-01

    Due to the large amount of civil work projects and research studies, large quantities of geo-data are produced for the urban environments. These data are usually redundant as well as they are spread in different institutions or private companies. Time consuming operations like data processing and information harmonisation represents the main reason to systematically avoid the re-use of data. The urban groundwater data shows the same complex situation. The underground structures (subway lines, deep foundations, underground parkings, and others), the urban facility networks (sewer systems, water supply networks, heating conduits, etc), the drainage systems, the surface water works and many others modify continuously. As consequence, their influence on groundwater changes systematically. However, these activities provide a large quantity of data, aquifers modelling and then behaviour prediction can be done using monitored quantitative and qualitative parameters. Due to the rapid evolution of technology in the past few years, transferring large amounts of information through internet has now become a feasible solution for sharing geoscience data. Furthermore, standard platform-independent means to do this have been developed (specific mark-up languages like: GML, GeoSciML, WaterML, GWML, CityML). They allow easily large geospatial databases updating and sharing through internet, even between different companies or between research centres that do not necessarily use the same database structures. For Bucharest City (Romania) an integrated platform for groundwater geospatial data management is developed under the framework of a national research project - "Sedimentary media modeling platform for groundwater management in urban areas" (SIMPA) financed by the National Authority for Scientific Research of Romania. The platform architecture is based on three components: a geospatial database, a desktop application (a complex set of hydrogeological and geological analysis

  14. Morphology Independent Learning in Modular Robots

    DEFF Research Database (Denmark)

    Christensen, David Johan; Bordignon, Mirko; Schultz, Ulrik Pagh

    2009-01-01

    speed its modules independently and in parallel adjust their behavior based on a single global reward signal. In simulation, we study the learning strategy’s performance on different robot configurations. On the physical platform, we perform learning experiments with ATRON robots learning to move as fast...

  15. Morphology Independent Learning in Modular Robots

    DEFF Research Database (Denmark)

    Christensen, David Johan; Bordignon, Mirko; Schultz, Ulrik Pagh

    2009-01-01

    speed its modules independently and in parallel adjust their behavior based on a single global reward signal. In simulation, we study the learning strategy?s performance on different robot con?gurations. On the physical platform, we perform learning experiments with ATRON robots learning to move as fast...

  16. Developing an online platform for gamified library instruction

    Directory of Open Access Journals (Sweden)

    Jared Cowing

    2017-01-01

    Full Text Available Gamification is a concept that has been catching fire for a while now in education, particularly in libraries. This article describes a pilot effort to create an online gamified platform for use in the Woodbury University Library’s information literacy course. The objectives of this project were both to increase student engagement and learning, and to serve as an opportunity for myself to further develop my web development skills. The platform was developed using the CodeIgniter web framework and consisted of several homework exercises ranging from a top-down two-dimensional library exploration game to a tutorial on cleaning up machine-generated APA citations. This article details the project’s planning and development process, the gamification concepts that helped guide the conceptualization of each exercise, reflections on the platform’s implementation in four course sections, and aspirations for the future of the project. It is hoped that this article will serve as an example of the opportunities–and challenges–that await both librarians and instructors who wish to add coding to their existing skill set.

  17. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  18. DEVELOPMENT OF SALES APPLICATION OF PREPAID ELECTRICITY VOUCHER BASED ON ANFROID PLATFORM USING QUICK RESPONSE CODE (QR CODE

    Directory of Open Access Journals (Sweden)

    Ricky Akbar

    2017-09-01

    Full Text Available Perusahaan Listrik Negara (PLN has implemented a smart electricity system or prepaid electricity. The customers pay the electricity voucher first before use the electricity. The token contained in electricity voucher that has been purchased by the customer is inserted into the Meter Prabayar (MPB installed in the location of customers. When a customer purchases a voucher, it will get a receipt that contains all of the customer's identity and the 20-digit of voucher code (token to be entered into MPB as a substitute for electrical energy credit. Receipts obtained by the customer is certainly vulnerable to loss, or hijacked by unresponsible parties. In this study, authors designed and develop an android based application by utilizing QR code technology as a replacement for the receipt of prepaid electricity credit which contains the identity of the customer and the 20-digit voucher code. The application is developed by implemented waterfall methodology. The implementation process of the waterfall methods used, are (1 analysis of functional requirement of the system by conducting a preliminary study and data collection based on field studies and literature, (2 system design by using UML diagrams and Business Process Model Notation (BPMN and Entity Relationship diagram (ERD, (3 design implementation by using OOP (Object Oriented programming technique. Web application is developed by using laravel PHP framework and database MySQL while mobile application is developed by using B4A (4 developed system is tested by using blackbox method testing. Final result of this research is a Web and mobile applications for the sale of electricityvoucher by QR Code technology.

  19. Recent advances in neutral particle transport methods and codes

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1996-01-01

    An overview of ORNL's three-dimensional neutral particle transport code, TORT, is presented. Special features of the code that make it invaluable for large applications are summarized for the prospective user. Advanced capabilities currently under development and installation in the production release of TORT are discussed; they include: multitasking on Cray platforms running the UNICOS operating system; Adjacent cell Preconditioning acceleration scheme; and graphics codes for displaying computed quantities such as the flux. Further developments for TORT and its companion codes to enhance its present capabilities, as well as expand its range of applications are disucssed. Speculation on the next generation of neutron particle transport codes at ORNL, especially regarding unstructured grids and high order spatial approximations, are also mentioned

  20. Parallel computing by Monte Carlo codes MVP/GMVP

    International Nuclear Information System (INIS)

    Nagaya, Yasunobu; Nakagawa, Masayuki; Mori, Takamasa

    2001-01-01

    General-purpose Monte Carlo codes MVP/GMVP are well-vectorized and thus enable us to perform high-speed Monte Carlo calculations. In order to achieve more speedups, we parallelized the codes on the different types of parallel computing platforms or by using a standard parallelization library MPI. The platforms used for benchmark calculations are a distributed-memory vector-parallel computer Fujitsu VPP500, a distributed-memory massively parallel computer Intel paragon and a distributed-memory scalar-parallel computer Hitachi SR2201, IBM SP2. As mentioned generally, linear speedup could be obtained for large-scale problems but parallelization efficiency decreased as the batch size per a processing element(PE) was smaller. It was also found that the statistical uncertainty for assembly powers was less than 0.1% by the PWR full-core calculation with more than 10 million histories and it took about 1.5 hours by massively parallel computing. (author)

  1. Determination of current loads of floating platform for special purposes

    Science.gov (United States)

    Ma, Guang-ying; Yao, Yun-long; Zhao, Chen-yao

    2017-08-01

    This article studied a new floating offshore platform for special purposes, which was assembled by standard floating modules. The environmental load calculation of the platform is an important part of the research of the ocean platform, which has always been paid attention to by engineers. In addition to wave loads, the wind loads and current loads are also important environmental factors that affect the dynamic response of the offshore platform. The current loads on the bottom structure should not be ignored. By Fluent software, the hydrostatic conditions and external current loads of the platform were calculated in this paper. The coefficient which is independent of the current velocity, namely, current force coefficient, can be fitted through current loads, which can be used for the consequent hydrodynamic and mooring analyses.

  2. Motion-adaptive intraframe transform coding of video signals

    NARCIS (Netherlands)

    With, de P.H.N.

    1989-01-01

    Spatial transform coding has been widely applied for image compression because of its high coding efficiency. However, in many intraframe systems, in which every TV frame is independently processed, coding of moving objects in the case of interlaced input signals is not addressed. In this paper, we

  3. Validation of the AZTRAN 1.1 code with problems Benchmark of LWR reactors

    International Nuclear Information System (INIS)

    Vallejo Q, J. A.; Bastida O, G. E.; Francois L, J. L.; Xolocostli M, J. V.; Gomez T, A. M.

    2016-09-01

    The AZTRAN module is a computational program that is part of the AZTLAN platform (Mexican modeling platform for the analysis and design of nuclear reactors) and that solves the neutron transport equation in 3-dimensional using the discrete ordinates method S_N, steady state and Cartesian geometry. As part of the activities of Working Group 4 (users group) of the AZTLAN project, this work validates the AZTRAN code using the 2002 Yamamoto Benchmark for LWR reactors. For comparison, the commercial code CASMO-4 and the free code Serpent-2 are used; in addition, the results are compared with the data obtained from an article of the PHYSOR 2002 conference. The Benchmark consists of a fuel pin, two UO_2 cells and two other of MOX cells; there is a problem of each cell for each type of reactor PWR and BWR. Although the AZTRAN code is at an early stage of development, the results obtained are encouraging and close to those reported with other internationally accepted codes and methodologies. (Author)

  4. Research and Design in Unified Coding Architecture for Smart Grids

    Directory of Open Access Journals (Sweden)

    Gang Han

    2013-09-01

    Full Text Available Standardized and sharing information platform is the foundation of the Smart Grids. In order to improve the dispatching center information integration of the power grids and achieve efficient data exchange, sharing and interoperability, a unified coding architecture is proposed. The architecture includes coding management layer, coding generation layer, information models layer and application system layer. Hierarchical design makes the whole coding architecture to adapt to different application environments, different interfaces, loosely coupled requirements, which can realize the integration model management function of the power grids. The life cycle and evaluation method of survival of unified coding architecture is proposed. It can ensure the stability and availability of the coding architecture. Finally, the development direction of coding technology of the Smart Grids in future is prospected.

  5. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  6. Acceptance and validation test report for HANSF code version 1.3.2

    International Nuclear Information System (INIS)

    PIEPHO, M.G.

    2001-01-01

    The HANSF code, Version 1.3.2, is a stand-along code that runs only in DOS. As a result, it runs on any Windows' platform, since each Windows(trademark) platform can create a DOS-prompt window and execute HANSF in the DOS window. The HANSF code is proprietary to Fauske and Associates, Inc., (FAI) of Burr Ridge, IL, the developers of the code. The SNF Project has a license from FAI to run the HANSF code on any computer for only work related to SNF Project. The SNF Project owns the MCO.FOR routine, which is the main routine in HANSF for CVDF applications. The HANSF code calculates physical variables such as temperature, pressure, oxidation rates due to chemical reactions of uranium metal/fuel with water or oxygen. The code is used by the Spent Nuclear Fuel (SNF) Project at Hanford; for example, the report Thermal Analysis of Cold Vacuum Drying of Spent Nuclear Fuel (HNF-SD-SNF-CN-023). The primary facilities of interest are the K-Basins, Cold Vacuum Drying Facility (CVDF), Canister Storage Building (CSB) and T Plant. The overall Summary is presented in Section 2.0, Variances in Section 3.0, Comprehensive Assessment in Section 4.0, Results in Section 5.0, Evaluation in Section 6.0, and Summary of Activities in Section 7.0

  7. Empirical validation of the triple-code model of numerical processing for complex math operations using functional MRI and group Independent Component Analysis of the mental addition and subtraction of fractions.

    Science.gov (United States)

    Schmithorst, Vincent J; Brown, Rhonda Douglas

    2004-07-01

    The suitability of a previously hypothesized triple-code model of numerical processing, involving analog magnitude, auditory verbal, and visual Arabic codes of representation, was investigated for the complex mathematical task of the mental addition and subtraction of fractions. Functional magnetic resonance imaging (fMRI) data from 15 normal adult subjects were processed using exploratory group Independent Component Analysis (ICA). Separate task-related components were found with activation in bilateral inferior parietal, left perisylvian, and ventral occipitotemporal areas. These results support the hypothesized triple-code model corresponding to the activated regions found in the individual components and indicate that the triple-code model may be a suitable framework for analyzing the neuropsychological bases of the performance of complex mathematical tasks. Copyright 2004 Elsevier Inc.

  8. ClinicalCodes: an online clinical codes repository to improve the validity and reproducibility of research using electronic medical records.

    Science.gov (United States)

    Springate, David A; Kontopantelis, Evangelos; Ashcroft, Darren M; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David

    2014-01-01

    Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.

  9. Analysis of the development of cross-platform mobile applications

    OpenAIRE

    Pinedo Escribano, Diego

    2012-01-01

    The development of mobile phone applications is a huge market nowadays. There are many companies investing lot of money to develop successful and profitable applications. The problem emerges when trying to develop an application to be used by every user independently of the platform they are using (Android, iOS, BlackBerry OS, Windows Phone, etc.). For this reason, on the last years many different technologies have appeared that making the development of cross-platform applications easier. In...

  10. User's and Programmer's Guide for HPC Platforms in CIEMAT; Guia de Utilizacion y programacion de las Plataformas de Calculo del CIEMAT

    Energy Technology Data Exchange (ETDEWEB)

    Munoz Roldan, A

    2003-07-01

    This Technical Report presents a description of the High Performance Computing platforms available to researchers in CIEMAT and dedicated mainly to scientific computing. It targets to users and programmers and tries to help in the processes of developing new code and porting code across platforms. A brief review is also presented about historical evolution in the field of HPC, ie, the programming paradigms and underlying architectures. (Author) 32 refs.

  11. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    Science.gov (United States)

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  12. Current status and applications of intergrated safety assessment and simulation code system for ISA

    Energy Technology Data Exchange (ETDEWEB)

    Izquierdo, J. M.; Hortal, J.; Perea, M. Sanchez; Melendez, E. [Modeling and Simulation Area (MOSI), Nuclear Safety Council (CSN), Madrid (Spain); Queral, E.; Rivas-Lewicky, J. [Energy and Fuels Department, Technical University of Madrid (UPM), Madrid (Spain)

    2017-03-15

    This paper reviews current status of the unified approach known as integrated safety assessment (ISA), as well as the associated SCAIS (simulation codes system for ISA) computer platform. These constitute a proposal, which is the result of collaborative action among the Nuclear Safety Council (CSN), University of Madrid (UPM), and NFQ Solutions S.L, aiming to allow independent regulatory verification of industry quantitative risk assessments. The content elaborates on discussions of the classical treatment of time in conventional probabilistic safety assessment (PSA) sequences and states important conclusions that can be used to avoid systematic and unacceptable underestimation of the failure exceedance frequencies. The unified ISA method meets this challenge by coupling deterministic and probabilistic mutual influences. The feasibility of the approach is illustrated with some examples of its application to a real size plant.

  13. Diagnostic and prognostic signatures from the small non-coding RNA transcriptome in prostate cancer

    DEFF Research Database (Denmark)

    Martens-Uzunova, E S; Jalava, S E; Dits, N F

    2011-01-01

    Prostate cancer (PCa) is the most frequent male malignancy and the second most common cause of cancer-related death in Western countries. Current clinical and pathological methods are limited in the prediction of postoperative outcome. It is becoming increasingly evident that small non-coding RNA...... signatures of 102 fresh-frozen patient samples during PCa progression by miRNA microarrays. Both platforms were cross-validated by quantitative reverse transcriptase-PCR. Besides the altered expression of several miRNAs, our deep sequencing analyses revealed strong differential expression of small nucleolar...... RNAs (snoRNAs) and transfer RNAs (tRNAs). From microarray analysis, we derived a miRNA diagnostic classifier that accurately distinguishes normal from cancer samples. Furthermore, we were able to construct a PCa prognostic predictor that independently forecasts postoperative outcome. Importantly...

  14. Development of platform to compare different wall heat transfer packages for system analysis codes

    International Nuclear Information System (INIS)

    Kim, Min-Gil; Lee, Won Woong; Lee, Jeong Ik; Shin, Sung Gil

    2016-01-01

    System thermal hydraulic (STH) analysis code is used for analyzing and evaluating the safety of a designed nuclear system. The system thermal hydraulic analysis code typically solves mass, momentum and energy conservation equations for multiple phases with sets of selected empirical constitutive equations to close the problem. Several STH codes are utilized in academia, industry and regulators, such as MARS-KS, SPACE, RELAP5, COBRA-TF, TRACE, and so on. Each system thermal hydraulic code consists of different sets of governing equations and correlations. However, the packages and sets of correlations of each code are not compared quantitatively yet. Wall heat transfer mode transition maps of SPACE and MARS-KS have a little difference for the transition from wall nucleate heat transfer mode to wall film heat transfer mode. Both codes have the same heat transfer packages and correlations in most region except for wall film heat transfer mode. Most of heat transfer coefficients calculated for the range of selected variables of SPACE are the same with those of MARS-KS. For the intervals between 500K and 540K of wall temperature, MARS-KS selects the wall film heat transfer mode and Bromley correlation but SPACE select the wall nucleate heat transfer mode and Chen correlation. This is because the transition from nucleate boiling to film boiling of MARS-KS is earlier than SPACE. More detailed analysis of the heat transfer package and flow regime package will be followed in the near future

  15. Licensing experience with SPINLINE digital I/C platform - 15099

    International Nuclear Information System (INIS)

    Jegou, H.; Duthou, A.; Bach, J.; Burzynski, M.

    2015-01-01

    Rolls-Royce recently received a safety evaluation report from the NRC for the SPINLINE 3 digital safety instrumentation and control platform. The main Rolls-Royce interest in the NRC review was approval of the fail-safe, fault-tolerance, self-monitoring, deterministic, and communication independence features of the platform. The SPINLINE 3 platform consists of a set of standardized, modular hardware and software components and associated development tools. Rolls-Royce used a set of EPRI guidance documents to successfully develop a commercial grade dedication case of the platform. It was important to describe the technical critical characteristics for performance and dependability in the documentation submitted to NRC. The NRC audit forum was an important opportunity to effectively communicate complex technical information about the SPINLINE 3 platform. The NRC review had five interesting focus areas that offer opportunities for lessons learned. The main lesson learned is to put the same emphasis on the review for communication effectiveness as is put on the review for technical completeness and accuracy

  16. Integration of the program TNXYZ in the platform SALOME; Integracion del programa TNXYZ en la plataforma SALOME

    Energy Technology Data Exchange (ETDEWEB)

    Chaparro V, F. J.; Silva A, L.; Del Valle G, E. [IPN, Escuela Superior de Fisica y Matematicas, Av. Instituto Politecnico Nacional s/n, U.P. Adolfo Lopez Mateos, Edificio 9, Col. San Pedro Zacatenco, 07738 Mexico D. F. (Mexico); Gomez T, A. M.; Vargas E, S., E-mail: javier.paquito@hotmail.com [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2013-10-15

    This work presents the procedure realized to integrate the code TNXYZ like a processing tool to the graphic simulation platform SALOME. The code TNXYZ solves the neutron transport equation in stationary state, for several energy groups, quantizing the angular variable by the discrete ordinates method and the space variable by nodal methods. The platform SALOME is a graphic surrounding designed for the construction, edition and simulation of mechanical models focused to the industry and contrary to other software, it allows to integrate external source codes to the surrounding, to form a complete scheme of execution, supervision, pre and post information processing. The code TNXYZ was programmed in the 90s in a Fortran compiler, but to be used at the present time the code should be actualized to the current compiler characteristics; also, in the original scheme was carried out a modularization process, that is to say, the main program was divided in sections where the code carries out important operations, with the intention of flexibility the data extraction process along its processing sequence and that can be useful in a later development of coupling. Finally, to verify the integration a fuel assembly BWR was modeled, as well as a control cell. The cross sections were obtained with the Monte Carlo Serpent code. Some results obtained with Serpent were used to verify and to begin with the validation of the code, being obtained an acceptable comparison in the infinite multiplication factor. The validation process should extend and one has planned to present in a future work. This work is part of the development of the research group formed between the Escuela Superior de Fisica y Matematicas del Instituto Politecnico Nacional (IPN) and the Instituto Nacional de Investigaciones Nucleares (ININ) in which a simulation Mexican platform of nuclear reactors is developed. (Author)

  17. Design of verification platform for wireless vision sensor networks

    Science.gov (United States)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  18. 3D shape measurement system developed on mobile platform

    Science.gov (United States)

    Wu, Zhoujie; Chang, Meng; Shi, Bowen; Zhang, Qican

    2017-02-01

    Three-dimensional (3-D) shape measurement technology based on structured light has become one hot research field inspired by the increasing requirements. Many methods have been implemented and applied in the industry applications, but most of their equipments are large and complex, cannot be portable. Meanwhile, the popularity of the smart mobile terminals, such as smart phones, provides a platform for the miniaturization and portability of this technology. The measurement system based on phase-shift algorithm and Gray-code pattern under the Android platform on a mobile phone is mainly studied and developed, and it has been encapsulated into a mobile phone application in order to reconstruct 3-D shape data in the employed smart phone easily and quickly. The experimental results of two measured object are given in this paper and demonstrate the application we developed in the mobile platform is effective.

  19. Independent Mobility Achieved through a Wireless Brain-Machine Interface.

    Directory of Open Access Journals (Sweden)

    Camilo Libedinsky

    Full Text Available Individuals with tetraplegia lack independent mobility, making them highly dependent on others to move from one place to another. Here, we describe how two macaques were able to use a wireless integrated system to control a robotic platform, over which they were sitting, to achieve independent mobility using the neuronal activity in their motor cortices. The activity of populations of single neurons was recorded using multiple electrode arrays implanted in the arm region of primary motor cortex, and decoded to achieve brain control of the platform. We found that free-running brain control of the platform (which was not equipped with any machine intelligence was fast and accurate, resembling the performance achieved using joystick control. The decoding algorithms can be trained in the absence of joystick movements, as would be required for use by tetraplegic individuals, demonstrating that the non-human primate model is a good pre-clinical model for developing such a cortically-controlled movement prosthetic. Interestingly, we found that the response properties of some neurons differed greatly depending on the mode of control (joystick or brain control, suggesting different roles for these neurons in encoding movement intention and movement execution. These results demonstrate that independent mobility can be achieved without first training on prescribed motor movements, opening the door for the implementation of this technology in persons with tetraplegia.

  20. Scoping review and evaluation of SMS/text messaging platforms for mHealth projects or clinical interventions.

    Science.gov (United States)

    Iribarren, Sarah J; Brown, William; Giguere, Rebecca; Stone, Patricia; Schnall, Rebecca; Staggers, Nancy; Carballo-Diéguez, Alex

    2017-05-01

    Mobile technology supporting text messaging interventions (TMIs) continues to evolve, presenting challenges for researchers and healthcare professionals who need to choose software solutions to best meet their program needs. The objective of this review was to systematically identify and compare text messaging platforms and to summarize their advantages and disadvantages as described in peer-reviewed literature. A scoping review was conducted using four steps: 1) identify currently available platforms through online searches and in mHealth repositories; 2) expand evaluation criteria of an mHealth mobile messaging toolkit and integrate prior user experiences as researchers; 3) evaluate each platform's functions and features based on the expanded criteria and a vendor survey; and 4) assess the documentation of platform use in the peer-review literature. Platforms meeting inclusion criteria were assessed independently by three reviewers and discussed until consensus was reached. The PRISMA guidelines were followed to report findings. Of the 1041 potentially relevant search results, 27 platforms met inclusion criteria. Most were excluded because they were not platforms (e.g., guides, toolkits, reports, or SMS gateways). Of the 27 platforms, only 12 were identified in existing mHealth repositories, 10 from Google searches, while five were found in both. The expanded evaluation criteria included 22 items. Results indicate no uniform presentation of platform features and functions, often making these difficult to discern. Fourteen of the platforms were reported as open source, 10 focused on health care and 16 were tailored to meet needs of low resource settings (not mutually exclusive). Fifteen platforms had do-it-yourself setup (programming not required) while the remainder required coding/programming skills or setups could be built to specification by the vendor. Frequently described features included data security and access to the platform via cloud-based systems. Pay

  1. The Accurate Particle Tracer Code

    OpenAIRE

    Wang, Yulei; Liu, Jian; Qin, Hong; Yu, Zhi

    2016-01-01

    The Accurate Particle Tracer (APT) code is designed for large-scale particle simulations on dynamical systems. Based on a large variety of advanced geometric algorithms, APT possesses long-term numerical accuracy and stability, which are critical for solving multi-scale and non-linear problems. Under the well-designed integrated and modularized framework, APT serves as a universal platform for researchers from different fields, such as plasma physics, accelerator physics, space science, fusio...

  2. A photon dominated region code comparison study

    NARCIS (Netherlands)

    Roellig, M.; Abel, N. P.; Bell, T.; Bensch, F.; Black, J.; Ferland, G. J.; Jonkheid, B.; Kamp, I.; Kaufman, M. J.; Le Bourlot, J.; Le Petit, F.; Meijerink, R.; Morata, O.; Ossenkopf, Volker; Roueff, E.; Shaw, G.; Spaans, M.; Sternberg, A.; Stutzki, J.; Thi, W.-F.; van Dishoeck, E. F.; van Hoof, P. A. M.; Viti, S.; Wolfire, M. G.

    Aims. We present a comparison between independent computer codes, modeling the physics and chemistry of interstellar photon dominated regions (PDRs). Our goal was to understand the mutual differences in the PDR codes and their effects on the physical and chemical structure of the model clouds, and

  3. Comparison of Learning Software Architecture by Developing Social Applications versus Games on the Android Platform

    Directory of Open Access Journals (Sweden)

    Bian Wu

    2012-01-01

    Full Text Available This paper describes an empirical study where the focus was on discovering differences and similarities in students working on development of social applications versus students working on development of games using the same Android development platform. In 2010-2011, students attending the software architecture course at the Norwegian University of Science and Technology (NTNU could choose between four types of projects. Independently of the chosen type of project, all students had to go through the same phases, produce the same documents based on the same templates, and follow exactly the same process. This study focuses on one of projects—Android project, to see how much the application domain affects the course project independently of the chosen technology. Our results revealed some positive effects for the students doing game development compared to social application development to learn software architecture, like motivated to work with games, a better focus on quality attributes such as modifiability and testability during the development, production of software architectures of higher complexity, and more productive coding working for the project. However, we did not find significant differences in awarded grade between students choosing the two different domains.

  4. CALTRANS: A parallel, deterministic, 3D neutronics code

    Energy Technology Data Exchange (ETDEWEB)

    Carson, L.; Ferguson, J.; Rogers, J.

    1994-04-01

    Our efforts to parallelize the deterministic solution of the neutron transport equation has culminated in a new neutronics code CALTRANS, which has full 3D capability. In this article, we describe the layout and algorithms of CALTRANS and present performance measurements of the code on a variety of platforms. Explicit implementation of the parallel algorithms of CALTRANS using both the function calls of the Parallel Virtual Machine software package (PVM 3.2) and the Meiko CS-2 tagged message passing library (based on the Intel NX/2 interface) are provided in appendices.

  5. RMG An Open Source Electronic Structure Code for Multi-Petaflops Calculations

    Science.gov (United States)

    Briggs, Emil; Lu, Wenchang; Hodak, Miroslav; Bernholc, Jerzy

    RMG (Real-space Multigrid) is an open source, density functional theory code for quantum simulations of materials. It solves the Kohn-Sham equations on real-space grids, which allows for natural parallelization via domain decomposition. Either subspace or Davidson diagonalization, coupled with multigrid methods, are used to accelerate convergence. RMG is a cross platform open source package which has been used in the study of a wide range of systems, including semiconductors, biomolecules, and nanoscale electronic devices. It can optionally use GPU accelerators to improve performance on systems where they are available. The recently released versions (>2.0) support multiple GPU's per compute node, have improved performance and scalability, enhanced accuracy and support for additional hardware platforms. New versions of the code are regularly released at http://www.rmgdft.org. The releases include binaries for Linux, Windows and MacIntosh systems, automated builds for clusters using cmake, as well as versions adapted to the major supercomputing installations and platforms. Several recent, large-scale applications of RMG will be discussed.

  6. Two-terminal video coding.

    Science.gov (United States)

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  7. Development of an IHE MRRT-compliant open-source web-based reporting platform

    Energy Technology Data Exchange (ETDEWEB)

    Pinto dos Santos, Daniel; Klos, G.; Kloeckner, R.; Oberle, R.; Dueber, C.; Mildenberger, P. [University Medical Center of the Johannes Gutenberg-University Mainz, Department of Diagnostic and Interventional Radiology, Mainz (Germany)

    2017-01-15

    To develop a platform that uses structured reporting templates according to the IHE Management of Radiology Report Templates (MRRT) profile, and to implement this platform into clinical routine. The reporting platform uses standard web technologies (HTML / JavaScript and PHP / MySQL) only. Several freely available external libraries were used to simplify the programming. The platform runs on a standard web server, connects with the radiology information system (RIS) and PACS, and is easily accessible via a standard web browser. A prototype platform that allows structured reporting to be easily incorporated into the clinical routine was developed and successfully tested. To date, 797 reports were generated using IHE MRRT-compliant templates (many of them downloaded from the RSNA's radreport.org website). Reports are stored in a MySQL database and are easily accessible for further analyses. Development of an IHE MRRT-compliant platform for structured reporting is feasible using only standard web technologies. All source code will be made available upon request under a free license, and the participation of other institutions in further development is welcome. (orig.)

  8. Development of an IHE MRRT-compliant open-source web-based reporting platform

    International Nuclear Information System (INIS)

    Pinto dos Santos, Daniel; Klos, G.; Kloeckner, R.; Oberle, R.; Dueber, C.; Mildenberger, P.

    2017-01-01

    To develop a platform that uses structured reporting templates according to the IHE Management of Radiology Report Templates (MRRT) profile, and to implement this platform into clinical routine. The reporting platform uses standard web technologies (HTML / JavaScript and PHP / MySQL) only. Several freely available external libraries were used to simplify the programming. The platform runs on a standard web server, connects with the radiology information system (RIS) and PACS, and is easily accessible via a standard web browser. A prototype platform that allows structured reporting to be easily incorporated into the clinical routine was developed and successfully tested. To date, 797 reports were generated using IHE MRRT-compliant templates (many of them downloaded from the RSNA's radreport.org website). Reports are stored in a MySQL database and are easily accessible for further analyses. Development of an IHE MRRT-compliant platform for structured reporting is feasible using only standard web technologies. All source code will be made available upon request under a free license, and the participation of other institutions in further development is welcome. (orig.)

  9. Energy and Data Throughput for Asymmetric Inter-Session Network Coding

    DEFF Research Database (Denmark)

    Paramanathan, Achuthan; Heide, Janus; Pahlevani, Peyman

    2012-01-01

    on commercial platforms. The outcome of this paper confirms the analytical expression, and the results shows that even with a large asymmetric data rate there is a gain in terms of energy consumption and throughput when network coding is applied in compare to the case when network coding is not applied.......In this paper we investigate the impact of asymmetric traffic patterns on the energy consumption and throughput in a wireless multi hop network. Network coding is a novel technique for communication systems and a viable solution for wireless multi hop networks. State-of-the-art research is mainly...

  10. The formative platform of the Congress of Panama (1810-1826): the Pan-American conjecture revisited

    OpenAIRE

    De la Reza, Germán A.

    2013-01-01

    This article examines the formative platform of the Congress of Panama of 1826. It seeks to support the hypothesis that the nature and scope of the first test of integration in the Western Hemisphere depended critically on the platform created by Simón Bolívar and other Latin American Independence heroes from the Declaration of Independence of Venezuela in 1810 until the last bilateral agreement of 1826. In that respect, it corroborates the Latin American Identity of the initiative. Este a...

  11. Quality assurance and verification of the MACCS [MELCOR Accident Consequence Code System] code, Version 1.5

    International Nuclear Information System (INIS)

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E.

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs

  12. The OpenMC Monte Carlo particle transport code

    International Nuclear Information System (INIS)

    Romano, Paul K.; Forget, Benoit

    2013-01-01

    Highlights: ► An open source Monte Carlo particle transport code, OpenMC, has been developed. ► Solid geometry and continuous-energy physics allow high-fidelity simulations. ► Development has focused on high performance and modern I/O techniques. ► OpenMC is capable of scaling up to hundreds of thousands of processors. ► Results on a variety of benchmark problems agree with MCNP5. -- Abstract: A new Monte Carlo code called OpenMC is currently under development at the Massachusetts Institute of Technology as a tool for simulation on high-performance computing platforms. Given that many legacy codes do not scale well on existing and future parallel computer architectures, OpenMC has been developed from scratch with a focus on high performance scalable algorithms as well as modern software design practices. The present work describes the methods used in the OpenMC code and demonstrates the performance and accuracy of the code on a variety of problems.

  13. Evaluation of data discretization methods to derive platform independent isoform expression signatures for multi-class tumor subtyping.

    Science.gov (United States)

    Jung, Segun; Bi, Yingtao; Davuluri, Ramana V

    2015-01-01

    Many supervised learning algorithms have been applied in deriving gene signatures for patient stratification from gene expression data. However, transferring the multi-gene signatures from one analytical platform to another without loss of classification accuracy is a major challenge. Here, we compared three unsupervised data discretization methods--Equal-width binning, Equal-frequency binning, and k-means clustering--in accurately classifying the four known subtypes of glioblastoma multiforme (GBM) when the classification algorithms were trained on the isoform-level gene expression profiles from exon-array platform and tested on the corresponding profiles from RNA-seq data. We applied an integrated machine learning framework that involves three sequential steps; feature selection, data discretization, and classification. For models trained and tested on exon-array data, the addition of data discretization step led to robust and accurate predictive models with fewer number of variables in the final models. For models trained on exon-array data and tested on RNA-seq data, the addition of data discretization step dramatically improved the classification accuracies with Equal-frequency binning showing the highest improvement with more than 90% accuracies for all the models with features chosen by Random Forest based feature selection. Overall, SVM classifier coupled with Equal-frequency binning achieved the best accuracy (> 95%). Without data discretization, however, only 73.6% accuracy was achieved at most. The classification algorithms, trained and tested on data from the same platform, yielded similar accuracies in predicting the four GBM subgroups. However, when dealing with cross-platform data, from exon-array to RNA-seq, the classifiers yielded stable models with highest classification accuracies on data transformed by Equal frequency binning. The approach presented here is generally applicable to other cancer types for classification and identification of

  14. Application of genotyping-by-sequencing on semiconductor sequencing platforms: a comparison of genetic and reference-based marker ordering in barley.

    Directory of Open Access Journals (Sweden)

    Martin Mascher

    Full Text Available The rapid development of next-generation sequencing platforms has enabled the use of sequencing for routine genotyping across a range of genetics studies and breeding applications. Genotyping-by-sequencing (GBS, a low-cost, reduced representation sequencing method, is becoming a common approach for whole-genome marker profiling in many species. With quickly developing sequencing technologies, adapting current GBS methodologies to new platforms will leverage these advancements for future studies. To test new semiconductor sequencing platforms for GBS, we genotyped a barley recombinant inbred line (RIL population. Based on a previous GBS approach, we designed bar code and adapter sets for the Ion Torrent platforms. Four sets of 24-plex libraries were constructed consisting of 94 RILs and the two parents and sequenced on two Ion platforms. In parallel, a 96-plex library of the same RILs was sequenced on the Illumina HiSeq 2000. We applied two different computational pipelines to analyze sequencing data; the reference-independent TASSEL pipeline and a reference-based pipeline using SAMtools. Sequence contigs positioned on the integrated physical and genetic map were used for read mapping and variant calling. We found high agreement in genotype calls between the different platforms and high concordance between genetic and reference-based marker order. There was, however, paucity in the number of SNP that were jointly discovered by the different pipelines indicating a strong effect of alignment and filtering parameters on SNP discovery. We show the utility of the current barley genome assembly as a framework for developing very low-cost genetic maps, facilitating high resolution genetic mapping and negating the need for developing de novo genetic maps for future studies in barley. Through demonstration of GBS on semiconductor sequencing platforms, we conclude that the GBS approach is amenable to a range of platforms and can easily be modified as new

  15. Integration of DYN3D inside the NURESIM platform

    International Nuclear Information System (INIS)

    Gomez T, A. M.; Sanchez E, V. H.; Kliem, S.; Gommlich, A.; Rohde, U.

    2010-10-01

    The NURISP project (Nuclear Reactor Integrated Simulation Project) is focused on the further development of the European Nuclear Reactor Simulation (NURESIM) platform for advanced numerical reactor design and safety analysis tools. NURESIM is based on an open source platform - called SALOME - that offers flexible and powerful capabilities for pre- and post processing as well as for coupling of multi-physics and multi-scale solutions. The developments within the NURISP project are concentrated in the areas of reactors, physics, thermal hydraulics, multi-physics, and sensitivity and uncertainty methodologies. The aim is to develop experimentally validated advanced simulation tools including capabilities for uncertainty and sensitivity quantification. A unique feature of NURESIM is the flexibility in selecting the solvers for the area of interest and the interpolation and mapping schemes according to the problem under consideration. The Sub Project 3 (S P3) of NURISP is focused on the development of multi-physics methodologies at different scales and covering different physical fields (neutronics, thermal hydraulics and pin mechanics). One of the objectives of S P3 is the development of multi-physics methodologies beyond the state-of-the-art for improved prediction of local safety margins and design at pin-by-pin scale. The Karlsruhe Institute of Technology and the Research Center Dresden-Rossendorf are involved in the integration of the reactor dynamics code DYN3D into the SALOME platform for coupling with a thermal hydraulic sub-channel code (FLICA4) at fuel assembly and pin level. In this paper, the main capabilities of the SALOME platform, the steps for the integration process of DYN3D as well as selected preliminary results obtained for the DYN3D/FLICA4 coupling are presented and discussed. Finally the next steps for the validation of the coupling scheme at fuel assembly and pin basis are given. (Author)

  16. NORTICA - a new code for cyclotron analysis

    International Nuclear Information System (INIS)

    Gorelov, D.; Johnson, D.; Marti, F.

    2001-01-01

    The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state

  17. NORTICA—a new code for cyclotron analysis

    Science.gov (United States)

    Gorelov, D.; Johnson, D.; Marti, F.

    2001-12-01

    The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER [1] developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state.

  18. The Transcriptome Analysis and Comparison Explorer--T-ACE: a platform-independent, graphical tool to process large RNAseq datasets of non-model organisms.

    Science.gov (United States)

    Philipp, E E R; Kraemer, L; Mountfort, D; Schilhabel, M; Schreiber, S; Rosenstiel, P

    2012-03-15

    Next generation sequencing (NGS) technologies allow a rapid and cost-effective compilation of large RNA sequence datasets in model and non-model organisms. However, the storage and analysis of transcriptome information from different NGS platforms is still a significant bottleneck, leading to a delay in data dissemination and subsequent biological understanding. Especially database interfaces with transcriptome analysis modules going beyond mere read counts are missing. Here, we present the Transcriptome Analysis and Comparison Explorer (T-ACE), a tool designed for the organization and analysis of large sequence datasets, and especially suited for transcriptome projects of non-model organisms with little or no a priori sequence information. T-ACE offers a TCL-based interface, which accesses a PostgreSQL database via a php-script. Within T-ACE, information belonging to single sequences or contigs, such as annotation or read coverage, is linked to the respective sequence and immediately accessible. Sequences and assigned information can be searched via keyword- or BLAST-search. Additionally, T-ACE provides within and between transcriptome analysis modules on the level of expression, GO terms, KEGG pathways and protein domains. Results are visualized and can be easily exported for external analysis. We developed T-ACE for laboratory environments, which have only a limited amount of bioinformatics support, and for collaborative projects in which different partners work on the same dataset from different locations or platforms (Windows/Linux/MacOS). For laboratories with some experience in bioinformatics and programming, the low complexity of the database structure and open-source code provides a framework that can be customized according to the different needs of the user and transcriptome project.

  19. The VOLNA-OP2 Tsunami Code (Version 1.0)

    KAUST Repository

    Reguly, Istvan Z.

    2018-03-08

    In this paper, we present the VOLNA-OP2 tsunami model and implementation; a finite volume non-linear shallow water equations (NSWE) solver built on the OP2 domain specific language for unstructured mesh computations. VOLNA-OP2 is unique among tsunami solvers in its support for several high performance computing platforms: CPUs, the Intel Xeon Phi, and GPUs. This is achieved in a way that the scientific code is kept separate from various parallel implementations, enabling easy maintainability. It has already been used in production for several years, here we discuss how it can be integrated into various workflows, such as a statistical emulator. The scalability of the code is demonstrated on three supercomputers, built with classical Xeon CPUs, the Intel Xeon Phi, and NVIDIA P100 GPUs. VOLNA-OP2 shows an ability to deliver productivity to its users, as well as performance and portability on a number of platforms.

  20. The VOLNA-OP2 Tsunami Code (Version 1.0)

    KAUST Repository

    Reguly, Istvan Z.; Gopinathan, Devaraj; Beck, Joakim H.; Giles, Michael B.; Guillas, Serge; Dias, Frederic

    2018-01-01

    In this paper, we present the VOLNA-OP2 tsunami model and implementation; a finite volume non-linear shallow water equations (NSWE) solver built on the OP2 domain specific language for unstructured mesh computations. VOLNA-OP2 is unique among tsunami solvers in its support for several high performance computing platforms: CPUs, the Intel Xeon Phi, and GPUs. This is achieved in a way that the scientific code is kept separate from various parallel implementations, enabling easy maintainability. It has already been used in production for several years, here we discuss how it can be integrated into various workflows, such as a statistical emulator. The scalability of the code is demonstrated on three supercomputers, built with classical Xeon CPUs, the Intel Xeon Phi, and NVIDIA P100 GPUs. VOLNA-OP2 shows an ability to deliver productivity to its users, as well as performance and portability on a number of platforms.

  1. QUIL: a chemical equilibrium code

    International Nuclear Information System (INIS)

    Lunsford, J.L.

    1977-02-01

    A chemical equilibrium code QUIL is described, along with two support codes FENG and SURF. QUIL is designed to allow calculations on a wide range of chemical environments, which may include surface phases. QUIL was written specifically to calculate distributions associated with complex equilibria involving fission products in the primary coolant loop of the high-temperature gas-cooled reactor. QUIL depends upon an energy-data library called ELIB. This library is maintained by FENG and SURF. FENG enters into the library all reactions having standard free energies of reaction that are independent of concentration. SURF enters all surface reactions into ELIB. All three codes are interactive codes written to be used from a remote terminal, with paging control provided. Plotted output is also available

  2. The formative platform of the Congress of Panama (1810-1826: the Pan-American conjecture revisited

    Directory of Open Access Journals (Sweden)

    Germán A. De la Reza

    2013-01-01

    Full Text Available This article examines the formative platform of the Congress of Panama of 1826. It seeks to support the hypothesis that the nature and scope of the first test of integration in the Western Hemisphere depended critically on the platform created by Simón Bolívar and other Latin American Independence heroes from the Declaration of Independence of Venezuela in 1810 until the last bilateral agreement of 1826. In that respect, it corroborates the Latin American Identity of the initiative.

  3. Panda code

    International Nuclear Information System (INIS)

    Altomare, S.; Minton, G.

    1975-02-01

    PANDA is a new two-group one-dimensional (slab/cylinder) neutron diffusion code designed to replace and extend the FAB series. PANDA allows for the nonlinear effects of xenon, enthalpy and Doppler. Fuel depletion is allowed. PANDA has a completely general search facility which will seek criticality, maximize reactivity, or minimize peaking. Any single parameter may be varied in a search. PANDA is written in FORTRAN IV, and as such is nearly machine independent. However, PANDA has been written with the present limitations of the Westinghouse CDC-6600 system in mind. Most computation loops are very short, and the code is less than half the useful 6600 memory size so that two jobs can reside in the core at once. (auth)

  4. Generating performance portable geoscientific simulation code with Firedrake (Invited)

    Science.gov (United States)

    Ham, D. A.; Bercea, G.; Cotter, C. J.; Kelly, P. H.; Loriant, N.; Luporini, F.; McRae, A. T.; Mitchell, L.; Rathgeber, F.

    2013-12-01

    This presentation will demonstrate how a change in simulation programming paradigm can be exploited to deliver sophisticated simulation capability which is far easier to programme than are conventional models, is capable of exploiting different emerging parallel hardware, and is tailored to the specific needs of geoscientific simulation. Geoscientific simulation represents a grand challenge computational task: many of the largest computers in the world are tasked with this field, and the requirements of resolution and complexity of scientists in this field are far from being sated. However, single thread performance has stalled, even sometimes decreased, over the last decade, and has been replaced by ever more parallel systems: both as conventional multicore CPUs and in the emerging world of accelerators. At the same time, the needs of scientists to couple ever-more complex dynamics and parametrisations into their models makes the model development task vastly more complex. The conventional approach of writing code in low level languages such as Fortran or C/C++ and then hand-coding parallelism for different platforms by adding library calls and directives forces the intermingling of the numerical code with its implementation. This results in an almost impossible set of skill requirements for developers, who must simultaneously be domain science experts, numericists, software engineers and parallelisation specialists. Even more critically, it requires code to be essentially rewritten for each emerging hardware platform. Since new platforms are emerging constantly, and since code owners do not usually control the procurement of the supercomputers on which they must run, this represents an unsustainable development load. The Firedrake system, conversely, offers the developer the opportunity to write PDE discretisations in the high-level mathematical language UFL from the FEniCS project (http://fenicsproject.org). Non-PDE model components, such as parametrisations

  5. Application Security for the Android Platform Processes, Permissions, and Other Safeguards

    CERN Document Server

    Six, Jeff

    2011-01-01

    This book will educate readers on the need for application security and secure coding practices when designing any app. No prior knowledge of security or secure programming techniques is assumed. The book will discuss the need for such practices, how the Android environment is structured with respect to security considerations, what services and techniques are available on the platform to protect data, and how developers can build and code applications that address the risk to their applications and the data processed by them. This text is especially important now, as Android is fast becoming

  6. Design of a Distributed Food Traceability Platform and Its Application in Food Traceability at Guangdong Province

    Directory of Open Access Journals (Sweden)

    Luo Haibiao

    2017-01-01

    Full Text Available Food traceability is an important measure to secure food safety. This paper designed a food traceability platform based on distribution framework and implemented it in Guangdong province. The platform can provide traceability service, production and management service for food enterprise, provide forward and backward traceability of the whole cycle of food production and circulation, and provide various methods of food traceability for public. One characteristic of the platform is that it opens up the data flow among production, circulation and supervising departments, and builds a unified commodity circulation data pool. Based on the flow data pool, not only the production and circulation information of the food product can be traced, but also its inspection and quarantine information. Another characteristic of the platform is that its database and data interface were developed based on the fool electronic traceability standards formulated by the National Food and Drug Administration. Its interface standardization and compatibility with other food traceability platforms can thus be guaranteed. The platform is running at Guangdong province for key supervising products of Infant formula foods (including milk powder, rice flour, farina, etc, editable oil and liquor. The public can use the Guangdong food traceability portal, mobile APP, Wechat or the self-service terminals in the supermarkets to trace food products by scanning or input its traceability code or its product code and verify its authenticity. It will help to promote consumer confidence in food safety.

  7. REVEAL: Software Documentation and Platform Migration

    Science.gov (United States)

    Wilson, Michael A.; Veibell, Victoir T.; Freudinger, Lawrence C.

    2008-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA s Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This report specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as a final report for that internship. The topics discussed include: the documentation of REVEAL source code; the migration of REVEAL to other platforms; and an end-to-end field test that successfully validates the efforts.

  8. The design and verification of probabilistic safety analysis platform NFRisk

    International Nuclear Information System (INIS)

    Hu Wenjun; Song Wei; Ren Lixia; Qian Hongtao

    2010-01-01

    To increase the technical ability in Probabilistic Safety Analysis (PSA) field in China,it is necessary and important to study and develop indigenous professional PSA platform. Following such principle as 'from structure simplification to modulization to production of cut sets to minimum of cut sets', the algorithms, including simplification algorithm, modulization algorithm, the algorithm of conversion from fault tree to binary decision diagram (BDD), the solving algorithm of cut sets, the minimum algorithm of cut sets, and so on, were designed and developed independently; the design of data management and operation platform was completed all alone; the verification and validation of NFRisk platform based on 3 typical fault trees was finished on our own. (authors)

  9. Web Services for Telegeriatric and Independent Living of the Elderly ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... models. The platform design follows a patient centric philosophy along with the ... aging population in the World). ... independent living environment for older people at home ...... impact scope. .... Configuring a Trusted Cloud.

  10. A not-so-short description of the PERFECT platform

    International Nuclear Information System (INIS)

    Bugat, S.; Zeghadi, A.; Adjanor, G.

    2010-01-01

    This article describes the building of the so-called 'PERFECT platform', which main issue was to allow the development of the PERFECT end-products dedicated to the prediction of the degradation of material properties due to irradiation. First, the general principles used to build the platform are detailed. Such principles guided the choices of preferential development language, architecture, and operating system. The architecture of the platform is then described. It allows an easy development of the end-products, and a 'black-box' integration of the codes developed during the project. Each end-product can be seen as a sequence of modules, each module representing a physical phenomenon in time and space. The platform is very flexible, so that different methodologies can be tested and compared inside an end-product. The second part is devoted to the description of a classical PERFECT study, defined thanks to the graphical user interface developed in the project. Focus is made in particular on how a selection of modules is done, how the input data can be entered, and how the study execution is fully controlled by the user. A final description of the post-processing facilities on the results is exposed.

  11. A GPU code for analytic continuation through a sampling method

    Directory of Open Access Journals (Sweden)

    Johan Nordström

    2016-01-01

    Full Text Available We here present a code for performing analytic continuation of fermionic Green’s functions and self-energies as well as bosonic susceptibilities on a graphics processing unit (GPU. The code is based on the sampling method introduced by Mishchenko et al. (2000, and is written for the widely used CUDA platform from NVidia. Detailed scaling tests are presented, for two different GPUs, in order to highlight the advantages of this code with respect to standard CPU computations. Finally, as an example of possible applications, we provide the analytic continuation of model Gaussian functions, as well as more realistic test cases from many-body physics.

  12. microRNA dependent and independent deregulation of long non-coding RNAs by an oncogenic herpesvirus.

    Directory of Open Access Journals (Sweden)

    Sunantha Sethuraman

    2017-07-01

    Full Text Available Kaposi's sarcoma (KS is a highly prevalent cancer in AIDS patients, especially in sub-Saharan Africa. Kaposi's sarcoma-associated herpesvirus (KSHV is the etiological agent of KS and other cancers like Primary Effusion Lymphoma (PEL. In KS and PEL, all tumors harbor latent KSHV episomes and express latency-associated viral proteins and microRNAs (miRNAs. The exact molecular mechanisms by which latent KSHV drives tumorigenesis are not completely understood. Recent developments have highlighted the importance of aberrant long non-coding RNA (lncRNA expression in cancer. Deregulation of lncRNAs by miRNAs is a newly described phenomenon. We hypothesized that KSHV-encoded miRNAs deregulate human lncRNAs to drive tumorigenesis. We performed lncRNA expression profiling of endothelial cells infected with wt and miRNA-deleted KSHV and identified 126 lncRNAs as putative viral miRNA targets. Here we show that KSHV deregulates host lncRNAs in both a miRNA-dependent fashion by direct interaction and in a miRNA-independent fashion through latency-associated proteins. Several lncRNAs that were previously implicated in cancer, including MEG3, ANRIL and UCA1, are deregulated by KSHV. Our results also demonstrate that KSHV-mediated UCA1 deregulation contributes to increased proliferation and migration of endothelial cells.

  13. RunJumpCode: An Educational Game for Educating Programming

    Science.gov (United States)

    Hinds, Matthew; Baghaei, Nilufar; Ragon, Pedrito; Lambert, Jonathon; Rajakaruna, Tharindu; Houghton, Travers; Dacey, Simon

    2017-01-01

    Programming promotes critical thinking, problem solving and analytic skills through creating solutions that can solve everyday problems. However, learning programming can be a daunting experience for a lot of students. "RunJumpCode" is an educational 2D platformer video game, designed and developed in Unity, to teach players the…

  14. A portable virtual machine target for proof-carrying code

    DEFF Research Database (Denmark)

    Franz, Michael; Chandra, Deepak; Gal, Andreas

    2005-01-01

    Virtual Machines (VMs) and Proof-Carrying Code (PCC) are two techniques that have been used independently to provide safety for (mobile) code. Existing virtual machines, such as the Java VM, have several drawbacks: First, the effort required for safety verification is considerable. Second and mor...... simultaneously providing efficient justin-time compilation and target-machine independence. In particular, our approach reduces the complexity of the required proofs, resulting in fewer proof obligations that need to be discharged at the target machine....

  15. Energy and Power Measurements for Network Coding in the Context of Green Mobile Clouds

    DEFF Research Database (Denmark)

    Paramanathan, Achuthan; Pedersen, Morten Videbæk; Roetter, Daniel Enrique Lucani

    2013-01-01

    results for inter-session network coding in Open-Mesh routers underline that the energy invested in performing network coding pays off by dramatically reducing the total energy for the transmission of data over wireless links. We also show measurements for intra-session network coding in three different......This paper presents an in-depth power and energy measurement campaign for inter- and intra-session network coding enabled communication in mobile clouds. The measurements are carried out on different commercial platforms with focus on routers and mobile phones with different CPU capabilities. Our...

  16. When sparse coding meets ranking: a joint framework for learning sparse codes and ranking scores

    KAUST Repository

    Wang, Jim Jing-Yan

    2017-06-28

    Sparse coding, which represents a data point as a sparse reconstruction code with regard to a dictionary, has been a popular data representation method. Meanwhile, in database retrieval problems, learning the ranking scores from data points plays an important role. Up to now, these two problems have always been considered separately, assuming that data coding and ranking are two independent and irrelevant problems. However, is there any internal relationship between sparse coding and ranking score learning? If yes, how to explore and make use of this internal relationship? In this paper, we try to answer these questions by developing the first joint sparse coding and ranking score learning algorithm. To explore the local distribution in the sparse code space, and also to bridge coding and ranking problems, we assume that in the neighborhood of each data point, the ranking scores can be approximated from the corresponding sparse codes by a local linear function. By considering the local approximation error of ranking scores, the reconstruction error and sparsity of sparse coding, and the query information provided by the user, we construct a unified objective function for learning of sparse codes, the dictionary and ranking scores. We further develop an iterative algorithm to solve this optimization problem.

  17. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  18. Towards Automatic Learning of Heuristics for Mechanical Transformations of Procedural Code

    Directory of Open Access Journals (Sweden)

    Guillermo Vigueras

    2017-01-01

    Full Text Available The current trends in next-generation exascale systems go towards integrating a wide range of specialized (co-processors into traditional supercomputers. Due to the efficiency of heterogeneous systems in terms of Watts and FLOPS per surface unit, opening the access of heterogeneous platforms to a wider range of users is an important problem to be tackled. However, heterogeneous platforms limit the portability of the applications and increase development complexity due to the programming skills required. Program transformation can help make programming heterogeneous systems easier by defining a step-wise transformation process that translates a given initial code into a semantically equivalent final code, but adapted to a specific platform. Program transformation systems require the definition of efficient transformation strategies to tackle the combinatorial problem that emerges due to the large set of transformations applicable at each step of the process. In this paper we propose a machine learning-based approach to learn heuristics to define program transformation strategies. Our approach proposes a novel combination of reinforcement learning and classification methods to efficiently tackle the problems inherent to this type of systems. Preliminary results demonstrate the suitability of this approach.

  19. User's and Programmer's Guide for HPC Platforms in CIEMAT; Guia de Utilizacion y programacion de las Plataformas de Calculo del CIEMAT

    Energy Technology Data Exchange (ETDEWEB)

    Munoz Roldan, A.

    2003-07-01

    This Technical Report presents a description of the High Performance Computing platforms available to researchers in CIEMAT and dedicated mainly to scientific computing. It targets to users and programmers and tries to help in the processes of developing new code and porting code across platforms. A brief review is also presented about historical evolution in the field of HPC, ie, the programming paradigms and underlying architectures. (Author) 32 refs.

  20. Design of batch audio/video conversion platform based on JavaEE

    Science.gov (United States)

    Cui, Yansong; Jiang, Lianpin

    2018-03-01

    With the rapid development of digital publishing industry, the direction of audio / video publishing shows the diversity of coding standards for audio and video files, massive data and other significant features. Faced with massive and diverse data, how to quickly and efficiently convert to a unified code format has brought great difficulties to the digital publishing organization. In view of this demand and present situation in this paper, basing on the development architecture of Sptring+SpringMVC+Mybatis, and combined with the open source FFMPEG format conversion tool, a distributed online audio and video format conversion platform with a B/S structure is proposed. Based on the Java language, the key technologies and strategies designed in the design of platform architecture are analyzed emphatically in this paper, designing and developing a efficient audio and video format conversion system, which is composed of “Front display system”, "core scheduling server " and " conversion server ". The test results show that, compared with the ordinary audio and video conversion scheme, the use of batch audio and video format conversion platform can effectively improve the conversion efficiency of audio and video files, and reduce the complexity of the work. Practice has proved that the key technology discussed in this paper can be applied in the field of large batch file processing, and has certain practical application value.

  1. Psynteract: A flexible, cross-platform, open framework for interactive experiments.

    Science.gov (United States)

    Henninger, Felix; Kieslich, Pascal J; Hilbig, Benjamin E

    2017-10-01

    We introduce a novel platform for interactive studies, that is, any form of study in which participants' experiences depend not only on their own responses, but also on those of other participants who complete the same study in parallel, for example a prisoner's dilemma or an ultimatum game. The software thus especially serves the rapidly growing field of strategic interaction research within psychology and behavioral economics. In contrast to all available software packages, our platform does not handle stimulus display and response collection itself. Instead, we provide a mechanism to extend existing experimental software to incorporate interactive functionality. This approach allows us to draw upon the capabilities already available, such as accuracy of temporal measurement, integration with auxiliary hardware such as eye-trackers or (neuro-)physiological apparatus, and recent advances in experimental software, for example capturing response dynamics through mouse-tracking. Through integration with OpenSesame, an open-source graphical experiment builder, studies can be assembled via a drag-and-drop interface requiring little or no further programming skills. In addition, by using the same communication mechanism across software packages, we also enable interoperability between systems. Our source code, which provides support for all major operating systems and several popular experimental packages, can be freely used and distributed under an open source license. The communication protocols underlying its functionality are also well documented and easily adapted to further platforms. Code and documentation are available at https://github.com/psynteract/ .

  2. Opportunistic Adaptive Transmission for Network Coding Using Nonbinary LDPC Codes

    Directory of Open Access Journals (Sweden)

    Cocco Giuseppe

    2010-01-01

    Full Text Available Network coding allows to exploit spatial diversity naturally present in mobile wireless networks and can be seen as an example of cooperative communication at the link layer and above. Such promising technique needs to rely on a suitable physical layer in order to achieve its best performance. In this paper, we present an opportunistic packet scheduling method based on physical layer considerations. We extend channel adaptation proposed for the broadcast phase of asymmetric two-way bidirectional relaying to a generic number of sinks and apply it to a network context. The method consists of adapting the information rate for each receiving node according to its channel status and independently of the other nodes. In this way, a higher network throughput can be achieved at the expense of a slightly higher complexity at the transmitter. This configuration allows to perform rate adaptation while fully preserving the benefits of channel and network coding. We carry out an information theoretical analysis of such approach and of that typically used in network coding. Numerical results based on nonbinary LDPC codes confirm the effectiveness of our approach with respect to previously proposed opportunistic scheduling techniques.

  3. AZTLAN platform: Mexican platform for analysis and design of nuclear reactors

    International Nuclear Information System (INIS)

    Gomez T, A. M.; Puente E, F.; Del Valle G, E.; Francois L, J. L.; Martin del Campo M, C.; Espinosa P, G.

    2014-10-01

    The Aztlan platform Project is a national initiative led by the Instituto Nacional de Investigaciones Nucleares (ININ) which brings together the main public houses of higher studies in Mexico, such as: Instituto Politecnico Nacional, Universidad Nacional Autonoma de Mexico and Universidad Autonoma Metropolitana in an effort to take a significant step toward the calculation autonomy and analysis that seeks to place Mexico in the medium term in a competitive international level on software issues for analysis of nuclear reactors. This project aims to modernize, improve and integrate the neutron, thermal-hydraulic and thermo-mechanical codes, developed in Mexican institutions, within an integrated platform, developed and maintained by Mexican experts to benefit from the same institutions. This project is financed by the mixed fund SENER-CONACYT of Energy Sustain ability, and aims to strengthen substantially to research institutions, such as educational institutions contributing to the formation of highly qualified human resources in the area of analysis and design of nuclear reactors. As innovative part the project includes the creation of a user group, made up of members of the project institutions as well as the Comision Nacional de Seguridad Nuclear y Salvaguardias, Central Nucleoelectrica de Laguna Verde (CNLV), Secretaria de Energia (Mexico) and Karlsruhe Institute of Technology (Germany) among others. This user group will be responsible for using the software and provide feedback to the development equipment in order that progress meets the needs of the regulator and industry; in this case the CNLV. Finally, in order to bridge the gap between similar developments globally, they will make use of the latest super computing technology to speed up calculation times. This work intends to present to national nuclear community the project, so a description of the proposed methodology is given, as well as the goals and objectives to be pursued for the development of the

  4. Evidence for gene-specific rather than transcription rate-dependent histone H3 exchange in yeast coding regions.

    Science.gov (United States)

    Gat-Viks, Irit; Vingron, Martin

    2009-02-01

    In eukaryotic organisms, histones are dynamically exchanged independently of DNA replication. Recent reports show that different coding regions differ in their amount of replication-independent histone H3 exchange. The current paradigm is that this histone exchange variability among coding regions is a consequence of transcription rate. Here we put forward the idea that this variability might be also modulated in a gene-specific manner independently of transcription rate. To that end, we study transcription rate-independent replication-independent coding region histone H3 exchange. We term such events relative exchange. Our genome-wide analysis shows conclusively that in yeast, relative exchange is a novel consistent feature of coding regions. Outside of replication, each coding region has a characteristic pattern of histone H3 exchange that is either higher or lower than what was expected by its RNAPII transcription rate alone. Histone H3 exchange in coding regions might be a way to add or remove certain histone modifications that are important for transcription elongation. Therefore, our results that gene-specific coding region histone H3 exchange is decoupled from transcription rate might hint at a new epigenetic mechanism of transcription regulation.

  5. Self-Organizing Wearable Device Platform for Assisting and Reminding Humans in Real Time

    Directory of Open Access Journals (Sweden)

    Yu Jin Park

    2016-01-01

    Full Text Available Most older persons would prefer “aging in my place,” that is, to remain in good health and live independently in their own home as long as possible. For assisting the independent living of older people, the ability to gather and analyze a user’s daily activity data would constitute a significant technical advance, enhancing their quality of life. However, the general approach based on centralized server has several problems such as the usage complexity, the high price of deployment and expansion, and the difficulty in identifying an individual person. To address these problems, we propose a wearable device platform for the life assistance of older persons that automatically records and analyzes their daily activity without intentional human intervention or a centralized server (i.e., cloud server. The proposed platform contains self-organizing protocols, Delay-Tolerant Messaging system, knowledge-based analysis and alerting for daily activities, and a hardware platform that provides low power consumption. We implemented a prototype smart watch, called Personal Activity Assisting and Reminding (PAAR, as a testbed for the proposed platform, and evaluated the power consumption and the service time of example scenarios.

  6. Cross-platform comparison of microarray data using order restricted inference

    Science.gov (United States)

    Klinglmueller, Florian; Tuechler, Thomas; Posch, Martin

    2013-01-01

    Motivation Titration experiments measuring the gene expression from two different tissues, along with total RNA mixtures of the pure samples, are frequently used for quality evaluation of microarray technologies. Such a design implies that the true mRNA expression of each gene, is either constant or follows a monotonic trend between the mixtures, applying itself to the use of order restricted inference procedures. Exploiting only the postulated monotonicity of titration designs, we propose three statistical analysis methods for the validation of high-throughput genetic data and corresponding preprocessing techniques. Results Our methods allow for inference of accuracy, repeatability and cross-platform agreement, with minimal required assumptions regarding the underlying data generating process. Therefore, they are readily applicable to all sorts of genetic high-throughput data independent of the degree of preprocessing. An application to the EMERALD dataset was used to demonstrate how our methods provide a rich spectrum of easily interpretable quality metrics and allow the comparison of different microarray technologies and normalization methods. The results are on par with previous work, but provide additional new insights that cast doubt on the utility of popular preprocessing techniques, specifically concerning the EMERALD projects dataset. Availability All datasets are available on EBI’s ArrayExpress web site (http://www.ebi.ac.uk/microarray-as/ae/) under accession numbers E-TABM-536, E-TABM-554 and E-TABM-555. Source code implemented in C and R is available at: http://statistics.msi.meduniwien.ac.at/float/cross_platform/. Methods for testing and variance decomposition have been made available in the R-package orQA, which can be downloaded and installed from CRAN http://cran.r-project.org. PMID:21317143

  7. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  8. Novel Biochip Platform for Nucleic Acid Analysis

    Directory of Open Access Journals (Sweden)

    Juan J. Diaz-Mochon

    2012-06-01

    Full Text Available This manuscript describes the use of a novel biochip platform for the rapid analysis/identification of nucleic acids, including DNA and microRNAs, with very high specificity. This approach combines a unique dynamic chemistry approach for nucleic acid testing and analysis developed by DestiNA Genomics with the STMicroelectronics In-Check platform, which comprises two microfluidic optimized and independent PCR reaction chambers, and a sequential microarray area for nucleic acid capture and identification by fluorescence. With its compact bench-top “footprint” requiring only a single technician to operate, the biochip system promises to transform and expand routine clinical diagnostic testing and screening for genetic diseases, cancers, drug toxicology and heart disease, as well as employment in the emerging companion diagnostics market.

  9. CISP: Simulation Platform for Collective Instabilities in the BRing of HIAF project

    Science.gov (United States)

    Liu, J.; Yang, J. C.; Xia, J. W.; Yin, D. Y.; Shen, G. D.; Li, P.; Zhao, H.; Ruan, S.; Wu, B.

    2018-02-01

    To simulate collective instabilities during the complicated beam manipulation in the BRing (Booster Ring) of HIAF (High Intensity heavy-ion Accelerator Facility) or other high intensity accelerators, a code, named CISP (Simulation Platform for Collective Instabilities), is designed and constructed in China's IMP (Institute of Modern Physics). The CISP is a scalable multi-macroparticle simulation platform that can perform longitudinal and transverse tracking when chromaticity, space charge effect, nonlinear magnets and wakes are included. And due to its well object-oriented design, the CISP is also a basic platform used to develop many other applications (like feedback). Several simulations, completed by the CISP in this paper, agree with analytical results very well, which shows that the CISP is fully functional now and it is a powerful platform for the further collective instability research in the BRing or other accelerators. In the future, the CISP can also be extended easily into a physics control system for HIAF or other facilities.

  10. Paracantor: A two group, two region reactor code

    Energy Technology Data Exchange (ETDEWEB)

    Stone, Stuart

    1956-07-01

    Paracantor I a two energy group, two region, time independent reactor code, which obtains a closed solution for a critical reactor assembly. The code deals with cylindrical reactors of finite length and with a radial reflector of finite thickness. It is programmed for the 1.B.M: Magnetic Drum Data-Processing Machine, Type 650. The limited memory space available does not permit a flux solution to be included in the basic Paracantor code. A supplementary code, Paracantor 11, has been programmed which computes fluxes, .including adjoint fluxes, from the .output of Paracamtor I.

  11. Development of the next generation code system as an engineering modelling language. 3. Study with prototyping. 2

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Chiba, Go; Kasahara, Naoto; Ishikawa, Makoto

    2004-04-01

    In the fast reactor development, numerical simulations using analysis code play and important role for complementing theory and experiment. In order to efficiently advance the research and development of fast reactors, JNC promotes the development of next generation simulation code (NGSC). In this report, an investigation research result by prototyping which carried out for the conceptual design of the NGSC is described. From the viewpoint of the cooperative research with CEA (Commissariat a l'Energie Atomique) in France, a trend survey on several platforms for numerical analysis and an applicability evaluation of the SALOME platform in CEA for the NGSC were carried out. As a result of the evaluation, it is confirmed that the SALOME had been satisfied the features of efficiency, openness, universality, expansibility and completeness that are required by the NGSC. In addition, it is confirmed that the SALOME had the concept of the control layer required by the NGSC and would be one of the important candidates as a platform of the NGSC. In the field of the structure analysis, the prototype of the PRTS.NET code was reexamined from the viewpoint of class structure and input/output specification in order to improve the data processing efficiency and maintainability. In the field of the reactor physics analysis, a development test of a new code with C++ and a reuse test of an existing code written in Fortran was carried out in view of utilizing the SALOME for the NGSC. (author)

  12. Development of the DTNTES code

    International Nuclear Information System (INIS)

    Ortega Prieto, P.; Morales Dorado, M.D.; Alonso Santos, A.

    1987-01-01

    The DTNTES code has been developed in the Department of Nuclear Technology of the Polytechnical University in Madrid as a part of the Research Program on Quantitative Risk Analysis. DTNTES code calculates several time-dependent probabilistic characteristics of basic events, minimal cut sets and the top event of a fault tree. The code assumes that basic events are statistically independent, and they have failure and repair distributions. It computes the minimal cut upper bound approach for the top event unavailability, and the time-dependent unreliability of the top event by means of different methods, selected by the user. These methods are: expected number of system failures, failure rate, Barlow-Proschan bound, steady-state upper bound, and T* method. (author)

  13. RIPE [robot independent programming environment]: A robot independent programming environment

    International Nuclear Information System (INIS)

    Miller, D.J.; Lennox, R.C.

    1990-01-01

    Remote manual operations in radiation environments are typically performed very slowly. Sensor-based computer-controlled robots hold great promise for increasing the speed and safety of remote operations; however, the programming of robotic systems has proven to be expensive and difficult. Generalized approaches to robot programming that reuse available software modules and employ programming languages which are independent of the specific robotic and sensory devices being used are needed to speed software development and increase overall system reliability. This paper discusses the robot independent programming environment (RIPE) developed at Sandia National Laboratories (SNL). The RIPE is an object-oriented approach to robot system architectures; it is a software environment that facilitates rapid design and implementation of complex robot systems for diverse applications. An architecture based on hierarchies of distributed multiprocessors provides the computing platform for a layered programming structure that models applications using software objects. These objects are designed to support model-based automated programming of robotic and machining devices, real-time sensor-based control, error handling, and robust communication

  14. System verification and validation report for the TMAD code

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    1995-01-01

    This document serves as the Verification and Validation Report for the TMAD code system, which includes the TMAD code and the LIBMAKR Code. The TMAD code was commissioned to facilitate the interpretation of moisture probe measurements in the Hanford Site waste tanks. In principle, the code is an interpolation routine that acts over a library of benchmark data based on two independent variables, typically anomaly size and moisture content. Two additional variables, anomaly type and detector type, can also be considered independent variables, but no interpolation is done over them. The dependent variable is detector response. The intent is to provide the code with measured detector responses from two or more detectors. The code will then interrogate (and interpolate upon) the benchmark data library and find the anomaly-type/anomaly-size/moisture-content combination that provides the closest match to the measured data. The primary purpose of this document is to provide the results of the system testing and the conclusions based thereon. The results of the testing process are documented in the body of the report. Appendix A gives the test plan, including test procedures, used in conducting the tests. Appendix B lists the input data required to conduct the tests, and Appendices C and 0 list the numerical results of the tests

  15. Nuclear Criticality Safety Assessment Using the SCALE Computer Code Package. A demonstration based on an independent review of a real application

    International Nuclear Information System (INIS)

    Mennerdahl, Dennis

    1998-06-01

    The purpose of this project was to instruct a young scientist from the Lithuanian Energy Institute (LEI) on how to carry out an independent review of a safety report. In particular, emphasis, was to be put on how to use the personal computer version of the calculation system SCALE 4.3 in this process. Nuclear criticality safety together with radiation shielding from gamma and neutron sources were areas of interest. This report concentrates on nuclear criticality safety aspects while a separate report covers radiation shielding. The application was a proposed storage cask for irradiated fuel assemblies from the Ignalina RBMK reactors in Lithuania. The safety report contained various documents involving many design and safety considerations. A few other documents describing the Ignalina reactors and their operation were available. The time for the project was limited to approximately one month, starting 'clean' with a SCALE 4.3 CD-ROM, a thick safety report and a fast personal computer. The results should be of general interest to Swedish authorities, in particular related to shielding where experience in using advanced computer codes like those available in SCALE is limited. It has been known for many years that criticality safety is very complicated, and that independent reviews are absolutely necessary to reduce the risk from quite common errors in the safety assessments. Several important results were obtained during the project. Concerning use of SCALE 4.3, it was confirmed that a young scientist, without extensive previous experience in the code system, can learn to use essentially all options. During the project, it was obvious that familiarity with personal computers, operating systems (including network system) and office software (word processing, spreadsheet and Internet browser software) saved a lot of time. Some of the Monte Carlo calculations took several hours. Experience is valuable in quickly picking out input or source document errors. Understanding

  16. Independent walking as a major skill for the development of anticipatory postural control: evidence from adjustments to predictable perturbations.

    Directory of Open Access Journals (Sweden)

    Fabien Cignetti

    Full Text Available Although there is suggestive evidence that a link exists between independent walking and the ability to establish anticipatory strategy to stabilize posture, the extent to which this skill facilitates the development of anticipatory postural control remains largely unknown. Here, we examined the role of independent walking on the infants' ability to anticipate predictable external perturbations. Non-walking infants, walking infants and adults were sitting on a platform that produced continuous rotation in the frontal plane. Surface electromyography (EMG of neck and lower back muscles and the positions of markers located on the platform, the upper body and the head were recorded. Results from cross-correlation analysis between rectified and filtered EMGs and platform movement indicated that although muscle activation already occurred before platform movement in non-walking infants, only walking infants demonstrated an adult-like ability for anticipation. Moreover, results from further cross-correlation analysis between segmental angular displacement and platform movement together with measures of balance control at the end-points of rotation of the platform evidenced two sorts of behaviour. The adults behaved as a non-rigid non-inverted pendulum, rather stabilizing head in space, while both the walking and non-walking infants followed the platform, behaving as a rigid inverted pendulum. These results suggest that the acquisition of independent walking plays a role in the development of anticipatory postural control, likely improving the internal model for the sensorimotor control of posture. However, despite such improvement, integrating the dynamics of an external object, here the platform, within the model to maintain balance still remains challenging in infants.

  17. [Exploration and practice of genetics teaching assisted by network technology platform].

    Science.gov (United States)

    Li, Ya-Xuan; Zhang, Fei-Xiong; Zhao, Xin; Cai, Min-Hua; Yan, Yue-Ming; Hu, Ying-Kao

    2010-04-01

    More teaching techniques have been brought out gradually along with the development of new technologies. On the basis of those traditional teaching methods, a new platform has been set up by the network technology for teaching process. In genetics teaching, it is possible to use the network platform to guide student studying, promote student's learning interest and study independently by themselves. It has been proved, after exploring and applying for many years, that network teaching is one of the most useful methods and has inimitable advantage comparing to the traditional ones in genetics teaching. The establishment of network teaching platform, the advantage and deficiency and relevant strategies were intro-duced in this paper.

  18. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  19. Extending CANTUP code analysis to probabilistic evaluations

    International Nuclear Information System (INIS)

    Florea, S.

    2001-01-01

    The structural analysis with numerical methods based on final element method plays at present a central role in evaluations and predictions of structural systems which require safety and reliable operation in aggressive environmental conditions. This is the case too for the CANDU - 600 fuel channel, where besides the corrosive and thermal aggression upon the Zr97.5Nb2.5 pressure tubes, a lasting irradiation adds which has marked consequences upon the materials properties evolution. This results in an unavoidable spreading in the materials properties in time, affected by high uncertainties. Consequently, the deterministic evaluation with computation codes based on finite element method are supplemented by statistic and probabilistic methods of evaluation of the response of structural components. This paper reports the works on extending the thermo-mechanical evaluation of the fuel channel components in the frame of probabilistic structure mechanics based on statistical methods and developed upon deterministic CANTUP code analyses. CANTUP code was adapted from LAHEY 77 platform onto Microsoft Developer Studio - Fortran Power Station 4.0 platform. To test the statistical evaluation of the creeping behaviour of pressure tube, the value of longitudinal elasticity modulus (Young) was used, as random variable, with a normal distribution around value, as used in deterministic analyses. The influence of the random quantity upon the hog and effective stress developed in the pressure tube for to time values, specific to primary and secondary creep was studied. The results obtained after a five year creep, corresponding to the secondary creep are presented

  20. Selecting Optimal Parameters of Random Linear Network Coding for Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Heide, J; Zhang, Qi; Fitzek, F H P

    2013-01-01

    This work studies how to select optimal code parameters of Random Linear Network Coding (RLNC) in Wireless Sensor Networks (WSNs). With Rateless Deluge [1] the authors proposed to apply Network Coding (NC) for Over-the-Air Programming (OAP) in WSNs, and demonstrated that with NC a significant...... reduction in the number of transmitted packets can be achieved. However, NC introduces additional computations and potentially a non-negligible transmission overhead, both of which depend on the chosen coding parameters. Therefore it is necessary to consider the trade-off that these coding parameters...... present in order to obtain the lowest energy consumption per transmitted bit. This problem is analyzed and suitable coding parameters are determined for the popular Tmote Sky platform. Compared to the use of traditional RLNC, these parameters enable a reduction in the energy spent per bit which grows...

  1. FUZZY CONTROLLER FOR THE CONTROL OF THE MOBILE PLATFORM OF THE CORBYS ROBOTIC GAIT REHABILITATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Maria Kyrarini

    2014-12-01

    Full Text Available In this paper, an inverse kinematics based control algorithm for the joystick control of the mobile platform of the novel mobile robot-assisted gait rehabilitation system CORBYS is presented. The mobile platform has four independently steered and driven wheels. Given the linear and angular velocities of the mobile platform, the inverse kinematics algorithm gives as its output the steering angle and the driving angular velocity of each of the four wheels. The paper is focused on the steering control of the platform for which a fuzzy logic controller is developed and implemented. The experimental results of the real-world steering of the platform are presented in the paper.

  2. Peripheral Codes in ASTRA for the TJ-II

    International Nuclear Information System (INIS)

    Lopez-Bruna, D.; Reynolds, J. M.; Cappa, A.; Martinell, J.; Garcia, J.; Gutierrez-Tapia, C.

    2010-01-01

    The study of data from the TJ-II device is often done with transport calculations based on the ASTRA transport system. However, complicated independent codes are used to obtain fundamental ingredients in these calculations, such as the particle and/or energy sources. These codes are accessible from ASTRA through the procedures explained in this report. (Author) 37 refs.

  3. Bit-Wise Arithmetic Coding For Compression Of Data

    Science.gov (United States)

    Kiely, Aaron

    1996-01-01

    Bit-wise arithmetic coding is data-compression scheme intended especially for use with uniformly quantized data from source with Gaussian, Laplacian, or similar probability distribution function. Code words of fixed length, and bits treated as being independent. Scheme serves as means of progressive transmission or of overcoming buffer-overflow or rate constraint limitations sometimes arising when data compression used.

  4. The Aster code

    International Nuclear Information System (INIS)

    Delbecq, J.M.

    1999-01-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  5. THREEDANT: A code to perform three-dimensional, neutral particle transport calculations

    International Nuclear Information System (INIS)

    Alcouffe, R.E.

    1994-01-01

    The THREEDANT code solves the three-dimensional neutral particle transport equation in its first order, multigroup, discrate ordinate form. The code allows an unlimited number of groups (depending upon the cross section set), angular quadrature up to S-100, and unlimited Pn order again depending upon the cross section set. The code has three options for spatial differencing, diamond with set-to-zero fixup, adaptive weighted diamond, and linear modal. The geometry options are XYZ and RZΘ with a special XYZ option based upon a volume fraction method. This allows objects or bodies of any shape to be modelled as input which gives the code as much geometric description flexibility as the Monte Carlo code MCNP. The transport equation is solved by source iteration accelerated by the DSA method. Both inner and outer iterations are so accelerated. Some results are presented which demonstrate the effectiveness of these techniques. The code is available on several types of computing platforms

  6. DANTSYS: A diffusion accelerated neutral particle transport code system

    Energy Technology Data Exchange (ETDEWEB)

    Alcouffe, R.E.; Baker, R.S.; Brinkley, F.W.; Marr, D.R.; O`Dell, R.D.; Walters, W.F.

    1995-06-01

    The DANTSYS code package includes the following transport codes: ONEDANT, TWODANT, TWODANT/GQ, TWOHEX, and THREEDANT. The DANTSYS code package is a modular computer program package designed to solve the time-independent, multigroup discrete ordinates form of the boltzmann transport equation in several different geometries. The modular construction of the package separates the input processing, the transport equation solving, and the post processing (or edit) functions into distinct code modules: the Input Module, one or more Solver Modules, and the Edit Module, respectively. The Input and Edit Modules are very general in nature and are common to all the Solver Modules. The ONEDANT Solver Module contains a one-dimensional (slab, cylinder, and sphere), time-independent transport equation solver using the standard diamond-differencing method for space/angle discretization. Also included in the package are solver Modules named TWODANT, TWODANT/GQ, THREEDANT, and TWOHEX. The TWODANT Solver Module solves the time-independent two-dimensional transport equation using the diamond-differencing method for space/angle discretization. The authors have also introduced an adaptive weighted diamond differencing (AWDD) method for the spatial and angular discretization into TWODANT as an option. The TWOHEX Solver Module solves the time-independent two-dimensional transport equation on an equilateral triangle spatial mesh. The THREEDANT Solver Module solves the time independent, three-dimensional transport equation for XYZ and RZ{Theta} symmetries using both diamond differencing with set-to-zero fixup and the AWDD method. The TWODANT/GQ Solver Module solves the 2-D transport equation in XY and RZ symmetries using a spatial mesh of arbitrary quadrilaterals. The spatial differencing method is based upon the diamond differencing method with set-to-zero fixup with changes to accommodate the generalized spatial meshing.

  7. DANTSYS: A diffusion accelerated neutral particle transport code system

    International Nuclear Information System (INIS)

    Alcouffe, R.E.; Baker, R.S.; Brinkley, F.W.; Marr, D.R.; O'Dell, R.D.; Walters, W.F.

    1995-06-01

    The DANTSYS code package includes the following transport codes: ONEDANT, TWODANT, TWODANT/GQ, TWOHEX, and THREEDANT. The DANTSYS code package is a modular computer program package designed to solve the time-independent, multigroup discrete ordinates form of the boltzmann transport equation in several different geometries. The modular construction of the package separates the input processing, the transport equation solving, and the post processing (or edit) functions into distinct code modules: the Input Module, one or more Solver Modules, and the Edit Module, respectively. The Input and Edit Modules are very general in nature and are common to all the Solver Modules. The ONEDANT Solver Module contains a one-dimensional (slab, cylinder, and sphere), time-independent transport equation solver using the standard diamond-differencing method for space/angle discretization. Also included in the package are solver Modules named TWODANT, TWODANT/GQ, THREEDANT, and TWOHEX. The TWODANT Solver Module solves the time-independent two-dimensional transport equation using the diamond-differencing method for space/angle discretization. The authors have also introduced an adaptive weighted diamond differencing (AWDD) method for the spatial and angular discretization into TWODANT as an option. The TWOHEX Solver Module solves the time-independent two-dimensional transport equation on an equilateral triangle spatial mesh. The THREEDANT Solver Module solves the time independent, three-dimensional transport equation for XYZ and RZΘ symmetries using both diamond differencing with set-to-zero fixup and the AWDD method. The TWODANT/GQ Solver Module solves the 2-D transport equation in XY and RZ symmetries using a spatial mesh of arbitrary quadrilaterals. The spatial differencing method is based upon the diamond differencing method with set-to-zero fixup with changes to accommodate the generalized spatial meshing

  8. Dual Coding and Bilingual Memory.

    Science.gov (United States)

    Paivio, Allan; Lambert, Wallace

    1981-01-01

    Describes study which tested a dual coding approach to bilingual memory using tasks that permit comparison of the effects of bilingual encoding with verbal-nonverbal dual encoding items. Results provide strong support for a version of the independent or separate stories view of bilingual memory. (Author/BK)

  9. Development of parallel Fokker-Planck code ALLAp

    International Nuclear Information System (INIS)

    Batishcheva, A.A.; Sigmar, D.J.; Koniges, A.E.

    1996-01-01

    We report on our ongoing development of the 3D Fokker-Planck code ALLA for a highly collisional scrape-off-layer (SOL) plasma. A SOL with strong gradients of density and temperature in the spatial dimension is modeled. Our method is based on a 3-D adaptive grid (in space, magnitude of the velocity, and cosine of the pitch angle) and a second order conservative scheme. Note that the grid size is typically 100 x 257 x 65 nodes. It was shown in our previous work that only these capabilities make it possible to benchmark a 3D code against a spatially-dependent self-similar solution of a kinetic equation with the Landau collision term. In the present work we show results of a more precise benchmarking against the exact solutions of the kinetic equation using a new parallel code ALLAp with an improved method of parallelization and a modified boundary condition at the plasma edge. We also report first results from the code parallelization using Message Passing Interface for a Massively Parallel CRI T3D platform. We evaluate the ALLAp code performance versus the number of T3D processors used and compare its efficiency against a Work/Data Sharing parallelization scheme and a workstation version

  10. A simple in-surge pressure analysis using the SPACE code

    International Nuclear Information System (INIS)

    Youn, Bum Soo; Kim, Yo Han; Lee, Dong Hyuk; Yang, Chang Keun; Kim, Se Yun; Ha, Sang Jun

    2010-01-01

    Currently, nuclear safety analysis codes used in Korea are developed by all the overseas. These codes are paying huge fee and permission must be obtained for use in the country. In addition, orders for nuclear power plants must ensure the safety analysis code for independent domestic technology. Therefore, Korea Electric Power Research Institute(KEPRI) is developing the domestic nuclear power safety analysis, SPACE(Safety and Performance Analysis Code for nuclear power plants). To determine the computational power of pressurizer model in development SPACE code, it was compared with existing commercial nuclear power safety analysis code, RETRAN

  11. Independent assessment of the TRAC-BD1/MOD1 computer code at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Wilson, G.E.; Charboneau, B.L.; Dallman, R.J.; Kullberg, C.M.; Wagner, K.C.; Wheatley, P.D.

    1984-01-01

    Under auspices of the United States Nuclear Regulatory Commission, their primary boiling water reactor safety analysis code (TRAC-BWR) is being assessed with simulations of a wide range of experimental data. The FY-1984 assessment activities were associated with the latest version (TRAC-BD1/MOD1) of this code. Typical results of the assessment studies are given. Conclusions formulated from these results are presented. These calculations relate to the overall applicability of the current code to safety analysis, and to future work which would further enhance the code's quality and ease of use

  12. Efficient Sensor Integration on Platforms (NeXOS)

    Science.gov (United States)

    Memè, S.; Delory, E.; Del Rio, J.; Jirka, S.; Toma, D. M.; Martinez, E.; Frommhold, L.; Barrera, C.; Pearlman, J.

    2016-12-01

    In-situ ocean observing platforms provide power and information transmission capability to sensors. Ocean observing platforms can be mobile, such as ships, autonomous underwater vehicles, drifters and profilers, or fixed, such as buoys, moorings and cabled observatories. The process of integrating sensors on platforms can imply substantial engineering time and resources. Constraints range from stringent mechanical constraints to proprietary communication and control firmware. In NeXOS, the implementation of a PUCK plug and play capability is being done with applications to multiple sensors and platforms. This is complemented with a sensor web enablement that addresses the flow of information from sensor to user. Open standards are being tested in order to assess their costs and benefits in existing and future observing systems. Part of the testing implied open-source coding and hardware prototyping of specific control devices in particular for closed commercial platforms where firmware upgrading is not straightforward or possible without prior agreements or service fees. Some platform manufacturers such as European companies ALSEAMAR[1] and NKE Instruments [2] are currently upgrading their control and communication firmware as part of their activities in NeXOS. The sensor development companies Sensorlab[3] SMID[4] and TRIOS [5]upgraded their firmware with this plug and play functionality. Other industrial players in Europe and the US have been sent NeXOS sensors emulators to test the new protocol on their platforms. We are currently demonstrating that with little effort, it is also possible to have such middleware implemented on very low-cost compact computers such as the open Raspberry Pi[6], and have a full end-to-end interoperable communication path from sensor to user with sensor plug and play capability. The result is an increase in sensor integration cost-efficiency and the demonstration will be used to highlight the benefit to users and ocean observatory

  13. Multitasking the three-dimensional transport code TORT on CRAY platforms

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1996-01-01

    The multitasking options in the three-dimensional neutral particle transport code TORT originally implemented for Cray's CTSS operating system are revived and extended to run on Cray Y/MP and C90 computers using the UNICOS operating system. These include two coarse-grained domain decompositions; across octants, and across directions within an octant, termed Octant Parallel (OP), and Direction Parallel (DP), respectively. Parallel performance of the DP is significantly enhanced by increasing the task grain size and reducing load imbalance via dynamic scheduling of the discrete angles among the participating tasks. Substantial Wall Clock speedup factors, approaching 4.5 using 8 tasks, have been measured in a time-sharing environment, and generally depend on the test problem specifications, number of tasks, and machine loading during execution

  14. Running code as part of an open standards policy

    OpenAIRE

    Shah, Rajiv; Kesan, Jay

    2009-01-01

    Governments around the world are considering implementing or even mandating open standards policies. They believe these policies will provide economic, socio-political, and technical benefits. In this article, we analyze the failure of the Massachusetts’s open standards policy as applied to document formats. We argue it failed due to the lack of running code. Running code refers to multiple independent, interoperable implementations of an open standard. With running code, users have choice ...

  15. Providing Device Independence to Mobile Services

    OpenAIRE

    Nylander, Stina; Bylund, Markus

    2002-01-01

    People want user interfaces to services that are functional and well suited to the device they choose for access. To provide this, services must be able to offer device specific user interfaces for the wide range of devices available today. We propose to combine the two dominant approaches to platform independence, "Write Once, Run Every-where™" and "different version for each device", to create multiple device specific user interfaces for mobile services. This gives possibilities to minimize...

  16. Beam Dynamics Simulation Platform and Studies of Beam Breakup in Dielectric Wakefield Structures

    International Nuclear Information System (INIS)

    Schoessow, P.; Kanareykin, A.; Jing, C.; Kustov, A.; Altmark, A.; Gai, W.

    2010-01-01

    A particle-Green's function beam dynamics code (BBU-3000) to study beam breakup effects is incorporated into a parallel computing framework based on the Boinc software environment, and supports both task farming on a heterogeneous cluster and local grid computing. User access to the platform is through a web browser.

  17. High Angular Momentum Halo Gas: A Feedback and Code-independent Prediction of LCDM

    Science.gov (United States)

    Stewart, Kyle R.; Maller, Ariyeh H.; Oñorbe, Jose; Bullock, James S.; Joung, M. Ryan; Devriendt, Julien; Ceverino, Daniel; Kereš, Dušan; Hopkins, Philip F.; Faucher-Giguère, Claude-André

    2017-07-01

    We investigate angular momentum acquisition in Milky Way-sized galaxies by comparing five high resolution zoom-in simulations, each implementing identical cosmological initial conditions but utilizing different hydrodynamic codes: Enzo, Art, Ramses, Arepo, and Gizmo-PSPH. Each code implements a distinct set of feedback and star formation prescriptions. We find that while many galaxy and halo properties vary between the different codes (and feedback prescriptions), there is qualitative agreement on the process of angular momentum acquisition in the galaxy’s halo. In all simulations, cold filamentary gas accretion to the halo results in ˜4 times more specific angular momentum in cold halo gas (λ cold ≳ 0.1) than in the dark matter halo. At z > 1, this inflow takes the form of inspiraling cold streams that are co-directional in the halo of the galaxy and are fueled, aligned, and kinematically connected to filamentary gas infall along the cosmic web. Due to the qualitative agreement among disparate simulations, we conclude that the buildup of high angular momentum halo gas and the presence of these inspiraling cold streams are robust predictions of Lambda Cold Dark Matter galaxy formation, though the detailed morphology of these streams is significantly less certain. A growing body of observational evidence suggests that this process is borne out in the real universe.

  18. High Angular Momentum Halo Gas: A Feedback and Code-independent Prediction of LCDM

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Kyle R. [Department of Mathematical Sciences, California Baptist University, 8432 Magnolia Ave., Riverside, CA 92504 (United States); Maller, Ariyeh H. [Department of Physics, New York City College of Technology, 300 Jay St., Brooklyn, NY 11201 (United States); Oñorbe, Jose [Max-Planck-Institut für Astronomie, Königstuhl 17, D-69117 Heidelberg (Germany); Bullock, James S. [Center for Cosmology, Department of Physics and Astronomy, The University of California at Irvine, Irvine, CA 92697 (United States); Joung, M. Ryan [Department of Astronomy, Columbia University, New York, NY 10027 (United States); Devriendt, Julien [Department of Physics, University of Oxford, The Denys Wilkinson Building, Keble Rd., Oxford OX1 3RH (United Kingdom); Ceverino, Daniel [Zentrum für Astronomie der Universität Heidelberg, Institut für Theoretische Astrophysik, Albert-Ueberle-Str. 2, D-69120 Heidelberg (Germany); Kereš, Dušan [Department of Physics, Center for Astrophysics and Space Sciences, University of California at San Diego, 9500 Gilman Dr., La Jolla, CA 92093 (United States); Hopkins, Philip F. [California Institute of Technology, 1200 E. California Blvd., Pasadena, CA 91125 (United States); Faucher-Giguère, Claude-André [Department of Physics and Astronomy and CIERA, Northwestern University, 2145 Sheridan Rd., Evanston, IL 60208 (United States)

    2017-07-01

    We investigate angular momentum acquisition in Milky Way-sized galaxies by comparing five high resolution zoom-in simulations, each implementing identical cosmological initial conditions but utilizing different hydrodynamic codes: Enzo, Art, Ramses, Arepo, and Gizmo-PSPH. Each code implements a distinct set of feedback and star formation prescriptions. We find that while many galaxy and halo properties vary between the different codes (and feedback prescriptions), there is qualitative agreement on the process of angular momentum acquisition in the galaxy’s halo. In all simulations, cold filamentary gas accretion to the halo results in ∼4 times more specific angular momentum in cold halo gas ( λ {sub cold} ≳ 0.1) than in the dark matter halo. At z > 1, this inflow takes the form of inspiraling cold streams that are co-directional in the halo of the galaxy and are fueled, aligned, and kinematically connected to filamentary gas infall along the cosmic web. Due to the qualitative agreement among disparate simulations, we conclude that the buildup of high angular momentum halo gas and the presence of these inspiraling cold streams are robust predictions of Lambda Cold Dark Matter galaxy formation, though the detailed morphology of these streams is significantly less certain. A growing body of observational evidence suggests that this process is borne out in the real universe.

  19. Multilevel LDPC Codes Design for Multimedia Communication CDMA System

    Directory of Open Access Journals (Sweden)

    Hou Jia

    2004-01-01

    Full Text Available We design multilevel coding (MLC with a semi-bit interleaved coded modulation (BICM scheme based on low density parity check (LDPC codes. Different from the traditional designs, we joined the MLC and BICM together by using the Gray mapping, which is suitable to transmit the data over several equivalent channels with different code rates. To perform well at signal-to-noise ratio (SNR to be very close to the capacity of the additive white Gaussian noise (AWGN channel, random regular LDPC code and a simple semialgebra LDPC (SA-LDPC code are discussed in MLC with parallel independent decoding (PID. The numerical results demonstrate that the proposed scheme could achieve both power and bandwidth efficiency.

  20. Gas Identification Using Passive UHF RFID Sensor Platform

    Directory of Open Access Journals (Sweden)

    Muhammad Ali AKBAR

    2015-11-01

    Full Text Available The concept of passive Radio Frequency Identification (RFID sensor tag is introduced to remove the dependency of current RFID platforms on battery life. In this paper, a gas identification system is presented using passive RFID sensor tag along with the processing unit. The RFID system is compliant to Electronics Product Code Generation 2 (EPC-Gen2 protocol in 902-928 MHz ISM band. Whereas the processing unit is implemented and analyzed in software and hardware platforms. The software platform uses MATLAB, whereas a High Level Synthesis (HLS tool is used to implement the processing unit on a Zynq platform. Moreover, two sets of different gases are used along with Principal Component Analysis (PCA and Linear Discriminant Analysis (LDA based feature reduction approaches to analyze in detail the best feature reduction approach for efficient classification of gas data. It is found that for the first set of gases, 90 % gases are identified using first three principal components, which is 7 % more efficient than LDA. However in terms of hardware overhead, LDA requires 50 % less hardware resources than PCA. The classification results for the second set of gases reveal that 91 % of gas classification is obtained using LDA and first four PCA, while LDA requires 52 % less hardware resources than PCA. The RFID tag used for transmission is implemented in 0.13 µm CMOS process, with simulated average power consumption of 2.6 µW from 1.2 V supply. ThingMagic M6e embedded reader is used for RFID platform implementation. It shows an output power of 31.5 dBm which allows a read range up to 9 meters.

  1. The OpenPMU Platform for Open Source Phasor Measurements

    OpenAIRE

    Laverty, David M.; Best, Robert J.; Brogan, Paul; Al-Khatib, Iyad; Vanfretti, Luigi; Morrow, D John

    2013-01-01

    OpenPMU is an open platform for the development of phasor measurement unit (PMU) technology. A need has been identified for an open-source alternative to commercial PMU devices tailored to the needs of the university researcher and for enabling the development of new synchrophasor instruments from this foundation. OpenPMU achieves this through open-source hardware design specifications and software source code, allowing duplicates of the OpenPMU to be fabricated under open-source licenses. Th...

  2. Payment Platform

    DEFF Research Database (Denmark)

    Hjelholt, Morten; Damsgaard, Jan

    2012-01-01

    thoroughly and substitute current payment standards in the decades to come. This paper portrays how digital payment platforms evolve in socio-technical niches and how various technological platforms aim for institutional attention in their attempt to challenge earlier platforms and standards. The paper...... applies a co-evolutionary multilevel perspective to model the interplay and processes between technology and society wherein digital payment platforms potentially will substitute other payment platforms just like the credit card negated the check. On this basis this paper formulate a multilevel conceptual...

  3. The future of the independent Egyptian music in the digital era

    OpenAIRE

    Maraghah, Mohammad

    2013-01-01

    Master's thesis in music management - University of Agder 2013 This thesis is investigating the impact of the digital era with its technological advanced components and revolutionized information platforms on shaping the future of the independent Egyptian music. The author investigated this impact through conducting fifteen semi structured qualitative interviews between the 15th of December 2012 to 25th of January 2013 with the relevant Independent Egyptian Music stakeholders who gave the ...

  4. Interactive game programming with Python (CodeSkulptor)

    OpenAIRE

    Ajayi, Richard Olugbenga

    2014-01-01

    Over the years, several types of gaming platforms have been created to encourage a more organised and friendly atmosphere for game lovers in various works of life, culture, and environment. This thesis focuses on the concept of interactive programming using Python. It encourages the use of Python to create simple interactive games applications based on basic human concept and ideas. CodeSkulptor is a browser-based IDE programming environment and uses the Python programming language. O...

  5. Eu-NORSEWInD - Assessment of Viability of Open Source CFD Code for the Wind Industry

    DEFF Research Database (Denmark)

    Stickland, Matt; Scanlon, Tom; Fabre, Sylvie

    2009-01-01

    Part of the overall NORSEWInD project is the use of LiDAR remote sensing (RS) systems mounted on offshore platforms to measure wind velocity profiles at a number of locations offshore. The data acquired from the offshore RS measurements will be fed into a large and novel wind speed dataset suitab...... between the results of simulations created by the commercial code FLUENT and the open source code OpenFOAM. An assessment of the ease with which the open source code can be used is also included....

  6. Two-dimensional QR-coded metamaterial absorber

    Science.gov (United States)

    Sui, Sai; Ma, Hua; Wang, Jiafu; Pang, Yongqiang; Zhang, Jieqiu; Qu, Shaobo

    2016-01-01

    In this paper, the design of metamaterial absorbers is proposed based on QR coding and topology optimization. Such absorbers look like QR codes and can be recognized by decoding softwares as well as mobile phones. To verify the design, two lightweight wideband absorbers are designed, which can achieve wideband absorption above 90 % in 6.68-19.30 and 7.00-19.70 GHz, respectively. More importantly, polarization-independent absorption over 90 % can be maintained under incident angle within 55°. The QR code absorber not only can achieve wideband absorption, but also can carry information such as texts and Web sites. They are of important values in applications such identification and electromagnetic protection.

  7. An Efficient and Flexible Implementation of Aspect-Oriented Languages

    NARCIS (Netherlands)

    Bockisch, Christoph

    2008-01-01

    Compilers for modern object-oriented programming languages generate code in a platform independent intermediate language preserving the concepts of the source language; for example, classes, fields, methods, and virtual or static dispatch can be directly identified within the intermediate code. To

  8. Platform capitalism: The intermediation and capitalization of digital economic circulation

    Directory of Open Access Journals (Sweden)

    Paul Langley

    2017-10-01

    Full Text Available A new form of digital economic circulation has emerged, wherein ideas, knowledge, labour and use rights for otherwise idle assets move between geographically distributed but connected and interactive online communities. Such circulation is apparent across a number of digital economic ecologies, including social media, online marketplaces, crowdsourcing, crowdfunding and other manifestations of the so-called ‘sharing economy’. Prevailing accounts deploy concepts such as ‘co-production’, ‘prosumption’ and ‘peer-to-peer’ to explain digital economic circulation as networked exchange relations characterised by their disintermediated, collaborative and democratising qualities. Building from the neologism of platform capitalism, we place ‘the platform’ – understood as a distinct mode of socio-technical intermediary and business arrangement that is incorporated into wider processes of capitalisation – at the centre of the critical analysis of digital economic circulation. To create multi-sided markets and coordinate network effects, platforms enrol users through a participatory economic culture and mobilise code and data analytics to compose immanent infrastructures. Platform intermediation is also nested in the ex-post construction of a replicable business model. Prioritising rapid up-scaling and extracting revenues from circulations and associated data trails, the model performs the structure of venture capital investment which capitalises on the potential of platforms to realise monopoly rents.

  9. The nuclear reaction model code MEDICUS

    International Nuclear Information System (INIS)

    Ibishia, A.I.

    2008-01-01

    The new computer code MEDICUS has been used to calculate cross sections of nuclear reactions. The code, implemented in MATLAB 6.5, Mathematica 5, and Fortran 95 programming languages, can be run in graphical and command line mode. Graphical User Interface (GUI) has been built that allows the user to perform calculations and to plot results just by mouse clicking. The MS Windows XP and Red Hat Linux platforms are supported. MEDICUS is a modern nuclear reaction code that can compute charged particle-, photon-, and neutron-induced reactions in the energy range from thresholds to about 200 MeV. The calculation of the cross sections of nuclear reactions are done in the framework of the Exact Many-Body Nuclear Cluster Model (EMBNCM), Direct Nuclear Reactions, Pre-equilibrium Reactions, Optical Model, DWBA, and Exciton Model with Cluster Emission. The code can be used also for the calculation of nuclear cluster structure of nuclei. We have calculated nuclear cluster models for some nuclei such as 177 Lu, 90 Y, and 27 Al. It has been found that nucleus 27 Al can be represented through the two different nuclear cluster models: 25 Mg + d and 24 Na + 3 He. Cross sections in function of energy for the reaction 27 Al( 3 He,x) 22 Na, established as a production method of 22 Na, are calculated by the code MEDICUS. Theoretical calculations of cross sections are in good agreement with experimental results. Reaction mechanisms are taken into account. (author)

  10. Code Modernization of VPIC

    Science.gov (United States)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  11. Comparison of microarray platforms for measuring differential microRNA expression in paired normal/cancer colon tissues.

    Directory of Open Access Journals (Sweden)

    Maurizio Callari

    Full Text Available BACKGROUND: Microarray technology applied to microRNA (miRNA profiling is a promising tool in many research fields; nevertheless, independent studies characterizing the same pathology have often reported poorly overlapping results. miRNA analysis methods have only recently been systematically compared but only in few cases using clinical samples. METHODOLOGY/PRINCIPAL FINDINGS: We investigated the inter-platform reproducibility of four miRNA microarray platforms (Agilent, Exiqon, Illumina, and Miltenyi, comparing nine paired tumor/normal colon tissues. The most concordant and selected discordant miRNAs were further studied by quantitative RT-PCR. Globally, a poor overlap among differentially expressed miRNAs identified by each platform was found. Nevertheless, for eight miRNAs high agreement in differential expression among the four platforms and comparability to qRT-PCR was observed. Furthermore, most of the miRNA sets identified by each platform are coherently enriched in data from the other platforms and the great majority of colon cancer associated miRNA sets derived from the literature were validated in our data, independently from the platform. Computational integration of miRNA and gene expression profiles suggested that anti-correlated predicted target genes of differentially expressed miRNAs are commonly enriched in cancer-related pathways and in genes involved in glycolysis and nutrient transport. CONCLUSIONS: Technical and analytical challenges in measuring miRNAs still remain and further research is required in order to increase consistency between different microarray-based methodologies. However, a better inter-platform agreement was found by looking at miRNA sets instead of single miRNAs and through a miRNAs - gene expression integration approach.

  12. EPICS: operating system independent device/driver support

    International Nuclear Information System (INIS)

    Kraimer, M.R.

    2003-01-01

    Originally EPICS input/output controllers (IOCs) were only supported on VME-based systems running the vxWorks operating system. Now IOCs are supported on many systems: vxWorks, RTEMS, Solaris, HPUX, Linux, WIN32, and Darwin. A challenge is to provide operating-system-independent device and driver support. This paper presents some techniques for providing such support. EPICS (Experimental Physics and Industrial Control System) is a set of software tools, libraries, and applications developed collaboratively and used worldwide to create distributed, real-time control systems for scientific instruments such as particle accelerators, telescopes, and other large scientific experiments. An important component of all EPICS-based control systems is a collection of input/output controllers (IOCs). An IOC has three primary components: (1) a real-time database; (2) channel access, which provides network access to the database; and (3) device/driver support for interfacing to equipment. This paper describes some projects related to providing device/driver support on non-vxWorks systems. In order to support IOCs on platforms other than vxWorks, operating-system-independent (OSI) application program interfaces (APIs) were defined for threads, semaphores, timers, etc. Providing support for a new platform consists of providing an operating-system-dependent implementation of the OSI APIs.

  13. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  14. Bacterial diversity in water injection systems of Brazilian offshore oil platforms.

    Science.gov (United States)

    Korenblum, Elisa; Valoni, Erika; Penna, Mônica; Seldin, Lucy

    2010-01-01

    Biogenic souring and microbial-influenced corrosion is a common scenario in water-flooded petroleum reservoirs. Water injection systems are continuously treated to control bacterial contamination, but some bacteria that cause souring and corrosion can persist even after different treatments have been applied. Our aim was to increase our knowledge of the bacterial communities that persist in the water injection systems of three offshore oil platforms in Brazil. To achieve this goal, we used a culture-independent molecular approach (16S ribosomal RNA gene clone libraries) to analyze seawater samples that had been subjected to different treatments. Phylogenetic analyses revealed that the bacterial communities from the different platforms were taxonomically different. A predominance of bacterial clones affiliated with Gammaproteobacteria, mostly belonging to the genus Marinobacter (60.7%), were observed in the platform A samples. Clones from platform B were mainly related to the genera Colwellia (37.9%) and Achromobacter (24.6%), whereas clones obtained from platform C were all related to unclassified bacteria. Canonical correspondence analyses showed that different treatments such as chlorination, deoxygenation, and biocide addition did not significantly influence the bacterial diversity in the platforms studied. Our results demonstrated that the injection water used in secondary oil recovery procedures contained potentially hazardous bacteria, which may ultimately cause souring and corrosion.

  15. Why comply with a code of ethics?

    Science.gov (United States)

    Spielthenner, Georg

    2015-05-01

    A growing number of professional associations and occupational groups are creating codes of ethics with the goal of guiding their members, protecting service users, and safeguarding the reputation of the profession. There is a great deal of literature dealing with the question to what extent ethical codes can achieve their desired objectives. The present paper does not contribute to this debate. Its aim is rather to investigate how rational it is to comply with codes of conduct. It is natural and virtually inevitable for a reflective person to ask why one should pay any attention to ethical codes, in particular if following a code is not in one's own interest. In order to achieve the aim of this paper, I shall (in "Quasi-reasons for complying with an ethical code" section) discuss reasons that only appear to be reasons for complying with a code. In "Code-independent reasons" section, I shall present genuine practical reasons that, however, turn out to be reasons of the wrong kind. In "Code-dependent reasons" section finally presents the most important reasons for complying with ethical codes. The paper argues that while ethical codes do not necessarily yield reasons for action, professionals can have genuine reasons for complying with a code, which may, however, be rather weak and easily overridden by reasons for deviating from the code.

  16. Decoding the encoding of functional brain networks: An fMRI classification comparison of non-negative matrix factorization (NMF), independent component analysis (ICA), and sparse coding algorithms.

    Science.gov (United States)

    Xie, Jianwen; Douglas, Pamela K; Wu, Ying Nian; Brody, Arthur L; Anderson, Ariana E

    2017-04-15

    Brain networks in fMRI are typically identified using spatial independent component analysis (ICA), yet other mathematical constraints provide alternate biologically-plausible frameworks for generating brain networks. Non-negative matrix factorization (NMF) would suppress negative BOLD signal by enforcing positivity. Spatial sparse coding algorithms (L1 Regularized Learning and K-SVD) would impose local specialization and a discouragement of multitasking, where the total observed activity in a single voxel originates from a restricted number of possible brain networks. The assumptions of independence, positivity, and sparsity to encode task-related brain networks are compared; the resulting brain networks within scan for different constraints are used as basis functions to encode observed functional activity. These encodings are then decoded using machine learning, by using the time series weights to predict within scan whether a subject is viewing a video, listening to an audio cue, or at rest, in 304 fMRI scans from 51 subjects. The sparse coding algorithm of L1 Regularized Learning outperformed 4 variations of ICA (pcoding algorithms. Holding constant the effect of the extraction algorithm, encodings using sparser spatial networks (containing more zero-valued voxels) had higher classification accuracy (pcoding algorithms suggests that algorithms which enforce sparsity, discourage multitasking, and promote local specialization may capture better the underlying source processes than those which allow inexhaustible local processes such as ICA. Negative BOLD signal may capture task-related activations. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Performance of scientific computing platforms with MCNP4B

    International Nuclear Information System (INIS)

    McLaughlin, H.E.; Hendricks, J.S.

    1998-01-01

    Several computing platforms were evaluated with the MCNP4B Monte Carlo radiation transport code. The DEC AlphaStation 500/500 was the fastest to run MCNP4B. Compared to the HP 9000-735, the fastest platform 4 yr ago, the AlphaStation is 335% faster, the HP C180 is 133% faster, the SGI Origin 2000 is 82% faster, the Cray T94/4128 is 1% faster, the IBM RS/6000-590 is 93% as fast, the DEC 3000/600 is 81% as fast, the Sun Sparc20 is 57% as fast, the Cray YMP 8/8128 is 57% as fast, the sun Sparc5 is 33% as fast, and the Sun Sparc2 is 13% as fast. All results presented are reproducible and allow for comparison to computer platforms not included in this study. Timing studies are seen to be very problem dependent. The performance gains resulting from advances in software were also investigated. Various compilers and operating systems were seen to have a modest impact on performance, whereas hardware improvements have resulted in a factor of 4 improvement. MCNP4B also ran approximately as fast as MCNP4A

  18. Navigation and Positioning System Using High Altitude Platforms Systems (HAPS)

    Science.gov (United States)

    Tsujii, Toshiaki; Harigae, Masatoshi; Harada, Masashi

    Recently, some countries have begun conducting feasibility studies and R&D projects on High Altitude Platform Systems (HAPS). Japan has been investigating the use of an airship system that will function as a stratospheric platform for applications such as environmental monitoring, communications and broadcasting. If pseudolites were mounted on the airships, their GPS-like signals would be stable augmentations that would improve the accuracy, availability, and integrity of GPS-based positioning systems. Also, the sufficient number of HAPS can function as a positioning system independent of GPS. In this paper, a system design of the HAPS-based positioning system and its positioning error analyses are described.

  19. Verification of the AZNHEX code v.1.4 with MCNP6 for different reference cases

    International Nuclear Information System (INIS)

    Galicia A, J.; Francois L, J. L.; Bastida O, G. E.; Del Valle G, E.

    2017-09-01

    The codes that make up the AZTLAN platform (AZTHECA, AZTRAN, AZKIND and AZNHEX) are currently in the testing phase simulating a variety of nuclear reactor assemblies and cores to compare and validate the results obtained for a particular case, with codes globally used in the nuclear area such as CASMO, Serpent and MCNP. The objective of this work is to continue improving the future versions of the codes of the AZTLAN platform so that accurate and reliable results can be obtained for the user. To test the current version of the AZNHEX code, 3 cases were taken into account, the first being the simulation of a VVER-440 reactor assembly; for the second case, the assembly of a fast reactor cooled with helium was simulated and for the third case it was decided to take up the case of the core of a fast reactor cooled with sodium, this because the previous versions of AZNHEX did not show adequate results and, in addition, they presented a considerable amount of limitations. The comparison and validation of the results (neutron multiplication factor, radial power, radial flow, axial power) for these three cases were made using the code MCNP6. The results obtained show that this version of AZNHEX produces values of the neutron multiplication factor and the neutron and power flow distributions very close to those of MCNP6. (Author)

  20. Learning by Doing: How to Develop a Cross-Platform Web App

    Directory of Open Access Journals (Sweden)

    Minh Q. Huynh

    2015-06-01

    Full Text Available As mobile devices become prevalent, there is always a need for apps.  How hard is it to develop an app especially a cross-platform app? The paper shares an experience in a project involved the development of a student services web app that can be run on cross-platform mobile devices.  The paper first describes the background of the project, the clients, and the proposed solution.  Then, it focuses on the step-by-step development processes and provides the illustration of written codes and techniques used.  The goal is for readers to gain an understanding on how to develop a mobile-friendly web app.  The paper concludes with teaching implications and offers thoughts for further development.

  1. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)

    Science.gov (United States)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B.; Jia, Xun

    2015-09-01

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by

  2. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC).

    Science.gov (United States)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun

    2015-10-07

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia's CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE's random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by

  3. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)

    International Nuclear Information System (INIS)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun

    2015-01-01

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon–electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783–97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48–0.53% for the electron beam cases and 0.15–0.17% for the photon beam cases. In terms of efficiency, goMC was ∼4–16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was

  4. Linking CATHENA with other computer codes through a remote process

    Energy Technology Data Exchange (ETDEWEB)

    Vasic, A.; Hanna, B.N.; Waddington, G.M. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Sabourin, G. [Atomic Energy of Canada Limited, Montreal, Quebec (Canada); Girard, R. [Hydro-Quebec, Montreal, Quebec (Canada)

    2005-07-01

    'Full text:' CATHENA (Canadian Algorithm for THErmalhydraulic Network Analysis) is a computer code developed by Atomic Energy of Canada Limited (AECL). The code uses a transient, one-dimensional, two-fluid representation of two-phase flow in piping networks. CATHENA is used primarily for the analysis of postulated upset conditions in CANDU reactors; however, the code has found a wider range of applications. In the past, the CATHENA thermalhydraulics code included other specialized codes, i.e. ELOCA and the Point LEPreau CONtrol system (LEPCON) as callable subroutine libraries. The combined program was compiled and linked as a separately named code. This code organizational process is not suitable for independent development, maintenance, validation and version tracking of separate computer codes. The alternative solution to provide code development independence is to link CATHENA to other computer codes through a Parallel Virtual Machine (PVM) interface process. PVM is a public domain software package, developed by Oak Ridge National Laboratory and enables a heterogeneous collection of computers connected by a network to be used as a single large parallel machine. The PVM approach has been well accepted by the global computing community and has been used successfully for solving large-scale problems in science, industry, and business. Once development of the appropriate interface for linking independent codes through PVM is completed, future versions of component codes can be developed, distributed separately and coupled as needed by the user. This paper describes the coupling of CATHENA to the ELOCA-IST and the TROLG2 codes through a PVM remote process as an illustration of possible code connections. ELOCA (Element Loss Of Cooling Analysis) is the Industry Standard Toolset (IST) code developed by AECL to simulate the thermo-mechanical response of CANDU fuel elements to transient thermalhydraulics boundary conditions. A separate ELOCA driver program

  5. Linking CATHENA with other computer codes through a remote process

    International Nuclear Information System (INIS)

    Vasic, A.; Hanna, B.N.; Waddington, G.M.; Sabourin, G.; Girard, R.

    2005-01-01

    'Full text:' CATHENA (Canadian Algorithm for THErmalhydraulic Network Analysis) is a computer code developed by Atomic Energy of Canada Limited (AECL). The code uses a transient, one-dimensional, two-fluid representation of two-phase flow in piping networks. CATHENA is used primarily for the analysis of postulated upset conditions in CANDU reactors; however, the code has found a wider range of applications. In the past, the CATHENA thermalhydraulics code included other specialized codes, i.e. ELOCA and the Point LEPreau CONtrol system (LEPCON) as callable subroutine libraries. The combined program was compiled and linked as a separately named code. This code organizational process is not suitable for independent development, maintenance, validation and version tracking of separate computer codes. The alternative solution to provide code development independence is to link CATHENA to other computer codes through a Parallel Virtual Machine (PVM) interface process. PVM is a public domain software package, developed by Oak Ridge National Laboratory and enables a heterogeneous collection of computers connected by a network to be used as a single large parallel machine. The PVM approach has been well accepted by the global computing community and has been used successfully for solving large-scale problems in science, industry, and business. Once development of the appropriate interface for linking independent codes through PVM is completed, future versions of component codes can be developed, distributed separately and coupled as needed by the user. This paper describes the coupling of CATHENA to the ELOCA-IST and the TROLG2 codes through a PVM remote process as an illustration of possible code connections. ELOCA (Element Loss Of Cooling Analysis) is the Industry Standard Toolset (IST) code developed by AECL to simulate the thermo-mechanical response of CANDU fuel elements to transient thermalhydraulics boundary conditions. A separate ELOCA driver program starts, ends

  6. Current Status of the LIFE Fast Reactors Fuel Performance Codes

    International Nuclear Information System (INIS)

    Yacout, A.M.; Billone, M.C.

    2013-01-01

    The LIFE-4 (Rev. 1) code was calibrated and validated using data from (U,Pu)O2 mixed-oxide fuel pins and UO2 blanket rods which were irradiation tested under steady-state and transient conditions. – It integrates a broad material and fuel-pin irradiation database into a consistent framework for use and extrapolation of the database to reactor design applications. – The code is available and running on different computer platforms (UNIX & PC) – Detailed documentations of the code’s models, routines, calibration and validation data sets are available. LIFE-METAL code is based on LIFE4 with modifications to include key phenomena applicable to metallic fuel, and metallic fuel properties – Calibrated with large database from irradiations in EBR-II – Further effort for calibration and detailed documentation. Recent activities with the codes are related to reactor design studies and support of licensing efforts for 4S and KAERI SFR designs. Future activities are related to re-assessment of the codes calibration and validation and inclusion of models for advanced fuels (transmutation fuels)

  7. A PC version of the Monte Carlo criticality code OMEGA

    International Nuclear Information System (INIS)

    Seifert, E.

    1996-05-01

    A description of the PC version of the Monte Carlo criticality code OMEGA is given. The report contains a general description of the code together with a detailed input description. Furthermore, some examples are given illustrating the generation of an input file. The main field of application is the calculation of the criticality of arrangements of fissionable material. Geometrically complicated arrangements that often appear inside and outside a reactor, e.g. in a fuel storage or transport container, can be considered essentially without geometrical approximations. For example, the real geometry of assemblies containing hexagonal or square lattice structures can be described in full detail. Moreover, the code can be used for special investigations in the field of reactor physics and neutron transport. Many years of practical experience and comparison with reference cases have shown that the code together with the built-in data libraries gives reliable results. OMEGA is completely independent on other widely used criticality codes (KENO, MCNP, etc.), concerning programming and the data base. It is a good practice to run difficult criticality safety problems by different independent codes in order to mutually verify the results. In this way, OMEGA can be used as a redundant code within the family of criticality codes. An advantage of OMEGA is the short calculation time: A typical criticality safety application takes only a few minutes on a Pentium PC. Therefore, the influence of parameter variations can simply be investigated by running many variants of a problem. (orig.)

  8. Short-term memory coding in children with intellectual disabilities.

    Science.gov (United States)

    Henry, Lucy

    2008-05-01

    To examine visual and verbal coding strategies, I asked children with intellectual disabilities and peers matched for MA and CA to perform picture memory span tasks with phonologically similar, visually similar, long, or nonsimilar named items. The CA group showed effects consistent with advanced verbal memory coding (phonological similarity and word length effects). Neither the intellectual disabilities nor MA groups showed evidence for memory coding strategies. However, children in these groups with MAs above 6 years showed significant visual similarity and word length effects, broadly consistent with an intermediate stage of dual visual and verbal coding. These results suggest that developmental progressions in memory coding strategies are independent of intellectual disabilities status and consistent with MA.

  9. Continuous integration in a social-coding world : empirical evidence from GitHub

    NARCIS (Netherlands)

    Vasilescu, B.N.; van Schuylenburg, S.B.; Wulms, Jules; Serebrenik, A.; Brand, van den M.G.J.

    2014-01-01

    Continuous integration is a software engineering practice of frequently merging all developer working copies with a shared main branch, e.g., several times a day. With the advent of GitHub, a platform well known for its "social coding" features that aid collaboration and sharing, and currently the

  10. The ethical commitment of independent directors in different contexts of investor protection

    Directory of Open Access Journals (Sweden)

    Isabel María García-Sánchez

    2015-04-01

    Full Text Available The purpose of this study is to compare, for countries with different legal environments, the degree to which boards of directors may improve corporate ethical behaviour by designing codes of ethics. These codes address issues such as a company's responsibility regarding the quality of its products and services, compliance with laws and regulations, conflicts of interest, corruption and fraud, and protection of the natural environment. Using a sample of firms from 12 countries, we obtain evidence that a greater presence of independent directors on the board leads to the existence of more complex codes of ethics. Moreover, there are significant differences between countries with high levels and countries with low levels of investor protection as regards the effectiveness of independent directors in constraining unethical behaviour by managers.

  11. UNIPIC code for simulations of high power microwave devices

    International Nuclear Information System (INIS)

    Wang Jianguo; Zhang Dianhui; Wang Yue; Qiao Hailiang; Li Xiaoze; Liu Chunliang; Li Yongdong; Wang Hongguang

    2009-01-01

    In this paper, UNIPIC code, a new member in the family of fully electromagnetic particle-in-cell (PIC) codes for simulations of high power microwave (HPM) generation, is introduced. In the UNIPIC code, the electromagnetic fields are updated using the second-order, finite-difference time-domain (FDTD) method, and the particles are moved using the relativistic Newton-Lorentz force equation. The convolutional perfectly matched layer method is used to truncate the open boundaries of HPM devices. To model curved surfaces and avoid the time step reduction in the conformal-path FDTD method, CP weakly conditional-stable FDTD (WCS FDTD) method which combines the WCS FDTD and CP-FDTD methods, is implemented. UNIPIC is two-and-a-half dimensional, is written in the object-oriented C++ language, and can be run on a variety of platforms including WINDOWS, LINUX, and UNIX. Users can use the graphical user's interface to create the geometric structures of the simulated HPM devices, or input the old structures created before. Numerical experiments on some typical HPM devices by using the UNIPIC code are given. The results are compared to those obtained from some well-known PIC codes, which agree well with each other.

  12. UNIPIC code for simulations of high power microwave devices

    Science.gov (United States)

    Wang, Jianguo; Zhang, Dianhui; Liu, Chunliang; Li, Yongdong; Wang, Yue; Wang, Hongguang; Qiao, Hailiang; Li, Xiaoze

    2009-03-01

    In this paper, UNIPIC code, a new member in the family of fully electromagnetic particle-in-cell (PIC) codes for simulations of high power microwave (HPM) generation, is introduced. In the UNIPIC code, the electromagnetic fields are updated using the second-order, finite-difference time-domain (FDTD) method, and the particles are moved using the relativistic Newton-Lorentz force equation. The convolutional perfectly matched layer method is used to truncate the open boundaries of HPM devices. To model curved surfaces and avoid the time step reduction in the conformal-path FDTD method, CP weakly conditional-stable FDTD (WCS FDTD) method which combines the WCS FDTD and CP-FDTD methods, is implemented. UNIPIC is two-and-a-half dimensional, is written in the object-oriented C++ language, and can be run on a variety of platforms including WINDOWS, LINUX, and UNIX. Users can use the graphical user's interface to create the geometric structures of the simulated HPM devices, or input the old structures created before. Numerical experiments on some typical HPM devices by using the UNIPIC code are given. The results are compared to those obtained from some well-known PIC codes, which agree well with each other.

  13. AZTLAN platform: Mexican platform for analysis and design of nuclear reactors; AZTLAN platform: plataforma mexicana para el analisis y diseno de reactores nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Gomez T, A. M.; Puente E, F. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Del Valle G, E. [IPN, Escuela Superior de Fisica y Matematicas, Av. IPN s/n, Edif. 9, Col. San Pedro Zacatenco, 07738 Mexico D. F. (Mexico); Francois L, J. L.; Martin del Campo M, C. [UNAM, Facultad de Ingenieria, Departamento de Sistemas Energeticos, Paseo Cuauhnahuac 8532, 62550 Jiutepec, Morelos (Mexico); Espinosa P, G., E-mail: armando.gomez@inin.gob.mx [Universidad Autonoma Metropolitana, Unidad Iztapalapa, Av. San Rafael Atlixco No. 186, Col. Vicentina, 09340 Mexico D. F. (Mexico)

    2014-10-15

    The Aztlan platform Project is a national initiative led by the Instituto Nacional de Investigaciones Nucleares (ININ) which brings together the main public houses of higher studies in Mexico, such as: Instituto Politecnico Nacional, Universidad Nacional Autonoma de Mexico and Universidad Autonoma Metropolitana in an effort to take a significant step toward the calculation autonomy and analysis that seeks to place Mexico in the medium term in a competitive international level on software issues for analysis of nuclear reactors. This project aims to modernize, improve and integrate the neutron, thermal-hydraulic and thermo-mechanical codes, developed in Mexican institutions, within an integrated platform, developed and maintained by Mexican experts to benefit from the same institutions. This project is financed by the mixed fund SENER-CONACYT of Energy Sustain ability, and aims to strengthen substantially to research institutions, such as educational institutions contributing to the formation of highly qualified human resources in the area of analysis and design of nuclear reactors. As innovative part the project includes the creation of a user group, made up of members of the project institutions as well as the Comision Nacional de Seguridad Nuclear y Salvaguardias, Central Nucleoelectrica de Laguna Verde (CNLV), Secretaria de Energia (Mexico) and Karlsruhe Institute of Technology (Germany) among others. This user group will be responsible for using the software and provide feedback to the development equipment in order that progress meets the needs of the regulator and industry; in this case the CNLV. Finally, in order to bridge the gap between similar developments globally, they will make use of the latest super computing technology to speed up calculation times. This work intends to present to national nuclear community the project, so a description of the proposed methodology is given, as well as the goals and objectives to be pursued for the development of the

  14. Product Platform Performance

    DEFF Research Database (Denmark)

    Munk, Lone

    The aim of this research is to improve understanding of platform-based product development by studying platform performance in relation to internal effects in companies. Platform-based product development makes it possible to deliver product variety and at the same time reduce the needed resources...... engaging in platform-based product development. Similarly platform assessment criteria lack empirical verification regarding relevance and sufficiency. The thesis focuses on • the process of identifying and estimating internal effects, • verification of performance of product platforms, (i...... experienced representatives from the different life systems phase systems of the platform products. The effects are estimated and modeled within different scenarios, taking into account financial and real option aspects. The model illustrates and supports estimation and quantification of internal platform...

  15. Validation of the transportation computer codes HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND

    International Nuclear Information System (INIS)

    Maheras, S.J.; Pippen, H.K.

    1995-05-01

    The computer codes HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND were used to estimate radiation doses from the transportation of radioactive material in the Department of Energy Programmatic Spent Nuclear Fuel Management and Idaho National Engineering Laboratory Environmental Restoration and Waste Management Programs Environmental Impact Statement. HIGHWAY and INTERLINE were used to estimate transportation routes for truck and rail shipments, respectively. RADTRAN 4 was used to estimate collective doses from incident-free transportation and the risk (probability x consequence) from transportation accidents. RISKIND was used to estimate incident-free radiation doses for maximally exposed individuals and the consequences from reasonably foreseeable transportation accidents. The purpose of this analysis is to validate the estimates made by these computer codes; critiques of the conceptual models used in RADTRAN 4 are also discussed. Validation is defined as ''the test and evaluation of the completed software to ensure compliance with software requirements.'' In this analysis, validation means that the differences between the estimates generated by these codes and independent observations are small (i.e., within the acceptance criterion established for the validation analysis). In some cases, the independent observations used in the validation were measurements; in other cases, the independent observations used in the validation analysis were generated using hand calculations. The results of the validation analyses performed for HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND show that the differences between the estimates generated using the computer codes and independent observations were small. Based on the acceptance criterion established for the validation analyses, the codes yielded acceptable results; in all cases the estimates met the requirements for successful validation

  16. Fast QC-LDPC code for free space optical communication

    Science.gov (United States)

    Wang, Jin; Zhang, Qi; Udeh, Chinonso Paschal; Wu, Rangzhong

    2017-02-01

    Free Space Optical (FSO) Communication systems use the atmosphere as a propagation medium. Hence the atmospheric turbulence effects lead to multiplicative noise related with signal intensity. In order to suppress the signal fading induced by multiplicative noise, we propose a fast Quasi-Cyclic (QC) Low-Density Parity-Check (LDPC) code for FSO Communication systems. As a linear block code based on sparse matrix, the performances of QC-LDPC is extremely near to the Shannon limit. Currently, the studies on LDPC code in FSO Communications is mainly focused on Gauss-channel and Rayleigh-channel, respectively. In this study, the LDPC code design over atmospheric turbulence channel which is nether Gauss-channel nor Rayleigh-channel is closer to the practical situation. Based on the characteristics of atmospheric channel, which is modeled as logarithmic-normal distribution and K-distribution, we designed a special QC-LDPC code, and deduced the log-likelihood ratio (LLR). An irregular QC-LDPC code for fast coding, of which the rates are variable, is proposed in this paper. The proposed code achieves excellent performance of LDPC codes and can present the characteristics of high efficiency in low rate, stable in high rate and less number of iteration. The result of belief propagation (BP) decoding shows that the bit error rate (BER) obviously reduced as the Signal-to-Noise Ratio (SNR) increased. Therefore, the LDPC channel coding technology can effectively improve the performance of FSO. At the same time, the BER, after decoding reduces with the increase of SNR arbitrarily, and not having error limitation platform phenomenon with error rate slowing down.

  17. Annual report on compliance with the codes of good conduct and independence of electricity grid and natural gas network operators. November 2005

    International Nuclear Information System (INIS)

    2005-11-01

    In France, system operators belong to groups that also conduct business in the energy sector, in fields governed by competition rules. They could therefore be tempted to use their privileged position to their group's benefit, which would disadvantage end consumers. Non-discriminatory access to electricity and gas transmission and distribution networks is at the core of the market opening to competition approach implemented by the European Union since the end of the 1990's. EU and national enactments in force highlight two tools to ensure nondiscrimination: compliance programmes and independence of system operators with regard to their parent companies. Firstly, compliance programs contain measures taken to ensure that discrimination is completely excluded and that their application is subject to appropriate monitoring. Secondly, system operator independence plays a part in preventing discrimination against competitors with other business activities (generation, supply, etc.) within the same group. In application of these enactments, every electricity or natural gas transmission or distribution system operator serving more than 100,000 customers provided CRE, the Energy Regulatory Commission, with their annual reports on the application of their compliance programs. This document is CRE's November 2005 report about compliance programmes and independence of electricity and natural gas system operators. It has been prepared using the codes of good conduct and the annual reports supplied by network operators. CRE also launched a public consultation of the market players in October 2005 and listened to what the network operators had to say. Moreover, it carried out a certain number of checks on operators' practices

  18. Reliability issues and solutions for coding social communication performance in classroom settings.

    Science.gov (United States)

    Olswang, Lesley B; Svensson, Liselotte; Coggins, Truman E; Beilinson, Jill S; Donaldson, Amy L

    2006-10-01

    To explore the utility of time-interval analysis for documenting the reliability of coding social communication performance of children in classroom settings. Of particular interest was finding a method for determining whether independent observers could reliably judge both occurrence and duration of ongoing behavioral dimensions for describing social communication performance. Four coders participated in this study. They observed and independently coded 6 social communication behavioral dimensions using handheld computers. The dimensions were mutually exclusive and accounted for all verbal and nonverbal productions during a specified time frame. The technology allowed for coding frequency and duration for each entered code. Data were collected from 20 different 2-min video segments of children in kindergarten through 3rd-grade classrooms. Data were analyzed for interobserver and intraobserver agreements using time-interval sorting and Cohen's kappa. Further, interval size and total observation length were manipulated to determine their influence on reliability. The data revealed interval sorting and kappa to be a suitable method for examining reliability of occurrence and duration of ongoing social communication behavioral dimensions. Nearly all comparisons yielded medium to large kappa values; interval size and length of observation minimally affected results. Implications The analysis procedure described in this research solves a challenge in reliability: comparing coding by independent observers of both occurrence and duration of behaviors. Results indicate the utility of a new coding taxonomy and technology for application in online observations of social communication in a classroom setting.

  19. The Platformization of the Web: Making Web Data Platform Ready

    NARCIS (Netherlands)

    Helmond, A.

    2015-01-01

    In this article, I inquire into Facebook’s development as a platform by situating it within the transformation of social network sites into social media platforms. I explore this shift with a historical perspective on, what I refer to as, platformization, or the rise of the platform as the dominant

  20. The world anti-doping code : a South African perspective : research ...

    African Journals Online (AJOL)

    During February 2003 the World Anti-Doping Agency adopted the World-Anti Doping Code in Copenhagen in an effort to create and independent anti-doping body and to co-ordinate the harmonisation of doping regulations. The Code encompasses the principles around which the anti-doping effort in sport will revolve in ...

  1. Multicore and Accelerator Development for a Leadership-Class Stellar Astrophysics Code

    Energy Technology Data Exchange (ETDEWEB)

    Messer, Bronson [ORNL; Harris, James A [ORNL; Parete-Koon, Suzanne T [ORNL; Chertkow, Merek A [ORNL

    2013-01-01

    We describe recent development work on the core-collapse supernova code CHIMERA. CHIMERA has consumed more than 100 million cpu-hours on Oak Ridge Leadership Computing Facility (OLCF) platforms in the past 3 years, ranking it among the most important applications at the OLCF. Most of the work described has been focused on exploiting the multicore nature of the current platform (Jaguar) via, e.g., multithreading using OpenMP. In addition, we have begun a major effort to marshal the computational power of GPUs with CHIMERA. The impending upgrade of Jaguar to Titan a 20+ PF machine with an NVIDIA GPU on many nodes makes this work essential.

  2. Signal-independent timescale analysis (SITA) and its application for neural coding during reaching and walking.

    Science.gov (United States)

    Zacksenhouse, Miriam; Lebedev, Mikhail A; Nicolelis, Miguel A L

    2014-01-01

    What are the relevant timescales of neural encoding in the brain? This question is commonly investigated with respect to well-defined stimuli or actions. However, neurons often encode multiple signals, including hidden or internal, which are not experimentally controlled, and thus excluded from such analysis. Here we consider all rate modulations as the signal, and define the rate-modulations signal-to-noise ratio (RM-SNR) as the ratio between the variance of the rate and the variance of the neuronal noise. As the bin-width increases, RM-SNR increases while the update rate decreases. This tradeoff is captured by the ratio of RM-SNR to bin-width, and its variations with the bin-width reveal the timescales of neural activity. Theoretical analysis and simulations elucidate how the interactions between the recovery properties of the unit and the spectral content of the encoded signals shape this ratio and determine the timescales of neural coding. The resulting signal-independent timescale analysis (SITA) is applied to investigate timescales of neural activity recorded from the motor cortex of monkeys during: (i) reaching experiments with Brain-Machine Interface (BMI), and (ii) locomotion experiments at different speeds. Interestingly, the timescales during BMI experiments did not change significantly with the control mode or training. During locomotion, the analysis identified units whose timescale varied consistently with the experimentally controlled speed of walking, though the specific timescale reflected also the recovery properties of the unit. Thus, the proposed method, SITA, characterizes the timescales of neural encoding and how they are affected by the motor task, while accounting for all rate modulations.

  3. Improving system modeling accuracy with Monte Carlo codes

    International Nuclear Information System (INIS)

    Johnson, A.S.

    1996-01-01

    The use of computer codes based on Monte Carlo methods to perform criticality calculations has become common-place. Although results frequently published in the literature report calculated k eff values to four decimal places, people who use the codes in their everyday work say that they only believe the first two decimal places of any result. The lack of confidence in the computed k eff values may be due to the tendency of the reported standard deviation to underestimate errors associated with the Monte Carlo process. The standard deviation as reported by the codes is the standard deviation of the mean of the k eff values for individual generations in the computer simulation, not the standard deviation of the computed k eff value compared with the physical system. A more subtle problem with the standard deviation of the mean as reported by the codes is that all the k eff values from the separate generations are not statistically independent since the k eff of a given generation is a function of k eff of the previous generation, which is ultimately based on the starting source. To produce a standard deviation that is more representative of the physical system, statistically independent values of k eff are needed

  4. An Automatic Instruction-Level Parallelization of Machine Code

    Directory of Open Access Journals (Sweden)

    MARINKOVIC, V.

    2018-02-01

    Full Text Available Prevailing multicores and novel manycores have made a great challenge of modern day - parallelization of embedded software that is still written as sequential. In this paper, automatic code parallelization is considered, focusing on developing a parallelization tool at the binary level as well as on the validation of this approach. The novel instruction-level parallelization algorithm for assembly code which uses the register names after SSA to find independent blocks of code and then to schedule independent blocks using METIS to achieve good load balance is developed. The sequential consistency is verified and the validation is done by measuring the program execution time on the target architecture. Great speedup, taken as the performance measure in the validation process, and optimal load balancing are achieved for multicore RISC processors with 2 to 16 cores (e.g. MIPS, MicroBlaze, etc.. In particular, for 16 cores, the average speedup is 7.92x, while in some cases it reaches 14x. An approach to automatic parallelization provided by this paper is useful to researchers and developers in the area of parallelization as the basis for further optimizations, as the back-end of a compiler, or as the code parallelization tool for an embedded system.

  5. Performance simulation in high altitude platforms (HAPs) communications systems

    Science.gov (United States)

    Ulloa-Vásquez, Fernando; Delgado-Penin, J. A.

    2002-07-01

    This paper considers the analysis by simulation of a digital narrowband communication system for an scenario which consists of a High-Altitude aeronautical Platform (HAP) and fixed/mobile terrestrial transceivers. The aeronautical channel is modelled considering geometrical (angle of elevation vs. horizontal distance of the terrestrial reflectors) and statistical arguments and under these circumstances a serial concatenated coded digital transmission is analysed for several hypothesis related to radio-electric coverage areas. The results indicate a good feasibility for the communication system proposed and analysed.

  6. Developing HYDMN code to include the transient of MNSR

    International Nuclear Information System (INIS)

    Al-Barhoum, M.

    2000-11-01

    A description of the programs added to HYDMN code (a code for thermal-hydraulic steady state of MNSR) to include the transient of the same MNSR is presented. The code asks the initial conditions for the power (in k W) and the cold initial core inlet temperature (in degrees centigrade). A time-dependent study of the coolant inlet and outlet temperature, its speed, pool and tank temperatures is done for MNSR in general and for the Syrian MNSR in particular. The study solves the differential equations taken from reference (1) by using some numerical methods found in reference (3). The code becomes this way independent of any external information source. (Author)

  7. Overview of the U.S. DOE Hydrogen Safety, Codes and Standards Program. Part 4: Hydrogen Sensors; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Buttner, William J.; Rivkin, Carl; Burgess, Robert; Brosha, Eric; Mukundan, Rangachary; James, C. Will; Keller, Jay

    2016-12-01

    Hydrogen sensors are recognized as a critical element in the safety design for any hydrogen system. In this role, sensors can perform several important functions including indication of unintended hydrogen releases, activation of mitigation strategies to preclude the development of dangerous situations, activation of alarm systems and communication to first responders, and to initiate system shutdown. The functionality of hydrogen sensors in this capacity is decoupled from the system being monitored, thereby providing an independent safety component that is not affected by the system itself. The importance of hydrogen sensors has been recognized by DOE and by the Fuel Cell Technologies Office's Safety and Codes Standards (SCS) program in particular, which has for several years supported hydrogen safety sensor research and development. The SCS hydrogen sensor programs are currently led by the National Renewable Energy Laboratory, Los Alamos National Laboratory, and Lawrence Livermore National Laboratory. The current SCS sensor program encompasses the full range of issues related to safety sensors, including development of advance sensor platforms with exemplary performance, development of sensor-related code and standards, outreach to stakeholders on the role sensors play in facilitating deployment, technology evaluation, and support on the proper selection and use of sensors.

  8. Testing Quick Response (QR) Codes as an Innovation to Improve Feedback Among Geographically-Separated Clerkship Sites.

    Science.gov (United States)

    Snyder, Matthew J; Nguyen, Dana R; Womack, Jasmyne J; Bunt, Christopher W; Westerfield, Katie L; Bell, Adriane E; Ledford, Christy J W

    2018-03-01

    Collection of feedback regarding medical student clinical experiences for formative or summative purposes remains a challenge across clinical settings. The purpose of this study was to determine whether the use of a quick response (QR) code-linked online feedback form improves the frequency and efficiency of rater feedback. In 2016, we compared paper-based feedback forms, an online feedback form, and a QR code-linked online feedback form at 15 family medicine clerkship sites across the United States. Outcome measures included usability, number of feedback submissions per student, number of unique raters providing feedback, and timeliness of feedback provided to the clerkship director. The feedback method was significantly associated with usability, with QR code scoring the highest, and paper second. Accessing feedback via QR code was associated with the shortest time to prepare feedback. Across four rotations, separate repeated measures analyses of variance showed no effect of feedback system on the number of submissions per student or the number of unique raters. The results of this study demonstrate that preceptors in the family medicine clerkship rate QR code-linked feedback as a high usability platform. Additionally, this platform resulted in faster form completion than paper or online forms. An overarching finding of this study is that feedback forms must be portable and easily accessible. Potential implementation barriers and the social norm for providing feedback in this manner need to be considered.

  9. LncRNAWiki: harnessing community knowledge in collaborative curation of human long non-coding RNAs

    KAUST Repository

    Ma, L.; Li, A.; Zou, D.; Xu, X.; Xia, L.; Yu, J.; Bajic, Vladimir B.; Zhang, Z.

    2014-01-01

    Long non-coding RNAs (lncRNAs) perform a diversity of functions in numerous important biological processes and are implicated in many human diseases. In this report we present lncRNAWiki (http://lncrna.big.ac.cn), a wiki-based platform that is open

  10. Low-Rank Sparse Coding for Image Classification

    KAUST Repository

    Zhang, Tianzhu; Ghanem, Bernard; Liu, Si; Xu, Changsheng; Ahuja, Narendra

    2013-01-01

    In this paper, we propose a low-rank sparse coding (LRSC) method that exploits local structure information among features in an image for the purpose of image-level classification. LRSC represents densely sampled SIFT descriptors, in a spatial neighborhood, collectively as low-rank, sparse linear combinations of code words. As such, it casts the feature coding problem as a low-rank matrix learning problem, which is different from previous methods that encode features independently. This LRSC has a number of attractive properties. (1) It encourages sparsity in feature codes, locality in codebook construction, and low-rankness for spatial consistency. (2) LRSC encodes local features jointly by considering their low-rank structure information, and is computationally attractive. We evaluate the LRSC by comparing its performance on a set of challenging benchmarks with that of 7 popular coding and other state-of-the-art methods. Our experiments show that by representing local features jointly, LRSC not only outperforms the state-of-the-art in classification accuracy but also improves the time complexity of methods that use a similar sparse linear representation model for feature coding.

  11. Low-Rank Sparse Coding for Image Classification

    KAUST Repository

    Zhang, Tianzhu

    2013-12-01

    In this paper, we propose a low-rank sparse coding (LRSC) method that exploits local structure information among features in an image for the purpose of image-level classification. LRSC represents densely sampled SIFT descriptors, in a spatial neighborhood, collectively as low-rank, sparse linear combinations of code words. As such, it casts the feature coding problem as a low-rank matrix learning problem, which is different from previous methods that encode features independently. This LRSC has a number of attractive properties. (1) It encourages sparsity in feature codes, locality in codebook construction, and low-rankness for spatial consistency. (2) LRSC encodes local features jointly by considering their low-rank structure information, and is computationally attractive. We evaluate the LRSC by comparing its performance on a set of challenging benchmarks with that of 7 popular coding and other state-of-the-art methods. Our experiments show that by representing local features jointly, LRSC not only outperforms the state-of-the-art in classification accuracy but also improves the time complexity of methods that use a similar sparse linear representation model for feature coding.

  12. Toward ubiquitous healthcare services with a novel efficient cloud platform.

    Science.gov (United States)

    He, Chenguang; Fan, Xiaomao; Li, Ye

    2013-01-01

    Ubiquitous healthcare services are becoming more and more popular, especially under the urgent demand of the global aging issue. Cloud computing owns the pervasive and on-demand service-oriented natures, which can fit the characteristics of healthcare services very well. However, the abilities in dealing with multimodal, heterogeneous, and nonstationary physiological signals to provide persistent personalized services, meanwhile keeping high concurrent online analysis for public, are challenges to the general cloud. In this paper, we proposed a private cloud platform architecture which includes six layers according to the specific requirements. This platform utilizes message queue as a cloud engine, and each layer thereby achieves relative independence by this loosely coupled means of communications with publish/subscribe mechanism. Furthermore, a plug-in algorithm framework is also presented, and massive semistructure or unstructured medical data are accessed adaptively by this cloud architecture. As the testing results showing, this proposed cloud platform, with robust, stable, and efficient features, can satisfy high concurrent requests from ubiquitous healthcare services.

  13. Further validation of liquid metal MHD code for unstructured grid based on OpenFOAM

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jingchao; Chen, Hongli, E-mail: hlchen1@ustc.edu.cn; He, Qingyun; Ye, Minyou

    2015-11-15

    Highlights: • Specific correction scheme has been adopted to revise the calculating result for non-orthogonal meshes. • The developed MHD code based on OpenFOAM platform has been validated by benchmark cases under uniform and non-uniform magnetic field in round and rectangular ducts. • ALEX experimental results have been used to validate the MHD code based on OpenFOAM. - Abstract: In fusion liquid metal blankets, complex geometries involving contractions, expansions, bends, manifolds are very common. The characteristics of liquid metal flow in these geometries are significant. In order to extend the magnetohydrodynamic (MHD) solver developed on OpenFOAM platform to be applied in the complex geometry, the MHD solver based on unstructured meshes has been implemented. The adoption of non-orthogonal correction techniques in the solver makes it possible to process the non-orthogonal meshes in complex geometries. The present paper focused on the validation of the code under critical conditions. An analytical solution benchmark case and two experimental benchmark cases were conducted to validate the code. Benchmark case I is MHD flow in a circular pipe with arbitrary electric conductivity of the walls in a uniform magnetic field. Benchmark cases II and III are experimental cases of 3D laminar steady MHD flow under fringing magnetic field. In all these cases, the numerical results match well with the benchmark cases.

  14. Further validation of liquid metal MHD code for unstructured grid based on OpenFOAM

    International Nuclear Information System (INIS)

    Feng, Jingchao; Chen, Hongli; He, Qingyun; Ye, Minyou

    2015-01-01

    Highlights: • Specific correction scheme has been adopted to revise the calculating result for non-orthogonal meshes. • The developed MHD code based on OpenFOAM platform has been validated by benchmark cases under uniform and non-uniform magnetic field in round and rectangular ducts. • ALEX experimental results have been used to validate the MHD code based on OpenFOAM. - Abstract: In fusion liquid metal blankets, complex geometries involving contractions, expansions, bends, manifolds are very common. The characteristics of liquid metal flow in these geometries are significant. In order to extend the magnetohydrodynamic (MHD) solver developed on OpenFOAM platform to be applied in the complex geometry, the MHD solver based on unstructured meshes has been implemented. The adoption of non-orthogonal correction techniques in the solver makes it possible to process the non-orthogonal meshes in complex geometries. The present paper focused on the validation of the code under critical conditions. An analytical solution benchmark case and two experimental benchmark cases were conducted to validate the code. Benchmark case I is MHD flow in a circular pipe with arbitrary electric conductivity of the walls in a uniform magnetic field. Benchmark cases II and III are experimental cases of 3D laminar steady MHD flow under fringing magnetic field. In all these cases, the numerical results match well with the benchmark cases.

  15. The impact of time step definition on code convergence and robustness

    Science.gov (United States)

    Venkateswaran, S.; Weiss, J. M.; Merkle, C. L.

    1992-01-01

    We have implemented preconditioning for multi-species reacting flows in two independent codes, an implicit (ADI) code developed in-house and the RPLUS code (developed at LeRC). The RPLUS code was modified to work on a four-stage Runge-Kutta scheme. The performance of both the codes was tested, and it was shown that preconditioning can improve convergence by a factor of two to a hundred depending on the problem. Our efforts are currently focused on evaluating the effect of chemical sources and on assessing how preconditioning may be applied to improve convergence and robustness in the calculation of reacting flows.

  16. The Overshoot Phenomenon in Geodynamics Codes

    Science.gov (United States)

    Kommu, R. K.; Heien, E. M.; Kellogg, L. H.; Bangerth, W.; Heister, T.; Studley, E. H.

    2013-12-01

    The overshoot phenomenon is a common occurrence in numerical software when a continuous function on a finite dimensional discretized space is used to approximate a discontinuous jump, in temperature and material concentration, for example. The resulting solution overshoots, and undershoots, the discontinuous jump. Numerical simulations play an extremely important role in mantle convection research. This is both due to the strong temperature and stress dependence of viscosity and also due to the inaccessibility of deep earth. Under these circumstances, it is essential that mantle convection simulations be extremely accurate and reliable. CitcomS and ASPECT are two finite element based mantle convection simulations developed and maintained by the Computational Infrastructure for Geodynamics. CitcomS is a finite element based mantle convection code that is designed to run on multiple high-performance computing platforms. ASPECT, an adaptive mesh refinement (AMR) code built on the Deal.II library, is also a finite element based mantle convection code that scales well on various HPC platforms. CitcomS and ASPECT both exhibit the overshoot phenomenon. One attempt at controlling the overshoot uses the Entropy Viscosity method, which introduces an artificial diffusion term in the energy equation of mantle convection. This artificial diffusion term is small where the temperature field is smooth. We present results from CitcomS and ASPECT that quantify the effect of the Entropy Viscosity method in reducing the overshoot phenomenon. In the discontinuous Galerkin (DG) finite element method, the test functions used in the method are continuous within each element but are discontinuous across inter-element boundaries. The solution space in the DG method is discontinuous. FEniCS is a collection of free software tools that automate the solution of differential equations using finite element methods. In this work we also present results from a finite element mantle convection

  17. Code development for nuclear reactor simulation

    International Nuclear Information System (INIS)

    Chauliac, C.; Verwaerde, D.; Pavageau, O.

    2006-01-01

    Full text of publication follows: Since several years, CEA, EDF and FANP have developed several numerical codes which are currently used for nuclear industry applications and will be remain in use for the coming years. Complementary to this set of codes and in order to better meet the present and future needs, a new system is being developed through a joint venture between CEA, EDF and FANP, with a ten year prospect and strong intermediate milestones. The focus is put on a multi-scale and multi-physics approach enabling to take into account phenomena from microscopic to macroscopic scale, and to describe interactions between various physical fields such as neutronics (DESCARTES), thermal-hydraulics (NEPTUNE) and fuel behaviour (PLEIADES). This approach is based on a more rational design of the softwares and uses a common integration platform providing pre-processing, supervision of computation and post-processing. This paper will describe the overall system under development and present the first results obtained. (authors)

  18. Preparation of the TRANSURANUS code for TEMELIN NPP

    International Nuclear Information System (INIS)

    Klouzal, J.

    2011-01-01

    Since 2010 Temelin NPP started using TVSA-T fuel supplied by JSC TVEL. The transition process included implementation of several new core reload design codes. TRANSURANUS code was selected for the evaluation of the fuel rod thermomechanical performance. The adaptation and validation of the code was performed by Nuclear Research Institute Rez. TRANSURANUS code contains wide selection of alternative models for most of phenomena important for the fuel behaviour. It was therefore necessary to select, based on a comparison with experimental data, those most suitable for the modeling of TVSA-T fuel rods. In some cases, new models were implemented. Software tools and methodology for the evaluation of the proposed core reload design using TRANSURANUS code were also developed in NRI. The software tools include the interface to core physics code ANDREA and a set of scripts for an automated execution and processing of the computational runs. Independent confirmation of some of the vendor specified core reload design criteria was performed using TRANSURANUS. (authors)

  19. Pre-processing of input files for the AZTRAN code; Pre procesamiento de archivos de entrada para el codigo AZTRAN

    Energy Technology Data Exchange (ETDEWEB)

    Vargas E, S. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Ibarra, G., E-mail: samuel.vargas@inin.gob.mx [IPN, Av. Instituto Politecnico Nacional s/n, 07738 Ciudad de Mexico (Mexico)

    2017-09-15

    The AZTRAN code began to be developed in the Nuclear Engineering Department of the Escuela Superior de Fisica y Matematicas (ESFM) of the Instituto Politecnico Nacional (IPN) with the purpose of numerically solving various models arising from the physics and engineering of nuclear reactors. The code is still under development and is part of the AZTLAN platform: Development of a Mexican platform for the analysis and design of nuclear reactors. Due to the complexity to generate an input file for the code, a script based on D language is developed, with the purpose of making its elaboration easier, based on a new input file format which includes specific cards, which have been divided into two blocks, mandatory cards and optional cards, including a pre-processing of the input file to identify possible errors within it, as well as an image generator for the specific problem based on the python interpreter. (Author)

  20. The impact of employment statutes on independent contract workers

    International Nuclear Information System (INIS)

    Molnar, L.F.

    1999-01-01

    The minimum statutory entitlements that apply to employees and independent contractors engaged in Alberta's petroleum industry were reviewed. The importance of employers being aware of the potential applications of these statutes to its independent contractors was emphasized. The employment relationship between independent contractors and employers are regulated by the Employment Standards Code and the Human Rights, Citizenship and Multiculturalism Act. The paper also described the obligations that the Occupational Health and Safety Act and the Workers' Compensation Act impose on both employers and independent contractors. This presentation also listed the criteria used in determining if a person is performing a contract of employment or a contract for service. An employee/non-employee questionnaire was also included

  1. A method for modeling co-occurrence propensity of clinical codes with application to ICD-10-PCS auto-coding.

    Science.gov (United States)

    Subotin, Michael; Davis, Anthony R

    2016-09-01

    Natural language processing methods for medical auto-coding, or automatic generation of medical billing codes from electronic health records, generally assign each code independently of the others. They may thus assign codes for closely related procedures or diagnoses to the same document, even when they do not tend to occur together in practice, simply because the right choice can be difficult to infer from the clinical narrative. We propose a method that injects awareness of the propensities for code co-occurrence into this process. First, a model is trained to estimate the conditional probability that one code is assigned by a human coder, given than another code is known to have been assigned to the same document. Then, at runtime, an iterative algorithm is used to apply this model to the output of an existing statistical auto-coder to modify the confidence scores of the codes. We tested this method in combination with a primary auto-coder for International Statistical Classification of Diseases-10 procedure codes, achieving a 12% relative improvement in F-score over the primary auto-coder baseline. The proposed method can be used, with appropriate features, in combination with any auto-coder that generates codes with different levels of confidence. The promising results obtained for International Statistical Classification of Diseases-10 procedure codes suggest that the proposed method may have wider applications in auto-coding. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Vaccine platform recombinant measles virus.

    Science.gov (United States)

    Mühlebach, Michael D

    2017-10-01

    The classic development of vaccines is lengthy, tedious, and may not necessarily be successful as demonstrated by the case of HIV. This is especially a problem for emerging pathogens that are newly introduced into the human population and carry the inherent risk of pandemic spread in a naïve population. For such situations, a considerable number of different platform technologies are under development. These are also under development for pathogens, where directly derived vaccines are regarded as too complicated or even dangerous due to the induction of inefficient or unwanted immune responses causing considerable side-effects as for dengue virus. Among platform technologies are plasmid-based DNA vaccines, RNA replicons, single-round infectious vector particles, or replicating vaccine-based vectors encoding (a) critical antigen(s) of the target pathogens. Among the latter, recombinant measles viruses derived from vaccine strains have been tested. Measles vaccines are among the most effective and safest life-attenuated vaccines known. Therefore, the development of Schwarz-, Moraten-, or AIK-C-strain derived recombinant vaccines against a wide range of mostly viral, but also bacterial pathogens was quite straightforward. These vaccines generally induce powerful humoral and cellular immune responses in appropriate animal models, i.e., transgenic mice or non-human primates. Also in the recent first clinical phase I trial, the results have been quite encouraging. The trial indicated the expected safety and efficacy also in human patients, interestingly independent from the level of prevalent anti-measles immunity before the trial. Thereby, recombinant measles vaccines expressing additional antigens are a promising platform for future vaccines.

  3. The queueing perspective of asynchronous network coding in two-way relay network

    Science.gov (United States)

    Liang, Yaping; Chang, Qing; Li, Xianxu

    2018-04-01

    Asynchronous network coding (NC) has potential to improve the wireless network performance compared with a routing or the synchronous network coding. Recent researches concentrate on the optimization between throughput/energy consuming and delay with a couple of independent input flow. However, the implementation of NC requires a thorough investigation of its impact on relevant queueing systems where few work focuses on. Moreover, few works study the probability density function (pdf) in network coding scenario. In this paper, the scenario with two independent Poisson input flows and one output flow is considered. The asynchronous NC-based strategy is that a new arrival evicts a head packet holding in its queue when waiting for another packet from the other flow to encode. The pdf for the output flow which contains both coded and uncoded packets is derived. Besides, the statistic characteristics of this strategy are analyzed. These results are verified by numerical simulations.

  4. NEWSPEC: A computer code to unfold neutron spectra from Bonner sphere data

    International Nuclear Information System (INIS)

    Lemley, E.C.; West, L.

    1996-01-01

    A new computer code, NEWSPEC, is in development at the University of Arkansas. The NEWSPEC code allows a user to unfold, fold, rebin, display, and manipulate neutron spectra as applied to Bonner sphere measurements. The SPUNIT unfolding algorithm, a new rebinning algorithm, and the graphical capabilities of Microsoft (MS) Windows and MS Excel are utilized to perform these operations. The computer platform for NEWSPEC is a personal computer (PC) running MS Windows 3.x or Win95, while the code is written in MS Visual Basic (VB) and MS VB for Applications (VBA) under Excel. One of the most useful attributes of the NEWSPEC software is the link to Excel allowing additional manipulation of program output or creation of program input

  5. A resilient and secure software platform and architecture for distributed spacecraft

    Science.gov (United States)

    Otte, William R.; Dubey, Abhishek; Karsai, Gabor

    2014-06-01

    A distributed spacecraft is a cluster of independent satellite modules flying in formation that communicate via ad-hoc wireless networks. This system in space is a cloud platform that facilitates sharing sensors and other computing and communication resources across multiple applications, potentially developed and maintained by different organizations. Effectively, such architecture can realize the functions of monolithic satellites at a reduced cost and with improved adaptivity and robustness. Openness of these architectures pose special challenges because the distributed software platform has to support applications from different security domains and organizations, and where information flows have to be carefully managed and compartmentalized. If the platform is used as a robust shared resource its management, configuration, and resilience becomes a challenge in itself. We have designed and prototyped a distributed software platform for such architectures. The core element of the platform is a new operating system whose services were designed to restrict access to the network and the file system, and to enforce resource management constraints for all non-privileged processes Mixed-criticality applications operating at different security labels are deployed and controlled by a privileged management process that is also pre-configuring all information flows. This paper describes the design and objective of this layer.

  6. Allele coding in genomic evaluation

    Directory of Open Access Journals (Sweden)

    Christensen Ole F

    2011-06-01

    Full Text Available Abstract Background Genomic data are used in animal breeding to assist genetic evaluation. Several models to estimate genomic breeding values have been studied. In general, two approaches have been used. One approach estimates the marker effects first and then, genomic breeding values are obtained by summing marker effects. In the second approach, genomic breeding values are estimated directly using an equivalent model with a genomic relationship matrix. Allele coding is the method chosen to assign values to the regression coefficients in the statistical model. A common allele coding is zero for the homozygous genotype of the first allele, one for the heterozygote, and two for the homozygous genotype for the other allele. Another common allele coding changes these regression coefficients by subtracting a value from each marker such that the mean of regression coefficients is zero within each marker. We call this centered allele coding. This study considered effects of different allele coding methods on inference. Both marker-based and equivalent models were considered, and restricted maximum likelihood and Bayesian methods were used in inference. Results Theoretical derivations showed that parameter estimates and estimated marker effects in marker-based models are the same irrespective of the allele coding, provided that the model has a fixed general mean. For the equivalent models, the same results hold, even though different allele coding methods lead to different genomic relationship matrices. Calculated genomic breeding values are independent of allele coding when the estimate of the general mean is included into the values. Reliabilities of estimated genomic breeding values calculated using elements of the inverse of the coefficient matrix depend on the allele coding because different allele coding methods imply different models. Finally, allele coding affects the mixing of Markov chain Monte Carlo algorithms, with the centered coding being

  7. Scientific Programming Using Java and C: A Remote Sensing Example

    Science.gov (United States)

    Prados, Donald; Johnson, Michael; Mohamed, Mohamed A.; Cao, Chang-Yong; Gasser, Jerry; Powell, Don; McGregor, Lloyd

    1999-01-01

    This paper presents results of a project to port code for processing remotely sensed data from the UNIX environment to Windows. Factors considered during this process include time schedule, cost, resource availability, reuse of existing code, rapid interface development, ease of integration, and platform independence. The approach selected for this project used both Java and C. By using Java for the graphical user interface and C for the domain model, the strengths of both languages were utilized and the resulting code can easily be ported to other platforms. The advantages of this approach are discussed in this paper.

  8. Thermohydraulic analysis of nuclear power plant accidents by computer codes

    International Nuclear Information System (INIS)

    Petelin, S.; Stritar, A.; Istenic, R.; Gregoric, M.; Jerele, A.; Mavko, B.

    1982-01-01

    RELAP4/MOD6, BRUCH-D-06, CONTEMPT-LT-28, RELAP5/MOD1 and COBRA-4-1 codes were successful y implemented at the CYBER 172 computer in Ljubljana. Input models of NPP Krsko for the first three codes were prepared. Because of the high computer cost only one analysis of double ended guillotine break of the cold leg of NPP Krsko by RELAP4 code has been done. BRUCH code is easier and cheaper for use. Several analysis have been done. Sensitivity study was performed with CONTEMPT-LT-28 for double ended pump suction break. These codes are intended to be used as a basis for independent safety analyses. (author)

  9. Energy meshing techniques for processing ENDF/B-VI cross sections using the AMPX code system

    International Nuclear Information System (INIS)

    Dunn, M.E.; Greene, N.M.; Leal, L.C.

    1999-01-01

    Modern techniques for the establishment of criticality safety for fissile systems invariably require the use of neutronic transport codes with applicable cross-section data. Accurate cross-section data are essential for solving the Boltzmann Transport Equation for fissile systems. In the absence of applicable critical experimental data, the use of independent calculational methods is crucial for the establishment of subcritical limits. Moreover, there are various independent modern transport codes available to the criticality safety analyst (e.g., KENO V.a., MCNP, and MONK). In contrast, there is currently only one complete software package that processes data from the Version 6 format of the Evaluated Nuclear Data File (ENDF) to a format useable by criticality safety codes. To facilitate independent cross-section processing, Oak Ridge National Laboratory (ORNL) is upgrading the AMPX code system to enable independent processing of Version 6 formats using state-of-the-art procedures. The AMPX code system has been in continuous use at ORNL since the early 1970s and is the premier processor for providing multigroup cross sections for criticality safety analysis codes. Within the AMPX system, the module POLIDENT is used to access the resonance parameters in File 2 of an ENDF/B library, generate point cross-section data, and combine the cross sections with File 3 point data. At the heart of any point cross-section processing code is the generation of a suitable energy mesh for representing the data. The purpose of this work is to facilitate the AMPX upgrade through the development of a new and innovative energy meshing technique for processing point cross-section data

  10. Platform Performance and Challenges - using Platforms in Lego Company

    DEFF Research Database (Denmark)

    Munk, Lone; Mortensen, Niels Henrik

    2009-01-01

    needs focus on the incentive of using the platform. This problem lacks attention in literature, as well as industry, where assessment criteria do not cover this aspect. Therefore, we recommend including user incentive in platform assessment criteria to these challenges. Concrete solution elements...... ensuring user incentive in platforms is an object for future research...

  11. CFD Code Validation against Stratified Air-Water Flow Experimental Data

    International Nuclear Information System (INIS)

    Terzuoli, F.; Galassi, M.C.; Mazzini, D.; D'Auria, F.

    2008-01-01

    Pressurized thermal shock (PTS) modelling has been identified as one of the most important industrial needs related to nuclear reactor safety. A severe PTS scenario limiting the reactor pressure vessel (RPV) lifetime is the cold water emergency core cooling (ECC) injection into the cold leg during a loss of coolant accident (LOCA). Since it represents a big challenge for numerical simulations, this scenario was selected within the European Platform for Nuclear Reactor Simulations (NURESIM) Integrated Project as a reference two-phase problem for computational fluid dynamics (CFDs) code validation. This paper presents a CFD analysis of a stratified air-water flow experimental investigation performed at the Institut de Mecanique des Fluides de Toulouse in 1985, which shares some common physical features with the ECC injection in PWR cold leg. Numerical simulations have been carried out with two commercial codes (Fluent and Ansys CFX), and a research code (NEPTUNE CFD). The aim of this work, carried out at the University of Pisa within the NURESIM IP, is to validate the free surface flow model implemented in the codes against experimental data, and to perform code-to-code benchmarking. Obtained results suggest the relevance of three-dimensional effects and stress the importance of a suitable interface drag modelling

  12. CFD Code Validation against Stratified Air-Water Flow Experimental Data

    Directory of Open Access Journals (Sweden)

    F. Terzuoli

    2008-01-01

    Full Text Available Pressurized thermal shock (PTS modelling has been identified as one of the most important industrial needs related to nuclear reactor safety. A severe PTS scenario limiting the reactor pressure vessel (RPV lifetime is the cold water emergency core cooling (ECC injection into the cold leg during a loss of coolant accident (LOCA. Since it represents a big challenge for numerical simulations, this scenario was selected within the European Platform for Nuclear Reactor Simulations (NURESIM Integrated Project as a reference two-phase problem for computational fluid dynamics (CFDs code validation. This paper presents a CFD analysis of a stratified air-water flow experimental investigation performed at the Institut de Mécanique des Fluides de Toulouse in 1985, which shares some common physical features with the ECC injection in PWR cold leg. Numerical simulations have been carried out with two commercial codes (Fluent and Ansys CFX, and a research code (NEPTUNE CFD. The aim of this work, carried out at the University of Pisa within the NURESIM IP, is to validate the free surface flow model implemented in the codes against experimental data, and to perform code-to-code benchmarking. Obtained results suggest the relevance of three-dimensional effects and stress the importance of a suitable interface drag modelling.

  13. Platform Constellations

    DEFF Research Database (Denmark)

    Staykova, Kalina Stefanova; Damsgaard, Jan

    2016-01-01

    This research paper presents an initial attempt to introduce and explain the emergence of new phenomenon, which we refer to as platform constellations. Functioning as highly modular systems, the platform constellations are collections of highly connected platforms which co-exist in parallel and a......’ acquisition and users’ engagement rates as well as unlock new sources of value creation and diversify revenue streams....

  14. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  15. On-board attitude determination for the Explorer Platform satellite

    Science.gov (United States)

    Jayaraman, C.; Class, B.

    1992-01-01

    This paper describes the attitude determination algorithm for the Explorer Platform satellite. The algorithm, which is baselined on the Landsat code, is a six-element linear quadratic state estimation processor, in the form of a Kalman filter augmented by an adaptive filter process. Improvements to the original Landsat algorithm were required to meet mission pointing requirements. These consisted of a more efficient sensor processing algorithm and the addition of an adaptive filter which acts as a check on the Kalman filter during satellite slew maneuvers. A 1750A processor will be flown on board the satellite for the first time as a coprocessor (COP) in addition to the NASA Standard Spacecraft Computer. The attitude determination algorithm, which will be resident in the COP's memory, will make full use of its improved processing capabilities to meet mission requirements. Additional benefits were gained by writing the attitude determination code in Ada.

  16. Conceptual OOP design of Pilot Code for Two-Fluid, Three-field Model with C++ 6.0

    Energy Technology Data Exchange (ETDEWEB)

    Chung, B. D.; Lee, Y. J

    2006-09-15

    To establish the concept of the objective oriented program (OOP) design for reactor safety analysis code, the preliminary OOP design for PILOT code, which based on one dimensional two fluid three filed model, has been attempted with C++ language feature. Microsoft C++ language has been used since it is available as groupware utilization in KAERI. The language has can be merged with Compac Visual Fortran 6.6 in Visual Studio platform. In the development platform, C++ has been used as main language and Fortran has been used as mixed language in connection with C++ main drive program. The mixed language environment is a specific feature provided in visual studio. Existing Fortran source was utilized for input routine of reading steam table from generated file and routine of steam property calculation. The calling convention and passing argument from C++ driver was corrected. The mathematical routine, such as inverse matrix conversion and tridiagonal matrix solver, has been used as PILOT Fortran routines. Simple volume and junction utilized in PILOT code can be treated as objects, since they are the basic construction elements of code system. Other routines for overall solution scheme have been realized as procedure C functions. The conceptual design which consists of hydraulic loop, component, volume, and junction class has been described in the appendix in order to give the essential OOP structure of system safety analysis code. The attempt shows that many part of system analysis code can be expressed as objects, although the overall structure should be maintained as procedure functions. The encapsulation of data and functions within an object can provide many beneficial aspects in programming of system code.

  17. Conceptual OOP design of Pilot Code for Two-Fluid, Three-field Model with C++ 6.0

    International Nuclear Information System (INIS)

    Chung, B. D.; Lee, Y. J.

    2006-09-01

    To establish the concept of the objective oriented program (OOP) design for reactor safety analysis code, the preliminary OOP design for PILOT code, which based on one dimensional two fluid three filed model, has been attempted with C++ language feature. Microsoft C++ language has been used since it is available as groupware utilization in KAERI. The language has can be merged with Compac Visual Fortran 6.6 in Visual Studio platform. In the development platform, C++ has been used as main language and Fortran has been used as mixed language in connection with C++ main drive program. The mixed language environment is a specific feature provided in visual studio. Existing Fortran source was utilized for input routine of reading steam table from generated file and routine of steam property calculation. The calling convention and passing argument from C++ driver was corrected. The mathematical routine, such as inverse matrix conversion and tridiagonal matrix solver, has been used as PILOT Fortran routines. Simple volume and junction utilized in PILOT code can be treated as objects, since they are the basic construction elements of code system. Other routines for overall solution scheme have been realized as procedure C functions. The conceptual design which consists of hydraulic loop, component, volume, and junction class has been described in the appendix in order to give the essential OOP structure of system safety analysis code. The attempt shows that many part of system analysis code can be expressed as objects, although the overall structure should be maintained as procedure functions. The encapsulation of data and functions within an object can provide many beneficial aspects in programming of system code

  18. ITER Dynamic Tritium Inventory Modeling Code

    International Nuclear Information System (INIS)

    Cristescu, Ioana-R.; Doerr, L.; Busigin, A.; Murdoch, D.

    2005-01-01

    A tool for tritium inventory evaluation within each sub-system of the Fuel Cycle of ITER is vital, with respect to both the process of licensing ITER and also for operation. It is very likely that measurements of total tritium inventories may not be possible for all sub-systems, however tritium accounting may be achieved by modeling its hold-up within each sub-system and by validating these models in real-time against the monitored flows and tritium streams between the systems. To get reliable results, an accurate dynamic modeling of the tritium content in each sub-system is necessary. In order to optimize the configuration and operation of the ITER fuel cycle, a dynamic fuel cycle model was developed progressively in the decade up to 2000-2001. As the design for some sub-systems from the fuel cycle (i.e. Vacuum pumping, Neutral Beam Injectors (NBI)) have substantially progressed meanwhile, a new code developed under a different platform to incorporate these modifications has been developed. The new code is taking over the models and algorithms for some subsystems, such as Isotope Separation System (ISS); where simplified models have been previously considered, more detailed have been introduced, as for the Water Detritiation System (WDS). To reflect all these changes, the new code developed inside EU participating team was nominated TRIMO (Tritium Inventory Modeling), to emphasize the use of the code on assessing the tritium inventory within ITER

  19. Global Software Development with Cloud Platforms

    Science.gov (United States)

    Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya

    Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.

  20. Comparison of Simulations and Offshore Measurement Data of a Combined Floating Wind and Wave Energy Demonstration Platform

    DEFF Research Database (Denmark)

    Yde, Anders; Larsen, Torben J.; Hansen, Anders Melchior

    2015-01-01

    In this paper, results from comparisons of simulations and measured offshore data from a floating combined wind and wave energy conversion system are presented. The numerical model of the platform is based on the aeroelastic code, HAWC2, developed by DTU Wind Energy, which is coupled with a special...... external system that reads the output generated directly by the wave analysis software WAMIT. The main focus of the comparison is on the statistical trends of the platform motion, mooring loads, and turbine loads in measurements and simulations during different operational conditions. Finally, challenges...

  1. Signal-Independent Timescale Analysis (SITA and its Application for Neural Coding during Reaching and Walking

    Directory of Open Access Journals (Sweden)

    Miriam eZacksenhouse

    2014-08-01

    Full Text Available What are the relevant timescales of neural encoding in the brain? This question is commonly investigated with respect to well-defined stimuli or actions. However, neurons often encode multiple signals, including hidden or internal, which are not experimentally controlled, and thus excluded from such analysis. Here we consider all rate modulations as the signal, and define the rate-modulations signal-to-noise ratio (RM-SNR as the ratio between the variance of the rate and the variance of the neuronal noise. As the bin-width increases, RM-SNR increases while the update rate decreases. This tradeoff is captured by the ratio of RM-SNR to bin-width, and its variations with the bin-width reveal the timescales of neural activity. Theoretical analysis and simulations elucidate how the interactions between the recovery properties of the unit and the spectral content of the encoded signals shape this ratio and determine the timescales of neural coding. The resulting signal-independent timescale analysis (SITA is applied to investigate timescales of neural activity recorded from the motor cortex of monkeys during: (i reaching experiments with Brain-Machine Interface (BMI, and (ii locomotion experiments at different speeds. Interestingly, the timescales during BMI experiments did not change significantly with the control mode or training. During locomotion, the analysis identified units whose timescale varied consistently with the experimentally controlled speed of walking, though the specific timescale reflected also the recovery properties of the unit. Thus, the proposed method, SITA, characterizes the timescales of neural encoding and how they are affected by the motor task, while accounting for all rate modulations.

  2. Continuous Platform Development

    DEFF Research Database (Denmark)

    Nielsen, Ole Fiil

    low risks and investments but also with relatively fuzzy results. When looking for new platform projects, it is important to make sure that the company and market is ready for the introduction of platforms, and to make sure that people from marketing and sales, product development, and downstream......, but continuous product family evolution challenges this strategy. The concept of continuous platform development is based on the fact that platform development should not be a one-time experience but rather an ongoing process of developing new platforms and updating existing ones, so that product family...

  3. ALLIANCES: simulation platform for radioactive waste disposal

    International Nuclear Information System (INIS)

    Deville, E.; Montarnal, Ph.; Loth, L.; Chavant, C.

    2009-01-01

    CEA, ANDRA and EDF are jointly developing the software platform ALLIANCES whose aim is to produce a tool for the simulation of nuclear waste storage and disposal. This type of simulations deals with highly coupled thermo-hydro-mechanical-chemical and radioactive (T-H-M-C-R) processes. ALLIANCES' aim is to accumulate within the same simulation environment the already acquired knowledge and to gradually integrate new knowledge. The current version of ALLIANCES contains the following modules: - Hydraulics and reactive transport in unsaturated and saturated media; - Multi-phase flow; - Mechanical thermal-hydraulics; - Thermo-Aeraulics; - Chemistry/Transport coupling in saturated media; - Alteration of waste package coupled with the environment; - Sensitivity analysis tools. The next releases will include more physical phenomena like: reactive transport in unsaturated flow and multicomponent multiphase flow; incorporation of responses surfaces in sensitivity analysis tools; integration of parallel numerical codes for flow and transport. Since the distribution of the first release of ALLIANCES (December 2003), the platform was used by ANDRA for his safety simulation program and by CEA for reactive transport simulations (migration of uranium in a soil, diffusion of different reactive species on laboratory samples, glass/iron/clay interaction). (authors)

  4. Bioinformatics on the Cloud Computing Platform Azure

    Science.gov (United States)

    Shanahan, Hugh P.; Owen, Anne M.; Harrison, Andrew P.

    2014-01-01

    We discuss the applicability of the Microsoft cloud computing platform, Azure, for bioinformatics. We focus on the usability of the resource rather than its performance. We provide an example of how R can be used on Azure to analyse a large amount of microarray expression data deposited at the public database ArrayExpress. We provide a walk through to demonstrate explicitly how Azure can be used to perform these analyses in Appendix S1 and we offer a comparison with a local computation. We note that the use of the Platform as a Service (PaaS) offering of Azure can represent a steep learning curve for bioinformatics developers who will usually have a Linux and scripting language background. On the other hand, the presence of an additional set of libraries makes it easier to deploy software in a parallel (scalable) fashion and explicitly manage such a production run with only a few hundred lines of code, most of which can be incorporated from a template. We propose that this environment is best suited for running stable bioinformatics software by users not involved with its development. PMID:25050811

  5. MARS 1.3 system analysis code coupling with CONTEMPT4/MOD5/PCCS containment analysis code using dynamic link library

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Lee, Won Jae

    1998-01-01

    The two independent codes, MARS 1.3 and CONTEMPT4/MOD5/PCCS, have been coupled using the method of dynamic-link-library (DLL) technique. Overall configuration of the code system is designed so that MARS will be a main driver program which use CONTEMPT as associated routines. Using Digital Visual Fortran compiler, DLL was generated from the CONTEMPT source code with the interfacing routine names and arguments. Coupling of MARS with CONTEMPT was realized by calling the DLL routines at the appropriate step in the MARS code. Verification of coupling was carried out for LBLOCA transient of a typical plant design. It was found that the DLL technique is much more convenient than the UNIX process control techniques and effective for Window operating system. Since DLL can be used by more than one application and an application program can use many DLLs simultaneously, this technique would enable the existing codes to use more broadly with linking others

  6. MARS 1.3 system analysis code coupling with CONTEMPT4/MOD5/PCCS containment analysis code using dynamic link library

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Jeong, Jae Jun; Lee, Won Jae [KAERI, Taejon (Korea, Republic of)

    1998-10-01

    The two independent codes, MARS 1.3 and CONTEMPT4/MOD5/PCCS, have been coupled using the method of dynamic-link-library (DLL) technique. Overall configuration of the code system is designed so that MARS will be a main driver program which use CONTEMPT as associated routines. Using Digital Visual Fortran compiler, DLL was generated from the CONTEMPT source code with the interfacing routine names and arguments. Coupling of MARS with CONTEMPT was realized by calling the DLL routines at the appropriate step in the MARS code. Verification of coupling was carried out for LBLOCA transient of a typical plant design. It was found that the DLL technique is much more convenient than the UNIX process control techniques and effective for Window operating system. Since DLL can be used by more than one application and an application program can use many DLLs simultaneously, this technique would enable the existing codes to use more broadly with linking others.

  7. Comparison and validation of the results of the AZNHEX v.1.0 code with the MCNP code simulating the core of a fast reactor cooled with sodium; Comparacion y validacion de los resultados del codigo AZNHEX v.1.0 con el codigo MCNP simulando el nucleo de un reactor rapido refrigerado con sodio

    Energy Technology Data Exchange (ETDEWEB)

    Galicia A, J.; Francois L, J. L.; Bastida O, G. E. [UNAM, Facultad de Ingenieria, Departamento de Sistemas Energeticos, Ciudad Universitaria, 04510 Ciudad de Mexico (Mexico); Esquivel E, J., E-mail: blink19871@hotmail.com [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2016-09-15

    The development of the AZTLAN platform for the analysis and design of nuclear reactors is led by Instituto Nacional de Investigaciones Nucleares (ININ) and divided into four working groups, which have well-defined activities to achieve significant progress in this project individually and jointly. Within these working groups is the users group, whose main task is to use the codes that make up the AZTLAN platform to provide feedback to the developers, and in this way to make the final versions of the codes are efficient and at the same time reliable and easy to understand. In this paper we present the results provided by the AZNHEX v.1.0 code when simulating the core of a fast reactor cooled with sodium at steady state. The validation of these results is a fundamental part of the platform development and responsibility of the users group, so in this research the results obtained with AZNHEX are compared and analyzed with those provided by the Monte Carlo code MCNP-5, software worldwide used and recognized. A description of the methodology used with MCNP-5 is also presented for the calculation of the interest variables and the difference that is obtained with respect to the calculated with AZNHEX. (Author)

  8. Performance Analysis of Faulty Gallager-B Decoding of QC-LDPC Codes with Applications

    Directory of Open Access Journals (Sweden)

    O. Al Rasheed

    2014-06-01

    Full Text Available In this paper we evaluate the performance of Gallager-B algorithm, used for decoding low-density parity-check (LDPC codes, under unreliable message computation. Our analysis is restricted to LDPC codes constructed from circular matrices (QC-LDPC codes. Using Monte Carlo simulation we investigate the effects of different code parameters on coding system performance, under a binary symmetric communication channel and independent transient faults model. One possible application of the presented analysis in designing memory architecture with unreliable components is considered.

  9. European Validation of the Integral Code ASTEC (EVITA)

    International Nuclear Information System (INIS)

    Allelein, H.-J.; Neu, K.; Dorsselaere, J.P. Van

    2005-01-01

    The main objective of the European Validation of the Integral Code ASTEC (EVITA) project is to distribute the severe accident integral code ASTEC to European partners in order to apply the validation strategy issued from the VASA project (4th EC FWP). Partners evaluate the code capability through validation on reference experiments and plant applications accounting for severe accident management measures, and compare results with reference codes. The basis version V0 of ASTEC (Accident Source Term Evaluation Code)-commonly developed and basically validated by GRS and IRSN-was made available in late 2000 for the EVITA partners on their individual platforms. Users' training was performed by IRSN and GRS. The code portability on different computers was checked to be correct. A 'hot line' assistance was installed continuously available for EVITA code users. The actual version V1 has been released to the EVITA partners end of June 2002. It allows to simulate the front-end phase by two new modules:- for reactor coolant system 2-phase simplified thermal hydraulics (5-equation approach) during both front-end and core degradation phases; - for core degradation, based on structure and main models of ICARE2 (IRSN) reference mechanistic code for core degradation and on other simplified models. Next priorities are clearly identified: code consolidation in order to increase the robustness, extension of all plant applications beyond the vessel lower head failure and coupling with fission product modules, and continuous improvements of users' tools. As EVITA has very successfully made the first step into the intention to provide end-users (like utilities, vendors and licensing authorities) with a well validated European integral code for the simulation of severe accidents in NPPs, the EVITA partners strongly recommend to continue validation, benchmarking and application of ASTEC. This work will continue in Severe Accident Research Network (SARNET) in the 6th Framework Programme

  10. Detected-jump-error-correcting quantum codes, quantum error designs, and quantum computation

    International Nuclear Information System (INIS)

    Alber, G.; Mussinger, M.; Beth, Th.; Charnes, Ch.; Delgado, A.; Grassl, M.

    2003-01-01

    The recently introduced detected-jump-correcting quantum codes are capable of stabilizing qubit systems against spontaneous decay processes arising from couplings to statistically independent reservoirs. These embedded quantum codes exploit classical information about which qubit has emitted spontaneously and correspond to an active error-correcting code embedded in a passive error-correcting code. The construction of a family of one-detected-jump-error-correcting quantum codes is shown and the optimal redundancy, encoding, and recovery as well as general properties of detected-jump-error-correcting quantum codes are discussed. By the use of design theory, multiple-jump-error-correcting quantum codes can be constructed. The performance of one-jump-error-correcting quantum codes under nonideal conditions is studied numerically by simulating a quantum memory and Grover's algorithm

  11. SCAMPI: A code package for cross-section processing

    International Nuclear Information System (INIS)

    Parks, C.V.; Petrie, L.M.; Bowman, S.M.; Broadhead, B.L.; Greene, N.M.; White, J.E.

    1996-01-01

    The SCAMPI code package consists of a set of SCALE and AMPX modules that have been assembled to facilitate user needs for preparation of problem-specific, multigroup cross-section libraries. The function of each module contained in the SCANTI code package is discussed, along with illustrations of their use in practical analyses. Ideas are presented for future work that can enable one-step processing from a fine-group, problem-independent library to a broad-group, problem-specific library ready for a shielding analysis

  12. SCAMPI: A code package for cross-section processing

    Energy Technology Data Exchange (ETDEWEB)

    Parks, C.V.; Petrie, L.M.; Bowman, S.M.; Broadhead, B.L.; Greene, N.M.; White, J.E.

    1996-04-01

    The SCAMPI code package consists of a set of SCALE and AMPX modules that have been assembled to facilitate user needs for preparation of problem-specific, multigroup cross-section libraries. The function of each module contained in the SCANTI code package is discussed, along with illustrations of their use in practical analyses. Ideas are presented for future work that can enable one-step processing from a fine-group, problem-independent library to a broad-group, problem-specific library ready for a shielding analysis.

  13. Qualification of FEAST 3.0 and FEAT 4.0 computer codes

    International Nuclear Information System (INIS)

    Xu, Z.; Lai, L.; Sim, K.-S.; Huang, F.; Wong, B.

    2005-01-01

    FEAST (Finite Element Analysis for Stresses) is an AECL computer code used to assess the structural integrity of the CANDU fuel element. FEAST models the thermo-elastic, thermo-elasto-plastic and creep deformations in CANDU fuel. FEAT (Finite Element Analysis for Temperature) is another AECL computer code and is used to assess the thermal integrity of fuel elements. FEAT models the steady-state and transient heat flows in CANDU fuel, under conditions such as flux depression, end flux peaking, temperature-dependent thermal conductivity, and non-uniform time-dependent boundary conditions. Both computer programs are used in design and qualification analyses of CANDU fuel. Formal qualifications (including coding verification and validation) of both codes were performed, in accordance with AECL software quality assurance (SQA) manual and procedures that are consistent with CSA N286.7-99. Validation of FEAST 3.0 shows very good agreement with independent analytical solutions or measurements. Validation of FEAT 4.0 also shows very good agreement with independent WIMS-AECL calculations, analytical solutions, ANSYS calculations and measurement. (author)

  14. Qualification of FEAST 3.0 and FEAT 4.0 computer codes

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Z.; Lai, L.; Sim, K.-S.; Huang, F.; Wong, B. [Atomic Energy of Canada Limited, Mississauga, Ontario (Canada)

    2005-07-01

    FEAST (Finite Element Analysis for Stresses) is an AECL computer code used to assess the structural integrity of the CANDU fuel element. FEAST models the thermo-elastic, thermo-elasto-plastic and creep deformations in CANDU fuel. FEAT (Finite Element Analysis for Temperature) is another AECL computer code and is used to assess the thermal integrity of fuel elements. FEAT models the steady-state and transient heat flows in CANDU fuel, under conditions such as flux depression, end flux peaking, temperature-dependent thermal conductivity, and non-uniform time-dependent boundary conditions. Both computer programs are used in design and qualification analyses of CANDU fuel. Formal qualifications (including coding verification and validation) of both codes were performed, in accordance with AECL software quality assurance (SQA) manual and procedures that are consistent with CSA N286.7-99. Validation of FEAST 3.0 shows very good agreement with independent analytical solutions or measurements. Validation of FEAT 4.0 also shows very good agreement with independent WIMS-AECL calculations, analytical solutions, ANSYS calculations and measurement. (author)

  15. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  16. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  17. A multiplexed miRNA and transgene expression platform for simultaneous repression and expression of protein coding sequences.

    Science.gov (United States)

    Seyhan, Attila A

    2016-01-01

    Knockdown of single or multiple gene targets by RNA interference (RNAi) is necessary to overcome escape mutants or isoform redundancy. It is also necessary to use multiple RNAi reagents to knockdown multiple targets. It is also desirable to express a transgene or positive regulatory elements and inhibit a target gene in a coordinated fashion. This study reports a flexible multiplexed RNAi and transgene platform using endogenous intronic primary microRNAs (pri-miRNAs) as a scaffold located in the green fluorescent protein (GFP) as a model for any functional transgene. The multiplexed intronic miRNA - GFP transgene platform was designed to co-express multiple small RNAs within the polycistronic cluster from a Pol II promoter at more moderate levels to reduce potential vector toxicity. The native intronic miRNAs are co-transcribed with a precursor GFP mRNA as a single transcript and presumably cleaved out of the precursor-(pre) mRNA by the RNA splicing machinery, spliceosome. The spliced intron with miRNA hairpins will be further processed into mature miRNAs or small interfering RNAs (siRNAs) capable of triggering RNAi effects, while the ligated exons become a mature messenger RNA for the translation of the functional GFP protein. Data show that this approach led to robust RNAi-mediated silencing of multiple Renilla Luciferase (R-Luc)-tagged target genes and coordinated expression of functional GFP from a single transcript in transiently transfected HeLa cells. The results demonstrated that this design facilitates the coordinated expression of all mature miRNAs either as individual miRNAs or as multiple miRNAs and the associated protein. The data suggest that, it is possible to simultaneously deliver multiple negative (miRNA or shRNA) and positive (transgene) regulatory elements. Because many cellular processes require simultaneous repression and activation of downstream pathways, this approach offers a platform technology to achieve that dual manipulation efficiently

  18. Qualification and application of nuclear reactor accident analysis code with the capability of internal assessment of uncertainty

    International Nuclear Information System (INIS)

    Borges, Ronaldo Celem

    2001-10-01

    This thesis presents an independent qualification of the CIAU code ('Code with the capability of - Internal Assessment of Uncertainty') which is part of the internal uncertainty evaluation process with a thermal hydraulic system code on a realistic basis. This is done by combining the uncertainty methodology UMAE ('Uncertainty Methodology based on Accuracy Extrapolation') with the RELAP5/Mod3.2 code. This allows associating uncertainty band estimates with the results obtained by the realistic calculation of the code, meeting licensing requirements of safety analysis. The independent qualification is supported by simulations with RELAP5/Mod3.2 related to accident condition tests of LOBI experimental facility and to an event which has occurred in Angra 1 nuclear power plant, by comparison with measured results and by establishing uncertainty bands on safety parameter calculated time trends. These bands have indeed enveloped the measured trends. Results from this independent qualification of CIAU have allowed to ascertain the adequate application of a systematic realistic code procedure to analyse accidents with uncertainties incorporated in the results, although there is an evident need of extending the uncertainty data base. It has been verified that use of the code with this internal assessment of uncertainty is feasible in the design and license stages of a NPP. (author)

  19. High-Fidelity Coding with Correlated Neurons

    Science.gov (United States)

    da Silveira, Rava Azeredo; Berry, Michael J.

    2014-01-01

    Positive correlations in the activity of neurons are widely observed in the brain. Previous studies have shown these correlations to be detrimental to the fidelity of population codes, or at best marginally favorable compared to independent codes. Here, we show that positive correlations can enhance coding performance by astronomical factors. Specifically, the probability of discrimination error can be suppressed by many orders of magnitude. Likewise, the number of stimuli encoded—the capacity—can be enhanced more than tenfold. These effects do not necessitate unrealistic correlation values, and can occur for populations with a few tens of neurons. We further show that both effects benefit from heterogeneity commonly seen in population activity. Error suppression and capacity enhancement rest upon a pattern of correlation. Tuning of one or several effective parameters can yield a limit of perfect coding: the corresponding pattern of positive correlation leads to a ‘lock-in’ of response probabilities that eliminates variability in the subspace relevant for stimulus discrimination. We discuss the nature of this pattern and we suggest experimental tests to identify it. PMID:25412463

  20. A New Cyber-enabled Platform for Scale-independent Interoperability of Earth Observations with Hydrologic Models

    Science.gov (United States)

    Rajib, A.; Zhao, L.; Merwade, V.; Shin, J.; Smith, J.; Song, C. X.

    2017-12-01

    Despite the significant potential of remotely sensed earth observations, their application is still not full-fledged in water resources research, management and education. Inconsistent storage structures, data formats and spatial resolution among different platforms/sources of earth observations hinder the use of these data. Available web-services can help bulk data downloading and visualization, but they are not sufficiently tailored to meet the degree of interoperability required for direct application of earth observations in hydrologic modeling at user-defined spatio-temporal scales. Similarly, the least ambiguous way for educators and watershed managers is to instantaneously obtain a time-series at any watershed of interest without spending time and computational resources on data download and post-processing activities. To address this issue, an open access, online platform, named HydroGlobe, is developed that minimizes all these processing tasks and delivers ready-to-use data from different earth observation sources. HydroGlobe can provide spatially-averaged time series of earth observations by using the following inputs: (i) data source, (ii) temporal extent in the form of start/end date, and (iii) geographic units (e.g., grid cell or sub-basin boundary) and extent in the form of GIS shapefile. In its preliminary version, HydroGlobe simultaneously handles five data sources including the surface and root zone soil moisture from SMAP (Soil Moisture Active Passive Mission), actual and potential evapotranspiration from MODIS (Moderate Resolution Imaging Spectroradiometer), and precipitation from GPM (Global Precipitation Measurements). This presentation will demonstrate the HydroGlobe interface and its applicability using few test cases on watersheds from different parts of the globe.

  1. A New profiling and pipelining approach for HEVC Decoder on ZedBoard Platform

    Directory of Open Access Journals (Sweden)

    Habib Smei

    2017-10-01

    Full Text Available New multimedia applications such as mobile video, high-quality Internet video or digital television requires high-performance encoding of video signals to meet technical constraints such as runtime, bandwidth or latency. Video coding standard h.265 HEVC (High Efficiency Video Coding was developed by JCT-VC to replace the MPEG-2, MPEG-4 and h.264 codecs and to respond to these new functional constraints. Currently, there are several implementations of this standard. Some implementations are based on software acceleration techniques; Others, on techniques of purely hardware acceleration and some others combine the two techniques. In software implementations, several techniques are used in order to decrease the video coding and decoding time. We quote data parallelism, tasks parallelism and combined solutions. In the other hand, In order to fulfill the computational demands of the new standard, HEVC includes several coding tools that allow dividing each picture into several partitions that can be processed in parallel, without degrading neither the quality nor the bitrate. In this paper, we adapt one of these approaches, the Tile coding tool to propose a pipeline execution approach of the HEVC / h265 decoder application in its version HM Test model. This approach is based on a fine profiling by using code injection techniques supported by standard profiling tools such as Gprof and Valgrind. Profiling allowed us to divide functions into four groups according to three criteria: the first criterion is based on the minimization of communication between the different functions groups in order to have minimal intergroup communication and maximum intragroup communication. The second criterion is the load balancing between processors. The third criterion is the parallelism between functions. Experiments carried out in this paper are based on the Zedboard platform, which integrates a chip Zynq xilinx with a dual core ARM A9. We start with a purely

  2. Avian leukosis virus is a versatile eukaryotic platform for polypeptide display

    International Nuclear Information System (INIS)

    Khare, Pranay D.; Russell, Stephen J.; Federspiel, Mark J.

    2003-01-01

    Display technology refers to methods of generating libraries of modularly coded biomolecules and screening them for particular properties. Retroviruses are good candidates to be a eukaryotic viral platform for the display of polypeptides synthesized in eukaryotic cells. Here we demonstrate that avian leukosis virus (ALV) provides an ideal platform for display of nonviral polyaeptides expressed in a eukaryotic cell substrate. Different sizes of polypeptides were genetically fused to the extreme N-terminus of the ALV envelope glycoprotein in an ALV infectious clone containing an alkaline phosphatase reporter gene. The chimeric envelope glycoproteins were efficiently incorporated into virions and were stably displayed on the surface of the virions through multiple virus replication cycles. The foreign polypeptides did not interfere with the attachment and entry functions of the underlying ALV envelope glycoproteins. The displayed polypeptides were fully functional and could efficiently mediate attachment of the recombinant viruses to their respective cognate receptors. This study demonstrates that ALV is an ideal display platform for the generation and selection of libraries of polypeptides where there is a need for expression, folding, and posttranslational modification in the endoplasmic reticulum of eukaryotic cells

  3. ZIVIS: A City Computing Platform Based on Volunteer Computing

    International Nuclear Information System (INIS)

    Antoli, B.; Castejon, F.; Giner, A.; Losilla, G.; Reynolds, J. M.; Rivero, A.; Sangiao, S.; Serrano, F.; Tarancon, A.; Valles, R.; Velasco, J. L.

    2007-01-01

    Abstract Volunteer computing has come up as a new form of distributed computing. Unlike other computing paradigms like Grids, which use to be based on complex architectures, volunteer computing has demonstrated a great ability to integrate dispersed, heterogeneous computing resources with ease. This article presents ZIVIS, a project which aims to deploy a city-wide computing platform in Zaragoza (Spain). ZIVIS is based on BOINC (Berkeley Open Infrastructure for Network Computing), a popular open source framework to deploy volunteer and desktop grid computing systems. A scientific code which simulates the trajectories of particles moving inside a stellarator fusion device, has been chosen as the pilot application of the project. In this paper we describe the approach followed to port the code to the BOINC framework as well as some novel techniques, based on standard Grid protocols, we have used to access the output data present in the BOINC server from a remote visualizer. (Author)

  4. OpenVX-based Python Framework for real-time cross platform acceleration of embedded computer vision applications

    Directory of Open Access Journals (Sweden)

    Ori Heimlich

    2016-11-01

    Full Text Available Embedded real-time vision applications are being rapidly deployed in a large realm of consumer electronics, ranging from automotive safety to surveillance systems. However, the relatively limited computational power of embedded platforms is considered as a bottleneck for many vision applications, necessitating optimization. OpenVX is a standardized interface, released in late 2014, in an attempt to provide both system and kernel level optimization to vision applications. With OpenVX, Vision processing are modeled with coarse-grained data flow graphs, which can be optimized and accelerated by the platform implementer. Current full implementations of OpenVX are given in the programming language C, which does not support advanced programming paradigms such as object-oriented, imperative and functional programming, nor does it have runtime or type-checking. Here we present a python-based full Implementation of OpenVX, which eliminates much of the discrepancies between the object-oriented paradigm used by many modern applications and the native C implementations. Our open-source implementation can be used for rapid development of OpenVX applications in embedded platforms. Demonstration includes static and real-time image acquisition and processing using a Raspberry Pi and a GoPro camera. Code is given as supplementary information. Code project and linked deployable virtual machine are located on GitHub: https://github.com/NBEL-lab/PythonOpenVX.

  5. A 3D Monte Carlo code for plasma transport in island divertors

    International Nuclear Information System (INIS)

    Feng, Y.; Sardei, F.; Kisslinger, J.; Grigull, P.

    1997-01-01

    A fully 3D self-consistent Monte Carlo code EMC3 (edge Monte Carlo 3D) for modelling the plasma transport in island divertors has been developed. In a first step, the code solves a simplified version of the 3D time-independent plasma fluid equations. Coupled to the neutral transport code EIRENE, the EMC3 code has been used to study the particle, energy and neutral transport in W7-AS island divertor configurations. First results are compared with data from different diagnostics (Langmuir probes, H α cameras and thermography). (orig.)

  6. Hybrid coded aperture and Compton imaging using an active mask

    International Nuclear Information System (INIS)

    Schultz, L.J.; Wallace, M.S.; Galassi, M.C.; Hoover, A.S.; Mocko, M.; Palmer, D.M.; Tornga, S.R.; Kippen, R.M.; Hynes, M.V.; Toolin, M.J.; Harris, B.; McElroy, J.E.; Wakeford, D.; Lanza, R.C.; Horn, B.K.P.; Wehe, D.K.

    2009-01-01

    The trimodal imager (TMI) images gamma-ray sources from a mobile platform using both coded aperture (CA) and Compton imaging (CI) modalities. In this paper we will discuss development and performance of image reconstruction algorithms for the TMI. In order to develop algorithms in parallel with detector hardware we are using a GEANT4 [J. Allison, K. Amako, J. Apostolakis, H. Araujo, P.A. Dubois, M. Asai, G. Barrand, R. Capra, S. Chauvie, R. Chytracek, G. Cirrone, G. Cooperman, G. Cosmo, G. Cuttone, G. Daquino, et al., IEEE Trans. Nucl. Sci. NS-53 (1) (2006) 270] based simulation package to produce realistic data sets for code development. The simulation code incorporates detailed detector modeling, contributions from natural background radiation, and validation of simulation results against measured data. Maximum likelihood algorithms for both imaging methods are discussed, as well as a hybrid imaging algorithm wherein CA and CI information is fused to generate a higher fidelity reconstruction.

  7. ScaMo: Realisation of an OO-functional DSL for cross platform mobile applications development

    Science.gov (United States)

    Macos, Dragan; Solymosi, Andreas

    2013-10-01

    The software market is dynamically changing: the Internet is going mobile, the software applications are shifting from the desktop hardware onto the mobile devices. The largest markets are the mobile applications for iOS, Android and Windows Phone and for the purpose the typical programming languages include Objective-C, Java and C ♯. The realization of the native applications implies the integration of the developed software into the environments of mentioned mobile operating systems to enable the access to different hardware components of the devices: GPS module, display, GSM module, etc. This paper deals with the definition and possible implementation of an environment for the automatic application generation for multiple mobile platforms. It is based on a DSL for mobile application development, which includes the programming language Scala and a DSL defined in Scala. As part of a multi-stage cross-compiling algorithm, this language is translated into the language of the affected mobile platform. The advantage of our method lies in the expressiveness of the defined language and the transparent source code translation between different languages, which implies, for example, the advantages of debugging and development of the generated code.

  8. Japanese national project for establishment of codes and standards for stationary PEFC system

    International Nuclear Information System (INIS)

    Sumi, S.; Ohmura, T.; Yamaguchi, R.; Kikuzawa, H.

    2003-01-01

    For the purpose of practical utilization of the PEFC cogeneration system, we are promoting the national projects of the 'Establishment of Codes and Standards for Stationary PEFC System'. The objective is to prepare the software platforms for wide spreading use, which are required in the introduction stage of the PEFC cogeneration systems, such as code and standards for safety, reliability, performance and so on. For this objective, using test samples of the systems and the stacks, developments of test and evaluation devices, collection of various kinds of data and establishment of test and evaluation methods are under way. (author)

  9. WCDMA Uplink Interference Assessment from Multiple High Altitude Platform Configurations

    Directory of Open Access Journals (Sweden)

    Grace D

    2008-01-01

    Full Text Available Abstract We investigate the possibility of multiple high altitude platform (HAP coverage of a common cell area using a wideband code division multiple access (WCDMA system. In particular, we study the uplink system performance of the system. The results show that depending on the traffic demand and the type of service used, there is a possibility of deploying 3–6 HAPs covering the same cell area. The results also show the effect of cell radius on performance and the position of the multiple HAP base stations which give the worst performance.

  10. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    Science.gov (United States)

    Moon, Hongsik

    changing computer hardware platforms in order to provide fast, accurate and efficient solutions to large, complex electromagnetic problems. The research in this dissertation proves that the performance of parallel code is intimately related to the configuration of the computer hardware and can be maximized for different hardware platforms. To benchmark and optimize the performance of parallel CEM software, a variety of large, complex projects are created and executed on a variety of computer platforms. The computer platforms used in this research are detailed in this dissertation. The projects run as benchmarks are also described in detail and results are presented. The parameters that affect parallel CEM software on High Performance Computing Clusters (HPCC) are investigated. This research demonstrates methods to maximize the performance of parallel CEM software code.

  11. PHoToNs–A parallel heterogeneous and threads oriented code for cosmological N-body simulation

    Science.gov (United States)

    Wang, Qiao; Cao, Zong-Yan; Gao, Liang; Chi, Xue-Bin; Meng, Chen; Wang, Jie; Wang, Long

    2018-06-01

    We introduce a new code for cosmological simulations, PHoToNs, which incorporates features for performing massive cosmological simulations on heterogeneous high performance computer (HPC) systems and threads oriented programming. PHoToNs adopts a hybrid scheme to compute gravitational force, with the conventional Particle-Mesh (PM) algorithm to compute the long-range force, the Tree algorithm to compute the short range force and the direct summation Particle-Particle (PP) algorithm to compute gravity from very close particles. A self-similar space filling a Peano-Hilbert curve is used to decompose the computing domain. Threads programming is advantageously used to more flexibly manage the domain communication, PM calculation and synchronization, as well as Dual Tree Traversal on the CPU+MIC platform. PHoToNs scales well and efficiency of the PP kernel achieves 68.6% of peak performance on MIC and 74.4% on CPU platforms. We also test the accuracy of the code against the much used Gadget-2 in the community and found excellent agreement.

  12. Role of independent director in corporate governance – Reference to India

    Directory of Open Access Journals (Sweden)

    Indrajit Dube

    2013-01-01

    Full Text Available A company is the common platform of various stakeholders, such as customers, employees, investors, shareholders etc.. It is an instrument that can attract huge capital for doing business. Every transaction in a company should be fair and transparent to its stakeholders. A company having good Corporate Governance and an effective Board of Directors attract investors and ensure investment. Independence of the Board is critical to ensure that the board fulfills its role objectively and holds the management accountable to the company. The practice across jurisdictions indicates that the presence of Independent Director is answer to that. The present write up delves into the current scenario in Indian Corporate Sector and examine the role of Independent Director in Corporate Governance, in particular.

  13. KubeNow: A Cloud Agnostic Platform for Microservice-Oriented Applications

    OpenAIRE

    Capuccini, Marco; Larsson, Anders; Toor, Salman; Spjuth, Ola

    2018-01-01

    KubeNow is a platform for rapid and continuous deployment of microservice-based applications over cloud infrastructure. Within the field of software engineering, the microservice-based architecture is a methodology in which complex applications are divided into smaller, more narrow services. These services are independently deployable and compatible with each other like building blocks. These blocks can be combined in multiple ways, according to specific use cases. Microservices are designed ...

  14. Mobile platform security

    CERN Document Server

    Asokan, N; Dmitrienko, Alexandra

    2013-01-01

    Recently, mobile security has garnered considerable interest in both the research community and industry due to the popularity of smartphones. The current smartphone platforms are open systems that allow application development, also for malicious parties. To protect the mobile device, its user, and other mobile ecosystem stakeholders such as network operators, application execution is controlled by a platform security architecture. This book explores how such mobile platform security architectures work. We present a generic model for mobile platform security architectures: the model illustrat

  15. NARMER-1: a photon point-kernel code with build-up factors

    Science.gov (United States)

    Visonneau, Thierry; Pangault, Laurence; Malouch, Fadhel; Malvagi, Fausto; Dolci, Florence

    2017-09-01

    This paper presents an overview of NARMER-1, the new generation of photon point-kernel code developed by the Reactor Studies and Applied Mathematics Unit (SERMA) at CEA Saclay Center. After a short introduction giving some history points and the current context of development of the code, the paper exposes the principles implemented in the calculation, the physical quantities computed and surveys the generic features: programming language, computer platforms, geometry package, sources description, etc. Moreover, specific and recent features are also detailed: exclusion sphere, tetrahedral meshes, parallel operations. Then some points about verification and validation are presented. Finally we present some tools that can help the user for operations like visualization and pre-treatment.

  16. Developing a Web Platform to Support a Community of Practice: A Mixed Methods Study in Pediatric Physiotherapy.

    Science.gov (United States)

    Pratte, Gabrielle; Hurtubise, Karen; Rivard, Lisa; Berbari, Jade; Camden, Chantal

    2018-01-01

    Web platforms are increasingly used to support virtual interactions between members of communities of practice (CoP). However, little is known about how to develop these platforms to support the implementation of best practices for health care professionals. The aim of this article is to explore pediatric physiotherapists' (PTs) perspectives regarding the utility and usability of the characteristic of a web platform developed to support virtual communities of practice (vCoP). This study adopted an explanatory sequential mixed methods design. A web platform supporting the interactions of vCoP members was developed for PTs working with children with developmental coordination disorder. Specific strategies and features were created to support the effectiveness of the platform across three domains: social, information-quality, and system-quality factors. Quantitative data were collected from a cross-sectional survey (n = 41) after 5 months of access to the web platform. Descriptive statistics were calculated. Qualitative data were also collected from semistructured interviews (n = 9), which were coded, interpreted, and analyzed by using Boucher's Web Ergonomics Conceptual Framework. The utility of web platform characteristics targeting the three key domain factors were generally perceived positively by PTs. However, web platform usability issues were noted by PTs, including problems with navigation and information retrieval. Web platform aiming to support vCoP should be carefully developed to target potential users' needs. Whenever possible, users should co-construct the web platform with vCoP developers. Moreover, each of the developed characteristics (eg, newsletter, search function) should be evaluated in terms of utility and usability for the users.

  17. Lattice Boltzmann Simulation Optimization on Leading Multicore Platforms

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Samuel; Carter, Jonathan; Oliker, Leonid; Shalf, John; Yelick, Katherine

    2008-02-01

    We present an auto-tuning approach to optimize application performance on emerging multicore architectures. The methodology extends the idea of search-based performance optimizations, popular in linear algebra and FFT libraries, to application-specific computational kernels. Our work applies this strategy to a lattice Boltzmann application (LBMHD) that historically has made poor use of scalar microprocessors due to its complex data structures and memory access patterns. We explore one of the broadest sets of multicore architectures in the HPC literature, including the Intel Clovertown, AMD Opteron X2, Sun Niagara2, STI Cell, as well as the single core Intel Itanium2. Rather than hand-tuning LBMHD for each system, we develop a code generator that allows us identify a highly optimized version for each platform, while amortizing the human programming effort. Results show that our auto-tuned LBMHD application achieves up to a 14x improvement compared with the original code. Additionally, we present detailed analysis of each optimization, which reveal surprising hardware bottlenecks and software challenges for future multicore systems and applications.

  18. Lattice Boltzmann simulation optimization on leading multicore platforms

    Energy Technology Data Exchange (ETDEWEB)

    Williams, S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States); Carter, J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, L. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Shalf, J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yelick, K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States)

    2008-01-01

    We present an auto-tuning approach to optimize application performance on emerging multicore architectures. The methodology extends the idea of searchbased performance optimizations, popular in linear algebra and FFT libraries, to application-specific computational kernels. Our work applies this strategy to a lattice Boltzmann application (LBMHD) that historically has made poor use of scalar microprocessors due to its complex data structures and memory access patterns. We explore one of the broadest sets of multicore architectures in the HPC literature, including the Intel Clovertown, AMD Opteron X2, Sun Niagara2, STI Cell, as well as the single core Intel Itanium2. Rather than hand-tuning LBMHD for each system, we develop a code generator that allows us identify a highly optimized version for each platform, while amortizing the human programming effort. Results show that our autotuned LBMHD application achieves up to a 14 improvement compared with the original code. Additionally, we present detailed analysis of each optimization, which reveal surprising hardware bottlenecks and software challenges for future multicore systems and applications.

  19. Verification of the MOTIF code version 3.0

    International Nuclear Information System (INIS)

    Chan, T.; Guvanasen, V.; Nakka, B.W.; Reid, J.A.K.; Scheier, N.W.; Stanchell, F.W.

    1996-12-01

    As part of the Canadian Nuclear Fuel Waste Management Program (CNFWMP), AECL has developed a three-dimensional finite-element code, MOTIF (Model Of Transport In Fractured/ porous media), for detailed modelling of groundwater flow, heat transport and solute transport in a fractured rock mass. The code solves the transient and steady-state equations of groundwater flow, solute (including one-species radionuclide) transport, and heat transport in variably saturated fractured/porous media. The initial development was completed in 1985 (Guvanasen 1985) and version 3.0 was completed in 1986. This version is documented in detail in Guvanasen and Chan (in preparation). This report describes a series of fourteen verification cases which has been used to test the numerical solution techniques and coding of MOTIF, as well as demonstrate some of the MOTIF analysis capabilities. For each case the MOTIF solution has been compared with a corresponding analytical or independently developed alternate numerical solution. Several of the verification cases were included in Level 1 of the International Hydrologic Code Intercomparison Project (HYDROCOIN). The MOTIF results for these cases were also described in the HYDROCOIN Secretariat's compilation and comparison of results submitted by the various project teams (Swedish Nuclear Power Inspectorate 1988). It is evident from the graphical comparisons presented that the MOTIF solutions for the fourteen verification cases are generally in excellent agreement with known analytical or numerical solutions obtained from independent sources. This series of verification studies has established the ability of the MOTIF finite-element code to accurately model the groundwater flow and solute and heat transport phenomena for which it is intended. (author). 20 refs., 14 tabs., 32 figs

  20. Smoothing-Based Relative Navigation and Coded Aperture Imaging

    Science.gov (United States)

    Saenz-Otero, Alvar; Liebe, Carl Christian; Hunter, Roger C.; Baker, Christopher

    2017-01-01

    This project will develop an efficient smoothing software for incremental estimation of the relative poses and velocities between multiple, small spacecraft in a formation, and a small, long range depth sensor based on coded aperture imaging that is capable of identifying other spacecraft in the formation. The smoothing algorithm will obtain the maximum a posteriori estimate of the relative poses between the spacecraft by using all available sensor information in the spacecraft formation.This algorithm will be portable between different satellite platforms that possess different sensor suites and computational capabilities, and will be adaptable in the case that one or more satellites in the formation become inoperable. It will obtain a solution that will approach an exact solution, as opposed to one with linearization approximation that is typical of filtering algorithms. Thus, the algorithms developed and demonstrated as part of this program will enhance the applicability of small spacecraft to multi-platform operations, such as precisely aligned constellations and fractionated satellite systems.

  1. Operational reactor physics analysis codes (ORPAC)

    International Nuclear Information System (INIS)

    Kumar, Jainendra; Singh, K.P.; Singh, Kanchhi

    2007-07-01

    For efficient, smooth and safe operation of a nuclear research reactor, many reactor physics evaluations are regularly required. As part of reactor core management the important activities are maintaining core reactivity status, core power distribution, xenon estimations, safety evaluation of in-pile irradiation samples and experimental assemblies and assessment of nuclear safety in fuel handling/storage. In-pile irradiation of samples requires a prior estimation of the reactivity load due to the sample, the heating rate and the activity developed in it during irradiation. For the safety of personnel handling irradiated samples the dose rate at the surface of shielded flask housing the irradiated sample should be less than 200 mR/Hr.Therefore, a proper shielding and radioactive cooling of the irradiated sample are required to meet the said requirement. Knowledge of xenon load variation with time (Startup-curve) helps in estimating Xenon override time. Monitoring of power in individual fuel channels during reactor operation is essential to know any abnormal power distribution to avoid unsafe situations. Complexities in the estimation of above mentioned reactor parameters and their frequent requirement compel one to use computer codes to avoid possible human errors. For efficient and quick evaluation of parameters related to reactor operations such as xenon load, critical moderator height and nuclear heating and reactivity load of isotope samples/experimental assembly, a computer code ORPAC (Operational Reactor Physics Analysis Codes) has been developed. This code is being used for regular assessment of reactor physics parameters in Dhruva and Cirus. The code ORPAC written in Visual Basic 6.0 environment incorporates several important operational reactor physics aspects on a single platform with graphical user interfaces (GUI) to make it more user-friendly and presentable. (author)

  2. A dual-color fluorescence-based platform to identify selective inhibitors of Akt signaling.

    Directory of Open Access Journals (Sweden)

    Aranzazú Rosado

    Full Text Available BACKGROUND: Inhibition of Akt signaling is considered one of the most promising therapeutic strategies for many cancers. However, rational target-orientated approaches to cell based drug screens for anti-cancer agents have historically been compromised by the notorious absence of suitable control cells. METHODOLOGY/PRINCIPAL FINDINGS: In order to address this fundamental problem, we have developed BaFiso, a live-cell screening platform to identify specific inhibitors of this pathway. BaFiso relies on the co-culture of isogenic cell lines that have been engineered to sustain interleukin-3 independent survival of the parental Ba/F3 cells, and that are individually tagged with different fluorescent proteins. Whilst in the first of these two lines cell survival in the absence of IL-3 is dependent on the expression of activated Akt, the cells expressing constitutively-activated Stat5 signaling display IL-3 independent growth and survival in an Akt-independent manner. Small molecules can then be screened in these lines to identify inhibitors that rescue IL-3 dependence. CONCLUSIONS/SIGNIFICANCE: BaFiso measures differential cell survival using multiparametric live cell imaging and permits selective inhibitors of Akt signaling to be identified. BaFiso is a platform technology suitable for the identification of small molecule inhibitors of IL-3 mediated survival signaling.

  3. Validation of the AZTRAN 1.1 code with problems Benchmark of LWR reactors; Validacion del codigo AZTRAN 1.1 con problemas Benchmark de reactores LWR

    Energy Technology Data Exchange (ETDEWEB)

    Vallejo Q, J. A.; Bastida O, G. E.; Francois L, J. L. [UNAM, Facultad de Ingenieria, Departamento de Sistemas Energeticos, Ciudad Universitaria, 04510 Ciudad de Mexico (Mexico); Xolocostli M, J. V.; Gomez T, A. M., E-mail: amhed.jvq@gmail.com [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2016-09-15

    The AZTRAN module is a computational program that is part of the AZTLAN platform (Mexican modeling platform for the analysis and design of nuclear reactors) and that solves the neutron transport equation in 3-dimensional using the discrete ordinates method S{sub N}, steady state and Cartesian geometry. As part of the activities of Working Group 4 (users group) of the AZTLAN project, this work validates the AZTRAN code using the 2002 Yamamoto Benchmark for LWR reactors. For comparison, the commercial code CASMO-4 and the free code Serpent-2 are used; in addition, the results are compared with the data obtained from an article of the PHYSOR 2002 conference. The Benchmark consists of a fuel pin, two UO{sub 2} cells and two other of MOX cells; there is a problem of each cell for each type of reactor PWR and BWR. Although the AZTRAN code is at an early stage of development, the results obtained are encouraging and close to those reported with other internationally accepted codes and methodologies. (Author)

  4. Development of a Modular Research Platform to Create Medical Observational Studies for Mobile Devices.

    Science.gov (United States)

    Zens, Martin; Grotejohann, Birgit; Tassoni, Adrian; Duttenhoefer, Fabian; Südkamp, Norbert P; Niemeyer, Philipp

    2017-05-23

    Observational studies have proven to be a valuable resource in medical research, especially when performed on a large scale. Recently, mobile device-based observational studies have been discovered by an increasing number of researchers as a promising new source of information. However, the development and deployment of app-based studies is not trivial and requires profound programming skills. The aim of this project was to develop a modular online research platform that allows researchers to create medical studies for mobile devices without extensive programming skills. The platform approach for a modular research platform consists of three major components. A Web-based platform forms the researchers' main workplace. This platform communicates via a shared database with a platform independent mobile app. Furthermore, a separate Web-based login platform for physicians and other health care professionals is outlined and completes the concept. A prototype of the research platform has been developed and is currently in beta testing. Simple questionnaire studies can be created within minutes and published for testing purposes. Screenshots of an example study are provided, and the general working principle is displayed. In this project, we have created a basis for a novel research platform. The necessity and implications of a modular approach were displayed and an outline for future development given. International researchers are invited and encouraged to participate in this ongoing project. ©Martin Zens, Birgit Grotejohann, Adrian Tassoni, Fabian Duttenhoefer, Norbert P Südkamp, Philipp Niemeyer. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 23.05.2017.

  5. Platform-based production development

    DEFF Research Database (Denmark)

    Bossen, Jacob; Brunoe, Thomas Ditlev; Nielsen, Kjeld

    2015-01-01

    Platforms as a means for applying modular thinking in product development is relatively well studied, but platforms in the production system has until now not been given much attention. With the emerging concept of platform-based co-development the importance of production platforms is though...

  6. User's manual for the G.T.M.-1 computer code

    International Nuclear Information System (INIS)

    Prado-Herrero, P.

    1992-01-01

    This document describes the GTM-1 ( Geosphere Transport Model, release-1) computer code and is intended to provide the reader with enough detailed information in order to use the code. GTM-1 was developed for the assessment of radionuclide migration by the ground water through geologic deposits whose properties can change along the pathway.GTM-1 solves the transport equation by the finite differences method ( Crank-Nicolson scheme ). It was developped for specific use within Probabilistic System Assessment (PSA) Monte Carlo Method codes; in this context the first application of GTM-1 was within the LISA (Long Term Isolation System Assessment) code. GTM-1 is also available as an independent model, which includes various submodels simulating a multi-barrier disposal system. The code has been tested with the PSACOIN ( Probabilistic System Assessment Codes intercomparison) benchmarks exercises from PSAC User Group (OECD/NEA). 10 refs., 6 Annex., 2 tabs

  7. Implementation of Wireless Communications Systems on FPGA-Based Platforms

    Directory of Open Access Journals (Sweden)

    Voros NS

    2007-01-01

    Full Text Available Wireless communications are a very popular application domain. The efficient implementation of their components (access points and mobile terminals/network interface cards in terms of hardware cost and design time is of great importance. This paper describes the design and implementation of the HIPERLAN/2 WLAN system on a platform including general purpose microprocessors and FPGAs. Detailed implementation results (performance, code size, and FPGA resources utilization are presented. The main goal of the design case presented is to provide insight into the design aspects of a complex system based on FPGAs. The results prove that an implementation based on microprocessors and FPGAs is adequate for the access point part of the system where the expected volumes are rather small. At the same time, such an implementation serves as a prototyping of an integrated implementation (System-on-Chip, which is necessary for the mobile terminals of a HIPERLAN/2 system. Finally, firmware upgrades were developed allowing the implementation of an outdoor wireless communication system on the same platform.

  8. Evaluation of the General Atomic codes TAP and RECA for HTGR accident analyses

    International Nuclear Information System (INIS)

    Ball, S.J.; Cleveland, J.C.; Sanders, J.P.

    1978-01-01

    The General Atomic codes TAP (Transient Analysis Program) and RECA (Reactor Emergency Cooling Analysis) are evaluated with respect to their capability for predicting the dynamic behavior of high-temperature gas-cooled reactors (HTGRs) for postulated accident conditions. Several apparent modeling problems are noted, and the susceptibility of the codes to misuse and input errors is discussed. A critique of code verification plans is also included. The several cases where direct comparisons could be made between TAP/RECA calculations and those based on other independently developed codes indicated generally good agreement, thus contributing to the credibility of the codes

  9. Anser EMT: the first open-source electromagnetic tracking platform for image-guided interventions.

    Science.gov (United States)

    Jaeger, Herman Alexander; Franz, Alfred Michael; O'Donoghue, Kilian; Seitel, Alexander; Trauzettel, Fabian; Maier-Hein, Lena; Cantillon-Murphy, Pádraig

    2017-06-01

    Electromagnetic tracking is the gold standard for instrument tracking and navigation in the clinical setting without line of sight. Whilst clinical platforms exist for interventional bronchoscopy and neurosurgical navigation, the limited flexibility and high costs of electromagnetic tracking (EMT) systems for research investigations mitigate against a better understanding of the technology's characterisation and limitations. The Anser project provides an open-source implementation for EMT with particular application to image-guided interventions. This work provides implementation schematics for our previously reported EMT system which relies on low-cost acquisition and demodulation techniques using both National Instruments and Arduino hardware alongside MATLAB support code. The system performance is objectively compared to other commercial tracking platforms using the Hummel assessment protocol. Positional accuracy of 1.14 mm and angular rotation accuracy of [Formula: see text] are reported. Like other EMT platforms, Anser is susceptible to tracking errors due to eddy current and ferromagnetic distortion. The system is compatible with commercially available EMT sensors as well as the Open Network Interface for image-guided therapy (OpenIGTLink) for easy communication with visualisation and medical imaging toolkits such as MITK and 3D Slicer. By providing an open-source platform for research investigations, we believe that novel and collaborative approaches can overcome the limitations of current EMT technology.

  10. Applications of the ARGUS code in accelerator physics

    International Nuclear Information System (INIS)

    Petillo, J.J.; Mankofsky, A.; Krueger, W.A.; Kostas, C.; Mondelli, A.A.; Drobot, A.T.

    1993-01-01

    ARGUS is a three-dimensional, electromagnetic, particle-in-cell (PIC) simulation code that is being distributed to U.S. accelerator laboratories in collaboration between SAIC and the Los Alamos Accelerator Code Group. It uses a modular architecture that allows multiple physics modules to share common utilities for grid and structure input., memory management, disk I/O, and diagnostics, Physics modules are in place for electrostatic and electromagnetic field solutions., frequency-domain (eigenvalue) solutions, time- dependent PIC, and steady-state PIC simulations. All of the modules are implemented with a domain-decomposition architecture that allows large problems to be broken up into pieces that fit in core and that facilitates the adaptation of ARGUS for parallel processing ARGUS operates on either Cray or workstation platforms, and MOTIF-based user interface is available for X-windows terminals. Applications of ARGUS in accelerator physics and design are described in this paper

  11. Platform decommissioning costs

    International Nuclear Information System (INIS)

    Rodger, David

    1998-01-01

    There are over 6500 platforms worldwide contributing to the offshore oil and gas production industry. In the North Sea there are around 500 platforms in place. There are many factors to be considered in planning for platform decommissioning and the evaluation of options for removal and disposal. The environmental impact, technical feasibility, safety and cost factors all have to be considered. This presentation considers what information is available about the overall decommissioning costs for the North Sea and the costs of different removal and disposal options for individual platforms. 2 figs., 1 tab

  12. CONTAIN code analyses of direct containment heating experiments

    International Nuclear Information System (INIS)

    Williams, D.C.; Griffith, R.O.; Tadios, E.L.; Washington, K.E.

    1995-01-01

    In some nuclear reactor core-melt accidents, a potential exists for molten core-debris to be dispersed into the containment under high pressure. Resulting energy transfer to the containment atmosphere can pressurize the containment. This process, known as direct containment heating (DCH), has been the subject of extensive experimental and analytical programs sponsored by the U.S. Nuclear Regulatory Commission (NRC). The DCH modeling has been an important focus for the development of the CONTAIN code. Results of a detailed independent peer review of the CONTAIN code were published recently. This paper summarizes work performed in support of the peer review in which the CONTAIN code was applied to analyze DCH experiments. Goals of this work were comparison of calculated and experimental results, CONTAIN DCH model assessment, and development of guidance for code users, including development of a standardized input prescription for DCH analysis

  13. Peer-review Platform for Astronomy Education Activities

    Science.gov (United States)

    Heenatigala, Thilina; Russo, Pedro; Gomez, Edward; Strubbe, Linda

    2015-08-01

    Astronomy educators and teachers worldwide commonly request and search for high-quality astronomy activities to do with their students. Hundreds of astronomy education activities exist, as well as many resource repositories to find them. However, the quality of such resources is highly variable as they are not updated regularly or limited with content review. Since its launch in 2013, astroEDU has been addressing these issues and more by following a peer-review process. Each activity submitted is reviewed by an educator and a professional astronomer, balancing both the scientific and educational value of the content. Moreover, the majority of the reviewers are invited from IAU commissions related to the field of the activity, as an effort to get IAU members actively involved in the project. The website code, activities and layout design are open-access in order to make them accessible and adoptable for educators around the world. Furthermore the platform harnesses the OAD volunteer database to develop existing astronomy education activities into the astroEDU activity format. Published activities are also pushed to partner repositories and each activity is registered for DOI, allowing authors to cite their work. To further test the activities and improve the platform, astroEDU editorial team organises workshops.

  14. Product Platform Replacements

    DEFF Research Database (Denmark)

    Sköld, Martin; Karlsson, Christer

    2012-01-01

    . To shed light on this unexplored and growing managerial concern, the purpose of this explorative study is to identify operational challenges to management when product platforms are replaced. Design/methodology/approach – The study uses a longitudinal field-study approach. Two companies, Gamma and Omega...... replacement was chosen in each company. Findings – The study shows that platform replacements primarily challenge managers' existing knowledge about platform architectures. A distinction can be made between “width” and “height” in platform replacements, and it is crucial that managers observe this in order...... to challenge their existing knowledge about platform architectures. Issues on technologies, architectures, components and processes as well as on segments, applications and functions are identified. Practical implications – Practical implications are summarized and discussed in relation to a framework...

  15. Reduction of product platform complexity by vectorial Euclidean algorithm

    International Nuclear Information System (INIS)

    Navarrete, Israel Aguilera; Guzman, Alejandro A. Lozano

    2013-01-01

    In traditional machine, equipment and devices design, technical solutions are practically independent, thus increasing designs cost and complexity. Overcoming this situation has been tackled just using designer's experience. In this work, a product platform complexity reduction is presented based on a matrix representation of technical solutions versus product properties. This matrix represents the product platform. From this matrix, the Euclidean distances among technical solutions are obtained. Thus, the vectorial distances among technical solutions are identified in a new matrix of order of the number of technical solutions identified. This new matrix can be reorganized in groups with a hierarchical structure, in such a way that modular design of products is now more tractable. As a result of this procedure, the minimum vector distances are found thus being possible to identify the best technical solutions for the design problem raised. Application of these concepts is shown with two examples.

  16. arrayCGHbase: an analysis platform for comparative genomic hybridization microarrays

    Directory of Open Access Journals (Sweden)

    Moreau Yves

    2005-05-01

    Full Text Available Abstract Background The availability of the human genome sequence as well as the large number of physically accessible oligonucleotides, cDNA, and BAC clones across the entire genome has triggered and accelerated the use of several platforms for analysis of DNA copy number changes, amongst others microarray comparative genomic hybridization (arrayCGH. One of the challenges inherent to this new technology is the management and analysis of large numbers of data points generated in each individual experiment. Results We have developed arrayCGHbase, a comprehensive analysis platform for arrayCGH experiments consisting of a MIAME (Minimal Information About a Microarray Experiment supportive database using MySQL underlying a data mining web tool, to store, analyze, interpret, compare, and visualize arrayCGH results in a uniform and user-friendly format. Following its flexible design, arrayCGHbase is compatible with all existing and forthcoming arrayCGH platforms. Data can be exported in a multitude of formats, including BED files to map copy number information on the genome using the Ensembl or UCSC genome browser. Conclusion ArrayCGHbase is a web based and platform independent arrayCGH data analysis tool, that allows users to access the analysis suite through the internet or a local intranet after installation on a private server. ArrayCGHbase is available at http://medgen.ugent.be/arrayCGHbase/.

  17. On the structure of Lattice code WIMSD-5B

    International Nuclear Information System (INIS)

    Kim, Won Young; Min, Byung Joo

    2004-03-01

    The WIMS-D code is a freely available thermal reactor physics lattice code used widely for thermal research and power reactor calculation. Now the code WIMS-AECL, developed on the basis of WIMS-D, has been used as one of lattice codes for the cell calculation in Canada and also, in 1998, the latest version WIMSD-5B is released for OECD/NEA Data Bank. While WIMS-KAERI was developed and has been used, originated from WIMS-D, in Korea, it was adjusted for the cell calculation of research reactor HANARO and so it has no confirmaty to CANDU reactor. Therefore, the code development applicable to cell calculation of CANDU reactor is necessary not only for technological independence and but also for the establishment of CANDU safety analysis system. A lattice code WIMSD-5B was analyzed in order to set the system of reactor physics computer codes, to be used in the assessment of void reactivity effect. In order to improve and validate WIMSD-5B code, the analysis of the structure of WIMSD-5B lattice code was made and so its structure, algorithm and the subroutines of WIMSD-5B were presented for the cluster type and the pij method modelling the CANDU-6 fuel

  18. Structural code benchmarking for the analysis of impact response of nuclear material shipping casks

    International Nuclear Information System (INIS)

    Glass, R.E.

    1984-01-01

    The Transportation Technology Center at Sandia National Laboratories has initiated a program to benchmark thermal and structural codes that are available to the nuclear material transportation community. The program consists of the following five phrases: (1) code inventory and review, (2) development of a cask-like set of problems, (3) multiple independent numerical analyses of the problems, (4) transfer of information, and (5) performance of experiments to obtain data for comparison with the numerical analyses. This paper will summarize the results obtained by the independent numerical analyses. The analyses indicate the variability that can be expected both due to differences in user-controlled parameters and from code-to-code differences. The results show that in purely elastic analyses, differences can be attributed to user controlled parameters. Model problems involving elastic/plastic material behavior and large deformations, however, have greater variability with significant differences reported for implicit and explicit integration schemes in finite element programs. This variability demonstrates the need to obtain experimental data to properly benchmark codes utilizing elastic/plastic material models and large deformation capability

  19. A method for scientific code coupling in a distributed environment; Une methodologie pour le couplage de codes scientifiques en environnement distribue

    Energy Technology Data Exchange (ETDEWEB)

    Caremoli, C; Beaucourt, D; Chen, O; Nicolas, G; Peniguel, C; Rascle, P; Richard, N; Thai Van, D; Yessayan, A

    1994-12-01

    This guide book deals with coupling of big scientific codes. First, the context is introduced: big scientific codes devoted to a specific discipline coming to maturity, and more and more needs in terms of multi discipline studies. Then we describe different kinds of code coupling and an example of code coupling: 3D thermal-hydraulic code THYC and 3D neutronics code COCCINELLE. With this example we identify problems to be solved to realize a coupling. We present the different numerical methods usable for the resolution of coupling terms. This leads to define two kinds of coupling: with the leak coupling, we can use explicit methods, and with the strong coupling we need to use implicit methods. On both cases, we analyze the link with the way of parallelizing code. For translation of data from one code to another, we define the notion of Standard Coupling Interface based on a general structure for data. This general structure constitutes an intermediary between the codes, thus allowing a relative independence of the codes from a specific coupling. The proposed method for the implementation of a coupling leads to a simultaneous run of the different codes, while they exchange data. Two kinds of data communication with message exchange are proposed: direct communication between codes with the use of PVM product (Parallel Virtual Machine) and indirect communication with a coupling tool. This second way, with a general code coupling tool, is based on a coupling method, and we strongly recommended to use it. This method is based on the two following principles: re-usability, that means few modifications on existing codes, and definition of a code usable for coupling, that leads to separate the design of a code usable for coupling from the realization of a specific coupling. This coupling tool available from beginning of 1994 is described in general terms. (authors). figs., tabs.

  20. Semi-device-independent security of one-way quantum key distribution

    International Nuclear Information System (INIS)

    Pawlowski, Marcin; Brunner, Nicolas

    2011-01-01

    By testing nonlocality, the security of entanglement-based quantum key distribution (QKD) can be enhanced to being ''device-independent.'' Here we ask whether such a strong form of security could also be established for one-way (prepare and measure) QKD. While fully device-independent security is impossible, we show that security can be guaranteed against individual attacks in a semi-device-independent scenario. In the latter, the devices used by the trusted parties are noncharacterized, but the dimensionality of the quantum systems used in the protocol is assumed to be bounded. Our security proof relies on the analogies between one-way QKD, dimension witnesses, and random-access codes.

  1. Embedding QR codes in tumor board presentations, enhancing educational content for oncology information management.

    Science.gov (United States)

    Siderits, Richard; Yates, Stacy; Rodriguez, Arelis; Lee, Tina; Rimmer, Cheryl; Roche, Mark

    2011-01-01

    Quick Response (QR) Codes are standard in supply management and seen with increasing frequency in advertisements. They are now present regularly in healthcare informatics and education. These 2-dimensional square bar codes, originally designed by the Toyota car company, are free of license and have a published international standard. The codes can be generated by free online software and the resulting images incorporated into presentations. The images can be scanned by "smart" phones and tablets using either the iOS or Android platforms, which link the device with the information represented by the QR code (uniform resource locator or URL, online video, text, v-calendar entries, short message service [SMS] and formatted text). Once linked to the device, the information can be viewed at any time after the original presentation, saved in the device or to a Web-based "cloud" repository, printed, or shared with others via email or Bluetooth file transfer. This paper describes how we use QR codes in our tumor board presentations, discusses the benefits, the different QR codes from Web links and how QR codes facilitate the distribution of educational content.

  2. Computer codes in nuclear safety, radiation transport and dosimetry; Les codes de calcul en radioprotection, radiophysique et dosimetrie

    Energy Technology Data Exchange (ETDEWEB)

    Bordy, J M; Kodeli, I; Menard, St; Bouchet, J L; Renard, F; Martin, E; Blazy, L; Voros, S; Bochud, F; Laedermann, J P; Beaugelin, K; Makovicka, L; Quiot, A; Vermeersch, F; Roche, H; Perrin, M C; Laye, F; Bardies, M; Struelens, L; Vanhavere, F; Gschwind, R; Fernandez, F; Quesne, B; Fritsch, P; Lamart, St; Crovisier, Ph; Leservot, A; Antoni, R; Huet, Ch; Thiam, Ch; Donadille, L; Monfort, M; Diop, Ch; Ricard, M

    2006-07-01

    The purpose of this conference was to describe the present state of computer codes dedicated to radiation transport or radiation source assessment or dosimetry. The presentations have been parted into 2 sessions: 1) methodology and 2) uses in industrial or medical or research domains. It appears that 2 different calculation strategies are prevailing, both are based on preliminary Monte-Carlo calculations with data storage. First, quick simulations made from a database of particle histories built though a previous Monte-Carlo simulation and secondly, a neuronal approach involving a learning platform generated through a previous Monte-Carlo simulation. This document gathers the slides of the presentations.

  3. Methodology, status, and plans for development and assessment of the RELAP5 code

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, G.W.; Riemke, R.A. [Idaho National Engineering Laboratory, Idaho Falls, ID (United States)

    1997-07-01

    RELAP/MOD3 is a computer code used for the simulation of transients and accidents in light-water nuclear power plants. The objective of the program to develop and maintain RELAP5 was and is to provide the U.S. Nuclear Regulatory Commission with an independent tool for assessing reactor safety. This paper describes code requirements, models, solution scheme, language and structure, user interface validation, and documentation. The paper also describes the current and near term development program and provides an assessment of the code`s strengths and limitations.

  4. Promoting Independent Performance of Transition-Related Tasks Using a Palmtop PC-Based Self-Directed Visual and Auditory Prompting System

    Science.gov (United States)

    Riffel, Laura A.; Wehmeyer, Michael L.; Turnbull, Ann P.; Lattimore, Jennifer; Davies, Daniel; Stock, Steven; Fisher, Sherilyn

    2005-01-01

    This study examined the use of a palmtop computer running a software program by transition-age students with cognitive disabilities to increase independence on vocational and independent living tasks. The purpose of this research was to test the hypotheses that a palmtop computer utilizing a Windows CE platform with touch screen capabilities and…

  5. Product Platform Modeling

    DEFF Research Database (Denmark)

    Pedersen, Rasmus

    for customisation of products. In many companies these changes in the business environment have created a controversy between the need for a wide variety of products offered to the marketplace and a desire to reduce variation within the company in order to increase efficiency. Many companies use the concept...... other. These groups can be varied and combined to form different product variants without increasing the internal variety in the company. Based on the Theory of Domains, the concept of encapsulation in the organ domain is introduced, and organs are formulated as platform elements. Included......This PhD thesis has the title Product Platform Modelling. The thesis is about product platforms and visual product platform modelling. Product platforms have gained an increasing attention in industry and academia in the past decade. The reasons are many, yet the increasing globalisation...

  6. Review of the atmospheric propagation in the SPC codes. A progress report

    International Nuclear Information System (INIS)

    Wuebbles, D.J.; Connell, P.S.; Ipser, J.R.; Porch, W.M.; Rosen, L.C.; Knox, J.B.

    1986-10-01

    This is an initial progress report describing findings in critically analyzing and evaluating the atmospheric propagation submodels in the SPC1 and SPC2 models. These systems performance codes were developed by United Technologies Research Center as general purpose, end-to-end models for determining the overall effects on propagation of a laser beam from its source, either from the earth's surface or from an airborne platform, to a target. The SPC1 model is a trimmed down version of SPC2, while including the same coding for atmospheric propagation effects. As with other system models, the SPC codes attempt to include all essential processes to an accuracy commensurate with the use of the models for overall systems analysis and examination of system deployment scenarios. A basic conclusion of our study is that the SPC codes do appear to provide an appropriate framework for end-to-end model studies determining the overall impact of atmospheric effects on laser beam propagation. Nonetheless, our preliminary analysis has discovered a number of errors and limitations to the existing models. The modular structure of the codes will be an important benefit in making necessary improvements. 30 refs., 15 figs., 4 tabs

  7. Introducing Platform Interactions Model for Studying Multi-Sided Platforms

    DEFF Research Database (Denmark)

    Staykova, Kalina; Damsgaard, Jan

    2018-01-01

    Multi-Sided Platforms (MSPs) function as socio-technical entities that facilitate direct interactions between various affiliated to them constituencies through developing and managing IT architecture. In this paper, we aim to explain the nature of the platform interactions as key characteristic o...

  8. Online Independent Vocabulary Learning Experience of Hong Kong University Students

    Directory of Open Access Journals (Sweden)

    Eunice Tang

    2016-03-01

    Full Text Available In response to the limited vocabulary size of its undergraduates, an independent vocabulary learning platform, VLearn was designed and launched in a university in Hong Kong. As an elearning environment that supports self-directed vocabulary learning of Chinese learners, the primary aim of VLearn is to equip users with appropriate knowledge and skills for vocabulary expansion. This paper introduces the contents of VLearn, and the theoretical underpinnings of its design. It also reports on the vocabulary learning experience of its users during an eight week evaluation study. Suggestions are made on how independent vocabulary building at higher education, as well as comprehensive vocabulary instruction at early years could be supported by means of technology.

  9. Is Cognitive Activity of Speech Based On Statistical Independence?

    DEFF Research Database (Denmark)

    Feng, Ling; Hansen, Lars Kai

    2008-01-01

    This paper explores the generality of COgnitive Component Analysis (COCA), which is defined as the process of unsupervised grouping of data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. The hypothesis of {COCA} is ecological......: the essentially independent features in a context defined ensemble can be efficiently coded using a sparse independent component representation. Our devised protocol aims at comparing the performance of supervised learning (invoking cognitive activity) and unsupervised learning (statistical regularities) based...... on similar representations, and the only difference lies in the human inferred labels. Inspired by the previous research on COCA, we introduce a new pair of models, which directly employ the independent hypothesis. Statistical regularities are revealed at multiple time scales on phoneme, gender, age...

  10. MODEL OF COLLABORATIVE COURSES DEVELOPMENT IN DISTANCE LEARNING PLATFORMS

    Directory of Open Access Journals (Sweden)

    Dmytro S. Morozov

    2015-02-01

    Full Text Available The research paper outlines the problem of organization collaboration of users group on creation distance learning courses. The article contains analysis of the courses data structure. According to proposed structure the model of developer’s collaboration on creating distance learning courses based on basic principles of source code management was proposed. The article also provides result of research on necessary tools for collaborative development of courses in distance learning platforms. According to the requirements of flexibility and simplicity of access to system for any level educational institutions, technological decisions on granting permissions on performing basic operations on course elements and providing to user moderation’s privileges were proposed.

  11. USING THE GOOGLE APP ENGINE PLATFORM FOR TEACHING PROGRAMMING

    Directory of Open Access Journals (Sweden)

    Mariusz Dzieńkowski

    2012-12-01

    Full Text Available The article outlines the present situation connected with teaching programming to students of different levels of education in Polish schools. The observed negative trend towards marginalization of programming can be successfully reversed in education thanks to using the latest IT achievements such as cloud computing (CC. The paper presents ways in which the cloud computing technology can be used to teach how to develop and code Internet applications by means of the Google App Engine platform. The final part focuses on practical examples of programming problems involving cloud computing applications which may be solved in IT classes with students of different levels of education.

  12. Analysis of the anisotropy effects with the AZTRAN code

    International Nuclear Information System (INIS)

    Xolocostli, V.; Vargas, S.; Gomez, A.; Del Valle, E.

    2017-09-01

    Among the improvements that are made for the deterministic codes with which nuclear reactors are analyzed, is the implementation of the dispersion anisotropic dispersion section, which can obtain better results. With the current computing technology is possible to carry out these implementations, since the computation time is no longer a considerable problem as in the past. In this paper we analyze some effects of anisotropy in the AZTRAN code, a code that solves the Boltzmann transport equation in one, two and three dimensions at steady state, using the multigroup technique, the nodal method RTN-0 and ordered discrete, which is part of the AZTLAN platform for analysis of nuclear reactors, which is currently under development. The implementation of the anisotropy in the AZTRAN code is one of the latest improvements that have been made to the code, leading to different tests and analyzes regarding the anisotropic dispersion, some as a test with homogeneous fuel assemblies. In the case presented here, the benchmark problem of a fuel assembly type BWR is analyzed, which is part of the Benchmark problem suite for reactor physics study of LWR next generation fuels, proposed by the Committee on Reactor Physics organized by the Japan Atomic Energy Research Institute (JAERI). In this problem the behavior of the infinite multiplication factor (k inf ) is analyzed, as well as the behavior of using odd and even anisotropy approximation with respect to the symmetry in the radial power of the assembly. (Author)

  13. Implementation of the kinetics in the transport code AZTRAN

    International Nuclear Information System (INIS)

    Duran G, J. A.; Del Valle G, E.; Gomez T, A. M.

    2017-09-01

    This paper shows the implementation of the time dependence in the three-dimensional transport code AZTRAN (AZtlan TRANsport), which belongs to the AZTLAN platform, for the analysis of nuclear reactors (currently under development). The AZTRAN code with this implementation is able to numerically solve the time-dependent transport equation in XYZ geometry, for several energy groups, using the discrete ordinate method S n for the discretization of the angular variable, the nodal method RTN-0 for spatial discretization and method 0 for discretization in time. Initially, the code only solved the neutrons transport equation in steady state, so the implementation of the temporal part was made integrating the neutrons transport equation with respect to time and balance equations corresponding to the concentrations of delayed neutron precursors, for which method 0 was applied. After having directly implemented code kinetics, the improved quasi-static method was implemented, which is a tool for reducing computation time, where the angular flow is factored by the product of two functions called shape function and amplitude function, where the first is calculated for long time steps, called macro-steps and the second is resolved for small time steps called micro-steps. In the new version of AZTRAN several Benchmark problems that were taken from the literature were simulated, the problems used are of two and three dimensions which allowed corroborating the accuracy and stability of the code, showing in general in the reference tests a good behavior. (Author)

  14. AECL's advanced code program

    Energy Technology Data Exchange (ETDEWEB)

    McGee, G.; Ball, J. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)

    2012-07-01

    This paper discusses the advanced code project at AECL.Current suite of Analytical, Scientific and Design (ASD) computer codes in use by Canadian Nuclear Power Industry is mostly developed 20 or more years ago. It is increasingly difficult to develop and maintain. It consist of many independent tools and integrated analysis is difficult, time consuming and error-prone. The objectives of this project is to demonstrate that nuclear facility systems, structures and components meet their design objectives in terms of function, cost, and safety; demonstrate that the nuclear facility meets licensing requirements in terms of consequences of off-normal events; dose to public, workers, impact on environment and demonstrate that the nuclear facility meets operational requirements with respect to on-power fuelling and outage management.

  15. The definitive guide to Jython Python for the Java platform

    CERN Document Server

    Juneau, Josh; Ng, Victor; Soto, Leo; Wierzbicki, Frank

    2010-01-01

    Jython is an open source implementation of the high-level, dynamic, object-oriented scripting language Python seamlessly integrated with the Java platform. The predecessor to Jython, JPython, is certified as 100% Pure Java. Jython is freely available for both commercial and noncommercial use and is distributed with source code. Jython is complementary to Java. The Definitive Guide to Jython, written by the official Jython team leads, covers the latest Jython 2.5 (or 2.5.x) from the basics to the advanced features. This book begins with a brief introduction to the language and then journeys thr

  16. WCDMA Uplink Interference Assessment from Multiple High Altitude Platform Configurations

    Directory of Open Access Journals (Sweden)

    A. Mohammed

    2008-06-01

    Full Text Available We investigate the possibility of multiple high altitude platform (HAP coverage of a common cell area using a wideband code division multiple access (WCDMA system. In particular, we study the uplink system performance of the system. The results show that depending on the traffic demand and the type of service used, there is a possibility of deploying 3–6 HAPs covering the same cell area. The results also show the effect of cell radius on performance and the position of the multiple HAP base stations which give the worst performance.

  17. Mixture block coding with progressive transmission in packet video. Appendix 1: Item 2. M.S. Thesis

    Science.gov (United States)

    Chen, Yun-Chung

    1989-01-01

    Video transmission will become an important part of future multimedia communication because of dramatically increasing user demand for video, and rapid evolution of coding algorithm and VLSI technology. Video transmission will be part of the broadband-integrated services digital network (B-ISDN). Asynchronous transfer mode (ATM) is a viable candidate for implementation of B-ISDN due to its inherent flexibility, service independency, and high performance. According to the characteristics of ATM, the information has to be coded into discrete cells which travel independently in the packet switching network. A practical realization of an ATM video codec called Mixture Block Coding with Progressive Transmission (MBCPT) is presented. This variable bit rate coding algorithm shows how a constant quality performance can be obtained according to user demand. Interactions between codec and network are emphasized including packetization, service synchronization, flow control, and error recovery. Finally, some simulation results based on MBCPT coding with error recovery are presented.

  18. Space-Time Trellis Coded 8PSK Schemes for Rapid Rayleigh Fading Channels

    Directory of Open Access Journals (Sweden)

    Salam A. Zummo

    2002-05-01

    Full Text Available This paper presents the design of 8PSK space-time (ST trellis codes suitable for rapid fading channels. The proposed codes utilize the design criteria of ST codes over rapid fading channels. Two different approaches have been used. The first approach maximizes the symbol-wise Hamming distance (HD between signals leaving from or entering to the same encoder′s state. In the second approach, set partitioning based on maximizing the sum of squared Euclidean distances (SSED between the ST signals is performed; then, the branch-wise HD is maximized. The proposed codes were simulated over independent and correlated Rayleigh fading channels. Coding gains up to 4 dB have been observed over other ST trellis codes of the same complexity.

  19. The use of diagnostic coding in chiropractic practice

    DEFF Research Database (Denmark)

    Testern, Cecilie D; Hestbæk, Lise; French, Simon D

    2015-01-01

    BACKGROUND: Diagnostic coding has several potential benefits, including improving the feasibility of data collection for research and clinical audits and providing a common language to improve interdisciplinary collaboration. The primary aim of this study was to determine the views and perspectives......-2 PLUS) provided the 14 chiropractors with some experience in diagnostic coding, followed by an interview on the topic. The interviews were analysed thematically. The participating chiropractors and an independent coder applied ICPC-2 PLUS terms to the diagnoses of 10 patients. Then the level...... of agreement between the chiropractors and the coder was determined and Cohen's Kappa was used to determine the agreement beyond that expected by chance. RESULTS: From the interviews the three emerging themes were: 1) Advantages and disadvantages of using a clinical coding system in chiropractic practice, 2...

  20. Structured Low-Density Parity-Check Codes with Bandwidth Efficient Modulation

    Science.gov (United States)

    Cheng, Michael K.; Divsalar, Dariush; Duy, Stephanie

    2009-01-01

    In this work, we study the performance of structured Low-Density Parity-Check (LDPC) Codes together with bandwidth efficient modulations. We consider protograph-based LDPC codes that facilitate high-speed hardware implementations and have minimum distances that grow linearly with block sizes. We cover various higher- order modulations such as 8-PSK, 16-APSK, and 16-QAM. During demodulation, a demapper transforms the received in-phase and quadrature samples into reliability information that feeds the binary LDPC decoder. We will compare various low-complexity demappers and provide simulation results for assorted coded-modulation combinations on the additive white Gaussian noise and independent Rayleigh fading channels.

  1. Validation of the reactor dynamics code TRAB

    International Nuclear Information System (INIS)

    Raety, H.; Kyrki-Rajamaeki, R.; Rajamaeki, M.

    1991-05-01

    The one-dimensional reactor dynamics code TRAB (Transient Analysis code for BWRs) developed at VTT was originally designed for BWR analyses, but it can in its present version be used for various modelling purposes. The core model of TRAB can be used separately for LWR calculations. For PWR modelling the core model of TRAB has been coupled to circuit model SMABRE to form the SMATRA code. The versatile modelling capabilities of TRAB have been utilized also in analyses of e.g. the heating reactor SECURE and the RBMK-type reactor (Chernobyl). The report summarizes the extensive validation of TRAB. TRAB has been validated with benchmark problems, comparative calculations against independent analyses, analyses of start-up experiments of nuclear power plants and real plant transients. Comparative RBMES type reactor calculations have been made against Soviet simulations and the initial power excursion of the Chernobyl reactor accident has also been calculated with TRAB

  2. SU-E-T-112: An OpenCL-Based Cross-Platform Monte Carlo Dose Engine (oclMC) for Coupled Photon-Electron Transport

    International Nuclear Information System (INIS)

    Tian, Z; Shi, F; Folkerts, M; Qin, N; Jiang, S; Jia, X

    2015-01-01

    Purpose: Low computational efficiency of Monte Carlo (MC) dose calculation impedes its clinical applications. Although a number of MC dose packages have been developed over the past few years, enabling fast MC dose calculations, most of these packages were developed under NVidia’s CUDA environment. This limited their code portability to other platforms, hindering the introduction of GPU-based MC dose engines to clinical practice. To solve this problem, we developed a cross-platform fast MC dose engine named oclMC under OpenCL environment for external photon and electron radiotherapy. Methods: Coupled photon-electron simulation was implemented with standard analogue simulation scheme for photon transport and Class II condensed history scheme for electron transport. We tested the accuracy and efficiency of oclMC by comparing the doses calculated using oclMC and gDPM, a previously developed GPU-based MC code on NVidia GPU platform, for a 15MeV electron beam and a 6MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. We also tested code portability of oclMC on different devices, including an NVidia GPU, two AMD GPUs and an Intel CPU. Results: Satisfactory agreements were observed in all photon and electron cases, with ∼0.48%–0.53% average dose differences at regions within 10% isodose line for electron beam cases and ∼0.15%–0.17% for photon beam cases. It took oclMC 3–4 sec to perform transport simulation for electron beam on NVidia Titan GPU and 35–51 sec for photon beam, both with ∼0.5% statistical uncertainty. The computation was 6%–17% slower than gDPM due to the differences in both physics model and development environment, which is considered not significant for clinical applications. In terms of code portability, gDPM only runs on NVidia GPUs, while oclMC successfully runs on all the tested devices. Conclusion: oclMC is an accurate and fast MC dose engine. Its high cross-platform

  3. SU-E-T-112: An OpenCL-Based Cross-Platform Monte Carlo Dose Engine (oclMC) for Coupled Photon-Electron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Tian, Z; Shi, F; Folkerts, M; Qin, N; Jiang, S; Jia, X [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2015-06-15

    Purpose: Low computational efficiency of Monte Carlo (MC) dose calculation impedes its clinical applications. Although a number of MC dose packages have been developed over the past few years, enabling fast MC dose calculations, most of these packages were developed under NVidia’s CUDA environment. This limited their code portability to other platforms, hindering the introduction of GPU-based MC dose engines to clinical practice. To solve this problem, we developed a cross-platform fast MC dose engine named oclMC under OpenCL environment for external photon and electron radiotherapy. Methods: Coupled photon-electron simulation was implemented with standard analogue simulation scheme for photon transport and Class II condensed history scheme for electron transport. We tested the accuracy and efficiency of oclMC by comparing the doses calculated using oclMC and gDPM, a previously developed GPU-based MC code on NVidia GPU platform, for a 15MeV electron beam and a 6MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. We also tested code portability of oclMC on different devices, including an NVidia GPU, two AMD GPUs and an Intel CPU. Results: Satisfactory agreements were observed in all photon and electron cases, with ∼0.48%–0.53% average dose differences at regions within 10% isodose line for electron beam cases and ∼0.15%–0.17% for photon beam cases. It took oclMC 3–4 sec to perform transport simulation for electron beam on NVidia Titan GPU and 35–51 sec for photon beam, both with ∼0.5% statistical uncertainty. The computation was 6%–17% slower than gDPM due to the differences in both physics model and development environment, which is considered not significant for clinical applications. In terms of code portability, gDPM only runs on NVidia GPUs, while oclMC successfully runs on all the tested devices. Conclusion: oclMC is an accurate and fast MC dose engine. Its high cross-platform

  4. Coding theory on the m-extension of the Fibonacci p-numbers

    International Nuclear Information System (INIS)

    Basu, Manjusri; Prasad, Bandhu

    2009-01-01

    In this paper, we introduce a new Fibonacci G p,m matrix for the m-extension of the Fibonacci p-numbers where p (≥0) is integer and m (>0). Thereby, we discuss various properties of G p,m matrix and the coding theory followed from the G p,m matrix. In this paper, we establish the relations among the code elements for all values of p (nonnegative integer) and m(>0). We also show that the relation, among the code matrix elements for all values of p and m=1, coincides with the relation among the code matrix elements for all values of p [Basu M, Prasad B. The generalized relations among the code elements for Fibonacci coding theory. Chaos, Solitons and Fractals (2008). doi: 10.1016/j.chaos.2008.09.030]. In general, correct ability of the method increases as p increases but it is independent of m.

  5. Task Characterisation and Cross-Platform Programming Through System Identification

    Directory of Open Access Journals (Sweden)

    Theocharis Kyriacou

    2005-12-01

    Full Text Available Developing robust and reliable control code for autonomous mobile robots is difficult, because the interaction between a physical robot and the environment is highly complex, it is subject to noise and variation, and therefore partly unpredictable. This means that to date it is not possible to predict robot behaviour, based on theoretical models. Instead, current methods to develop robot control code still require a substantial trial-and-error component to the software design process. Such iterative refinement could be reduced, we argue, if a more profound theoretical understanding of robot-environment interaction existed. In this paper, we therefore present a modelling method that generates a faithful model of a robot's interaction with its environment, based on data logged while observing a physical robot's behaviour. Because this modelling method — nonlinear modelling using polynomials — is commonly used in the engineering discipline of system identification, we refer to it here as “robot identification”. We show in this paper that using robot identification to obtain a computer model of robot-environment interaction offers several distinct advantages: Very compact representations (one-line programs of the robot control program are generated The model can be analysed, for example through sensitivity analysis, leading to a better understanding of the essential parameters underlying the robot's behaviour, and The generated, compact robot code can be used for cross-platform robot programming, allowing fast transfer of robot code from one type of robot to another. We demonstrate these points through experiments with a Magellan Pro and a Nomad 200 mobile robot.

  6. Development of non-orthogonal and 2-dimensional numerical code TFC2D-BFC for fluid flow

    International Nuclear Information System (INIS)

    Park, Ju Yeop; In, Wang Kee; Chun, Tae Hyun; Oh, Dong Seok

    2000-09-01

    The development of algorithm for three dimensional non-orthogonal coordinate system has been made. The algorithm adopts a non-staggered grid system, Cartesian velocity components for independent variables of momentum equations and a SIMPLER algorithm for a pressure correction equation. Except the pressure correction method, the selected grid system and the selected independent variables for momentum equations have been widely used in a commercial code. It is well known that the SIMPLER is superior to the SIMPLE algorithm in the view of convergence rate. Using this algorithm, a two dimensional non-orthogonal numerical code has been completed. The code adopts a structured single square block in a computational domain with a uniform mesh interval. Consequently, any solid body existing in a flow field can be implemented in the numerical code through a blocked-off method which was devised by Patankar

  7. Benchmark Simulation for the Development of the Regulatory Audit Subchannel Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, G. H.; Song, C.; Woo, S. W. [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2012-05-15

    For the safe and reliable operation of a reactor, it is important to predict accurately the flow and temperature distributions in the thermal-hydraulic design of a reactor core. A subchannel approach can give the reasonable flow and temperature distributions with the short computing time. Korea Institute of Nuclear Safety (KINS) is presently reviewing new subchannel code, THALES, which will substitute for both THINC-IV and TORC code. To assess the prediction performance of THALES, KINS is developing the subchannel analysis code for the independent audit calculation. The code is based on workstation version of COBRA-IV-I. The main objective of the present study is to assess the performance of COBRA-IV-I code by comparing the simulation results with experimental ones for the sample problems

  8. Android platform based smartphones for a logistical remote association repair framework.

    Science.gov (United States)

    Lien, Shao-Fan; Wang, Chun-Chieh; Su, Juhng-Perng; Chen, Hong-Ming; Wu, Chein-Hsing

    2014-06-25

    The maintenance of large-scale systems is an important issue for logistics support planning. In this paper, we developed a Logistical Remote Association Repair Framework (LRARF) to aid repairmen in keeping the system available. LRARF includes four subsystems: smart mobile phones, a Database Management System (DBMS), a Maintenance Support Center (MSC) and wireless networks. The repairman uses smart mobile phones to capture QR-codes and the images of faulty circuit boards. The captured QR-codes and images are transmitted to the DBMS so the invalid modules can be recognized via the proposed algorithm. In this paper, the Linear Projective Transform (LPT) is employed for fast QR-code calibration. Moreover, the ANFIS-based data mining system is used for module identification and searching automatically for the maintenance manual corresponding to the invalid modules. The inputs of the ANFIS-based data mining system are the QR-codes and image features; the output is the module ID. DBMS also transmits the maintenance manual back to the maintenance staff. If modules are not recognizable, the repairmen and center engineers can obtain the relevant information about the invalid modules through live video. The experimental results validate the applicability of the Android-based platform in the recognition of invalid modules. In addition, the live video can also be recorded synchronously on the MSC for later use.

  9. Android Platform Based Smartphones for a Logistical Remote Association Repair Framework

    Directory of Open Access Journals (Sweden)

    Shao-Fan Lien

    2014-06-01

    Full Text Available The maintenance of large-scale systems is an important issue for logistics support planning. In this paper, we developed a Logistical Remote Association Repair Framework (LRARF to aid repairmen in keeping the system available. LRARF includes four subsystems: smart mobile phones, a Database Management System (DBMS, a Maintenance Support Center (MSC and wireless networks. The repairman uses smart mobile phones to capture QR-codes and the images of faulty circuit boards. The captured QR-codes and images are transmitted to the DBMS so the invalid modules can be recognized via the proposed algorithm. In this paper, the Linear Projective Transform (LPT is employed for fast QR-code calibration. Moreover, the ANFIS-based data mining system is used for module identification and searching automatically for the maintenance manual corresponding to the invalid modules. The inputs of the ANFIS-based data mining system are the QR-codes and image features; the output is the module ID. DBMS also transmits the maintenance manual back to the maintenance staff. If modules are not recognizable, the repairmen and center engineers can obtain the relevant information about the invalid modules through live video. The experimental results validate the applicability of the Android-based platform in the recognition of invalid modules. In addition, the live video can also be recorded synchronously on the MSC for later use.

  10. Tracking code patterns over multiple software versions with Herodotos

    DEFF Research Database (Denmark)

    Palix, Nicolas Jean-Michel; Lawall, Julia; Muller, Gilles

    2010-01-01

    An important element of understanding a software code base is to identify the repetitive patterns of code it contains and how these evolve over time. Some patterns are useful to the software, and may be modularized. Others are detrimental to the software, such as patterns that represent defects...... pattern occurrences over multiple versions of a software project, independent of other changes in the source files. Guided by a user-provided configuration file, Herodotos builds various graphs showing the evolution of the pattern occurrences and computes some statistics. We have evaluated this approach...

  11. IVSPlat 1.0: an integrated virtual screening platform with a molecular graphical interface.

    Science.gov (United States)

    Sun, Yin Xue; Huang, Yan Xin; Li, Feng Li; Wang, Hong Yan; Fan, Cong; Bao, Yong Li; Sun, Lu Guo; Ma, Zhi Qiang; Kong, Jun; Li, Yu Xin

    2012-01-05

    The virtual screening (VS) of lead compounds using molecular docking and pharmacophore detection is now an important tool in drug discovery. VS tasks typically require a combination of several software tools and a molecular graphics system. Thus, the integration of all the requisite tools in a single operating environment could reduce the complexity of running VS experiments. However, only a few freely available integrated software platforms have been developed. A free open-source platform, IVSPlat 1.0, was developed in this study for the management and automation of VS tasks. We integrated several VS-related programs into a molecular graphics system to provide a comprehensive platform for the solution of VS tasks based on molecular docking, pharmacophore detection, and a combination of both methods. This tool can be used to visualize intermediate and final results of the VS execution, while also providing a clustering tool for the analysis of VS results. A case study was conducted to demonstrate the applicability of this platform. IVSPlat 1.0 provides a plug-in-based solution for the management, automation, and visualization of VS tasks. IVSPlat 1.0 is an open framework that allows the integration of extra software to extend its functionality and modified versions can be freely distributed. The open source code and documentation are available at http://kyc.nenu.edu.cn/IVSPlat/.

  12. BUILDING A BILLION SPATIO-TEMPORAL OBJECT SEARCH AND VISUALIZATION PLATFORM

    Directory of Open Access Journals (Sweden)

    D. Kakkar

    2017-10-01

    Full Text Available With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC, an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.

  13. Building a Billion Spatio-Temporal Object Search and Visualization Platform

    Science.gov (United States)

    Kakkar, D.; Lewis, B.

    2017-10-01

    With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA) has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC), an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.

  14. Computer code development plant for SMART design

    International Nuclear Information System (INIS)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H.

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  15. Computer code development plant for SMART design

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  16. ZENO: N-body and SPH Simulation Codes

    Science.gov (United States)

    Barnes, Joshua E.

    2011-02-01

    The ZENO software package integrates N-body and SPH simulation codes with a large array of programs to generate initial conditions and analyze numerical simulations. Written in C, the ZENO system is portable between Mac, Linux, and Unix platforms. It is in active use at the Institute for Astronomy (IfA), at NRAO, and possibly elsewhere. Zeno programs can perform a wide range of simulation and analysis tasks. While many of these programs were first created for specific projects, they embody algorithms of general applicability and embrace a modular design strategy, so existing code is easily applied to new tasks. Major elements of the system include: Structured data file utilities facilitate basic operations on binary data, including import/export of ZENO data to other systems.Snapshot generation routines create particle distributions with various properties. Systems with user-specified density profiles can be realized in collisionless or gaseous form; multiple spherical and disk components may be set up in mutual equilibrium.Snapshot manipulation routines permit the user to sift, sort, and combine particle arrays, translate and rotate particle configurations, and assign new values to data fields associated with each particle.Simulation codes include both pure N-body and combined N-body/SPH programs: Pure N-body codes are available in both uniprocessor and parallel versions.SPH codes offer a wide range of options for gas physics, including isothermal, adiabatic, and radiating models. Snapshot analysis programs calculate temporal averages, evaluate particle statistics, measure shapes and density profiles, compute kinematic properties, and identify and track objects in particle distributions.Visualization programs generate interactive displays and produce still images and videos of particle distributions; the user may specify arbitrary color schemes and viewing transformations.

  17. ADMS Evaluation Platform

    Energy Technology Data Exchange (ETDEWEB)

    2018-01-23

    Deploying an ADMS or looking to optimize its value? NREL offers a low-cost, low-risk evaluation platform for assessing ADMS performance. The National Renewable Energy Laboratory (NREL) has developed a vendor-neutral advanced distribution management system (ADMS) evaluation platform and is expanding its capabilities. The platform uses actual grid-scale hardware, large-scale distribution system models, and advanced visualization to simulate realworld conditions for the most accurate ADMS evaluation and experimentation.

  18. US/JAERI calculational benchmarks for nuclear data and codes intercomparison. Article 8

    International Nuclear Information System (INIS)

    Youssef, M.Z.; Jung, J.; Sawan, M.E.; Nakagawa, M.; Mori, T.; Kosako, K.

    1986-01-01

    Prior to analyzing the integral experiments performed at the FNS facility at JAERI, both US and JAERI's analysts have agreed upon four calculational benchmark problems proposed by JAERI to intercompare results based on various codes and data base used independently by both countries. To compare codes the same data base is used (ENDF/B-IV). To compare nuclear data libraries, common codes were applied. Some of the benchmarks chosen were geometrically simple and consisted of a single material to clearly identify sources of discrepancies and thus help in analysing the integral experiments

  19. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  20. Colombian Artists and Digital Music Platforms: Some Difficulties

    Directory of Open Access Journals (Sweden)

    Marcela Palacio Puerta

    2017-12-01

    Full Text Available The Internet provides new business opportunities for the music industry, especially for both independent artists and record companies. The reason of the latter is the great proliferation and growth of digital music platforms. However, contrary to statistics, artists have not been able to benefit of such opportunities in the expected manner. The academic development on this subject is in its beginnings especially with respect to the Colombian panorama, therefore for the first time in the literature, this paper draws some of the difficulties that the Colombian artists face in the world of the digital music.

  1. Homemade Buckeye-Pi: A Learning Many-Node Platform for High-Performance Parallel Computing

    Science.gov (United States)

    Amooie, M. A.; Moortgat, J.

    2017-12-01

    We report on the "Buckeye-Pi" cluster, the supercomputer developed in The Ohio State University School of Earth Sciences from 128 inexpensive Raspberry Pi (RPi) 3 Model B single-board computers. Each RPi is equipped with fast Quad Core 1.2GHz ARMv8 64bit processor, 1GB of RAM, and 32GB microSD card for local storage. Therefore, the cluster has a total RAM of 128GB that is distributed on the individual nodes and a flash capacity of 4TB with 512 processors, while it benefits from low power consumption, easy portability, and low total cost. The cluster uses the Message Passing Interface protocol to manage the communications between each node. These features render our platform the most powerful RPi supercomputer to date and suitable for educational applications in high-performance-computing (HPC) and handling of large datasets. In particular, we use the Buckeye-Pi to implement optimized parallel codes in our in-house simulator for subsurface media flows with the goal of achieving a massively-parallelized scalable code. We present benchmarking results for the computational performance across various number of RPi nodes. We believe our project could inspire scientists and students to consider the proposed unconventional cluster architecture as a mainstream and a feasible learning platform for challenging engineering and scientific problems.

  2. The Karlsruhe code MODINA for model independent analysis of elastic scattering of spinless particles

    International Nuclear Information System (INIS)

    Gils, H.J.

    1983-12-01

    The Karlsruhe code MODINA (KfK 3063, published November 1980) has been extended in particular with respect to new approximations in the folding models and to the calculation of errors in the fourier-Bessel potentials. The corresponding subroutines replacing previous ones are compiled in this first supplement. The listings of the fit-routine-package FITEX missing in the first publication of MODINA are also included now. (orig.) [de

  3. Comparing Different Strategies in Directed Evolution of Enzyme Stereoselectivity: Single- versus Double-Code Saturation Mutagenesis.

    Science.gov (United States)

    Sun, Zhoutong; Lonsdale, Richard; Li, Guangyue; Reetz, Manfred T

    2016-10-04

    Saturation mutagenesis at sites lining the binding pockets of enzymes constitutes a viable protein engineering technique for enhancing or inverting stereoselectivity. Statistical analysis shows that oversampling in the screening step (the bottleneck) increases astronomically as the number of residues in the randomization site increases, which is the reason why reduced amino acid alphabets have been employed, in addition to splitting large sites into smaller ones. Limonene epoxide hydrolase (LEH) has previously served as the experimental platform in these methodological efforts, enabling comparisons between single-code saturation mutagenesis (SCSM) and triple-code saturation mutagenesis (TCSM); these employ either only one or three amino acids, respectively, as building blocks. In this study the comparative platform is extended by exploring the efficacy of double-code saturation mutagenesis (DCSM), in which the reduced amino acid alphabet consists of two members, chosen according to the principles of rational design on the basis of structural information. The hydrolytic desymmetrization of cyclohexene oxide is used as the model reaction, with formation of either (R,R)- or (S,S)-cyclohexane-1,2-diol. DCSM proves to be clearly superior to the likewise tested SCSM, affording both R,R- and S,S-selective mutants. These variants are also good catalysts in reactions of further substrates. Docking computations reveal the basis of enantioselectivity. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Development of a surface plasmon resonance and nanomechanical biosensing hybrid platform for multiparametric reading.

    Science.gov (United States)

    Alvarez, Mar; Fariña, David; Escuela, Alfonso M; Sendra, Jose Ramón; Lechuga, Laura M

    2013-01-01

    We have developed a hybrid platform that combines two well-known biosensing technologies based on quite different transducer principles: surface plasmon resonance and nanomechanical sensing. The new system allows the simultaneous and real-time detection of two independent parameters, refractive index change (Δn), and surface stress change (Δσ) when a biomolecular interaction takes place. Both parameters have a direct relation with the mass coverage of the sensor surface. The core of the platform is a common fluid cell, where the solution arrives to both sensor areas at the same time and under the same conditions (temperature, velocity, diffusion, etc.).The main objective of this integration is to achieve a better understanding of the physical behaviour of the transducers during sensing, increasing the information obtained in real time in one single experiment. The potential of the hybrid platform is demonstrated by the detection of DNA hybridization.

  5. Exoskeleton-Based Robotic Platform Applied in Biomechanical Modelling of the Human Upper Limb

    Directory of Open Access Journals (Sweden)

    Andres F. Ruiz

    2009-01-01

    Full Text Available One of the approaches to study the human motor system, and specifically the motor strategies implied during postural tasks of the upper limbs, is to manipulate the mechanical conditions of each joint of the upper limbs independently. At the same time, it is essential to pick up biomechanical signals and bio-potentials generated while the human motor system adapts to the new condition. The aim of this paper is two-fold: first, to describe the design, development and validation of an experimental platform designed to modify or perturb the mechanics of human movement, and simultaneously acquire, process, display and quantify bioelectric and biomechanical signals; second, to characterise the dynamics of the elbow joint during postural control. A main goal of the study was to determine the feasibility of estimating human elbow joint dynamics using EMG-data during maintained posture. In particular, the experimental robotic platform provides data to correlate electromyographic (EMG activity, kinetics and kinematics information from the upper limb motion. The platform aims consists of an upper limb powered exoskeleton, an EMG acquisition module, a control unit and a software system. Important concerns of the platform such as dependability and safety were addressed in the development. The platform was evaluated with 4 subjects to identify, using system identification methods, the human joint dynamics, i.e. visco-elasticity. Results obtained in simulations and experimental phase are introduced.

  6. Promoter Analysis Reveals Globally Differential Regulation of Human Long Non-Coding RNA and Protein-Coding Genes

    KAUST Repository

    Alam, Tanvir

    2014-10-02

    Transcriptional regulation of protein-coding genes is increasingly well-understood on a global scale, yet no comparable information exists for long non-coding RNA (lncRNA) genes, which were recently recognized to be as numerous as protein-coding genes in mammalian genomes. We performed a genome-wide comparative analysis of the promoters of human lncRNA and protein-coding genes, finding global differences in specific genetic and epigenetic features relevant to transcriptional regulation. These two groups of genes are hence subject to separate transcriptional regulatory programs, including distinct transcription factor (TF) proteins that significantly favor lncRNA, rather than coding-gene, promoters. We report a specific signature of promoter-proximal transcriptional regulation of lncRNA genes, including several distinct transcription factor binding sites (TFBS). Experimental DNase I hypersensitive site profiles are consistent with active configurations of these lncRNA TFBS sets in diverse human cell types. TFBS ChIP-seq datasets confirm the binding events that we predicted using computational approaches for a subset of factors. For several TFs known to be directly regulated by lncRNAs, we find that their putative TFBSs are enriched at lncRNA promoters, suggesting that the TFs and the lncRNAs may participate in a bidirectional feedback loop regulatory network. Accordingly, cells may be able to modulate lncRNA expression levels independently of mRNA levels via distinct regulatory pathways. Our results also raise the possibility that, given the historical reliance on protein-coding gene catalogs to define the chromatin states of active promoters, a revision of these chromatin signature profiles to incorporate expressed lncRNA genes is warranted in the future.

  7. The use of best estimate codes to improve the simulation in real time

    International Nuclear Information System (INIS)

    Rivero, N.; Esteban, J. A.; Lenhardt, G.

    2007-01-01

    Best estimate codes are assumed to be the technology solution providing the most realistic and accurate response. Best estimate technology provides a complementary solution to the conservative simulation technology usually applied to determine plant safety margins and perform security related studies. Tecnatom in the early 90's, within the MAS project, pioneered the initiative to implement best estimate code in its training simulators. Result of this project was the implementation of the first six-equations thermal hydraulic code worldwide (TRAC R T), running in a training environment. To meet real time and other specific training requirements, it was necessary to overcome important difficulties. Tecnatom has just adapted the Global Nuclear Fuel core Design code: PANAC 11, and is about to complete the General Electric TRACG04 thermal hydraulic code adaptation. This technology features a unique solution for nuclear plants aiming at providing the highest fidelity in simulation, enabling to consider the simulator as a multipurpose: engineering and training, simulation platform. Besides, a visual environment designed to optimize the models life cycle, covering both pre and post-processing activities, is in its late development phase. (Author)

  8. Remote and Virtual Instrumentation Platform for Distance Learning

    Directory of Open Access Journals (Sweden)

    Tom Eppes

    2010-08-01

    Full Text Available This journal presents distance learning using the National Instruments ELVIS II and how Multisim can be combined with ELVIS II for distance learning. National Instrument’s ELVIS II is a new version that can easily be used for e-learning. It features 12 of the commonly used instruments in engineering and science laboratories, including an oscilloscope, a function generator, a variable power supply, and an isolated digital multi-meter in a low-cost and easy-to-use platform and completes integration with Multisim software for SPICE simulation, which simplifies the teaching of circuit design. As NI ELVIS II is based on LabView, designers can easily customize the 12 instruments or can create their own using the provided source code for the instruments.

  9. Adaptive variable-length coding for efficient compression of spacecraft television data.

    Science.gov (United States)

    Rice, R. F.; Plaunt, J. R.

    1971-01-01

    An adaptive variable length coding system is presented. Although developed primarily for the proposed Grand Tour missions, many features of this system clearly indicate a much wider applicability. Using sample to sample prediction, the coding system produces output rates within 0.25 bit/picture element (pixel) of the one-dimensional difference entropy for entropy values ranging from 0 to 8 bit/pixel. This is accomplished without the necessity of storing any code words. Performance improvements of 0.5 bit/pixel can be simply achieved by utilizing previous line correlation. A Basic Compressor, using concatenated codes, adapts to rapid changes in source statistics by automatically selecting one of three codes to use for each block of 21 pixels. The system adapts to less frequent, but more dramatic, changes in source statistics by adjusting the mode in which the Basic Compressor operates on a line-to-line basis. Furthermore, the compression system is independent of the quantization requirements of the pulse-code modulation system.

  10. Additional extensions to the NASCAP computer code, volume 3

    Science.gov (United States)

    Mandell, M. J.; Cooke, D. L.

    1981-01-01

    The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

  11. Development of Testing Platform for Digital I and C System in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Park, G. Y.; Kim, Y. M.; Jeong, C. H. [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2013-10-15

    According to digitalization of the NPP (Nuclear Power Plant) I and C (Instrumentation and Control) system, cyber threats against I and C system are increased. Moreover, the complexity of I and C system are increased due to adopt the up-to-date technologies (i. e., smart sensor, wireless network, and Field Programmable Gate Array / Complex Programmable Logic Device) into NPP's I and C system. For example, new issues such as cyber threat are introduced from digitalized I and C systems and components to replace obsolete analog equipment in existing NPPs. Furthermore, use of wireless communication, FPGA/CPLD, and smart sensor could introduce new considerations such as Defense-in-Depth and Diversity. Therefore, the proof testing for digital I and C system is required to verify the adverse effect from use of up-to-date digital technologies and identify the criteria to resolve and mitigate (or prevent) the (possibility of) effects. The objective of this study is developing the Testing Platform for the proof testing. The digital I and C System Test Platform is implemented using test platform hardware, component software, and architectural design. The digital I and C testing platform includes the safety-related PLC and relevant ladder logics, Windows-based C++ codes for host PC. For software, there are seven spike models to confirm the each module's functionality and generate/monitor the signals to/from PLCs. For future work, digital I and C System Test Platform architecture will be implemented using spike models. And a set of acceptance test against cyber security, smart sensor, wireless network, and FPGA/CPLD will be conducted using digital I and C System Test Platform.

  12. Development of Testing Platform for Digital I and C System in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Park, G. Y.; Kim, Y. M.; Jeong, C. H.

    2013-01-01

    According to digitalization of the NPP (Nuclear Power Plant) I and C (Instrumentation and Control) system, cyber threats against I and C system are increased. Moreover, the complexity of I and C system are increased due to adopt the up-to-date technologies (i. e., smart sensor, wireless network, and Field Programmable Gate Array / Complex Programmable Logic Device) into NPP's I and C system. For example, new issues such as cyber threat are introduced from digitalized I and C systems and components to replace obsolete analog equipment in existing NPPs. Furthermore, use of wireless communication, FPGA/CPLD, and smart sensor could introduce new considerations such as Defense-in-Depth and Diversity. Therefore, the proof testing for digital I and C system is required to verify the adverse effect from use of up-to-date digital technologies and identify the criteria to resolve and mitigate (or prevent) the (possibility of) effects. The objective of this study is developing the Testing Platform for the proof testing. The digital I and C System Test Platform is implemented using test platform hardware, component software, and architectural design. The digital I and C testing platform includes the safety-related PLC and relevant ladder logics, Windows-based C++ codes for host PC. For software, there are seven spike models to confirm the each module's functionality and generate/monitor the signals to/from PLCs. For future work, digital I and C System Test Platform architecture will be implemented using spike models. And a set of acceptance test against cyber security, smart sensor, wireless network, and FPGA/CPLD will be conducted using digital I and C System Test Platform

  13. Towards Holography via Quantum Source-Channel Codes

    Science.gov (United States)

    Pastawski, Fernando; Eisert, Jens; Wilming, Henrik

    2017-07-01

    While originally motivated by quantum computation, quantum error correction (QEC) is currently providing valuable insights into many-body quantum physics, such as topological phases of matter. Furthermore, mounting evidence originating from holography research (AdS/CFT) indicates that QEC should also be pertinent for conformal field theories. With this motivation in mind, we introduce quantum source-channel codes, which combine features of lossy compression and approximate quantum error correction, both of which are predicted in holography. Through a recent construction for approximate recovery maps, we derive guarantees on its erasure decoding performance from calculations of an entropic quantity called conditional mutual information. As an example, we consider Gibbs states of the transverse field Ising model at criticality and provide evidence that they exhibit nontrivial protection from local erasure. This gives rise to the first concrete interpretation of a bona fide conformal field theory as a quantum error correcting code. We argue that quantum source-channel codes are of independent interest beyond holography.

  14. SolTrace: A Ray-Tracing Code for Complex Solar Optical Systems

    Energy Technology Data Exchange (ETDEWEB)

    Wendelin, Tim [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lewandowski, Allan [Allan Lewandowski Solar Consulting LLC, Evergreen, CO (United States)

    2013-10-01

    SolTrace is an optical simulation tool designed to model optical systems used in concentrating solar power (CSP) applications. The code was first written in early 2003, but has seen significant modifications and changes since its inception, including conversion from a Pascal-based software development platform to C++. SolTrace is unique in that it can model virtually any optical system utilizingthe sun as the source. It has been made available for free and as such is in use worldwide by industry, universities, and research laboratories. The fundamental design of the code is discussed, including enhancements and improvements over the earlier version. Comparisons are made with other optical modeling tools, both non-commercial and commercial in nature. Finally, modeled results are shownfor some typical CSP systems and, in one case, compared to measured optical data.

  15. Mr.CAS-A minimalistic (pure) Ruby CAS for fast prototyping and code generation

    Science.gov (United States)

    Ragni, Matteo

    There are Computer Algebra System (CAS) systems on the market with complete solutions for manipulation of analytical models. But exporting a model that implements specific algorithms on specific platforms, for target languages or for particular numerical library, is often a rigid procedure that requires manual post-processing. This work presents a Ruby library that exposes core CAS capabilities, i.e. simplification, substitution, evaluation, etc. The library aims at programmers that need to rapidly prototype and generate numerical code for different target languages, while keeping separated mathematical expression from the code generation rules, where best practices for numerical conditioning are implemented. The library is written in pure Ruby language and is compatible with most Ruby interpreters.

  16. Mr.CAS—A minimalistic (pure Ruby CAS for fast prototyping and code generation

    Directory of Open Access Journals (Sweden)

    Matteo Ragni

    2017-01-01

    Full Text Available There are Computer Algebra System (CAS systems on the market with complete solutions for manipulation of analytical models. But exporting a model that implements specific algorithms on specific platforms, for target languages or for particular numerical library, is often a rigid procedure that requires manual post-processing. This work presents a Ruby library that exposes core CAS capabilities, i.e. simplification, substitution, evaluation, etc. The library aims at programmers that need to rapidly prototype and generate numerical code for different target languages, while keeping separated mathematical expression from the code generation rules, where best practices for numerical conditioning are implemented. The library is written in pure Ruby language and is compatible with most Ruby interpreters.

  17. Data Platforms and Cities

    DEFF Research Database (Denmark)

    Blok, Anders; Courmont, Antoine; Hoyng, Rolien

    2017-01-01

    This section offers a series of joint reflections on (open) data platform from a variety of cases, from cycling, traffic and mapping to activism, environment and data brokering. Data platforms play a key role in contemporary urban governance. Linked to open data initiatives, such platforms are of...

  18. Integrated Production-Distribution Scheduling Problem with Multiple Independent Manufacturers

    Directory of Open Access Journals (Sweden)

    Jianhong Hao

    2015-01-01

    Full Text Available We consider the nonstandard parts supply chain with a public service platform for machinery integration in China. The platform assigns orders placed by a machinery enterprise to multiple independent manufacturers who produce nonstandard parts and makes production schedule and batch delivery schedule for each manufacturer in a coordinate manner. Each manufacturer has only one plant with parallel machines and is located at a location far away from other manufacturers. Orders are first processed at the plants and then directly shipped from the plants to the enterprise in order to be finished before a given deadline. We study the above integrated production-distribution scheduling problem with multiple manufacturers to maximize a weight sum of the profit of each manufacturer under the constraints that all orders are finished before the deadline and the profit of each manufacturer is not negative. According to the optimal condition analysis, we formulate the problem as a mixed integer programming model and use CPLEX to solve it.

  19. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    Science.gov (United States)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  20. Mobile Platforms and Development Environments

    CERN Document Server

    Helal, Sumi; Li, Wengdong

    2012-01-01

    Mobile platform development has lately become a technological war zone with extremely dynamic and fluid movement, especially in the smart phone and tablet market space. This Synthesis lecture is a guide to the latest developments of the key mobile platforms that are shaping the mobile platform industry. The book covers the three currently dominant native platforms -- iOS, Android and Windows Phone -- along with the device-agnostic HTML5 mobile web platform. The lecture also covers location-based services (LBS) which can be considered as a platform in its own right. The lecture utilizes a sampl

  1. Methods and computer codes for probabilistic sensitivity and uncertainty analysis

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1985-01-01

    This paper describes the methods and applications experience with two computer codes that are now available from the National Energy Software Center at Argonne National Laboratory. The purpose of the SCREEN code is to identify a group of most important input variables of a code that has many (tens, hundreds) input variables with uncertainties, and do this without relying on judgment or exhaustive sensitivity studies. Purpose of the PROSA-2 code is to propagate uncertainties and calculate the distributions of interesting output variable(s) of a safety analysis code using response surface techniques, based on the same runs used for screening. Several applications are discussed, but the codes are generic, not tailored to any specific safety application code. They are compatible in terms of input/output requirements but also independent of each other, e.g., PROSA-2 can be used without first using SCREEN if a set of important input variables has first been selected by other methods. Also, although SCREEN can select cases to be run (by random sampling), a user can select cases by other methods if he so prefers, and still use the rest of SCREEN for identifying important input variables

  2. Developing an Australian code of construction ethics

    Directory of Open Access Journals (Sweden)

    Sean Francis McCarthy

    2012-05-01

    Full Text Available This article looks at the increasing need to consider the role of ethics in construction. The industry, historically, has been challenged by allegations of a serious shortfall in ethical standards. Only limited attempts to date in Australia have been made to address that concern. Any ethical analysis should consider the definition of ethics and its historical development. This paper considers major historical developments in ethical thinking as well as contemporary thinking on ethics for professional sub-sets. A code could be developed specific to construction. Current methods of addressing ethics in construction and in other industries are also reviewed. This paper argues that developing a code of ethics, supported by other measures is the way forward. The author’s aim is to promote further discussion and promote the drafting of a code. This paper includes a summary of other ethical codes that may provide a starting point. The time for reform is upon us, and there is an urgent need for an independent body to take the lead, for fear of floundering and having only found ‘another debating topic’ (Uff 2006.

  3. TACO: a finite element heat transfer code

    International Nuclear Information System (INIS)

    Mason, W.E. Jr.

    1980-02-01

    TACO is a two-dimensional implicit finite element code for heat transfer analysis. It can perform both linear and nonlinear analyses and can be used to solve either transient or steady state problems. Either plane or axisymmetric geometries can be analyzed. TACO has the capability to handle time or temperature dependent material properties and materials may be either isotropic or orthotropic. A variety of time and temperature dependent loadings and boundary conditions are available including temperature, flux, convection, and radiation boundary conditions and internal heat generation. Additionally, TACO has some specialized features such as internal surface conditions (e.g., contact resistance), bulk nodes, enclosure radiation with view factor calculations, and chemical reactive kinetics. A user subprogram feature allows for any type of functional representation of any independent variable. A bandwidth and profile minimization option is also available in the code. Graphical representation of data generated by TACO is provided by a companion post-processor named POSTACO. The theory on which TACO is based is outlined, the capabilities of the code are explained, the input data required to perform an analysis with TACO are described. Some simple examples are provided to illustrate the use of the code

  4. ITS Platform North Denmark

    DEFF Research Database (Denmark)

    Lahrmann, Harry; Agerholm, Niels; Juhl, Jens

    2012-01-01

    This paper presents the project entitled “ITS Platform North Denmark” which is used as a test platform for Intelligent Transportation System (ITS) solutions. The platform consists of a newly developed GNSS/GPRS On Board Unit (OBU) to be installed in 500 cars, a backend server and a specially...

  5. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  6. Platform development supportedby gaming

    DEFF Research Database (Denmark)

    Mikkola, Juliana Hsuan; Hansen, Poul H. Kyvsgård

    2007-01-01

    The challenge of implementing industrial platforms in practice can be described as a configuration problem caused by high number of variables, which often have contradictory influences on the total performance of the firm. Consequently, the specific platform decisions become extremely complex......, possibly increasing the strategic risks for the firm. This paper reports preliminary findings on platform management process at LEGO, a Danish toy company.  Specifically, we report the process of applying games combined with simulations and workshops in the platform development. We also propose a framework...

  7. CBP Phase I Code Integration

    International Nuclear Information System (INIS)

    Smith, F.; Brown, K.; Flach, G.; Sarkar, S.

    2011-01-01

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material properties via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown and Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown and Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface

  8. CBP PHASE I CODE INTEGRATION

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F.; Brown, K.; Flach, G.; Sarkar, S.

    2011-09-30

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material properties via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was

  9. Progress on DART code optimization

    International Nuclear Information System (INIS)

    Taboada, Horacio; Solis, Diego; Rest, Jeffrey

    1999-01-01

    This work consists about the progress made on the design and development of a new optimized version of DART code (DART-P), a mechanistic computer model for the performance calculation and assessment of aluminum dispersion fuel. It is part of a collaboration agreement between CNEA and ANL in the area of Low Enriched Uranium Advanced Fuels. It is held by the Implementation Arrangement for Technical Exchange and Cooperation in the Area of Peaceful Uses of Nuclear Energy, signed on October 16, 1997 between US DOE and the National Atomic Energy Commission of the Argentine Republic. DART optimization is a biannual program; it is operative since February 8, 1999 and has the following goals: 1. Design and develop a new DART calculation kernel for implementation within a parallel processing architecture. 2. Design and develop new user-friendly I/O routines to be resident on Personal Computer (PC)/WorkStation (WS) platform. 2.1. The new input interface will be designed and developed by means of a Visual interface, able to guide the user in the construction of the problem to be analyzed with the aid of a new database (described in item 3, below). The new I/O interface will include input data check controls in order to avoid corrupted input data. 2.2. The new output interface will be designed and developed by means of graphical tools, able to translate numeric data output into 'on line' graphic information. 3. Design and develop a new irradiated materials database, to be resident on PC/WS platform, so as to facilitate the analysis of the behavior of different fuel and meat compositions with DART-P. Currently, a different version of DART is used for oxide, silicide, and advanced alloy fuels. 4. Develop rigorous general inspection algorithms in order to provide valuable DART-P benchmarks. 5. Design and develop new models, such as superplasticity, elastoplastic feedback, improved models for the calculation of fuel deformation and the evolution of the fuel microstructure for

  10. Accuracy of clinical coding for procedures in oral and maxillofacial surgery.

    Science.gov (United States)

    Khurram, S A; Warner, C; Henry, A M; Kumar, A; Mohammed-Ali, R I

    2016-10-01

    Clinical coding has important financial implications, and discrepancies in the assigned codes can directly affect the funding of a department and hospital. Over the last few years, numerous oversights have been noticed in the coding of oral and maxillofacial (OMF) procedures. To establish the accuracy and completeness of coding, we retrospectively analysed the records of patients during two time periods: March to May 2009 (324 patients), and January to March 2014 (200 patients). Two investigators independently collected and analysed the data to ensure accuracy and remove bias. A large proportion of operations were not assigned all the relevant codes, and only 32% - 33% were correct in both cycles. To our knowledge, this is the first reported audit of clinical coding in OMFS, and it highlights serious shortcomings that have substantial financial implications. Better input by the surgical team and improved communication between the surgical and coding departments will improve accuracy. Copyright © 2016 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  11. Vulnerability assessment of a space based weapon platform electronic system exposed to a thermonuclear weapon detonation

    Science.gov (United States)

    Perez, C. L.; Johnson, J. O.

    Rapidly changing world events, the increased number of nations with inter-continental ballistic missile capability, and the proliferation of nuclear weapon technology will increase the number of nuclear threats facing the world today. Monitoring these nation's activities and providing an early warning and/or intercept system via reconnaissance and surveillance satellites and space based weapon platforms is a viable deterrent against a surprise nuclear attack. However, the deployment of satellite and weapon platform assets in space will subject the sensitive electronic equipment to a variety of natural and man-made radiation environments. These include Van Allen Belt protons and electrons; galactic and solar flare protons; and neutrons, gamma rays, and x-rays from intentionally detonated fission and fusion weapons. In this paper, the MASH vl.0 code system is used to estimate the dose to the critical electronics components of an idealized space based weapon platform from neutron and gamma-ray radiation emitted from a thermonuclear weapon detonation in space. Fluence and dose assessments were performed for the platform fully loaded, and in several stages representing limited engagement scenarios. The results indicate vulnerabilities to the Command, Control, and Communication bay instruments from radiation damage for a nuclear weapon detonation for certain source/platform orientations. The distance at which damage occurs will depend on the weapon yield (n,(gamma)/kiloton) and size (kilotons).

  12. Evolution in public procurement and the impact of e-procurement platforms: A case study

    OpenAIRE

    Leal, Pedro Manuel Mariano

    2010-01-01

    A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics The aim of this project is to understand the evolution that occurred in the public procurement due to the new Public Contracts Code that changed the procurement procedures and forced public entities to use electronic procurement platforms. This work is a contribution to the understanding of the main procurement changes with this ...

  13. The APS SASE FEL: modeling and code comparison

    International Nuclear Information System (INIS)

    Biedron, S. G.

    1999-01-01

    A self-amplified spontaneous emission (SASE) free-electron laser (FEL) is under construction at the Advanced Photon Source (APS). Five FEL simulation codes were used in the design phase: GENESIS, GINGER, MEDUSA, RON, and TDA3D. Initial comparisons between each of these independent formulations show good agreement for the parameters of the APS SASE FEL

  14. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  15. Structure and operation of the ITS code system

    International Nuclear Information System (INIS)

    Halbleib, J.

    1988-01-01

    The TIGER series of time-independent coupled electron-photon Monte Carlo transport codes is a group of multimaterial and multidimensional codes designed to provide a state-of-the-art description of the production and transport of the electron-photon cascade by combining microscopic photon transport with a macroscopic random walk for electron transport. Major contributors to its evolution are listed. The author and his associates are primarily code users rather than code developers, and have borrowed freely from existing work wherever possible. Nevertheless, their efforts have resulted in various software packages for describing the production and transport of the electron-photon cascade that they found sufficiently useful to warrant dissemination through the Radiation Shielding Information Center (RSIC) at Oak Ridge National Laboratory. The ITS system (Integrated TIGER Series) represents the organization and integration of this combined software, along with much additional capability from previously unreleased work, into a single convenient package of exceptional user friendliness and portability. Emphasis is on simplicity and flexibility of application without sacrificing the rigor or sophistication of the physical model

  16. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth; Tracy Rafferty

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scale long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK

  17. SpecPad: device-independent NMR data visualization and processing based on the novel DART programming language and Html5 Web technology.

    Science.gov (United States)

    Guigas, Bruno

    2017-09-01

    SpecPad is a new device-independent software program for the visualization and processing of one-dimensional and two-dimensional nuclear magnetic resonance (NMR) time domain (FID) and frequency domain (spectrum) data. It is the result of a project to investigate whether the novel programming language DART, in combination with Html5 Web technology, forms a suitable base to write an NMR data evaluation software which runs on modern computing devices such as Android, iOS, and Windows tablets as well as on Windows, Linux, and Mac OS X desktop PCs and notebooks. Another topic of interest is whether this technique also effectively supports the required sophisticated graphical and computational algorithms. SpecPad is device-independent because DART's compiled executable code is JavaScript and can, therefore, be run by the browsers of PCs and tablets. Because of Html5 browser cache technology, SpecPad may be operated off-line. Network access is only required during data import or export, e.g. via a Cloud service, or for software updates. A professional and easy to use graphical user interface consistent across all hardware platforms supports touch screen features on mobile devices for zooming and panning and for NMR-related interactive operations such as phasing, integration, peak picking, or atom assignment. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Galen-In-Use: using artificial intelligence terminology tools to improve the linguistic coherence of a national coding system for surgical procedures.

    Science.gov (United States)

    Rodrigues, J M; Trombert-Paviot, B; Baud, R; Wagner, J; Meusnier-Carriot, F

    1998-01-01

    GALEN has developed a language independent common reference model based on a medically oriented ontology and practical tools and techniques for managing healthcare terminology including natural language processing. GALEN-IN-USE is the current phase which applied the modelling and the tools to the development or the updating of coding systems for surgical procedures in different national coding centers co-operating within the European Federation of Coding Centre (EFCC) to create a language independent knowledge repository for multicultural Europe. We used an integrated set of artificial intelligence terminology tools named CLAssification Manager workbench to process French professional medical language rubrics into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation we generate controlled French natural language. The French national coding centre is then able to retrieve the initial professional rubrics with different categories of concepts, to compare the professional language proposed by expert clinicians to the French generated controlled vocabulary and to finalize the linguistic labels of the coding system in relation with the meanings of the conceptual system structure.

  19. First experience with particle-in-cell plasma physics code on ARM-based HPC systems

    Science.gov (United States)

    Sáez, Xavier; Soba, Alejandro; Sánchez, Edilberto; Mantsinen, Mervi; Mateo, Sergi; Cela, José M.; Castejón, Francisco

    2015-09-01

    In this work, we will explore the feasibility of porting a Particle-in-cell code (EUTERPE) to an ARM multi-core platform from the Mont-Blanc project. The used prototype is based on a system-on-chip Samsung Exynos 5 with an integrated GPU. It is the first prototype that could be used for High-Performance Computing (HPC), since it supports double precision and parallel programming languages.

  20. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.