WorldWideScience

Sample records for platform independent code

  1. A platform independent framework for Statecharts code generation

    International Nuclear Information System (INIS)

    Andolfato, L.; Chiozzi, G.; Migliorini, N.; Morales, C.

    2012-01-01

    Control systems for telescopes and their instruments are reactive systems very well suited to be modelled using Statecharts formalism. The World Wide Web Consortium is working on a new standard called SCXML that specifies XML notation to describe Statecharts and provides a well defined operational semantic for run-time interpretation of the SCXML models. This paper presents a generic application framework for reactive non realtime systems based on interpreted Statecharts. The framework consists of a model to text transformation tool and an SCXML interpreter. The tool generates from UML state machine models the SCXML representation of the state machines as well as the application skeletons for the supported software platforms. An abstraction layer propagates the events from the middle-ware to the SCXML interpreter facilitating the support for different software platforms. This project benefits from the positive experience gained in several years of development of coordination and monitoring applications for the telescope control software domain using Model Driven Development technologies. (authors)

  2. Designing platform independent mobile apps and services

    CERN Document Server

    Heckman, Rocky

    2016-01-01

    This book explains how to help create an innovative and future proof architecture for mobile apps by introducing practical approaches to increase the value and flexibility of their service layers and reduce their delivery time. Designing Platform Independent Mobile Apps and Services begins by describing the mobile computing landscape and previous attempts at cross platform development. Platform independent mobile technologies and development strategies are described in chapter two and three. Communication protocols, details of a recommended five layer architecture, service layers, and the data abstraction layer are also introduced in these chapters. Cross platform languages and multi-client development tools for the User Interface (UI) layer, as well as message processing patterns and message routing of the Service Int rface (SI) layer are explained in chapter four and five. Ways to design the service layer for mobile computing, using Command Query Responsibility Segregation (CQRS) and the Data Abstraction La...

  3. A platform independent communication library for distributed computing

    NARCIS (Netherlands)

    Groen, D.; Rieder, S.; Grosso, P.; de Laat, C.; Portegies Zwart, S.

    2010-01-01

    We present MPWide, a platform independent communication library for performing message passing between supercomputers. Our library couples several local MPI applications through a long distance network using, for example, optical links. The implementation is deliberately kept light-weight, platform

  4. Multimedia distribution using network coding on the iphone platform

    DEFF Research Database (Denmark)

    Vingelmann, Peter; Pedersen, Morten Videbæk; Fitzek, Frank

    2010-01-01

    This paper looks into the implementation details of random linear network coding on the Apple iPhone and iPod Touch mobile platforms for multimedia distribution. Previous implementations of network coding on this platform failed to achieve a throughput which is sufficient to saturate the WLAN...

  5. Platform-independent method for computer aided schematic drawings

    Science.gov (United States)

    Vell, Jeffrey L [Slingerlands, NY; Siganporia, Darius M [Clifton Park, NY; Levy, Arthur J [Fort Lauderdale, FL

    2012-02-14

    A CAD/CAM method is disclosed for a computer system to capture and interchange schematic drawing and associated design information. The schematic drawing and design information are stored in an extensible, platform-independent format.

  6. Porting of serial molecular dynamics code on MIMD platforms

    International Nuclear Information System (INIS)

    Celino, M.

    1995-05-01

    A molecular Dynamics (MD) code, utilized for the study of atomistic models of metallic systems has been parallelized for MIMD (Multiple Instructions Multiple Data) parallel platforms by means of the Parallel Virtual Machine (PVM) message passing library. Since the parallelization implies modifications of the sequential algorithms, these are described from the point of view of the Statistical Mechanics theory. Furthermore, techniques and parallelization strategies utilized and the MD parallel code are described in detail. Benchmarks on several MIMD platforms (IBM SP1 and SP2, Cray T3D, Cluster of workstations) allow performances evaluation of the code versus the different characteristics of the parallel platforms

  7. A Platform-Independent Plugin for Navigating Online Radiology Cases.

    Science.gov (United States)

    Balkman, Jason D; Awan, Omer A

    2016-06-01

    Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.

  8. Data exchange between zero dimensional code and physics platform in the CFETR integrated system code

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Guoliang [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Shi, Nan [Institute of Plasma Physics, Chinese Academy of Sciences, No. 350 Shushanhu Road, Hefei (China); Zhou, Yifu; Mao, Shifeng [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Jian, Xiang [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, School of Electrical and Electronics Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Chen, Jiale [Institute of Plasma Physics, Chinese Academy of Sciences, No. 350 Shushanhu Road, Hefei (China); Liu, Li; Chan, Vincent [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Ye, Minyou, E-mail: yemy@ustc.edu.cn [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China)

    2016-11-01

    Highlights: • The workflow of the zero dimensional code and the multi-dimension physics platform of CFETR integrated system codeis introduced. • The iteration process among the codes in the physics platform. • The data transfer between the zero dimensionalcode and the physical platform, including data iteration and validation, and justification for performance parameters.. - Abstract: The China Fusion Engineering Test Reactor (CFETR) integrated system code contains three parts: a zero dimensional code, a physics platform and an engineering platform. We use the zero dimensional code to identify a set of preliminary physics and engineering parameters for CFETR, which is used as input to initiate multi-dimension studies using the physics and engineering platform for design, verification and validation. Effective data exchange between the zero dimensional code and the physical platform is critical for the optimization of CFETR design. For example, in evaluating the impact of impurity radiation on core performance, an open field line code is used to calculate the impurity transport from the first-wall boundary to the pedestal. The impurity particle in the pedestal are used as boundary conditions in a transport code for calculating impurity transport in the core plasma and the impact of core radiation on core performance. Comparison of the results from the multi-dimensional study to those from the zero dimensional code is used to further refine the controlled radiation model. The data transfer between the zero dimensional code and the physical platform, including data iteration and validation, and justification for performance parameters will be presented in this paper.

  9. Los Alamos radiation transport code system on desktop computing platforms

    International Nuclear Information System (INIS)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.; West, J.T.

    1990-01-01

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. The current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines

  10. Modular turbine airfoil and platform assembly with independent root teeth

    Science.gov (United States)

    Campbell, Christian X; Davies, Daniel O; Eng, Darryl

    2013-07-30

    A turbine airfoil (22E-H) extends from a shank (23E-H). A platform (30E-H) brackets or surrounds a first portion of the shank (23E-H). Opposed teeth (33, 35) extend laterally from the platform (30E-H) to engage respective slots (50) in a disk. Opposed teeth (25, 27) extend laterally from a second portion of the shank (29) that extends below the platform (30E-H) to engage other slots (52) in the disk. Thus the platform (30E-H) and the shank (23E-H) independently support their own centrifugal loads via their respective teeth. The platform may be formed in two portions (32E-H, 34E-H), that are bonded to each other at matching end-walls (37) and/or via pins (36G) passing through the shank (23E-H). Coolant channels (41, 43) may pass through the shank beside the pins (36G).

  11. Porting of a serial molecular dynamics code on MIMD platforms

    Energy Technology Data Exchange (ETDEWEB)

    Celino, M. [ENEA Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). HPCN Project

    1999-07-01

    A molecular dynamics (MD) code, utilized for the study of atomistic models of metallic systems has been parallelized for MIMD (multiple instructions multiple data) parallel platforms by means of the parallel virtual machine (PVM) message passing library. Since the parallelization implies modifications of the sequential algorithms, these are described from the point of view of the statistical mechanical theory. Furthermore, techniques and parallelization strategies utilized and the MD parallel code are described in detail. Benchmarks on several MIMD platforms (IBM SP1, SP2, Cray T3D, cluster of workstations) allow performances evaluation of the code versus the different characteristics of the parallel platforms. [Italian] Un codice seriale di dinamica molecolare (MD) utilizzato per lo studio di modelli atomici di materiali metallici e' stato parallelizzato per piattaforme parallele MIMD (multiple instructions multiple data) utilizzando librerie del parallel virtual machine (PVM). Poiche' l'operazione di parallelizzazione ha implicato la modifica degli algoritmi seriali del codice, questi vengono descritti ripercorrendo i concetti fondamentali della meccanica statistica. Inoltre sono presentate le tecniche e le strategie di parallelizzazione utilizzate descrivendo in dettaglio il codice parallelo di MD: Risultati di benchmark su diverse piattaforme MIMD (IBM SP1, SP2, Cray T3D, cluster of workstations) permettono di analizzare le performances del codice in funzione delle differenti caratteristiche delle piattaforme parallele.

  12. Independent peer review of nuclear safety computer codes

    International Nuclear Information System (INIS)

    Boyack, B.E.; Jenks, R.P.

    1993-01-01

    A structured, independent computer code peer-review process has been developed to assist the US Nuclear Regulatory Commission (NRC) and the US Department of Energy in their nuclear safety missions. This paper describes a structured process of independent code peer review, benefits associated with a code-independent peer review, as well as the authors' recent peer-review experience. The NRC adheres to the principle that safety of plant design, construction, and operation are the responsibility of the licensee. Nevertheless, NRC staff must have the ability to independently assess plant designs and safety analyses submitted by license applicants. According to Ref. 1, open-quotes this requires that a sound understanding be obtained of the important physical phenomena that may occur during transients in operating power plants.close quotes The NRC concluded that computer codes are the principal products to open-quotes understand and predict plant response to deviations from normal operating conditionsclose quotes and has developed several codes for that purpose. However, codes cannot be used blindly; they must be assessed and found adequate for the purposes they are intended. A key part of the qualification process can be accomplished through code peer reviews; this approach has been adopted by the NRC

  13. Independent rate and temporal coding in hippocampal pyramidal cells.

    Science.gov (United States)

    Huxter, John; Burgess, Neil; O'Keefe, John

    2003-10-23

    In the brain, hippocampal pyramidal cells use temporal as well as rate coding to signal spatial aspects of the animal's environment or behaviour. The temporal code takes the form of a phase relationship to the concurrent cycle of the hippocampal electroencephalogram theta rhythm. These two codes could each represent a different variable. However, this requires the rate and phase to vary independently, in contrast to recent suggestions that they are tightly coupled, both reflecting the amplitude of the cell's input. Here we show that the time of firing and firing rate are dissociable, and can represent two independent variables: respectively the animal's location within the place field, and its speed of movement through the field. Independent encoding of location together with actions and stimuli occurring there may help to explain the dual roles of the hippocampus in spatial and episodic memory, or may indicate a more general role of the hippocampus in relational/declarative memory.

  14. PAC++: Object-oriented platform for accelerator codes

    International Nuclear Information System (INIS)

    Malitsky, N.; Reshetov, A.; Bourianoff, G.

    1994-06-01

    Software packages in accelerator physics have relatively long life cycles. They had been developed and used for a wide range of accelerators in the past as well as for the current projects. For example, the basic algorithms written in the first accelerator Program TRANSPORT are actual for design of most magnet systems. Most of these packages had been implemented on Fortran. But this language is rather inconvenient as a basic language for large integrated projects that possibly could include real-time data acquisition, data base access, graphic riser interface modules (GUI), arid other features. Some later accelerator programs had been based on object-oriented tools (primarily, C++ language). These range from systems for advanced theoretical studies to control system software. For the new generations of accelerators it would be desirable to have an integrated platform in which all simulation and control tasks will be considered with one point of view. In this report the basic principles of an object-oriented platform for accelerator research software (PAC++) are suggested and analyzed. Primary objectives of this work are to enable efficient self-explaining realization of the accelerator concepts and to provide an integrated environment for the updating and the developing of the code

  15. Evidence for modality-independent order coding in working memory.

    Science.gov (United States)

    Depoorter, Ann; Vandierendonck, André

    2009-03-01

    The aim of the present study was to investigate the representation of serial order in working memory, more specifically whether serial order is coded by means of a modality-dependent or a modality-independent order code. This was investigated by means of a series of four experiments based on a dual-task methodology in which one short-term memory task was embedded between the presentation and recall of another short-term memory task. Two aspects were varied in these memory tasks--namely, the modality of the stimulus materials (verbal or visuo-spatial) and the presence of an order component in the task (an order or an item memory task). The results of this study showed impaired primary-task recognition performance when both the primary and the embedded task included an order component, irrespective of the modality of the stimulus materials. If one or both of the tasks did not contain an order component, less interference was found. The results of this study support the existence of a modality-independent order code.

  16. Cross-Platform JavaScript Coding: Shifting Sand Dunes and Shimmering Mirages.

    Science.gov (United States)

    Merchant, David

    1999-01-01

    Most libraries don't have the resources to cross-platform and cross-version test all of their JavaScript coding. Many turn to WYSIWYG; however, WYSIWYG editors don't generally produce optimized coding. Web developers should: test their coding on at least one 3.0 browser, code by hand using tools to help speed that process up, and include a simple…

  17. Practical Salesforce.com development without code customizing salesforce on the Force.com platform

    CERN Document Server

    Weinmeister, Philip

    2014-01-01

    Are you facing a challenging Salesforce.com problem-say, relating to customization, configuration, reporting, dashboards, or formulation-that you can't quite crack? Or maybe you are hoping to infuse some creativity into your solution design strategy to solve problems faster or make solutions more efficient? Practical Salesforce.com Development Without Code shows you how to unlock the power of the Force.com platform to solve real business problems-and all without writing a line of code. Adhering to Salesforce.com's ""Clicks, not code"" mantra, Salesforce.com expert Phil Weinmeister walks you t

  18. A platform-independent method for detecting errors in metagenomic sequencing data: DRISEE.

    Directory of Open Access Journals (Sweden)

    Kevin P Keegan

    Full Text Available We provide a novel method, DRISEE (duplicate read inferred sequencing error estimation, to assess sequencing quality (alternatively referred to as "noise" or "error" within and/or between sequencing samples. DRISEE provides positional error estimates that can be used to inform read trimming within a sample. It also provides global (whole sample error estimates that can be used to identify samples with high or varying levels of sequencing error that may confound downstream analyses, particularly in the case of studies that utilize data from multiple sequencing samples. For shotgun metagenomic data, we believe that DRISEE provides estimates of sequencing error that are more accurate and less constrained by technical limitations than existing methods that rely on reference genomes or the use of scores (e.g. Phred. Here, DRISEE is applied to (non amplicon data sets from both the 454 and Illumina platforms. The DRISEE error estimate is obtained by analyzing sets of artifactual duplicate reads (ADRs, a known by-product of both sequencing platforms. We present DRISEE as an open-source, platform-independent method to assess sequencing error in shotgun metagenomic data, and utilize it to discover previously uncharacterized error in de novo sequence data from the 454 and Illumina sequencing platforms.

  19. Application of the SALOME platform to the loose coupling of the CATHENA and ELOCA codes

    International Nuclear Information System (INIS)

    Zhuchkova, A.

    2012-01-01

    Use of coupled codes for the safety analysis of nuclear power plants is highly desirable, as it permits multi-disciplinary studies of complex reactor behaviors and, in particular, accident simulations. The present work demonstrates the potential of the SALOME platform as an interface for creating integrated, multi-disciplinary simulations of reactor scenarios. For this purpose two codes currently in use within the Canadian nuclear industry, CATHENA and ELOCA, were coupled by means of SALOME. The coupled codes were used to model the Power Burst Facility (PBF)-CANDU Test, which was to test the thermal-mechanical behavior of PHWR (pressurized heavy water reactor) fuel during a simulated Large Loss-Of-Coolant Accident (LLOCA). The results of the SALOME-coupled simulations are compared with a previous analysis in which the two codes were coupled using a package of scripts. (author)

  20. An Efficient Platform for the Automatic Extraction of Patterns in Native Code

    Directory of Open Access Journals (Sweden)

    Javier Escalada

    2017-01-01

    Full Text Available Different software tools, such as decompilers, code quality analyzers, recognizers of packed executable files, authorship analyzers, and malware detectors, search for patterns in binary code. The use of machine learning algorithms, trained with programs taken from the huge number of applications in the existing open source code repositories, allows finding patterns not detected with the manual approach. To this end, we have created a versatile platform for the automatic extraction of patterns from native code, capable of processing big binary files. Its implementation has been parallelized, providing important runtime performance benefits for multicore architectures. Compared to the single-processor execution, the average performance improvement obtained with the best configuration is 3.5 factors over the maximum theoretical gain of 4 factors.

  1. Validation of tumor protein marker quantification by two independent automated immunofluorescence image analysis platforms

    Science.gov (United States)

    Peck, Amy R; Girondo, Melanie A; Liu, Chengbao; Kovatich, Albert J; Hooke, Jeffrey A; Shriver, Craig D; Hu, Hai; Mitchell, Edith P; Freydin, Boris; Hyslop, Terry; Chervoneva, Inna; Rui, Hallgeir

    2016-01-01

    Protein marker levels in formalin-fixed, paraffin-embedded tissue sections traditionally have been assayed by chromogenic immunohistochemistry and evaluated visually by pathologists. Pathologist scoring of chromogen staining intensity is subjective and generates low-resolution ordinal or nominal data rather than continuous data. Emerging digital pathology platforms now allow quantification of chromogen or fluorescence signals by computer-assisted image analysis, providing continuous immunohistochemistry values. Fluorescence immunohistochemistry offers greater dynamic signal range than chromogen immunohistochemistry, and combined with image analysis holds the promise of enhanced sensitivity and analytic resolution, and consequently more robust quantification. However, commercial fluorescence scanners and image analysis software differ in features and capabilities, and claims of objective quantitative immunohistochemistry are difficult to validate as pathologist scoring is subjective and there is no accepted gold standard. Here we provide the first side-by-side validation of two technologically distinct commercial fluorescence immunohistochemistry analysis platforms. We document highly consistent results by (1) concordance analysis of fluorescence immunohistochemistry values and (2) agreement in outcome predictions both for objective, data-driven cutpoint dichotomization with Kaplan–Meier analyses or employment of continuous marker values to compute receiver-operating curves. The two platforms examined rely on distinct fluorescence immunohistochemistry imaging hardware, microscopy vs line scanning, and functionally distinct image analysis software. Fluorescence immunohistochemistry values for nuclear-localized and tyrosine-phosphorylated Stat5a/b computed by each platform on a cohort of 323 breast cancer cases revealed high concordance after linear calibration, a finding confirmed on an independent 382 case cohort, with concordance correlation coefficients >0

  2. FASTQSim: platform-independent data characterization and in silico read generation for NGS datasets.

    Science.gov (United States)

    Shcherbina, Anna

    2014-08-15

    High-throughput next generation sequencing technologies have enabled rapid characterization of clinical and environmental samples. Consequently, the largest bottleneck to actionable data has become sample processing and bioinformatics analysis, creating a need for accurate and rapid algorithms to process genetic data. Perfectly characterized in silico datasets are a useful tool for evaluating the performance of such algorithms. Background contaminating organisms are observed in sequenced mixtures of organisms. In silico samples provide exact truth. To create the best value for evaluating algorithms, in silico data should mimic actual sequencer data as closely as possible. FASTQSim is a tool that provides the dual functionality of NGS dataset characterization and metagenomic data generation. FASTQSim is sequencing platform-independent, and computes distributions of read length, quality scores, indel rates, single point mutation rates, indel size, and similar statistics for any sequencing platform. To create training or testing datasets, FASTQSim has the ability to convert target sequences into in silico reads with specific error profiles obtained in the characterization step. FASTQSim enables users to assess the quality of NGS datasets. The tool provides information about read length, read quality, repetitive and non-repetitive indel profiles, and single base pair substitutions. FASTQSim allows the user to simulate individual read datasets that can be used as standardized test scenarios for planning sequencing projects or for benchmarking metagenomic software. In this regard, in silico datasets generated with the FASTQsim tool hold several advantages over natural datasets: they are sequencing platform independent, extremely well characterized, and less expensive to generate. Such datasets are valuable in a number of applications, including the training of assemblers for multiple platforms, benchmarking bioinformatics algorithm performance, and creating challenge

  3. Benchmark testing and independent verification of the VS2DT computer code

    International Nuclear Information System (INIS)

    McCord, J.T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code's original documentation

  4. A PLC platform-independent structural analysis on FBD programs for digital reactor protection systems

    International Nuclear Information System (INIS)

    Jung, Sejin; Yoo, Junbeom; Lee, Young-Jun

    2017-01-01

    Highlights: • FBD has been widely used to implement safety-critical software for PLC-based systems. • The safety-critical software should be developed strictly with safety programming guidelines. • There are no argued rules that have specific links to higher guidelines NUREG/CR-6463 PLC platform-independently. • This paper proposes a set of rules on the structure of FBD programs with providing specific links to higher guidelines. • This paper also provides CASE tool ‘FBD Checker’ for analyzing the structure of FBD. - Abstract: FBD (function block diagram) has been widely used to implement safety-critical software for PLC (programmable logic controller)-based digital nuclear reactor protection systems. The software should be developed strictly in accordance with safety programming guidelines such as NUREG/CR-6463. Software engineering tools of PLC vendors enable us to present structural analyses using FBD programs, but specific rules pertaining to the guidelines are enclosed within the commercial tools, and specific links to the guidelines are not clearly communicated. This paper proposes a set of rules on the structure of FBD programs in accordance with guidelines, and we develop an automatic analysis tool for FBD programs written in the PLCopen TC6 format. With the proposed tool, any FBD program that is transformed into an open format can be analyzed the PLC platform-independently. We consider a case study on FBD programs obtained from a preliminary version of a Korean nuclear power plant, and we demonstrate the effectiveness and potential of the proposed rules and analysis tool.

  5. Performance evaluations of advanced massively parallel platforms based on gyrokinetic toroidal five-dimensional Eulerian code GT5D

    International Nuclear Information System (INIS)

    Idomura, Yasuhiro; Jolliet, Sebastien

    2010-01-01

    A gyrokinetic toroidal five dimensional Eulerian code GT5D is ported on six advanced massively parallel platforms and comprehensive benchmark tests are performed. A parallelisation technique based on physical properties of the gyrokinetic equation is presented. By extending the parallelisation technique with a hybrid parallel model, the scalability of the code is improved on platforms with multi-core processors. In the benchmark tests, a good salability is confirmed up to several thousands cores on every platforms, and the maximum sustained performance of ∼18.6 Tflops is achieved using 16384 cores of BX900. (author)

  6. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies

  7. An integrated development framework for rapid development of platform-independent and reusable satellite on-board software

    Science.gov (United States)

    Ziemke, Claas; Kuwahara, Toshinori; Kossev, Ivan

    2011-09-01

    Even in the field of small satellites, the on-board data handling subsystem has become complex and powerful. With the introduction of powerful CPUs and the availability of considerable amounts of memory on-board a small satellite it has become possible to utilize the flexibility and power of contemporary platform-independent real-time operating systems. Especially the non-commercial sector such like university institutes and community projects such as AMSAT or SSETI are characterized by the inherent lack of financial as well as manpower resources. The opportunity to utilize such real-time operating systems will contribute significantly to achieve a successful mission. Nevertheless the on-board software of a satellite is much more than just an operating system. It has to fulfill a multitude of functional requirements such as: Telecommand interpretation and execution, execution of control loops, generation of telemetry data and frames, failure detection isolation and recovery, the communication with peripherals and so on. Most of the aforementioned tasks are of generic nature and have to be conducted on any satellite with only minor modifications. A general set of functional requirements as well as a protocol for communication is defined in the SA ECSS-E-70-41A standard "Telemetry and telecommand packet utilization". This standard not only defines the communication protocol of the satellite-ground link but also defines a set of so called services which have to be available on-board of every compliant satellite and which are of generic nature. In this paper, a platform-independent and reusable framework is described which is implementing not only the ECSS-E-70-41A standard but also functionalities for interprocess communication, scheduling and a multitude of tasks commonly performed on-board of a satellite. By making use of the capabilities of the high-level programming language C/C++, the powerful open source library BOOST, the real-time operating system RTEMS and

  8. Independent validation testing of the FLAME computer code, Version 1.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-07-01

    Independent testing of the FLAME computer code, Version 1.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Validation tests, (i.e., tests which compare field data to the computer generated solutions) were used to determine the operational status of the FLAME computer code and were done on a qualitative basis through graphical comparisons of the experimental and numerical data. These tests were specifically designed to check: (1) correctness of the FORTRAN coding, (2) computational accuracy, and (3) suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: (1) independent applications, and (2) graduated difficulty of test cases. Three tests ranging in complexity from simple one-dimensional steady-state flow field problems under near-saturated conditions to two-dimensional transient flow problems with very dry initial conditions

  9. Development of a platform-independent receiver control system for SISIFOS

    Science.gov (United States)

    Lemke, Roland; Olberg, Michael

    1998-05-01

    Up to now receiver control software was a time consuming development usually written by receiver engineers who had mainly the hardware in mind. We are presenting a low-cost and very flexible system which uses a minimal interface to the real hardware, and which makes it easy to adapt to new receivers. Our system uses Tcl/Tk as a graphical user interface (GUI), SpecTcl as a GUI builder, Pgplot as plotting software, a simple query language (SQL) database for information storage and retrieval, Ethernet socket to socket communication and SCPI as a command control language. The complete system is in principal platform independent but for cost saving reasons we are using it actually on a PC486 running Linux 2.0.30, which is a copylefted Unix. The only hardware dependent part are the digital input/output boards, analog to digital and digital to analog convertors. In the case of the Linux PC we are using a device driver development kit to integrate the boards fully into the kernel of the operating system, which indeed makes them look like an ordinary device. The advantage of this system is firstly the low price and secondly the clear separation between the different software components which are available for many operating systems. If it is not possible, due to CPU performance limitations, to run all the software in a single machine,the SQL-database or the graphical user interface could be installed on separate computers.

  10. Synchronized Multimedia Streaming on the iPhone Platform with Network Coding

    DEFF Research Database (Denmark)

    Vingelmann, Peter; Fitzek, Frank; Pedersen, Morten Videbæk

    2011-01-01

    on the iPhone that use point-to-point architectures. After acknowledging their limitations, we propose a solution based on network coding to efficiently and reliably deliver the multimedia content to many devices in a synchronized manner. Then we introduce an application that implements this technique......This work presents the implementation of synchronized multimedia streaming for the Apple iPhone platform. The idea is to stream multimedia content from a single source to multiple receivers with direct or multihop connections to the source. First we look into existing solutions for video streaming...... on the iPhone. We also present our testbed, which consists of 16 iPod Touch devices to showcase the capabilities of our application....

  11. RTE - Compliance with the code of good practices and Independence of RTE. 2013 Annual Report

    International Nuclear Information System (INIS)

    2013-01-01

    RTE Reseau de Transport d'Electricite (Electricity Transmission System Operator) is referred to in Article L111-40 of the French Energy Code as the company in charge of managing France's public electricity transmission grid. For this purpose, RTE must comply with all the rules and obligations that apply to transmission grid management companies as defined by the Energy Code. More particularly, the articles concerning the Transmission System Operators (TSOs) belonging to a Vertically Integrated Undertaking (VIU) apply to RTE, a wholly-owned subsidiary of Electricite de France. The purpose of these provisions is to establish and maintain over time the independence of the transmission grid operator vis-a-vis the VIU. The Commission de Regulation de l'Energie (CRE - Energy Regulation Board) certified RTE in its deliberation of January 26, 2012: To maintain this certification, RTE is required to comply with its commitments made within the framework of the certification process and maintain the conditions of independence that were approved by the CRE. Among the obligations that RTE is required to comply with as an independent transmission manager is the need to bring together 'in a code of good practices approved by the Energy Regulation Board, the organisational measures taken to prevent any risks of discriminatory practices in terms of access to the grid' (Article L111-22). RTE is also required to put in place 'a Compliance Officer in charge of ensuring [...] the conformity of its methods with the obligations of independence incumbent on it with regard to other companies belonging to the VIU', 'to verify the application [...] of the commitments appearing in the code of good practices' and to draw up an annual report [...] which it sends on to the Energy Regulating Board' on the subject (Article L111-34). This document is the report regarding compliance with the code of good practices for 2013 by the RTE Compliance Officer. It is destined for the CRE and is intended to

  12. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    Science.gov (United States)

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  13. NASA Glenn Steady-State Heat Pipe Code GLENHP: Compilation for 64- and 32-Bit Windows Platforms

    Science.gov (United States)

    Tower, Leonard K.; Geng, Steven M.

    2016-01-01

    A new version of the NASA Glenn Steady State Heat Pipe Code, designated "GLENHP," is introduced here. This represents an update to the disk operating system (DOS) version LERCHP reported in NASA/TM-2000-209807. The new code operates on 32- and 64-bit Windows-based platforms from within the 32-bit command prompt window. An additional evaporator boundary condition and other features are provided.

  14. DC Brushless Motor Control Design and Preliminary Testing for Independent 4-Wheel Drive Rev-11 Robotic Platform

    Directory of Open Access Journals (Sweden)

    Roni Permana Saputra

    2012-03-01

    Full Text Available This paper discusses the design of control system for brushless DC motor using microcontroller ATMega 16 that will be applied to an independent 4-wheel drive Mobile Robot LIPI version 2 (REV-11. The control system consists of two parts which are brushless DC motor control module and supervisory control module that coordinates the desired command to the motor control module. To control the REV-11 platform, supervisory control transmit the reference data of speed and direction of motor to control the speed and direction of each actuator on the platform REV-11. From the test results it is concluded that the designed control system work properly to coordinate and control the speed and direction of motion of the actuator motor REV-11 platform

  15. Press touch code: A finger press based screen size independent authentication scheme for smart devices

    Science.gov (United States)

    Ranak, M. S. A. Noman; Nor, Nur Nadiah Hanim Binti Mohd; Zamli, Kamal Z.

    2017-01-01

    Due to recent advancements and appealing applications, the purchase rate of smart devices is increasing at a higher rate. Parallely, the security related threats and attacks are also increasing at a greater ratio on these devices. As a result, a considerable number of attacks have been noted in the recent past. To resist these attacks, many password-based authentication schemes are proposed. However, most of these schemes are not screen size independent; whereas, smart devices come in different sizes. Specifically, they are not suitable for miniature smart devices due to the small screen size and/or lack of full sized keyboards. In this paper, we propose a new screen size independent password-based authentication scheme, which also offers an affordable defense against shoulder surfing, brute force, and smudge attacks. In the proposed scheme, the Press Touch (PT)—a.k.a., Force Touch in Apple’s MacBook, Apple Watch, ZTE’s Axon 7 phone; 3D Touch in iPhone 6 and 7; and so on—is transformed into a new type of code, named Press Touch Code (PTC). We design and implement three variants of it, namely mono-PTC, multi-PTC, and multi-PTC with Grid, on the Android Operating System. An in-lab experiment and a comprehensive survey have been conducted on 105 participants to demonstrate the effectiveness of the proposed scheme. PMID:29084262

  16. Press touch code: A finger press based screen size independent authentication scheme for smart devices.

    Science.gov (United States)

    Ranak, M S A Noman; Azad, Saiful; Nor, Nur Nadiah Hanim Binti Mohd; Zamli, Kamal Z

    2017-01-01

    Due to recent advancements and appealing applications, the purchase rate of smart devices is increasing at a higher rate. Parallely, the security related threats and attacks are also increasing at a greater ratio on these devices. As a result, a considerable number of attacks have been noted in the recent past. To resist these attacks, many password-based authentication schemes are proposed. However, most of these schemes are not screen size independent; whereas, smart devices come in different sizes. Specifically, they are not suitable for miniature smart devices due to the small screen size and/or lack of full sized keyboards. In this paper, we propose a new screen size independent password-based authentication scheme, which also offers an affordable defense against shoulder surfing, brute force, and smudge attacks. In the proposed scheme, the Press Touch (PT)-a.k.a., Force Touch in Apple's MacBook, Apple Watch, ZTE's Axon 7 phone; 3D Touch in iPhone 6 and 7; and so on-is transformed into a new type of code, named Press Touch Code (PTC). We design and implement three variants of it, namely mono-PTC, multi-PTC, and multi-PTC with Grid, on the Android Operating System. An in-lab experiment and a comprehensive survey have been conducted on 105 participants to demonstrate the effectiveness of the proposed scheme.

  17. Press touch code: A finger press based screen size independent authentication scheme for smart devices.

    Directory of Open Access Journals (Sweden)

    M S A Noman Ranak

    Full Text Available Due to recent advancements and appealing applications, the purchase rate of smart devices is increasing at a higher rate. Parallely, the security related threats and attacks are also increasing at a greater ratio on these devices. As a result, a considerable number of attacks have been noted in the recent past. To resist these attacks, many password-based authentication schemes are proposed. However, most of these schemes are not screen size independent; whereas, smart devices come in different sizes. Specifically, they are not suitable for miniature smart devices due to the small screen size and/or lack of full sized keyboards. In this paper, we propose a new screen size independent password-based authentication scheme, which also offers an affordable defense against shoulder surfing, brute force, and smudge attacks. In the proposed scheme, the Press Touch (PT-a.k.a., Force Touch in Apple's MacBook, Apple Watch, ZTE's Axon 7 phone; 3D Touch in iPhone 6 and 7; and so on-is transformed into a new type of code, named Press Touch Code (PTC. We design and implement three variants of it, namely mono-PTC, multi-PTC, and multi-PTC with Grid, on the Android Operating System. An in-lab experiment and a comprehensive survey have been conducted on 105 participants to demonstrate the effectiveness of the proposed scheme.

  18. LeARN: a platform for detecting, clustering and annotating non-coding RNAs

    Directory of Open Access Journals (Sweden)

    Schiex Thomas

    2008-01-01

    Full Text Available Abstract Background In the last decade, sequencing projects have led to the development of a number of annotation systems dedicated to the structural and functional annotation of protein-coding genes. These annotation systems manage the annotation of the non-protein coding genes (ncRNAs in a very crude way, allowing neither the edition of the secondary structures nor the clustering of ncRNA genes into families which are crucial for appropriate annotation of these molecules. Results LeARN is a flexible software package which handles the complete process of ncRNA annotation by integrating the layers of automatic detection and human curation. Conclusion This software provides the infrastructure to deal properly with ncRNAs in the framework of any annotation project. It fills the gap between existing prediction software, that detect independent ncRNA occurrences, and public ncRNA repositories, that do not offer the flexibility and interactivity required for annotation projects. The software is freely available from the download section of the website http://bioinfo.genopole-toulouse.prd.fr/LeARN

  19. RELAP5/MOD3 code manual: Summaries and reviews of independent code assessment reports. Volume 7, Revision 1

    International Nuclear Information System (INIS)

    Moore, R.L.; Sloan, S.M.; Schultz, R.R.; Wilson, G.E.

    1996-10-01

    Summaries of RELAP5/MOD3 code assessments, a listing of the assessment matrix, and a chronology of the various versions of the code are given. Results from these code assessments have been used to formulate a compilation of some of the strengths and weaknesses of the code. These results are documented in the report. Volume 7 was designed to be updated periodically and to include the results of the latest code assessments as they become available. Consequently, users of Volume 7 should ensure that they have the latest revision available

  20. Platforms.

    Science.gov (United States)

    Josko, Deborah

    2014-01-01

    The advent of DNA sequencing technologies and the various applications that can be performed will have a dramatic effect on medicine and healthcare in the near future. There are several DNA sequencing platforms available on the market for research and clinical use. Based on the medical laboratory scientist or researcher's needs and taking into consideration laboratory space and budget, one can chose which platform will be beneficial to their institution and their patient population. Although some of the instrument costs seem high, diagnosing a patient quickly and accurately will save hospitals money with fewer hospital stays and targeted treatment based on an individual's genetic make-up. By determining the type of disease an individual has, based on the mutations present or having the ability to prescribe the appropriate antimicrobials based on the knowledge of the organism's resistance patterns, the clinician will be better able to treat and diagnose a patient which ultimately will improve patient outcomes and prognosis.

  1. FENICIA: a generic plasma simulation code using a flux-independent field-aligned coordinate approach

    International Nuclear Information System (INIS)

    Hariri, Farah

    2013-01-01

    The primary thrust of this work is the development and implementation of a new approach to the problem of field-aligned coordinates in magnetized plasma turbulence simulations called the FCI approach (Flux-Coordinate Independent). The method exploits the elongated nature of micro-instability driven turbulence which typically has perpendicular scales on the order of a few ion gyro-radii, and parallel scales on the order of the machine size. Mathematically speaking, it relies on local transformations that align a suitable coordinate to the magnetic field to allow efficient computation of the parallel derivative. However, it does not rely on flux coordinates, which permits discretizing any given field on a regular grid in the natural coordinates such as (x, y, z) in the cylindrical limit. The new method has a number of advantages over methods constructed starting from flux coordinates, allowing for more flexible coding in a variety of situations including X-point configurations. In light of these findings, a plasma simulation code FENICIA has been developed based on the FCI approach with the ability to tackle a wide class of physical models. The code has been verified on several 3D test models. The accuracy of the approach is tested in particular with respect to the question of spurious radial transport. Tests on 3D models of the drift wave propagation and of the Ion Temperature Gradient (ITG) instability in cylindrical geometry in the linear regime demonstrate again the high quality of the numerical method. Finally, the FCI approach is shown to be able to deal with an X-point configuration such as one with a magnetic island with good convergence and conservation properties. (author) [fr

  2. A 2.9 ps equivalent resolution interpolating time counter based on multiple independent coding lines

    International Nuclear Information System (INIS)

    Szplet, R; Jachna, Z; Kwiatkowski, P; Rozyc, K

    2013-01-01

    We present the design, operation and test results of a time counter that has an equivalent resolution of 2.9 ps, a measurement uncertainty at the level of 6 ps, and a measurement range of 10 s. The time counter has been implemented in a general-purpose reprogrammable device Spartan-6 (Xilinx). To obtain both high precision and wide measurement range the counting of periods of a reference clock is combined with a two-stage interpolation within a single period of the clock signal. The interpolation involves a four-phase clock in the first interpolation stage (FIS) and an equivalent coding line (ECL) in the second interpolation stage (SIS). The ECL is created as a compound of independent discrete time coding lines (TCL). The number of TCLs used to create the virtual ECL has an effect on its resolution. We tested ECLs made from up to 16 TCLs, but the idea may be extended to a larger number of lines. In the presented time counter the coarse resolution of the counting method equal to 2 ns (period of the 500 MHz reference clock) is firstly improved fourfold in the FIS and next even more than 400 times in the SIS. The proposed solution allows us to overcome the technological limitation in achievable resolution and improve the precision of conversion of integrated interpolators based on tapped delay lines. (paper)

  3. Independent assessment of TRAC and RELAP5 codes through separate effects tests

    International Nuclear Information System (INIS)

    Saha, P.; Rohatgi, U.S.; Jo, J.H.; Neymotin, L.; Slovik, G.; Yuelys-Miksis, C.; Pu, J.

    1983-01-01

    Independent assessment of TRAC-PF1 (Version 7.0), TRAC-BD1 (Version 12.0) and RELAP5/MOD1 (Cycle 14) that was initiated at BNL in FY 1982, has been completed in FY 1983. As in the previous years, emphasis at Brookhaven has been in simulating various separate-effects tests with these advanced codes and identifying the areas where further thermal-hydraulic modeling improvements are needed. The following six catetories of tests were simulated with the above codes: (1) critical flow tests (Moby-Dick nitrogen-water, BNL flashing flow, Marviken Test 24); (2) Counter-Current Flow Limiting (CCFL) tests (University of Houston, Dartmouth College single and parallel tube test); (3) level swell tests (G.E. large vessel test); (4) steam generator tests (B and W 19-tube model S.G. tests, FLECHT-SEASET U-tube S.G. tests); (5) natural circulation tests (FRIGG loop tests); and (6) post-CHF tests (Oak Ridge steady-state test)

  4. Conversion of HSPF Legacy Model to a Platform-Independent, Open-Source Language

    Science.gov (United States)

    Heaphy, R. T.; Burke, M. P.; Love, J. T.

    2015-12-01

    Since its initial development over 30 years ago, the Hydrologic Simulation Program - FORTAN (HSPF) model has been used worldwide to support water quality planning and management. In the United States, HSPF receives widespread endorsement as a regulatory tool at all levels of government and is a core component of the EPA's Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) system, which was developed to support nationwide Total Maximum Daily Load (TMDL) analysis. However, the model's legacy code and data management systems have limitations in their ability to integrate with modern software, hardware, and leverage parallel computing, which have left voids in optimization, pre-, and post-processing tools. Advances in technology and our scientific understanding of environmental processes that have occurred over the last 30 years mandate that upgrades be made to HSPF to allow it to evolve and continue to be a premiere tool for water resource planners. This work aims to mitigate the challenges currently facing HSPF through two primary tasks: (1) convert code to a modern widely accepted, open-source, high-performance computing (hpc) code; and (2) convert model input and output files to modern widely accepted, open-source, data model, library, and binary file format. Python was chosen as the new language for the code conversion. It is an interpreted, object-oriented, hpc code with dynamic semantics that has become one of the most popular open-source languages. While python code execution can be slow compared to compiled, statically typed programming languages, such as C and FORTRAN, the integration of Numba (a just-in-time specializing compiler) has allowed this challenge to be overcome. For the legacy model data management conversion, HDF5 was chosen to store the model input and output. The code conversion for HSPF's hydrologic and hydraulic modules has been completed. The converted code has been tested against HSPF's suite of "test" runs and shown

  5. Combining independent de novo assemblies optimizes the coding transcriptome for nonconventional model eukaryotic organisms.

    Science.gov (United States)

    Cerveau, Nicolas; Jackson, Daniel J

    2016-12-09

    Next-generation sequencing (NGS) technologies are arguably the most revolutionary technical development to join the list of tools available to molecular biologists since PCR. For researchers working with nonconventional model organisms one major problem with the currently dominant NGS platform (Illumina) stems from the obligatory fragmentation of nucleic acid material that occurs prior to sequencing during library preparation. This step creates a significant bioinformatic challenge for accurate de novo assembly of novel transcriptome data. This challenge becomes apparent when a variety of modern assembly tools (of which there is no shortage) are applied to the same raw NGS dataset. With the same assembly parameters these tools can generate markedly different assembly outputs. In this study we present an approach that generates an optimized consensus de novo assembly of eukaryotic coding transcriptomes. This approach does not represent a new assembler, rather it combines the outputs of a variety of established assembly packages, and removes redundancy via a series of clustering steps. We test and validate our approach using Illumina datasets from six phylogenetically diverse eukaryotes (three metazoans, two plants and a yeast) and two simulated datasets derived from metazoan reference genome annotations. All of these datasets were assembled using three currently popular assembly packages (CLC, Trinity and IDBA-tran). In addition, we experimentally demonstrate that transcripts unique to one particular assembly package are likely to be bioinformatic artefacts. For all eight datasets our pipeline generates more concise transcriptomes that in fact possess more unique annotatable protein domains than any of the three individual assemblers we employed. Another measure of assembly completeness (using the purpose built BUSCO databases) also confirmed that our approach yields more information. Our approach yields coding transcriptome assemblies that are more likely to be

  6. Remembering to learn: independent place and journey coding mechanisms contribute to memory transfer.

    Science.gov (United States)

    Bahar, Amir S; Shapiro, Matthew L

    2012-02-08

    The neural mechanisms that integrate new episodes with established memories are unknown. When rats explore an environment, CA1 cells fire in place fields that indicate locations. In goal-directed spatial memory tasks, some place fields differentiate behavioral histories ("journey-dependent" place fields) while others do not ("journey-independent" place fields). To investigate how these signals inform learning and memory for new and familiar episodes, we recorded CA1 and CA3 activity in rats trained to perform a "standard" spatial memory task in a plus maze and in two new task variants. A "switch" task exchanged the start and goal locations in the same environment; an "altered environment" task contained unfamiliar local and distal cues. In the switch task, performance was mildly impaired, new firing maps were stable, but the proportion and stability of journey-dependent place fields declined. In the altered environment, overall performance was strongly impaired, new firing maps were unstable, and stable proportions of journey-dependent place fields were maintained. In both tasks, memory errors were accompanied by a decline in journey codes. The different dynamics of place and journey coding suggest that they reflect separate mechanisms and contribute to distinct memory computations. Stable place fields may represent familiar relationships among environmental features that are required for consistent memory performance. Journey-dependent activity may correspond with goal-directed behavioral sequences that reflect expectancies that generalize across environments. The complementary signals could help link current events with established memories, so that familiarity with either a behavioral strategy or an environment can inform goal-directed learning.

  7. DEVELOPMENT OF SALES APPLICATION OF PREPAID ELECTRICITY VOUCHER BASED ON ANFROID PLATFORM USING QUICK RESPONSE CODE (QR CODE

    Directory of Open Access Journals (Sweden)

    Ricky Akbar

    2017-09-01

    Full Text Available Perusahaan Listrik Negara (PLN has implemented a smart electricity system or prepaid electricity. The customers pay the electricity voucher first before use the electricity. The token contained in electricity voucher that has been purchased by the customer is inserted into the Meter Prabayar (MPB installed in the location of customers. When a customer purchases a voucher, it will get a receipt that contains all of the customer's identity and the 20-digit of voucher code (token to be entered into MPB as a substitute for electrical energy credit. Receipts obtained by the customer is certainly vulnerable to loss, or hijacked by unresponsible parties. In this study, authors designed and develop an android based application by utilizing QR code technology as a replacement for the receipt of prepaid electricity credit which contains the identity of the customer and the 20-digit voucher code. The application is developed by implemented waterfall methodology. The implementation process of the waterfall methods used, are (1 analysis of functional requirement of the system by conducting a preliminary study and data collection based on field studies and literature, (2 system design by using UML diagrams and Business Process Model Notation (BPMN and Entity Relationship diagram (ERD, (3 design implementation by using OOP (Object Oriented programming technique. Web application is developed by using laravel PHP framework and database MySQL while mobile application is developed by using B4A (4 developed system is tested by using blackbox method testing. Final result of this research is a Web and mobile applications for the sale of electricityvoucher by QR Code technology.

  8. Memory for pictures and sounds: independence of auditory and visual codes.

    Science.gov (United States)

    Thompson, V A; Paivio, A

    1994-09-01

    Three experiments examined the mnemonic independence of auditory and visual nonverbal stimuli in free recall. Stimulus lists consisted of (1) pictures, (2) the corresponding environmental sounds, or (3) picture-sound pairs. In Experiment 1, free recall was tested under three learning conditions: standard intentional, intentional with a rehearsal-inhibiting distracter task, or incidental with the distracter task. In all three groups, recall was best for the picture-sound items. In addition, recall for the picture-sound stimuli appeared to be additive relative to pictures or sounds alone when the distracter task was used. Experiment 2 included two additional groups: In one, two copies of the same picture were shown simultaneously; in the other, two different pictures of the same concept were shown. There was no difference in recall among any of the picture groups; in contrast, recall in the picture-sound condition was greater than recall in either single-modality condition. However, doubling the exposure time in a third experiment resulted in additively higher recall for repeated pictures with different exemplars than ones with identical exemplars. The results are discussed in terms of dual coding theory and alternative conceptions of the memory trace.

  9. Linear-time non-malleable codes in the bit-wise independent tampering model

    NARCIS (Netherlands)

    R.J.F. Cramer (Ronald); I.B. Damgård (Ivan); N.M. Döttling (Nico); I. Giacomelli (Irene); C. Xing (Chaoping)

    2017-01-01

    textabstractNon-malleable codes were introduced by Dziembowski et al. (ICS 2010) as coding schemes that protect a message against tampering attacks. Roughly speaking, a code is non-malleable if decoding an adversarially tampered encoding of a message m produces the original message m or a value m′

  10. prfectBLAST: a platform-independent portable front end for the command terminal BLAST+ stand-alone suite.

    Science.gov (United States)

    Santiago-Sotelo, Perfecto; Ramirez-Prado, Jorge Humberto

    2012-11-01

    prfectBLAST is a multiplatform graphical user interface (GUI) for the stand-alone BLAST+ suite of applications. It allows researchers to do nucleotide or amino acid sequence similarity searches against public (or user-customized) databases that are locally stored. It does not require any dependencies or installation and can be used from a portable flash drive. prfectBLAST is implemented in Java version 6 (SUN) and runs on all platforms that support Java and for which National Center for Biotechnology Information has made available stand-alone BLAST executables, including MS Windows, Mac OS X, and Linux. It is free and open source software, made available under the GNU General Public License version 3 (GPLv3) and can be downloaded at www.cicy.mx/sitios/jramirez or http://code.google.com/p/prfectblast/.

  11. ON CODE REFACTORING OF THE DIALOG SUBSYSTEM OF CDSS PLATFORM FOR THE OPEN-SOURCE MIS OPENMRS

    Directory of Open Access Journals (Sweden)

    A. V. Semenets

    2016-08-01

    The open-source MIS OpenMRS developer tools and software API are reviewed. The results of code refactoring of the dialog subsystem of the CDSS platform which is made as module for the open-source MIS OpenMRS are presented. The structure of information model of database of the CDSS dialog subsystem was updated according with MIS OpenMRS requirements. The Model-View-Controller (MVC based approach to the CDSS dialog subsystem architecture was re-implemented with Java programming language using Spring and Hibernate frameworks. The MIS OpenMRS Encounter portlet form for the CDSS dialog subsystem integration is developed as an extension. The administrative module of the CDSS platform is recreated. The data exchanging formats and methods for interaction of OpenMRS CDSS dialog subsystem module and DecisionTree GAE service are re-implemented with help of AJAX technology via jQuery library

  12. Signal-independent timescale analysis (SITA) and its application for neural coding during reaching and walking.

    Science.gov (United States)

    Zacksenhouse, Miriam; Lebedev, Mikhail A; Nicolelis, Miguel A L

    2014-01-01

    What are the relevant timescales of neural encoding in the brain? This question is commonly investigated with respect to well-defined stimuli or actions. However, neurons often encode multiple signals, including hidden or internal, which are not experimentally controlled, and thus excluded from such analysis. Here we consider all rate modulations as the signal, and define the rate-modulations signal-to-noise ratio (RM-SNR) as the ratio between the variance of the rate and the variance of the neuronal noise. As the bin-width increases, RM-SNR increases while the update rate decreases. This tradeoff is captured by the ratio of RM-SNR to bin-width, and its variations with the bin-width reveal the timescales of neural activity. Theoretical analysis and simulations elucidate how the interactions between the recovery properties of the unit and the spectral content of the encoded signals shape this ratio and determine the timescales of neural coding. The resulting signal-independent timescale analysis (SITA) is applied to investigate timescales of neural activity recorded from the motor cortex of monkeys during: (i) reaching experiments with Brain-Machine Interface (BMI), and (ii) locomotion experiments at different speeds. Interestingly, the timescales during BMI experiments did not change significantly with the control mode or training. During locomotion, the analysis identified units whose timescale varied consistently with the experimentally controlled speed of walking, though the specific timescale reflected also the recovery properties of the unit. Thus, the proposed method, SITA, characterizes the timescales of neural encoding and how they are affected by the motor task, while accounting for all rate modulations.

  13. Signal-Independent Timescale Analysis (SITA and its Application for Neural Coding during Reaching and Walking

    Directory of Open Access Journals (Sweden)

    Miriam eZacksenhouse

    2014-08-01

    Full Text Available What are the relevant timescales of neural encoding in the brain? This question is commonly investigated with respect to well-defined stimuli or actions. However, neurons often encode multiple signals, including hidden or internal, which are not experimentally controlled, and thus excluded from such analysis. Here we consider all rate modulations as the signal, and define the rate-modulations signal-to-noise ratio (RM-SNR as the ratio between the variance of the rate and the variance of the neuronal noise. As the bin-width increases, RM-SNR increases while the update rate decreases. This tradeoff is captured by the ratio of RM-SNR to bin-width, and its variations with the bin-width reveal the timescales of neural activity. Theoretical analysis and simulations elucidate how the interactions between the recovery properties of the unit and the spectral content of the encoded signals shape this ratio and determine the timescales of neural coding. The resulting signal-independent timescale analysis (SITA is applied to investigate timescales of neural activity recorded from the motor cortex of monkeys during: (i reaching experiments with Brain-Machine Interface (BMI, and (ii locomotion experiments at different speeds. Interestingly, the timescales during BMI experiments did not change significantly with the control mode or training. During locomotion, the analysis identified units whose timescale varied consistently with the experimentally controlled speed of walking, though the specific timescale reflected also the recovery properties of the unit. Thus, the proposed method, SITA, characterizes the timescales of neural encoding and how they are affected by the motor task, while accounting for all rate modulations.

  14. An Evaluation of Automated Code Generation with the PetriCode Approach

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Automated code generation is an important element of model driven development methodologies. We have previously proposed an approach for code generation based on Coloured Petri Net models annotated with textual pragmatics for the network protocol domain. In this paper, we present and evaluate thr...... important properties of our approach: platform independence, code integratability, and code readability. The evaluation shows that our approach can generate code for a wide range of platforms which is integratable and readable....

  15. Development of platform to compare different wall heat transfer packages for system analysis codes

    International Nuclear Information System (INIS)

    Kim, Min-Gil; Lee, Won Woong; Lee, Jeong Ik; Shin, Sung Gil

    2016-01-01

    System thermal hydraulic (STH) analysis code is used for analyzing and evaluating the safety of a designed nuclear system. The system thermal hydraulic analysis code typically solves mass, momentum and energy conservation equations for multiple phases with sets of selected empirical constitutive equations to close the problem. Several STH codes are utilized in academia, industry and regulators, such as MARS-KS, SPACE, RELAP5, COBRA-TF, TRACE, and so on. Each system thermal hydraulic code consists of different sets of governing equations and correlations. However, the packages and sets of correlations of each code are not compared quantitatively yet. Wall heat transfer mode transition maps of SPACE and MARS-KS have a little difference for the transition from wall nucleate heat transfer mode to wall film heat transfer mode. Both codes have the same heat transfer packages and correlations in most region except for wall film heat transfer mode. Most of heat transfer coefficients calculated for the range of selected variables of SPACE are the same with those of MARS-KS. For the intervals between 500K and 540K of wall temperature, MARS-KS selects the wall film heat transfer mode and Bromley correlation but SPACE select the wall nucleate heat transfer mode and Chen correlation. This is because the transition from nucleate boiling to film boiling of MARS-KS is earlier than SPACE. More detailed analysis of the heat transfer package and flow regime package will be followed in the near future

  16. Linear-Time Non-Malleable Codes in the Bit-Wise Independent Tampering Model

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Döttling, Nico

    Non-malleable codes were introduced by Dziembowski et al. (ICS 2010) as coding schemes that protect a message against tampering attacks. Roughly speaking, a code is non-malleable if decoding an adversarially tampered encoding of a message m produces the original message m or a value m' (eventuall...... non-malleable codes of Agrawal et al. (TCC 2015) and of Cher- aghchi and Guruswami (TCC 2014) and improves the previous result in the bit-wise tampering model: it builds the first non-malleable codes with linear-time complexity and optimal-rate (i.e. rate 1 - o(1)).......Non-malleable codes were introduced by Dziembowski et al. (ICS 2010) as coding schemes that protect a message against tampering attacks. Roughly speaking, a code is non-malleable if decoding an adversarially tampered encoding of a message m produces the original message m or a value m' (eventually...... abort) completely unrelated with m. It is known that non-malleability is possible only for restricted classes of tampering functions. Since their introduction, a long line of works has established feasibility results of non-malleable codes against different families of tampering functions. However...

  17. Performance awareness execution performance of HEP codes on RISC platforms,issues and solutions

    CERN Document Server

    Yaari, R; Yaari, Refael; Jarp, Sverre

    1995-01-01

    The work described in this paper was started during the migration of Aleph's production jobs from the IBM mainframe/CRAY supercomputer to several RISC/Unix workstation platforms. The aim was to understand why Aleph did not obtain the performance on the RISC platforms that was "promised" after a CERN Unit comparison between these RISC platforms and the IBM mainframe. Remedies were also sought. Since the work with the Aleph jobs in turn led to the related task of understanding compilers and their options, the conditions under which the CERN benchmarks (and other benchmarks) were run, kernel routines and frequently used CERNLIB routines, the whole undertaking expanded to try to look at all the factors that influence the performance of High Energy Physics (HEP) jobs in general. Finally, key performance issues were reviewed against the programs of one of the LHC collaborations (Atlas) with the hope that the conclusions would be of long- term interest during the establishment of their simulation, reconstruction and...

  18. Multitasking the three-dimensional transport code TORT on CRAY platforms

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1996-01-01

    The multitasking options in the three-dimensional neutral particle transport code TORT originally implemented for Cray's CTSS operating system are revived and extended to run on Cray Y/MP and C90 computers using the UNICOS operating system. These include two coarse-grained domain decompositions; across octants, and across directions within an octant, termed Octant Parallel (OP), and Direction Parallel (DP), respectively. Parallel performance of the DP is significantly enhanced by increasing the task grain size and reducing load imbalance via dynamic scheduling of the discrete angles among the participating tasks. Substantial Wall Clock speedup factors, approaching 4.5 using 8 tasks, have been measured in a time-sharing environment, and generally depend on the test problem specifications, number of tasks, and machine loading during execution

  19. The Karlsruhe code MODINA for model independent analysis of elastic scattering of spinless particles

    International Nuclear Information System (INIS)

    Gils, H.J.

    1983-12-01

    The Karlsruhe code MODINA (KfK 3063, published November 1980) has been extended in particular with respect to new approximations in the folding models and to the calculation of errors in the fourier-Bessel potentials. The corresponding subroutines replacing previous ones are compiled in this first supplement. The listings of the fit-routine-package FITEX missing in the first publication of MODINA are also included now. (orig.) [de

  20. High Angular Momentum Halo Gas: A Feedback and Code-independent Prediction of LCDM

    Science.gov (United States)

    Stewart, Kyle R.; Maller, Ariyeh H.; Oñorbe, Jose; Bullock, James S.; Joung, M. Ryan; Devriendt, Julien; Ceverino, Daniel; Kereš, Dušan; Hopkins, Philip F.; Faucher-Giguère, Claude-André

    2017-07-01

    We investigate angular momentum acquisition in Milky Way-sized galaxies by comparing five high resolution zoom-in simulations, each implementing identical cosmological initial conditions but utilizing different hydrodynamic codes: Enzo, Art, Ramses, Arepo, and Gizmo-PSPH. Each code implements a distinct set of feedback and star formation prescriptions. We find that while many galaxy and halo properties vary between the different codes (and feedback prescriptions), there is qualitative agreement on the process of angular momentum acquisition in the galaxy’s halo. In all simulations, cold filamentary gas accretion to the halo results in ˜4 times more specific angular momentum in cold halo gas (λ cold ≳ 0.1) than in the dark matter halo. At z > 1, this inflow takes the form of inspiraling cold streams that are co-directional in the halo of the galaxy and are fueled, aligned, and kinematically connected to filamentary gas infall along the cosmic web. Due to the qualitative agreement among disparate simulations, we conclude that the buildup of high angular momentum halo gas and the presence of these inspiraling cold streams are robust predictions of Lambda Cold Dark Matter galaxy formation, though the detailed morphology of these streams is significantly less certain. A growing body of observational evidence suggests that this process is borne out in the real universe.

  1. High Angular Momentum Halo Gas: A Feedback and Code-independent Prediction of LCDM

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Kyle R. [Department of Mathematical Sciences, California Baptist University, 8432 Magnolia Ave., Riverside, CA 92504 (United States); Maller, Ariyeh H. [Department of Physics, New York City College of Technology, 300 Jay St., Brooklyn, NY 11201 (United States); Oñorbe, Jose [Max-Planck-Institut für Astronomie, Königstuhl 17, D-69117 Heidelberg (Germany); Bullock, James S. [Center for Cosmology, Department of Physics and Astronomy, The University of California at Irvine, Irvine, CA 92697 (United States); Joung, M. Ryan [Department of Astronomy, Columbia University, New York, NY 10027 (United States); Devriendt, Julien [Department of Physics, University of Oxford, The Denys Wilkinson Building, Keble Rd., Oxford OX1 3RH (United Kingdom); Ceverino, Daniel [Zentrum für Astronomie der Universität Heidelberg, Institut für Theoretische Astrophysik, Albert-Ueberle-Str. 2, D-69120 Heidelberg (Germany); Kereš, Dušan [Department of Physics, Center for Astrophysics and Space Sciences, University of California at San Diego, 9500 Gilman Dr., La Jolla, CA 92093 (United States); Hopkins, Philip F. [California Institute of Technology, 1200 E. California Blvd., Pasadena, CA 91125 (United States); Faucher-Giguère, Claude-André [Department of Physics and Astronomy and CIERA, Northwestern University, 2145 Sheridan Rd., Evanston, IL 60208 (United States)

    2017-07-01

    We investigate angular momentum acquisition in Milky Way-sized galaxies by comparing five high resolution zoom-in simulations, each implementing identical cosmological initial conditions but utilizing different hydrodynamic codes: Enzo, Art, Ramses, Arepo, and Gizmo-PSPH. Each code implements a distinct set of feedback and star formation prescriptions. We find that while many galaxy and halo properties vary between the different codes (and feedback prescriptions), there is qualitative agreement on the process of angular momentum acquisition in the galaxy’s halo. In all simulations, cold filamentary gas accretion to the halo results in ∼4 times more specific angular momentum in cold halo gas ( λ {sub cold} ≳ 0.1) than in the dark matter halo. At z > 1, this inflow takes the form of inspiraling cold streams that are co-directional in the halo of the galaxy and are fueled, aligned, and kinematically connected to filamentary gas infall along the cosmic web. Due to the qualitative agreement among disparate simulations, we conclude that the buildup of high angular momentum halo gas and the presence of these inspiraling cold streams are robust predictions of Lambda Cold Dark Matter galaxy formation, though the detailed morphology of these streams is significantly less certain. A growing body of observational evidence suggests that this process is borne out in the real universe.

  2. Qualification of the APOLLO2 lattice physics code of the NURISP platform for WWER hexagonal lattices

    International Nuclear Information System (INIS)

    Hegyi, G.; Kereszturi, A.; Tota, A.

    2011-01-01

    The experiments performed at the ZR-6 zero critical reactor by the Temporary International Collective and a numerical assembly burnup benchmark specified for depletion calculation of a WWER-440 assembly containing gadolinium burnable poison were used to qualify the APOLLO2 (APOLLO2.8-E3) code as a part of its ongoing validation activity. The work is part of the NURISP project, where KFKI Atomic Energy Research Institute undertook to develop and qualify some calculation schemes for hexagonal problems. Concerning the ZR-6 measurements, single cell, macro cell and two-dimensional calculations of selected regular and perturbed experiments are being used for the validation. In the two-dimensional cases the radial leakage is also taken into account in the calculations together with the axial leakage represented by the measured axial buckling. Criticality parameter and reaction rate comparisons are presented. Although various sets of the experiments have been selected for the validation, good agreement of the measured and calculated parameters could be found by using the different options offered by APOLLO2. An additional mathematical benchmark-presented in the paper - also attests for the reliability of APOLLO2. All the test results prove the reliability of APOLLO2 for WWER core calculations. (Authors)

  3. REMEMBERING TO LEARN: INDEPENDENT PLACE AND JOURNEY CODING MECHANISMS CONTRIBUTE TO MEMORY TRANSFER

    OpenAIRE

    Bahar, Amir S.; Shapiro, Matthew L.

    2012-01-01

    The neural mechanisms that integrate new episodes with established memories are unknown. When rats explore an environment, CA1 cells fire in place fields that indicate locations. In goal-directed spatial memory tasks, some place fields differentiate behavioral histories (journey-dependent place fields) while others do not (journey-independent place fields). To investigate how these signals inform learning and memory for new and familiar episodes, we recorded CA1 and CA3 activity in rats train...

  4. Design and simulation of material-integrated distributed sensor processing with a code-based agent platform and mobile multi-agent systems.

    Science.gov (United States)

    Bosse, Stefan

    2015-02-16

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.

  5. Design and Simulation of Material-Integrated Distributed Sensor Processing with a Code-Based Agent Platform and Mobile Multi-Agent Systems

    Directory of Open Access Journals (Sweden)

    Stefan Bosse

    2015-02-01

    Full Text Available Multi-agent systems (MAS can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.

  6. Independent assessment of TRAC-PD2 and RELAP5/MOD1 codes at BNL in FY 1981

    International Nuclear Information System (INIS)

    Saha, P.; Jo, J.H.; Neymotin, L.; Rohatgi, U.S.; Slovik, G.

    1982-12-01

    This report documents the independent assessment calculations performed with the TRAC-PD2 and RELAP/MOD1 codes at Brookhaven National Laboratory (BNL) during Fiscal Year 1981. A large variety of separate-effects experiments dealing with (1) steady-state and transient critical flow, (2) level swell, (3) flooding and entrainment, (4) steady-state flow boiling, (5) integral economizer once-through steam generator (IEOTSG) performance, (6) bottom reflood, and (7) two-dimensional phase separation of two-phase mixtures were simulated with TRAC-PD2. In addition, the early part of an overcooling transient which occurred at the Rancho Seco nuclear power plant on March 20, 1978 was also computed with an updated version of TRAC-PD2. Three separate-effects tests dealing with (1) transient critical flow, (2) steady-state flow boiling, and (3) IEOTSG performance were also simulated with RELAP5/MOD1 code. Comparisons between the code predictions and the test data are presented

  7. MouSensor: A Versatile Genetic Platform to Create Super Sniffer Mice for Studying Human Odor Coding

    Directory of Open Access Journals (Sweden)

    Charlotte D’Hulst

    2016-07-01

    Full Text Available Typically, ∼0.1% of the total number of olfactory sensory neurons (OSNs in the main olfactory epithelium express the same odorant receptor (OR in a singular fashion and their axons coalesce into homotypic glomeruli in the olfactory bulb. Here, we have dramatically increased the total number of OSNs expressing specific cloned OR coding sequences by multimerizing a 21-bp sequence encompassing the predicted homeodomain binding site sequence, TAATGA, known to be essential in OR gene choice. Singular gene choice is maintained in these “MouSensors.” In vivo synaptopHluorin imaging of odor-induced responses by known M71 ligands shows functional glomerular activation in an M71 MouSensor. Moreover, a behavioral avoidance task demonstrates that specific odor detection thresholds are significantly decreased in multiple transgenic lines, expressing mouse or human ORs. We have developed a versatile platform to study gene choice and axon identity, to create biosensors with great translational potential, and to finally decode human olfaction.

  8. MouSensor: A Versatile Genetic Platform to Create Super Sniffer Mice for Studying Human Odor Coding.

    Science.gov (United States)

    D'Hulst, Charlotte; Mina, Raena B; Gershon, Zachary; Jamet, Sophie; Cerullo, Antonio; Tomoiaga, Delia; Bai, Li; Belluscio, Leonardo; Rogers, Matthew E; Sirotin, Yevgeniy; Feinstein, Paul

    2016-07-26

    Typically, ∼0.1% of the total number of olfactory sensory neurons (OSNs) in the main olfactory epithelium express the same odorant receptor (OR) in a singular fashion and their axons coalesce into homotypic glomeruli in the olfactory bulb. Here, we have dramatically increased the total number of OSNs expressing specific cloned OR coding sequences by multimerizing a 21-bp sequence encompassing the predicted homeodomain binding site sequence, TAATGA, known to be essential in OR gene choice. Singular gene choice is maintained in these "MouSensors." In vivo synaptopHluorin imaging of odor-induced responses by known M71 ligands shows functional glomerular activation in an M71 MouSensor. Moreover, a behavioral avoidance task demonstrates that specific odor detection thresholds are significantly decreased in multiple transgenic lines, expressing mouse or human ORs. We have developed a versatile platform to study gene choice and axon identity, to create biosensors with great translational potential, and to finally decode human olfaction. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Growth platform-dependent and -independent phenotypic and metabolic responses of Arabidopsis and its halophytic relative, Eutrema salsugineum, to salt stress.

    Science.gov (United States)

    Kazachkova, Yana; Batushansky, Albert; Cisneros, Aroldo; Tel-Zur, Noemi; Fait, Aaron; Barak, Simon

    2013-07-01

    Comparative studies of the stress-tolerant Arabidopsis (Arabidopsis thaliana) halophytic relative, Eutrema salsugineum, have proven a fruitful approach to understanding natural stress tolerance. Here, we performed comparative phenotyping of Arabidopsis and E. salsugineum vegetative development under control and salt-stress conditions, and then compared the metabolic responses of the two species on different growth platforms in a defined leaf developmental stage. Our results reveal both growth platform-dependent and -independent phenotypes and metabolic responses. Leaf emergence was affected in a similar way in both species grown in vitro but the effects observed in Arabidopsis occurred at higher salt concentrations in E. salsugineum. No differences in leaf emergence were observed on soil. A new effect of a salt-mediated reduction in E. salsugineum leaf area was unmasked. On soil, leaf area reduction in E. salsugineum was mainly due to a fall in cell number, whereas both cell number and cell size contributed to the decrease in Arabidopsis leaf area. Common growth platform-independent leaf metabolic signatures such as high raffinose and malate, and low fumarate contents that could reflect core stress tolerance mechanisms, as well as growth platform-dependent metabolic responses were identified. In particular, the in vitro growth platform led to repression of accumulation of many metabolites including sugars, sugar phosphates, and amino acids in E. salsugineum compared with the soil system where these same metabolites accumulated to higher levels in E. salsugineum than in Arabidopsis. The observation that E. salsugineum maintains salt tolerance despite growth platform-specific phenotypes and metabolic responses suggests a considerable degree of phenotypic and metabolic adaptive plasticity in this extremophile.

  10. IP-MLI: An Independency of Learning Materials from Platforms in a Mobile Learning using Intelligent Method

    Directory of Open Access Journals (Sweden)

    Mohammed Abdallh Otair

    2006-06-01

    Full Text Available Attempting to deliver a monolithic mobile learning system is too inflexible in view of the heterogeneous mixture of hardware and services available and the desirability of facility blended approaches to learning delivery, and how to build learning materials to run on all platforms[1]. This paper proposes a framework of mobile learning system using an intelligent method (IP-MLI . A fuzzy matching method is used to find suitable learning material design. It will provide a best matching for each specific platform type for each learner. The main contribution of the proposed method is to use software layer to insulate learning materials from device-specific features. Consequently, many versions of learning materials can be designed to work on many platform types.

  11. Evaluation of data discretization methods to derive platform independent isoform expression signatures for multi-class tumor subtyping.

    Science.gov (United States)

    Jung, Segun; Bi, Yingtao; Davuluri, Ramana V

    2015-01-01

    Many supervised learning algorithms have been applied in deriving gene signatures for patient stratification from gene expression data. However, transferring the multi-gene signatures from one analytical platform to another without loss of classification accuracy is a major challenge. Here, we compared three unsupervised data discretization methods--Equal-width binning, Equal-frequency binning, and k-means clustering--in accurately classifying the four known subtypes of glioblastoma multiforme (GBM) when the classification algorithms were trained on the isoform-level gene expression profiles from exon-array platform and tested on the corresponding profiles from RNA-seq data. We applied an integrated machine learning framework that involves three sequential steps; feature selection, data discretization, and classification. For models trained and tested on exon-array data, the addition of data discretization step led to robust and accurate predictive models with fewer number of variables in the final models. For models trained on exon-array data and tested on RNA-seq data, the addition of data discretization step dramatically improved the classification accuracies with Equal-frequency binning showing the highest improvement with more than 90% accuracies for all the models with features chosen by Random Forest based feature selection. Overall, SVM classifier coupled with Equal-frequency binning achieved the best accuracy (> 95%). Without data discretization, however, only 73.6% accuracy was achieved at most. The classification algorithms, trained and tested on data from the same platform, yielded similar accuracies in predicting the four GBM subgroups. However, when dealing with cross-platform data, from exon-array to RNA-seq, the classifiers yielded stable models with highest classification accuracies on data transformed by Equal frequency binning. The approach presented here is generally applicable to other cancer types for classification and identification of

  12. Modality independence of order coding in working memory: Evidence from cross-modal order interference at recall.

    Science.gov (United States)

    Vandierendonck, André

    2016-01-01

    Working memory researchers do not agree on whether order in serial recall is encoded by dedicated modality-specific systems or by a more general modality-independent system. Although previous research supports the existence of autonomous modality-specific systems, it has been shown that serial recognition memory is prone to cross-modal order interference by concurrent tasks. The present study used a serial recall task, which was performed in a single-task condition and in a dual-task condition with an embedded memory task in the retention interval. The modality of the serial task was either verbal or visuospatial, and the embedded tasks were in the other modality and required either serial or item recall. Care was taken to avoid modality overlaps during presentation and recall. In Experiment 1, visuospatial but not verbal serial recall was more impaired when the embedded task was an order than when it was an item task. Using a more difficult verbal serial recall task, verbal serial recall was also more impaired by another order recall task in Experiment 2. These findings are consistent with the hypothesis of modality-independent order coding. The implications for views on short-term recall and the multicomponent view of working memory are discussed.

  13. Evaluation of Independent Audit and Corporate Go vernance Practices in Turkey Under The Turkish Commercıal Code No. 6102: A Qualitative Research

    Directory of Open Access Journals (Sweden)

    Yasin Karadeniz

    2015-12-01

    Full Text Available The purpose of this study is as follows: To explain the new dimension that the corporate governance practices, which have had troubles for years in Turkey, have acquired with the Turkish Commercial Code and, while explaining such relations, to reveal the importance of independent auditing, which could not become fully functional and has gone through many problems again in the practices of our country, and also the importance of present situation and the situation in future with the help of Turkish Commercial Code and corporate governance relations.Interviews as a way of qualitative research has been done face to face with at least one chief auditor (mostly CPAs working in any of the independent auditing firms in İzmir and Çanakkale cities.Following interviews with auditors it has been revealed that the Turkish Commercial Code, corporate governance in Turkey would contribute positively to development of independent auditing.

  14. microRNA dependent and independent deregulation of long non-coding RNAs by an oncogenic herpesvirus.

    Directory of Open Access Journals (Sweden)

    Sunantha Sethuraman

    2017-07-01

    Full Text Available Kaposi's sarcoma (KS is a highly prevalent cancer in AIDS patients, especially in sub-Saharan Africa. Kaposi's sarcoma-associated herpesvirus (KSHV is the etiological agent of KS and other cancers like Primary Effusion Lymphoma (PEL. In KS and PEL, all tumors harbor latent KSHV episomes and express latency-associated viral proteins and microRNAs (miRNAs. The exact molecular mechanisms by which latent KSHV drives tumorigenesis are not completely understood. Recent developments have highlighted the importance of aberrant long non-coding RNA (lncRNA expression in cancer. Deregulation of lncRNAs by miRNAs is a newly described phenomenon. We hypothesized that KSHV-encoded miRNAs deregulate human lncRNAs to drive tumorigenesis. We performed lncRNA expression profiling of endothelial cells infected with wt and miRNA-deleted KSHV and identified 126 lncRNAs as putative viral miRNA targets. Here we show that KSHV deregulates host lncRNAs in both a miRNA-dependent fashion by direct interaction and in a miRNA-independent fashion through latency-associated proteins. Several lncRNAs that were previously implicated in cancer, including MEG3, ANRIL and UCA1, are deregulated by KSHV. Our results also demonstrate that KSHV-mediated UCA1 deregulation contributes to increased proliferation and migration of endothelial cells.

  15. A New Cyber-enabled Platform for Scale-independent Interoperability of Earth Observations with Hydrologic Models

    Science.gov (United States)

    Rajib, A.; Zhao, L.; Merwade, V.; Shin, J.; Smith, J.; Song, C. X.

    2017-12-01

    Despite the significant potential of remotely sensed earth observations, their application is still not full-fledged in water resources research, management and education. Inconsistent storage structures, data formats and spatial resolution among different platforms/sources of earth observations hinder the use of these data. Available web-services can help bulk data downloading and visualization, but they are not sufficiently tailored to meet the degree of interoperability required for direct application of earth observations in hydrologic modeling at user-defined spatio-temporal scales. Similarly, the least ambiguous way for educators and watershed managers is to instantaneously obtain a time-series at any watershed of interest without spending time and computational resources on data download and post-processing activities. To address this issue, an open access, online platform, named HydroGlobe, is developed that minimizes all these processing tasks and delivers ready-to-use data from different earth observation sources. HydroGlobe can provide spatially-averaged time series of earth observations by using the following inputs: (i) data source, (ii) temporal extent in the form of start/end date, and (iii) geographic units (e.g., grid cell or sub-basin boundary) and extent in the form of GIS shapefile. In its preliminary version, HydroGlobe simultaneously handles five data sources including the surface and root zone soil moisture from SMAP (Soil Moisture Active Passive Mission), actual and potential evapotranspiration from MODIS (Moderate Resolution Imaging Spectroradiometer), and precipitation from GPM (Global Precipitation Measurements). This presentation will demonstrate the HydroGlobe interface and its applicability using few test cases on watersheds from different parts of the globe.

  16. ImagePy: an open-source, Python-based and platform-independent software package for boimage analysis.

    Science.gov (United States)

    Wang, Anliang; Yan, Xiaolong; Wei, Zhijun

    2018-04-27

    This note presents the design of a scalable software package named ImagePy for analysing biological images. Our contribution is concentrated on facilitating extensibility and interoperability of the software through decoupling the data model from the user interface. Especially with assistance from the Python ecosystem, this software framework makes modern computer algorithms easier to be applied in bioimage analysis. ImagePy is free and open source software, with documentation and code available at https://github.com/Image-Py/imagepy under the BSD license. It has been tested on the Windows, Mac and Linux operating systems. wzjdlut@dlut.edu.cn or yxdragon@imagepy.org.

  17. Physical, taxonomic code, and other data from current meter and other instruments in New York Bight from DOLPHIN and other platforms; 14 March 1971 to 03 August 1975 (NODC Accession 7601385)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Physical, taxonomic code, and other data were collected using current meter and other instruments from DOLPHIN and other platforms in New York Bight. Data were...

  18. Taxonomic code, physical, and other data collected from NOAA Ship DELAWARE II and other platforms in New York Bight from net casts and other instruments; 1973-02-20 to 1975-12-16 (NODC Accession 7601402)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Taxonomic Code, physical, and other data were collected using net casts and other instruments in the New York Bight from NOAA Ship DELAWARE II and other platforms....

  19. Independent Evaluation of the integrated Community Case Management of Childhood Illness Strategy in Malawi Using a National Evaluation Platform Design.

    Science.gov (United States)

    Amouzou, Agbessi; Kanyuka, Mercy; Hazel, Elizabeth; Heidkamp, Rebecca; Marsh, Andrew; Mleme, Tiope; Munthali, Spy; Park, Lois; Banda, Benjamin; Moulton, Lawrence H; Black, Robert E; Hill, Kenneth; Perin, Jamie; Victora, Cesar G; Bryce, Jennifer

    2016-03-01

    We evaluated the impact of integrated community case management of childhood illness (iCCM) on careseeking for childhood illness and child mortality in Malawi, using a National Evaluation Platform dose-response design with 27 districts as units of analysis. "Dose" variables included density of iCCM providers, drug availability, and supervision, measured through a cross-sectional cellular telephone survey of all iCCM-trained providers. "Response" variables were changes between 2010 and 2014 in careseeking and mortality in children aged 2-59 months, measured through household surveys. iCCM implementation strength was not associated with changes in careseeking or mortality. There were fewer than one iCCM-ready provider per 1,000 under-five children per district. About 70% of sick children were taken outside the home for care in both 2010 and 2014. Careseeking from iCCM providers increased over time from about 2% to 10%; careseeking from other providers fell by a similar amount. Likely contributors to the failure to find impact include low density of iCCM providers, geographic targeting of iCCM to "hard-to-reach" areas although women did not identify distance from a provider as a barrier to health care, and displacement of facility careseeking by iCCM careseeking. This suggests that targeting iCCM solely based on geographic barriers may need to be reconsidered. © The American Society of Tropical Medicine and Hygiene.

  20. Independent assessment of the TRAC-BD1/MOD1 computer code at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Wilson, G.E.; Charboneau, B.L.; Dallman, R.J.; Kullberg, C.M.; Wagner, K.C.; Wheatley, P.D.

    1984-01-01

    Under auspices of the United States Nuclear Regulatory Commission, their primary boiling water reactor safety analysis code (TRAC-BWR) is being assessed with simulations of a wide range of experimental data. The FY-1984 assessment activities were associated with the latest version (TRAC-BD1/MOD1) of this code. Typical results of the assessment studies are given. Conclusions formulated from these results are presented. These calculations relate to the overall applicability of the current code to safety analysis, and to future work which would further enhance the code's quality and ease of use

  1. Coupling of the neutron-kinetic core model DYN3D with the thermal hydraulic code FLICA-4 within the NURESIM platform

    International Nuclear Information System (INIS)

    Gommlich, A.; Kliem, S.; Rohde, U.; Gomez, A.; Sanchez, V.

    2010-01-01

    Within the FP7 Collaborative Project NURISP (NUclear Reactor Integrated Simulation Project) new and significant steps will be done towards a European Reference Simulation Platform for applications relevant to present PWR and BWR and to future reactors. The first step towards this target has been made during the FP6 NURESIM Integrated Project, where the already common and well-proven NURESIM informatics platform has been developed. This platform is based on the open source software SALOME. The 3D neutron kinetic core model DYN3D developed at Forschungszentrum Dresden-Rossendorf is part of the NURESIM platform. Within the NURESIM project, a SALOME based pre-processor for creation of DYN3D input data sets via GUI has been developed. DYN3D has been implemented into SALOME as black box, which allowed an independent execution. A conversion of the DYN3D result file into SALOME format was developed which opened the possibility using SALOME tools to visualize DYN3D results. (orig.)

  2. An imprinted non-coding genomic cluster at 14q32 defines clinically relevant molecular subtypes in osteosarcoma across multiple independent datasets

    OpenAIRE

    Hill, Katherine E.; Kelly, Andrew D.; Kuijjer, Marieke L.; Barry, William; Rattani, Ahmed; Garbutt, Cassandra C.; Kissick, Haydn; Janeway, Katherine; Perez-Atayde, Antonio; Goldsmith, Jeffrey; Gebhardt, Mark C.; Arredouani, Mohamed S.; Cote, Greg; Hornicek, Francis; Choy, Edwin

    2017-01-01

    Background: A microRNA (miRNA) collection on the imprinted 14q32 MEG3 region has been associated with outcome in osteosarcoma. We assessed the clinical utility of this miRNA set and their association with methylation status. Methods: We integrated coding and non-coding RNA data from three independent annotated clinical osteosarcoma cohorts (n = 65, n = 27, and n = 25) and miRNA and methylation data from one in vitro (19 cell lines) and one clinical (NCI Therapeutically Applicable Research to ...

  3. A multiplexed miRNA and transgene expression platform for simultaneous repression and expression of protein coding sequences.

    Science.gov (United States)

    Seyhan, Attila A

    2016-01-01

    Knockdown of single or multiple gene targets by RNA interference (RNAi) is necessary to overcome escape mutants or isoform redundancy. It is also necessary to use multiple RNAi reagents to knockdown multiple targets. It is also desirable to express a transgene or positive regulatory elements and inhibit a target gene in a coordinated fashion. This study reports a flexible multiplexed RNAi and transgene platform using endogenous intronic primary microRNAs (pri-miRNAs) as a scaffold located in the green fluorescent protein (GFP) as a model for any functional transgene. The multiplexed intronic miRNA - GFP transgene platform was designed to co-express multiple small RNAs within the polycistronic cluster from a Pol II promoter at more moderate levels to reduce potential vector toxicity. The native intronic miRNAs are co-transcribed with a precursor GFP mRNA as a single transcript and presumably cleaved out of the precursor-(pre) mRNA by the RNA splicing machinery, spliceosome. The spliced intron with miRNA hairpins will be further processed into mature miRNAs or small interfering RNAs (siRNAs) capable of triggering RNAi effects, while the ligated exons become a mature messenger RNA for the translation of the functional GFP protein. Data show that this approach led to robust RNAi-mediated silencing of multiple Renilla Luciferase (R-Luc)-tagged target genes and coordinated expression of functional GFP from a single transcript in transiently transfected HeLa cells. The results demonstrated that this design facilitates the coordinated expression of all mature miRNAs either as individual miRNAs or as multiple miRNAs and the associated protein. The data suggest that, it is possible to simultaneously deliver multiple negative (miRNA or shRNA) and positive (transgene) regulatory elements. Because many cellular processes require simultaneous repression and activation of downstream pathways, this approach offers a platform technology to achieve that dual manipulation efficiently

  4. The Transcriptome Analysis and Comparison Explorer--T-ACE: a platform-independent, graphical tool to process large RNAseq datasets of non-model organisms.

    Science.gov (United States)

    Philipp, E E R; Kraemer, L; Mountfort, D; Schilhabel, M; Schreiber, S; Rosenstiel, P

    2012-03-15

    Next generation sequencing (NGS) technologies allow a rapid and cost-effective compilation of large RNA sequence datasets in model and non-model organisms. However, the storage and analysis of transcriptome information from different NGS platforms is still a significant bottleneck, leading to a delay in data dissemination and subsequent biological understanding. Especially database interfaces with transcriptome analysis modules going beyond mere read counts are missing. Here, we present the Transcriptome Analysis and Comparison Explorer (T-ACE), a tool designed for the organization and analysis of large sequence datasets, and especially suited for transcriptome projects of non-model organisms with little or no a priori sequence information. T-ACE offers a TCL-based interface, which accesses a PostgreSQL database via a php-script. Within T-ACE, information belonging to single sequences or contigs, such as annotation or read coverage, is linked to the respective sequence and immediately accessible. Sequences and assigned information can be searched via keyword- or BLAST-search. Additionally, T-ACE provides within and between transcriptome analysis modules on the level of expression, GO terms, KEGG pathways and protein domains. Results are visualized and can be easily exported for external analysis. We developed T-ACE for laboratory environments, which have only a limited amount of bioinformatics support, and for collaborative projects in which different partners work on the same dataset from different locations or platforms (Windows/Linux/MacOS). For laboratories with some experience in bioinformatics and programming, the low complexity of the database structure and open-source code provides a framework that can be customized according to the different needs of the user and transcriptome project.

  5. Long non-coding RNA HOTAIR is an independent prognostic marker of metastasis in estrogen receptor-positive primary breast cancer

    DEFF Research Database (Denmark)

    Sørensen, Kristina P; Thomassen, Mads; Tan, Qihua

    2013-01-01

    Expression of HOX transcript antisense intergenic RNA (HOTAIR)-a long non-coding RNA-has been examined in a variety of human cancers, and overexpression of HOTAIR is correlated with poor survival among breast, colon, and liver cancer patients. In this retrospective study, we examine HOTAIR......-negative tumor samples, we are not able to detect a prognostic value of HOTAIR expression, probably due to the limited sample size. These results are successfully validated in an independent dataset with similar associations (P = 0.018, HR 1.825). In conclusion, our findings suggest that HOTAIR expression may...

  6. Decoding the encoding of functional brain networks: An fMRI classification comparison of non-negative matrix factorization (NMF), independent component analysis (ICA), and sparse coding algorithms.

    Science.gov (United States)

    Xie, Jianwen; Douglas, Pamela K; Wu, Ying Nian; Brody, Arthur L; Anderson, Ariana E

    2017-04-15

    Brain networks in fMRI are typically identified using spatial independent component analysis (ICA), yet other mathematical constraints provide alternate biologically-plausible frameworks for generating brain networks. Non-negative matrix factorization (NMF) would suppress negative BOLD signal by enforcing positivity. Spatial sparse coding algorithms (L1 Regularized Learning and K-SVD) would impose local specialization and a discouragement of multitasking, where the total observed activity in a single voxel originates from a restricted number of possible brain networks. The assumptions of independence, positivity, and sparsity to encode task-related brain networks are compared; the resulting brain networks within scan for different constraints are used as basis functions to encode observed functional activity. These encodings are then decoded using machine learning, by using the time series weights to predict within scan whether a subject is viewing a video, listening to an audio cue, or at rest, in 304 fMRI scans from 51 subjects. The sparse coding algorithm of L1 Regularized Learning outperformed 4 variations of ICA (pcoding algorithms. Holding constant the effect of the extraction algorithm, encodings using sparser spatial networks (containing more zero-valued voxels) had higher classification accuracy (pcoding algorithms suggests that algorithms which enforce sparsity, discourage multitasking, and promote local specialization may capture better the underlying source processes than those which allow inexhaustible local processes such as ICA. Negative BOLD signal may capture task-related activations. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Design and Implementation of Malicious Code Detection Platform for iOS System%iOS系统恶意代码检测平台设计与实现

    Institute of Scientific and Technical Information of China (English)

    田庆宜

    2013-01-01

    随着苹果手机日益普及,苹果终端已成为黑客重要的攻击目标。黑客利用恶意代码窃取个人信息及窃财犯罪层出不穷。但目前执法部门缺乏对苹果iOS移动平台恶意代码的检测平台。文章以iOS平台安全模型为基础,在分析国内外攻击方法的基础上,设计了苹果恶意代码检测框架,在此基础上构筑了对iOS的app应用恶意代码检测平台。经实际测试,本平台具备对iOS系统恶意代码通讯流量及文件改变的检测能力。本平台总体成本较低,有利于装备基层执法部门。%With the increasing popularity of Apple mobile phone, apple terminal has become the important target of hackers. The malicious hackers steal personal information and crime emerge in an endless stream. But the law enforcement departments lack of detection platform for Apple iOS mobile platform of malicious code. This paper is based on the iOS platform security model, designed the apple of malicious code detection framework, on the basis of iOS app application platform to build a malicious code detection. The actual test, the platform has the ability of detecting malicious code iOS system communication trafifc and ifle change. The overall cost of the platform is relatively low, in favor of equipment in the basic law enforcement.

  8. Evaluation of the influence of a postulated lubrication oil fire on safety related cables in the top shield platform of PFBR RCB by using FDS Code

    International Nuclear Information System (INIS)

    Mangarjuna Rao, P.; Jayasuriya, C.; Nashine, B.K.; Chellapandi, P.; Velusamy, K.

    2010-01-01

    Top deck of Prototype Fast Breeder Reactor (PFBR) primary system houses redundant safety related systems like Control and Safety Rod Drive Mechanisms (CSRDM), Diverse Safety Rod Drive Mechanism (DSRDM), subassembly outlet sodium temperature measurement system and central canal plug. These systems protrude out from the reactor through the Control Plug (CP), which is supported on the Top Shield (TS) of PFBR. Control and instrumentation signal cables and power cables of these safety related systems that are coming out from the CP are routed through Top Shield Platform (TSP, which is concentric with Reactor Vault (RV) at EL 34.1 m above the TS) to the peripheral local instrumentation control centers via the cable junction boxes supported on TS. Influence approach fire hazard analysis (FHA) has been carried out to evaluate the condition of redundant safety related cables under the scenario of a postulated oil fire in the TSP using Fire Dynamics Simulator code (FDS, Version 5). FDS is a computational fluid dynamics (CFD) based fire analysis code and it is developed by National Institute of Standards and Technology (NIST), USA. In this paper the details of the model developed and the results of the analysis carried out are discussed. In TSP, a postulated oil fire scenario with complete inventory of a primary sodium pump (PSP) lubrication oil leak (200 lt) has been considered at 30 m elevation on the TS. Computational model with the geometry of TSP and with other important structural components on TS like PSPs, intermediate heat exchangers (IHXs), large rotating plug (LRP), small rotating plug (SRP), CP and etc. has been developed along with a fire of 1800 kW/m 2 heat release rate in the vicinity of the PSP1. Numerical simulation has been carried out to evaluate this oil fire influence on the typical safety related cables routed at 34 m elevation. It has been found that the surface temperature of the cables that are routed directly above the fire only crosses the ignition

  9. Nuclear Criticality Safety Assessment Using the SCALE Computer Code Package. A demonstration based on an independent review of a real application

    International Nuclear Information System (INIS)

    Mennerdahl, Dennis

    1998-06-01

    The purpose of this project was to instruct a young scientist from the Lithuanian Energy Institute (LEI) on how to carry out an independent review of a safety report. In particular, emphasis, was to be put on how to use the personal computer version of the calculation system SCALE 4.3 in this process. Nuclear criticality safety together with radiation shielding from gamma and neutron sources were areas of interest. This report concentrates on nuclear criticality safety aspects while a separate report covers radiation shielding. The application was a proposed storage cask for irradiated fuel assemblies from the Ignalina RBMK reactors in Lithuania. The safety report contained various documents involving many design and safety considerations. A few other documents describing the Ignalina reactors and their operation were available. The time for the project was limited to approximately one month, starting 'clean' with a SCALE 4.3 CD-ROM, a thick safety report and a fast personal computer. The results should be of general interest to Swedish authorities, in particular related to shielding where experience in using advanced computer codes like those available in SCALE is limited. It has been known for many years that criticality safety is very complicated, and that independent reviews are absolutely necessary to reduce the risk from quite common errors in the safety assessments. Several important results were obtained during the project. Concerning use of SCALE 4.3, it was confirmed that a young scientist, without extensive previous experience in the code system, can learn to use essentially all options. During the project, it was obvious that familiarity with personal computers, operating systems (including network system) and office software (word processing, spreadsheet and Internet browser software) saved a lot of time. Some of the Monte Carlo calculations took several hours. Experience is valuable in quickly picking out input or source document errors. Understanding

  10. Annual report on compliance with the codes of good conduct and independence of electricity grid and natural gas network operators. November 2005

    International Nuclear Information System (INIS)

    2005-11-01

    In France, system operators belong to groups that also conduct business in the energy sector, in fields governed by competition rules. They could therefore be tempted to use their privileged position to their group's benefit, which would disadvantage end consumers. Non-discriminatory access to electricity and gas transmission and distribution networks is at the core of the market opening to competition approach implemented by the European Union since the end of the 1990's. EU and national enactments in force highlight two tools to ensure nondiscrimination: compliance programmes and independence of system operators with regard to their parent companies. Firstly, compliance programs contain measures taken to ensure that discrimination is completely excluded and that their application is subject to appropriate monitoring. Secondly, system operator independence plays a part in preventing discrimination against competitors with other business activities (generation, supply, etc.) within the same group. In application of these enactments, every electricity or natural gas transmission or distribution system operator serving more than 100,000 customers provided CRE, the Energy Regulatory Commission, with their annual reports on the application of their compliance programs. This document is CRE's November 2005 report about compliance programmes and independence of electricity and natural gas system operators. It has been prepared using the codes of good conduct and the annual reports supplied by network operators. CRE also launched a public consultation of the market players in October 2005 and listened to what the network operators had to say. Moreover, it carried out a certain number of checks on operators' practices

  11. Cross-Platform Technologies

    Directory of Open Access Journals (Sweden)

    Maria Cristina ENACHE

    2017-04-01

    Full Text Available Cross-platform - a concept becoming increasingly used in recent years especially in the development of mobile apps, but this consistently over time and in the development of conventional desktop applications. The notion of cross-platform software (multi-platform or platform-independent refers to a software application that can run on more than one operating system or computing architecture. Thus, a cross-platform application can operate independent of software or hardware platform on which it is execute. As a generic definition presents a wide range of meanings for purposes of this paper we individualize this definition as follows: we will reduce the horizon of meaning and we use functionally following definition: a cross-platform application is a software application that can run on more than one operating system (desktop or mobile identical or in a similar way.

  12. CAS (CHEMICAL ABSTRACTS SOCIETY) PARAMETER CODES and Other Data from FIXED STATIONS and Other Platforms from 19890801 to 19891130 (NODC Accession 9700156)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Tissue, sediment, and Chemical Abstracts Society (CAS) parameter codes were collected from Columbia River Basin and other locations from 01 August 1989 to 30...

  13. Omnidirectional holonomic platforms

    International Nuclear Information System (INIS)

    Pin, F.G.; Killough, S.M.

    1994-01-01

    This paper presents the concepts for a new family of wheeled platforms which feature full omnidirectionality with simultaneous and independently controlled rotational and translational motion capabilities. The authors first present the orthogonal-wheels concept and the two major wheel assemblies on which these platforms are based. They then describe how a combination of these assemblies with appropriate control can be used to generate an omnidirectional capability for mobile robot platforms. The design and control of two prototype platforms are then presented and their respective characteristics with respect to rotational and translational motion control are discussed

  14. Empirical validation of the triple-code model of numerical processing for complex math operations using functional MRI and group Independent Component Analysis of the mental addition and subtraction of fractions.

    Science.gov (United States)

    Schmithorst, Vincent J; Brown, Rhonda Douglas

    2004-07-01

    The suitability of a previously hypothesized triple-code model of numerical processing, involving analog magnitude, auditory verbal, and visual Arabic codes of representation, was investigated for the complex mathematical task of the mental addition and subtraction of fractions. Functional magnetic resonance imaging (fMRI) data from 15 normal adult subjects were processed using exploratory group Independent Component Analysis (ICA). Separate task-related components were found with activation in bilateral inferior parietal, left perisylvian, and ventral occipitotemporal areas. These results support the hypothesized triple-code model corresponding to the activated regions found in the individual components and indicate that the triple-code model may be a suitable framework for analyzing the neuropsychological bases of the performance of complex mathematical tasks. Copyright 2004 Elsevier Inc.

  15. Morphology Independent Learning in Modular Robots

    DEFF Research Database (Denmark)

    Christensen, David Johan; Bordignon, Mirko; Schultz, Ulrik Pagh

    2009-01-01

    Hand-coding locomotion controllers for modular robots is difficult due to their polymorphic nature. Instead, we propose to use a simple and distributed reinforcement learning strategy. ATRON modules with identical controllers can be assembled in any configuration. To optimize the robot’s locomotion...... speed its modules independently and in parallel adjust their behavior based on a single global reward signal. In simulation, we study the learning strategy’s performance on different robot configurations. On the physical platform, we perform learning experiments with ATRON robots learning to move as fast...

  16. Payment Platform

    DEFF Research Database (Denmark)

    Hjelholt, Morten; Damsgaard, Jan

    2012-01-01

    thoroughly and substitute current payment standards in the decades to come. This paper portrays how digital payment platforms evolve in socio-technical niches and how various technological platforms aim for institutional attention in their attempt to challenge earlier platforms and standards. The paper...... applies a co-evolutionary multilevel perspective to model the interplay and processes between technology and society wherein digital payment platforms potentially will substitute other payment platforms just like the credit card negated the check. On this basis this paper formulate a multilevel conceptual...

  17. A Cross-Platform Tactile Capabilities Interface for Humanoid Robots

    Directory of Open Access Journals (Sweden)

    Jie eMa

    2016-04-01

    Full Text Available This article presents the core elements of a cross-platform tactile capabilities interface (TCI for humanoid arms. The aim of the interface is to reduce the cost of developing humanoid robot capabilities by supporting reuse through cross-platform deployment. The article presents a comparative analysis of existing robot middleware frameworks, as well as the technical details of the TCI framework that builds on the the existing YARP platform. The TCI framework currently includes robot arm actuators with robot skin sensors. It presents such hardware in a platform independent manner, making it possible to write robot control software that can be executed on different robots through the TCI frameworks. The TCI framework supports multiple humanoid platforms and this article also presents a case study of a cross-platform implementation of a set of tactile protective withdrawal reflexes that have been realised on both the Nao and iCub humanoid robot platforms using the same high-level source code.

  18. Platform Constellations

    DEFF Research Database (Denmark)

    Staykova, Kalina Stefanova; Damsgaard, Jan

    2016-01-01

    This research paper presents an initial attempt to introduce and explain the emergence of new phenomenon, which we refer to as platform constellations. Functioning as highly modular systems, the platform constellations are collections of highly connected platforms which co-exist in parallel and a......’ acquisition and users’ engagement rates as well as unlock new sources of value creation and diversify revenue streams....

  19. Wireless sensor platform

    Science.gov (United States)

    Joshi, Pooran C.; Killough, Stephen M.; Kuruganti, Phani Teja

    2017-08-08

    A wireless sensor platform and methods of manufacture are provided. The platform involves providing a plurality of wireless sensors, where each of the sensors is fabricated on flexible substrates using printing techniques and low temperature curing. Each of the sensors can include planar sensor elements and planar antennas defined using the printing and curing. Further, each of the sensors can include a communications system configured to encode the data from the sensors into a spread spectrum code sequence that is transmitted to a central computer(s) for use in monitoring an area associated with the sensors.

  20. Windows Azure Platform

    CERN Document Server

    Redkar, Tejaswi

    2010-01-01

    The Azure Services Platform is a brand-new cloud-computing technology from Microsoft. It is composed of four core components-Windows Azure, .NET Services, SQL Services, and Live Services-each with a unique role in the functioning of your cloud service. It is the goal of this book to show you how to use these components, both separately and together, to build flawless cloud services. At its heart Windows Azure Platform is a down-to-earth, code-centric book. This book aims to show you precisely how the components are employed and to demonstrate the techniques and best practices you need to know

  1. Brake for rollable platform

    Science.gov (United States)

    Morris, A. L.

    1974-01-01

    Frame-mounted brake is independent of wheels and consists of simple lever-actuated foot. Brake makes good contact with surface even though foot pad is at higher or lower level than wheels, this is particularly important when a rollable platform is used on irregular surface.

  2. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  3. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  4. Synthesizing Certified Code

    OpenAIRE

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since code certification uses the same underlying technology as program verification, it requires detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding annotations to th...

  5. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  6. On the notion of abstract platform in MDA development

    NARCIS (Netherlands)

    Andrade Almeida, João; Dijkman, R.M.; van Sinderen, Marten J.; Ferreira Pires, Luis

    2004-01-01

    Although platform-independence is a central property in MDA models, the study of platform-independence has been largely overlooked in MDA. As a consequence, there is a lack of guidelines to select abstraction criteria and modelling concepts for platform-independent design. In addition, there is

  7. Hooke: an open software platform for force spectroscopy.

    Science.gov (United States)

    Sandal, Massimo; Benedetti, Fabrizio; Brucale, Marco; Gomez-Casado, Alberto; Samorì, Bruno

    2009-06-01

    Hooke is an open source, extensible software intended for analysis of atomic force microscope (AFM)-based single molecule force spectroscopy (SMFS) data. We propose it as a platform on which published and new algorithms for SMFS analysis can be integrated in a standard, open fashion, as a general solution to the current lack of a standard software for SMFS data analysis. Specific features and support for file formats are coded as independent plugins. Any user can code new plugins, extending the software capabilities. Basic automated dataset filtering and semi-automatic analysis facilities are included. Software and documentation are available at (http://code.google.com/p/hooke). Hooke is a free software under the GNU Lesser General Public License.

  8. Panda code

    International Nuclear Information System (INIS)

    Altomare, S.; Minton, G.

    1975-02-01

    PANDA is a new two-group one-dimensional (slab/cylinder) neutron diffusion code designed to replace and extend the FAB series. PANDA allows for the nonlinear effects of xenon, enthalpy and Doppler. Fuel depletion is allowed. PANDA has a completely general search facility which will seek criticality, maximize reactivity, or minimize peaking. Any single parameter may be varied in a search. PANDA is written in FORTRAN IV, and as such is nearly machine independent. However, PANDA has been written with the present limitations of the Westinghouse CDC-6600 system in mind. Most computation loops are very short, and the code is less than half the useful 6600 memory size so that two jobs can reside in the core at once. (auth)

  9. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  10. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  11. Network Coding Applications and Implementations on Mobile Devices

    DEFF Research Database (Denmark)

    Fitzek, Frank; Pedersen, Morten Videbæk; Heide, Janus

    2010-01-01

    Network coding has attracted a lot of attention lately. The goal of this paper is to demonstrate that the implementation of network coding is feasible on mobile platforms. The paper will guide the reader through some examples and demonstrate uses for network coding. Furthermore the paper will also...... show that the implementation of network coding is feasible today on commercial mobile platforms....

  12. [Orange Platform].

    Science.gov (United States)

    Toba, Kenji

    2017-07-01

    The Organized Registration for the Assessment of dementia on Nationwide General consortium toward Effective treatment in Japan (ORANGE platform) is a recently established nationwide clinical registry for dementia. This platform consists of multiple registries of patients with dementia stratified by the following clinical stages: preclinical, mild cognitive impairment, early-stage, and advanced-stage dementia. Patients will be examined in a super-longitudinal fashion, and their lifestyle, social background, genetic risk factors, and required care process will be assessed. This project is also notable because the care registry includes information on the successful, comprehensive management of patients with dementia. Therefore, this multicenter prospective cohort study will contribute participants to all clinical trials for Alzheimer's disease as well as improve the understanding of individuals with dementia.

  13. YARP: Yet Another Robot Platform

    Directory of Open Access Journals (Sweden)

    Lorenzo Natale

    2008-11-01

    Full Text Available We describe YARP, Yet Another Robot Platform, an open-source project that encapsulates lessons from our experience in building humanoid robots. The goal of YARP is to minimize the effort devoted to infrastructure-level software development by facilitating code reuse, modularity and so maximize research-level development and collaboration. Humanoid robotics is a "bleeding edge" field of research, with constant flux in sensors, actuators, and processors. Code reuse and maintenance is therefore a significant challenge. We describe the main problems we faced and the solutions we adopted. In short, the main features of YARP include support for inter-process communication, image processing as well as a class hierarchy to ease code reuse across different hardware platforms. YARP is currently used and tested on Windows, Linux and QNX6 which are common operating systems used in robotics.

  14. Are Independent Probes Truly Independent?

    Science.gov (United States)

    Camp, Gino; Pecher, Diane; Schmidt, Henk G.; Zeelenberg, Rene

    2009-01-01

    The independent cue technique has been developed to test traditional interference theories against inhibition theories of forgetting. In the present study, the authors tested the critical criterion for the independence of independent cues: Studied cues not presented during test (and unrelated to test cues) should not contribute to the retrieval…

  15. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  16. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  17. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  18. 2009 Analysis Platform Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Ferrell, John [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States

    2009-12-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the U.S. Department of Energy Biomass Program’s Analysis platform review meeting, held on February 18, 2009, at the Marriott Residence Inn, National Harbor, Maryland.

  19. 2009 Feedstocks Platform Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Ferrell, John [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2009-12-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the U.S. Department of Energy Biomass Program‘s Feedstock platform review meeting, held on April 8–10, 2009, at the Grand Hyatt Washington, Washington, D.C.

  20. 2009 Infrastructure Platform Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Ferrell, John [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2009-12-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the U.S. Department of Energy Biomass program‘s Infrastructure platform review meeting, held on February 19, 2009, at the Marriott Residence Inn, National Harbor, Maryland.

  1. 2011 Biomass Program Platform Peer Review. Sustainability

    Energy Technology Data Exchange (ETDEWEB)

    Eng, Alison Goss [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s Sustainability Platform Review meeting.

  2. 2011 Biomass Program Platform Peer Review: Feedstock

    Energy Technology Data Exchange (ETDEWEB)

    McCann, Laura [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s Feedstock Platform Review meeting.

  3. 2011 Biomass Program Platform Peer Review. Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Lindauer, Alicia [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s Infrastructure Platform Review meeting.

  4. 2011 Biomass Program Platform Peer Review: Algae

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Joyce [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s Algae Platform Review meeting.

  5. 2011 Biomass Program Platform Peer Review: Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haq, Zia [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s Analysis Platform Review meeting.

  6. Speech coding code- excited linear prediction

    CERN Document Server

    Bäckström, Tom

    2017-01-01

    This book provides scientific understanding of the most central techniques used in speech coding both for advanced students as well as professionals with a background in speech audio and or digital signal processing. It provides a clear connection between the whys hows and whats thus enabling a clear view of the necessity purpose and solutions provided by various tools as well as their strengths and weaknesses in each respect Equivalently this book sheds light on the following perspectives for each technology presented Objective What do we want to achieve and especially why is this goal important Resource Information What information is available and how can it be useful and Resource Platform What kind of platforms are we working with and what are their capabilities restrictions This includes computational memory and acoustic properties and the transmission capacity of devices used. The book goes on to address Solutions Which solutions have been proposed and how can they be used to reach the stated goals and ...

  7. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  8. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  9. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  10. MARS Code in Linux Environment

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2005-07-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  11. MARS Code in Linux Environment

    International Nuclear Information System (INIS)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong

    2005-01-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  12. Independent preferences

    DEFF Research Database (Denmark)

    Vind, Karl

    1991-01-01

    A simple mathematical result characterizing a subset of a product set is proved and used to obtain additive representations of preferences. The additivity consequences of independence assumptions are obtained for preferences which are not total or transitive. This means that most of the economic ...... theory based on additive preferences - expected utility, discounted utility - has been generalized to preferences which are not total or transitive. Other economic applications of the theorem are given...

  13. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  14. A Dual Coding View of Vocabulary Learning

    Science.gov (United States)

    Sadoski, Mark

    2005-01-01

    A theoretical perspective on acquiring sight vocabulary and developing meaningful vocabulary is presented. Dual Coding Theory assumes that cognition occurs in two independent but connected codes: a verbal code for language and a nonverbal code for mental imagery. The mixed research literature on using pictures in teaching sight vocabulary is…

  15. Geospatial Data Management Platform for Urban Groundwater

    Science.gov (United States)

    Gaitanaru, D.; Priceputu, A.; Gogu, C. R.

    2012-04-01

    Due to the large amount of civil work projects and research studies, large quantities of geo-data are produced for the urban environments. These data are usually redundant as well as they are spread in different institutions or private companies. Time consuming operations like data processing and information harmonisation represents the main reason to systematically avoid the re-use of data. The urban groundwater data shows the same complex situation. The underground structures (subway lines, deep foundations, underground parkings, and others), the urban facility networks (sewer systems, water supply networks, heating conduits, etc), the drainage systems, the surface water works and many others modify continuously. As consequence, their influence on groundwater changes systematically. However, these activities provide a large quantity of data, aquifers modelling and then behaviour prediction can be done using monitored quantitative and qualitative parameters. Due to the rapid evolution of technology in the past few years, transferring large amounts of information through internet has now become a feasible solution for sharing geoscience data. Furthermore, standard platform-independent means to do this have been developed (specific mark-up languages like: GML, GeoSciML, WaterML, GWML, CityML). They allow easily large geospatial databases updating and sharing through internet, even between different companies or between research centres that do not necessarily use the same database structures. For Bucharest City (Romania) an integrated platform for groundwater geospatial data management is developed under the framework of a national research project - "Sedimentary media modeling platform for groundwater management in urban areas" (SIMPA) financed by the National Authority for Scientific Research of Romania. The platform architecture is based on three components: a geospatial database, a desktop application (a complex set of hydrogeological and geological analysis

  16. Independent Directors

    DEFF Research Database (Denmark)

    Ringe, Wolf-Georg

    2013-01-01

    This paper re-evaluates the corporate governance concept of ‘board independence’ against the disappointing experiences during the 2007-08 financial crisis. Independent or outside directors had long been seen as an essential tool to improve the monitoring role of the board. Yet the crisis revealed...... that they did not prevent firms' excessive risk taking; further, these directors sometimes showed serious deficits in understanding the business they were supposed to control, and remained passive in addressing structural problems. A closer look reveals that under the surface of seemingly unanimous consensus...

  17. Static Verification for Code Contracts

    Science.gov (United States)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  18. Latest improvements on TRACPWR six-equations thermohydraulic code

    International Nuclear Information System (INIS)

    Rivero, N.; Batuecas, T.; Martinez, R.; Munoz, J.; Lenhardt, G.; Serrano, P.

    1999-01-01

    The paper presents the latest improvements on TRACPWR aimed at adapting the code to present trends on computer platforms, architectures and training requirements as well as extending the scope of the code itself and its applicability to other technologies different from Westinghouse PWR one. Firstly major features of TRACPWR as best estimate and real time simulation code are summed, then the areas where TRACPWR is being improved are presented. These areas comprising: (1) Architecture: integrating TRACPWR and RELAP5 codes, (2) Code scope enhancement: modelling the Mid-Loop operation, (3) Code speed-up: applying parallelization techniques, (4) Code platform downswing: porting to Windows N1 platform, (5) On-line performance: allowing simulation initialisation from a Plant Process Computer, and (6) Code scope extension: using the code for modelling VVER and PHWR technology. (author)

  19. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  20. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  1. The Definitive Guide to NetBeans Platform

    CERN Document Server

    Bock, Heiko

    2009-01-01

    The Definitive Guide to NetBeans(t) Platform is a thorough and definitive introduction to the NetBeans Platform, covering all its major APIs in detail, with relevant code examples used throughout. The original German book on which this title is based was well received. The NetBeans Platform Community has put together this English translation, which author Heiko Bock updated to cover the latest NetBeans Platform 6.5 APIs. With an introduction by known NetBeans Platform experts Jaroslav Tulach, Tim Boudreau, and Geertjan Wielenga, this is the most up-to-date book on this topic at the moment. All

  2. JBoss Weld CDI for Java platform

    CERN Document Server

    Finnegan, Ken

    2013-01-01

    This book is a mini tutorial with plenty of code examples and strategies to give you numerous options when building your own applications.""JBoss Weld CDI for Java Platform"" is written for developers who are new to dependency injection. A rudimentary knowledge of Java is required.

  3. Product Platform Performance

    DEFF Research Database (Denmark)

    Munk, Lone

    The aim of this research is to improve understanding of platform-based product development by studying platform performance in relation to internal effects in companies. Platform-based product development makes it possible to deliver product variety and at the same time reduce the needed resources...... engaging in platform-based product development. Similarly platform assessment criteria lack empirical verification regarding relevance and sufficiency. The thesis focuses on • the process of identifying and estimating internal effects, • verification of performance of product platforms, (i...... experienced representatives from the different life systems phase systems of the platform products. The effects are estimated and modeled within different scenarios, taking into account financial and real option aspects. The model illustrates and supports estimation and quantification of internal platform...

  4. Feasibility analysis of the modified ATHLET code for supercritical water cooled systems

    Energy Technology Data Exchange (ETDEWEB)

    Zhou Chong, E-mail: ch.zhou@sjtu.edu.cn [School of Nuclear Science and Engineering, Shanghai Jiao Tong University, 800 Dongchuan Road, Shanghai 200240 (China); Institute of Fusion and Reactor Technology, Karlsruhe Institute of Technology, Vincenz-Priessnitz-Str. 3, 76131 Karlsruhe (Germany); Yang Yanhua [School of Nuclear Science and Engineering, Shanghai Jiao Tong University, 800 Dongchuan Road, Shanghai 200240 (China); Cheng Xu [Institute of Fusion and Reactor Technology, Karlsruhe Institute of Technology, Vincenz-Priessnitz-Str. 3, 76131 Karlsruhe (Germany)

    2012-09-15

    Highlights: Black-Right-Pointing-Pointer Modification of system code ATHLET for supercritical water application. Black-Right-Pointing-Pointer Development and assessment of a heat transfer package for supercritical water. Black-Right-Pointing-Pointer Validation of the modified code at supercritical pressures with the theoretical point-hydraulics model and the SASC code. Black-Right-Pointing-Pointer Application of the modified code to LOCA analysis of a supercritical water cooled in-pile fuel qualification test loop. - Abstract: Since the existing thermal-hydraulic computer codes for light water reactors are not applicable to supercritical water cooled reactors (SCWRs) owing to the limitation of physical models and numerical treatments, the development of a reliable thermal-hydraulic computer code is very important to design analysis and safety assessment of SCWRs. Based on earlier modification of ATHLET for SCWR, a general interface is implemented to the code, which serves as the platform for information exchange between ATHLET and the external independent physical modules. A heat transfer package containing five correlations for supercritical water is connected to the ATHLET code through the interface. The correlations are assessed with experimental data. To verify the modified ATHLET code, the Edwards-O'Brian blow-down test is simulated. As first validation at supercritical pressures, a simplified supercritical water cooled loop is modeled and its stability behavior is analyzed. Results are compared with that of the theoretical model and SASC code in the reference and show good agreement. To evaluate its feasibility, the modified ATHLET code is applied to a supercritical water cooled in-pile fuel qualification test loop. Loss of coolant accidents (LOCAs) due to break of coolant supply lines are calculated for the loop. Sensitivity analysis of some safety system parameters is performed to get further knowledge about their influence on the function of the

  5. Mobile platform security

    CERN Document Server

    Asokan, N; Dmitrienko, Alexandra

    2013-01-01

    Recently, mobile security has garnered considerable interest in both the research community and industry due to the popularity of smartphones. The current smartphone platforms are open systems that allow application development, also for malicious parties. To protect the mobile device, its user, and other mobile ecosystem stakeholders such as network operators, application execution is controlled by a platform security architecture. This book explores how such mobile platform security architectures work. We present a generic model for mobile platform security architectures: the model illustrat

  6. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  7. Users and Programmers Guide for HPC Platforms in CIEMAT

    International Nuclear Information System (INIS)

    Munoz Roldan, A.

    2003-01-01

    This Technical Report presents a description of the High Performance Computing platforms available to researchers in CIEMAT and dedicated mainly to scientific computing. It targets to users and programmers and tries to help in the processes of developing new code and porting code across platforms. A brief review is also presented about historical evolution in the field of HPC, ie, the programming paradigms and underlying architectures. (Author) 32 refs

  8. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  9. Data Platforms and Cities

    DEFF Research Database (Denmark)

    Blok, Anders; Courmont, Antoine; Hoyng, Rolien

    2017-01-01

    This section offers a series of joint reflections on (open) data platform from a variety of cases, from cycling, traffic and mapping to activism, environment and data brokering. Data platforms play a key role in contemporary urban governance. Linked to open data initiatives, such platforms are of...

  10. Dynamic Gaming Platform (DGP)

    Science.gov (United States)

    2009-04-01

    GAMING PLATFORM (DGP) Lockheed Martin Corporation...YYYY) APR 09 2. REPORT TYPE Final 3. DATES COVERED (From - To) Jul 07 – Mar 09 4. TITLE AND SUBTITLE DYNAMIC GAMING PLATFORM (DGP) 5a...CMU Carnegie Mellon University DGP Dynamic Gaming Platform GA Genetic Algorithm IARPA Intelligence Advanced Research Projects Activity LM ATL Lockheed Martin Advanced Technology Laboratories PAINT ProActive INTelligence

  11. ITS Platform North Denmark

    DEFF Research Database (Denmark)

    Lahrmann, Harry; Agerholm, Niels; Juhl, Jens

    2012-01-01

    This paper presents the project entitled “ITS Platform North Denmark” which is used as a test platform for Intelligent Transportation System (ITS) solutions. The platform consists of a newly developed GNSS/GPRS On Board Unit (OBU) to be installed in 500 cars, a backend server and a specially...

  12. Cross platform SCA component using C++ builder and KYLIX

    International Nuclear Information System (INIS)

    Nishimura, Hiroshi; Timossi, Chiris; McDonald, James L.

    2003-01-01

    A cross-platform component for EPICS Simple Channel Access (SCA) has been developed. EPICS client programs with GUI become portable at their C++ source-code level both on Windows and Linux by using Borland C++ Builder 6 and Kylix 3 on these platforms respectively

  13. Error floor behavior study of LDPC codes for concatenated codes design

    Science.gov (United States)

    Chen, Weigang; Yin, Liuguo; Lu, Jianhua

    2007-11-01

    Error floor behavior of low-density parity-check (LDPC) codes using quantized decoding algorithms is statistically studied with experimental results on a hardware evaluation platform. The results present the distribution of the residual errors after decoding failure and reveal that the number of residual error bits in a codeword is usually very small using quantized sum-product (SP) algorithm. Therefore, LDPC code may serve as the inner code in a concatenated coding system with a high code rate outer code and thus an ultra low error floor can be achieved. This conclusion is also verified by the experimental results.

  14. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  15. Continuous Platform Development

    DEFF Research Database (Denmark)

    Nielsen, Ole Fiil

    low risks and investments but also with relatively fuzzy results. When looking for new platform projects, it is important to make sure that the company and market is ready for the introduction of platforms, and to make sure that people from marketing and sales, product development, and downstream......, but continuous product family evolution challenges this strategy. The concept of continuous platform development is based on the fact that platform development should not be a one-time experience but rather an ongoing process of developing new platforms and updating existing ones, so that product family...

  16. Performance Measurement of Complex Event Platforms

    Directory of Open Access Journals (Sweden)

    Eva Zámečníková

    2016-12-01

    Full Text Available The aim of this paper is to find and compare existing solutions of complex event processing platforms (CEP. CEP platforms generally serve for processing and/or predicting of high frequency data. We intend to use CEP platform for processing of complex time series and integrate a solution for newly proposed method of decision making. The decision making process will be described by formal grammar. As there are lots of CEP solutions we will take the following characteristics under consideration - the processing in real time, possibility of processing of high volume data from multiple sources, platform independence, platform allowing integration with user solution and open license. At first we will talk about existing CEP tools and their specific way of use in praxis. Then we will mention the design of method for formalization of business rules used for decision making. Afterwards, we focus on two platforms which seem to be the best fit for integration of our solution and we will list the main pros and cons of each approach. Next part is devoted to benchmark platforms for CEP. Final part is devoted to experimental measurements of platform with integrated method for decision support.

  17. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  18. Two-terminal video coding.

    Science.gov (United States)

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  19. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  20. NESTLE: A nodal kinetics code

    International Nuclear Information System (INIS)

    Al-Chalabi, R.M.; Turinsky, P.J.; Faure, F.-X.; Sarsour, H.N.; Engrand, P.R.

    1993-01-01

    The NESTLE nodal kinetics code has been developed for utilization as a stand-alone code for steady-state and transient reactor neutronic analysis and for incorporation into system transient codes, such as TRAC and RELAP. The latter is desirable to increase the simulation fidelity over that obtained from currently employed zero- and one-dimensional neutronic models and now feasible due to advances in computer performance and efficiency of nodal methods. As a stand-alone code, requirements are that it operate on a range of computing platforms from memory-limited personal computers (PCs) to supercomputers with vector processors. This paper summarizes the features of NESTLE that reflect the utilization and requirements just noted

  1. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  2. NSURE code

    International Nuclear Information System (INIS)

    Rattan, D.S.

    1993-11-01

    NSURE stands for Near-Surface Repository code. NSURE is a performance assessment code. developed for the safety assessment of near-surface disposal facilities for low-level radioactive waste (LLRW). Part one of this report documents the NSURE model, governing equations and formulation of the mathematical models, and their implementation under the SYVAC3 executive. The NSURE model simulates the release of nuclides from an engineered vault, their subsequent transport via the groundwater and surface water pathways tot he biosphere, and predicts the resulting dose rate to a critical individual. Part two of this report consists of a User's manual, describing simulation procedures, input data preparation, output and example test cases

  3. The Aster code

    International Nuclear Information System (INIS)

    Delbecq, J.M.

    1999-01-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  4. ICP (ITER Collaborative Platform)

    Energy Technology Data Exchange (ETDEWEB)

    Capuano, C.; Carayon, F.; Patel, V. [ITER, 13 - St. Paul-Lez Durance (France)

    2009-07-01

    The ITER organization has the necessity to manage a massive amount of data and processes. Each team requires different process and databases often interconnected with those of others teams. ICP is the current central ITER repository of structured and unstructured data. All data in ICP is served and managed via a web interface that provides global accessibility with a common user friendly interface. This paper will explain the model used by ICP and how it serves the ITER project by providing a robust and agile platform. ICP is developed in ASP.NET using MSSQL Server for data storage. It currently houses 15 data driven applications, 150 different types of record, 500 k objects and 2.5 M references. During European working hours the system averages 150 concurrent users and 20 requests per second. ICP connects to external database applications to provide a single entry point to ITER data and a safe shared storage place to maintain this data long-term. The Core model provides an easy to extend framework to meet the future needs of the Organization. ICP follows a multi-tier architecture, providing logical separation of process. The standard three-tier architecture is expanded, with the data layer separated into data storage, data structure, and data access components. The business or applications logic layer is broken up into a common business functionality layer, a type specific logic layer, and a detached work-flow layer. Finally the presentation tier comprises a presentation adapter layer and an interface layer. Each layer is built up from small blocks which can be combined to create a wide range of more complex functionality. Each new object type developed gains access to a wealth of existing code functionality, while also free to adapt and extend this. The hardware structure is designed to provide complete redundancy, high availability and to handle high load. This document is composed of an abstract followed by the presentation transparencies. (authors)

  5. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  6. Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  7. ANIMAL code

    International Nuclear Information System (INIS)

    Lindemuth, I.R.

    1979-01-01

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables

  8. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621 ...

  9. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  10. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  11. Code Modernization of VPIC

    Science.gov (United States)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  12. ADMS Evaluation Platform

    Energy Technology Data Exchange (ETDEWEB)

    2018-01-23

    Deploying an ADMS or looking to optimize its value? NREL offers a low-cost, low-risk evaluation platform for assessing ADMS performance. The National Renewable Energy Laboratory (NREL) has developed a vendor-neutral advanced distribution management system (ADMS) evaluation platform and is expanding its capabilities. The platform uses actual grid-scale hardware, large-scale distribution system models, and advanced visualization to simulate realworld conditions for the most accurate ADMS evaluation and experimentation.

  13. Coupling methodology within the software platform alliances

    Energy Technology Data Exchange (ETDEWEB)

    Montarnal, Ph; Deville, E; Adam, E; Bengaouer, A [CEA Saclay, Dept. de Modelisation des Systemes et Structures 91 - Gif-sur-Yvette (France); Dimier, A; Gaombalet, J; Loth, L [Agence Nationale pour la Gestion des Dechets Radioactifs (ANDRA), 92 - Chatenay Malabry (France); Chavant, C [Electricite de France (EDF), 92 - Clamart (France)

    2005-07-01

    CEA, ANDRA and EDF are jointly developing the software platform ALLIANCES which aim is to produce a tool for the simulation of nuclear waste storage and disposal repository. This type of simulations deals with highly coupled thermo-hydro-mechanical and chemical (T-H-M-C) processes. A key objective of Alliances is to give the capability for coupling algorithms development between existing codes. The aim of this paper is to present coupling methodology use in the context of this software platform. (author)

  14. Coupling methodology within the software platform alliances

    International Nuclear Information System (INIS)

    Montarnal, Ph.; Deville, E.; Adam, E.; Bengaouer, A.; Dimier, A.; Gaombalet, J.; Loth, L.; Chavant, C.

    2005-01-01

    CEA, ANDRA and EDF are jointly developing the software platform ALLIANCES which aim is to produce a tool for the simulation of nuclear waste storage and disposal repository. This type of simulations deals with highly coupled thermo-hydro-mechanical and chemical (T-H-M-C) processes. A key objective of Alliances is to give the capability for coupling algorithms development between existing codes. The aim of this paper is to present coupling methodology use in the context of this software platform. (author)

  15. Platform development supportedby gaming

    DEFF Research Database (Denmark)

    Mikkola, Juliana Hsuan; Hansen, Poul H. Kyvsgård

    2007-01-01

    The challenge of implementing industrial platforms in practice can be described as a configuration problem caused by high number of variables, which often have contradictory influences on the total performance of the firm. Consequently, the specific platform decisions become extremely complex......, possibly increasing the strategic risks for the firm. This paper reports preliminary findings on platform management process at LEGO, a Danish toy company.  Specifically, we report the process of applying games combined with simulations and workshops in the platform development. We also propose a framework...

  16. Platform decommissioning costs

    International Nuclear Information System (INIS)

    Rodger, David

    1998-01-01

    There are over 6500 platforms worldwide contributing to the offshore oil and gas production industry. In the North Sea there are around 500 platforms in place. There are many factors to be considered in planning for platform decommissioning and the evaluation of options for removal and disposal. The environmental impact, technical feasibility, safety and cost factors all have to be considered. This presentation considers what information is available about the overall decommissioning costs for the North Sea and the costs of different removal and disposal options for individual platforms. 2 figs., 1 tab

  17. The Accurate Particle Tracer Code

    OpenAIRE

    Wang, Yulei; Liu, Jian; Qin, Hong; Yu, Zhi

    2016-01-01

    The Accurate Particle Tracer (APT) code is designed for large-scale particle simulations on dynamical systems. Based on a large variety of advanced geometric algorithms, APT possesses long-term numerical accuracy and stability, which are critical for solving multi-scale and non-linear problems. Under the well-designed integrated and modularized framework, APT serves as a universal platform for researchers from different fields, such as plasma physics, accelerator physics, space science, fusio...

  18. 2009 Integrated Biorefinery Platform Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Ferrell, John [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2009-12-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the U.S. Department of Energy Biomass Program‘s Integrated Biorefinery (IBR) platform review meeting, held on February 18–19, 2009, at the Westin National Harbor, National Harbor, Maryland.

  19. 2009 Biochemical Conversion Platform Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Ferrell, John [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2009-12-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the U.S. Department of Energy Biomass Program’s Biochemical Conversion platform review meeting, held on April 14-16, 2009, at the Sheraton Denver Downtown, Denver, Colorado.

  20. A Platform for Simulating Language Evolution

    Science.gov (United States)

    Vogel, Carl; Woods, Justin

    A platform for conducting experiments in the simulation of natural language evolution is presented. The system is paramaterized for independent specification of important features like: number of agents, communication attempt frequency, agent short term memory capacity, communicative urgency, etc. Representative experiments are demonstrated.

  1. 2009 Thermochemical Conversion Platform Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Ferrell, John [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2009-12-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the U.S. Department of Energy Biomass Program’s Thermochemical Conversion platform review meeting, held on April 14-16, 2009, at the Sheraton Denver Downtown, Denver, Colorado.

  2. Integration of the TNXYZ computer program inside the platform Salome

    International Nuclear Information System (INIS)

    Chaparro V, F. J.

    2014-01-01

    The present work shows the procedure carried out to integrate the code TNXYZ as a calculation tool at the graphical simulation platform Salome. The TNXYZ code propose a numerical solution of the neutron transport equation, in several groups of energy, steady-state and three-dimensional geometry. In order to discretized the variables of the transport equation, the code uses the method of discrete ordinates for the angular variable, and a nodal method for the spatial dependence. The Salome platform is a graphical environment designed for building, editing and simulating mechanical models mainly focused on the industry and unlike other software, in order to form a complete scheme of pre and post processing of information, to integrate and control an external source code. Before the integration the in the Salome platform TNXYZ code was upgraded. TNXYZ was programmed in the 90s using Fortran 77 compiler; for this reason the code was adapted to the characteristics of the current Fortran compilers; in addition, with the intention of extracting partial results over the process sequence, the original structure of the program underwent a modularization process, i.e. the main program was divided into sections where the code performs major operations. This procedure is controlled by the information module (YACS) on Salome platform, and it could be useful for a subsequent coupling with thermal-hydraulics codes. Finally, with the help of the Monte Carlo code Serpent several study cases were defined in order to check the process of integration; the verification process consisted in performing a comparison of the results obtained with the code executed as stand-alone and after modernized, integrated and controlled by the Salome platform. (Author)

  3. Autonomous platform for distributed sensing and actuation over bluetooth

    OpenAIRE

    Carvalhal, Paulo; Coelho, Ezequiel T.; Ferreira, Manuel João Oliveira; Afonso, José A.; Santos, Cristina

    2006-01-01

    This paper presents a short range wireless network platform based on Bluetooth technology and on a Round Robin scheduling algotithm. The main goal is to provide an application independent platform in order to support a distributed data acquisition and control system used to control a model of a greenhouse. This platform enables the advantages of wireless communications while assuring low weight, small energy consumption and reliable communications.

  4. CANAL code

    International Nuclear Information System (INIS)

    Gara, P.; Martin, E.

    1983-01-01

    The CANAL code presented here optimizes a realistic iron free extraction channel which has to provide a given transversal magnetic field law in the median plane: the current bars may be curved, have finite lengths and cooling ducts and move in a restricted transversal area; terminal connectors may be added, images of the bars in pole pieces may be included. A special option optimizes a real set of circular coils [fr

  5. Groundwater Assessment Platform

    OpenAIRE

    Podgorski, Joel; Berg, Michael

    2018-01-01

    The Groundwater Assessment Platform is a free, interactive online GIS platform for the mapping, sharing and statistical modeling of groundwater quality data. The modeling allows users to take advantage of publicly available global datasets of various environmental parameters to produce prediction maps of their contaminant of interest.

  6. EURESCOM Services Platform

    NARCIS (Netherlands)

    Nieuwenhuis, Lambertus Johannes Maria; van Halteren, Aart

    1999-01-01

    This paper presents the results of the EURESCOM Project 715. In February 1999, a large team of researchers from six European public network operators completed a two year period of cooperative experiments on a TINA-based environment, called the EURESCOM Services Platform (ESP). This platform

  7. 2011 Biomass Program Platform Peer Review: Biochemical Conversion

    Energy Technology Data Exchange (ETDEWEB)

    Pezzullo, Leslie [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s Biochemical Conversion Platform Review meeting.

  8. 2011 Biomass Program Platform Peer Review. Integrated Biorefineries

    Energy Technology Data Exchange (ETDEWEB)

    Rossmeissl, Neil [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s IBR Platform Review meeting.

  9. 2011 Biomass Program Platform Peer Review. Thermochemical Conversion

    Energy Technology Data Exchange (ETDEWEB)

    Grabowski, Paul E. [Office of Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)

    2012-02-01

    This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2011 U.S. Department of Energy Biomass Program’s Thermochemical Conversion Platform Review meeting.

  10. Product Platform Modeling

    DEFF Research Database (Denmark)

    Pedersen, Rasmus

    for customisation of products. In many companies these changes in the business environment have created a controversy between the need for a wide variety of products offered to the marketplace and a desire to reduce variation within the company in order to increase efficiency. Many companies use the concept...... other. These groups can be varied and combined to form different product variants without increasing the internal variety in the company. Based on the Theory of Domains, the concept of encapsulation in the organ domain is introduced, and organs are formulated as platform elements. Included......This PhD thesis has the title Product Platform Modelling. The thesis is about product platforms and visual product platform modelling. Product platforms have gained an increasing attention in industry and academia in the past decade. The reasons are many, yet the increasing globalisation...

  11. Product Platform Replacements

    DEFF Research Database (Denmark)

    Sköld, Martin; Karlsson, Christer

    2012-01-01

    . To shed light on this unexplored and growing managerial concern, the purpose of this explorative study is to identify operational challenges to management when product platforms are replaced. Design/methodology/approach – The study uses a longitudinal field-study approach. Two companies, Gamma and Omega...... replacement was chosen in each company. Findings – The study shows that platform replacements primarily challenge managers' existing knowledge about platform architectures. A distinction can be made between “width” and “height” in platform replacements, and it is crucial that managers observe this in order...... to challenge their existing knowledge about platform architectures. Issues on technologies, architectures, components and processes as well as on segments, applications and functions are identified. Practical implications – Practical implications are summarized and discussed in relation to a framework...

  12. User's manual for the TMAD code

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    1995-01-01

    This document serves as the User's Manual for the TMAD code system, which includes the TMAD code and the LIBMAKR code. The TMAD code was commissioned to make it easier to interpret moisture probe measurements in the Hanford Site waste tanks. In principle, the code is an interpolation routine that acts over a library of benchmark data based on two independent variables, typically anomaly size and moisture content. Two additional variables, anomaly type and detector type, also can be considered independent variables, but no interpolation is done over them. The dependent variable is detector response. The intent is to provide the code with measured detector responses from two or more detectors. The code then will interrogate (and interpolate upon) the benchmark data library and find the anomaly-type/anomaly-size/moisture-content combination that provides the closest match to the measured data

  13. Implementation of Online Veterinary Hospital on Cloud Platform.

    Science.gov (United States)

    Chen, Tzer-Shyong; Chen, Tzer-Long; Chung, Yu-Fang; Huang, Yao-Min; Chen, Tao-Chieh; Wang, Huihui; Wei, Wei

    2016-06-01

    Pet markets involve in great commercial possibilities, which boost thriving development of veterinary hospital businesses. The service tends to intensive competition and diversified channel environment. Information technology is integrated for developing the veterinary hospital cloud service platform. The platform contains not only pet medical services but veterinary hospital management and services. In the study, QR Code andcloud technology are applied to establish the veterinary hospital cloud service platform for pet search by labeling a pet's identification with QR Code. This technology can break the restriction on veterinary hospital inspection in different areas and allows veterinary hospitals receiving the medical records and information through the exclusive QR Code for more effective inspection. As an interactive platform, the veterinary hospital cloud service platform allows pet owners gaining the knowledge of pet diseases and healthcare. Moreover, pet owners can enquire and communicate with veterinarians through the platform. Also, veterinary hospitals can periodically send reminders of relevant points and introduce exclusive marketing information with the platform for promoting the service items and establishing individualized marketing. Consequently, veterinary hospitals can increase the profits by information share and create the best solution in such a competitive veterinary market with industry alliance.

  14. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  15. Morphology Independent Learning in Modular Robots

    DEFF Research Database (Denmark)

    Christensen, David Johan; Bordignon, Mirko; Schultz, Ulrik Pagh

    2009-01-01

    speed its modules independently and in parallel adjust their behavior based on a single global reward signal. In simulation, we study the learning strategy’s performance on different robot configurations. On the physical platform, we perform learning experiments with ATRON robots learning to move as fast...

  16. Morphology Independent Learning in Modular Robots

    DEFF Research Database (Denmark)

    Christensen, David Johan; Bordignon, Mirko; Schultz, Ulrik Pagh

    2009-01-01

    speed its modules independently and in parallel adjust their behavior based on a single global reward signal. In simulation, we study the learning strategy?s performance on different robot con?gurations. On the physical platform, we perform learning experiments with ATRON robots learning to move as fast...

  17. The vacuum platform

    Science.gov (United States)

    McNab, A.

    2017-10-01

    This paper describes GridPP’s Vacuum Platform for managing virtual machines (VMs), which has been used to run production workloads for WLCG and other HEP experiments. The platform provides a uniform interface between VMs and the sites they run at, whether the site is organised as an Infrastructure-as-a-Service cloud system such as OpenStack, or an Infrastructure-as-a-Client system such as Vac. The paper describes our experience in using this platform, in developing and operating VM lifecycle managers Vac and Vcycle, and in interacting with VMs provided by LHCb, ATLAS, ALICE, CMS, and the GridPP DIRAC service to run production workloads.

  18. Web Services for Telegeriatric and Independent Living of the Elderly ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... models. The platform design follows a patient centric philosophy along with the ... aging population in the World). ... independent living environment for older people at home ...... impact scope. .... Configuring a Trusted Cloud.

  19. The universal modular platform

    International Nuclear Information System (INIS)

    North, R.B.

    1995-01-01

    A new and patented design for offshore wellhead platforms has been developed to meet a 'fast track' requirement for increased offshore production, from field locations not yet identified. The new design uses modular construction to allow for radical changes in the water depth of the final location and assembly line efficiency in fabrication. By utilizing high strength steels and structural support from the well conductors the new design accommodates all planned production requirements on a support structure significantly lighter and less expensive than the conventional design it replaces. Twenty two platforms based on the new design were ready for installation within 18 months of the project start. Installation of the new platforms began in 1992 for drilling support and 1993 for production support. The new design has become the Company standard for all future production platforms. Large saving and construction costs have been realized through its light weight, flexibility in both positioning and water depth, and its modular construction

  20. Identification of platform levels

    DEFF Research Database (Denmark)

    Mortensen, Niels Henrik

    2005-01-01

    reduction, ability to launch a wider product portfolio without increasing resources and reduction of complexity within the whole company. To support the multiple product development process, platform based product development has in many companies such as Philips, VW, Ford etc. proven to be a very effective...... product development in one step and therefore the objective of this paper is to identify levels of platform based product development. The structure of this paper is as follows. First the applied terminology for platforms will be briefly explained and then characteristics between single and multi product...... development will be examined. Based on the identification of the above characteristics five platform levels are described. The research presented in this paper is a result of MSc, Ph.D projects at the Technical University of Denmark and consultancy projects within the organisation of Institute of Product...

  1. Paper based electronics platform

    KAUST Repository

    Nassar, Joanna Mohammad; Sevilla, Galo Andres Torres; Hussain, Muhammad Mustafa

    2017-01-01

    A flexible and non-functionalized low cost paper-based electronic system platform fabricated from common paper, such as paper based sensors, and methods of producing paper based sensors, and methods of sensing using the paper based sensors

  2. USA Hire Testing Platform

    Data.gov (United States)

    Office of Personnel Management — The USA Hire Testing Platform delivers tests used in hiring for positions in the Federal Government. To safeguard the integrity of the hiring processes and ensure...

  3. Dual Coding and Bilingual Memory.

    Science.gov (United States)

    Paivio, Allan; Lambert, Wallace

    1981-01-01

    Describes study which tested a dual coding approach to bilingual memory using tasks that permit comparison of the effects of bilingual encoding with verbal-nonverbal dual encoding items. Results provide strong support for a version of the independent or separate stories view of bilingual memory. (Author/BK)

  4. National Community Solar Platform

    Energy Technology Data Exchange (ETDEWEB)

    Rupert, Bart [Clean Energy Collective, Louisville, CO (United States)

    2016-06-30

    This project was created to provide a National Community Solar Platform (NCSP) portal known as Community Solar Hub, that is available to any entity or individual who wants to develop community solar. This has been done by providing a comprehensive portal to make CEC’s solutions, and other proven community solar solutions, externally available for everyone to access – making the process easy through proven platforms to protect subscribers, developers and utilities. The successful completion of this project provides these tools via a web platform and integration APIs, a wide spectrum of community solar projects included in the platform, multiple groups of customers (utilities, EPCs, and advocates) using the platform to develop community solar, and open access to anyone interested in community solar. CEC’s Incubator project includes web-based informational resources, integrated systems for project information and billing systems, and engagement with customers and users by community solar experts. The combined effort externalizes much of Clean Energy Collective’s industry-leading expertise, allowing third parties to develop community solar without duplicating expensive start-up efforts. The availability of this platform creates community solar projects that are cheaper to build and cheaper to participate in, furthering the goals of DOE’s SunShot Initiative. Final SF 425 Final SF 428 Final DOE F 2050.11 Final Report Narrative

  5. Polar Codes

    Science.gov (United States)

    2014-12-01

    independently has a 10% chance of being flipped. Then the decoder should use the majority vote rule: if y is (0, 0, 0), (0, 0, 1), (0, 1, 0), or (1, 0, 0... tensor power, and BN is a square matrix called the bit-reversal operator. Therefore G−1N = (F ⊗n) −1 B−1N . Section VII.B of [1] shows that B −1 N...BN . 18 Also we see by direct computation that FF = I2. Using the tensor product identity (AC) ⊗ (BD) = (A⊗B)(C⊗D), we get that (F ⊗F )(F ⊗F ) = I2

  6. Online Crowdfunding Campaign for an Independent Video Game

    OpenAIRE

    Kivikangas, Inessa

    2014-01-01

    Over the past several years online reward-model crowdfunding platforms have become a popular tool for raising funds among independent game developers. Big success of several brilliant indie titles brought to the online crowdfunding platforms Kickstarter and Indiegogo hundreds of hopeful independent developers. However, apart from creating an excellent game indie developers have to be able to reach out to their audience and capture attention of potential supporters and gaming media. Time and e...

  7. AZTLAN: Mexican platform for analysis and design of nuclear reactors - 15493

    International Nuclear Information System (INIS)

    Gomez Torres, A.M.; Puente Espel, F.; Valle Gallegos, E. del; Francois, J.L.; Martin-del-Campo, C.; Espinosa-Paredes, G.

    2015-01-01

    The AZTLAN platform is presented in this paper. This project aims at modernizing, improving and incorporating the neutron transport codes such as AZTLAN, AZKIND and AZNHEX, thermo-hydraulics codes like AZTHECA and thermo-mechanical codes developed in the Mexican institutions of higher education as well as in the Mexican nuclear research institute, in an integrated platform, established and maintained for the benefit of the Mexican nuclear knowledge. An important part of the project is to develop a coupling methodology between neutron transport codes and thermal-hydraulics codes in order to get an accurate 3-dimensional simulation of a reactor core

  8. Vaccine platform recombinant measles virus.

    Science.gov (United States)

    Mühlebach, Michael D

    2017-10-01

    The classic development of vaccines is lengthy, tedious, and may not necessarily be successful as demonstrated by the case of HIV. This is especially a problem for emerging pathogens that are newly introduced into the human population and carry the inherent risk of pandemic spread in a naïve population. For such situations, a considerable number of different platform technologies are under development. These are also under development for pathogens, where directly derived vaccines are regarded as too complicated or even dangerous due to the induction of inefficient or unwanted immune responses causing considerable side-effects as for dengue virus. Among platform technologies are plasmid-based DNA vaccines, RNA replicons, single-round infectious vector particles, or replicating vaccine-based vectors encoding (a) critical antigen(s) of the target pathogens. Among the latter, recombinant measles viruses derived from vaccine strains have been tested. Measles vaccines are among the most effective and safest life-attenuated vaccines known. Therefore, the development of Schwarz-, Moraten-, or AIK-C-strain derived recombinant vaccines against a wide range of mostly viral, but also bacterial pathogens was quite straightforward. These vaccines generally induce powerful humoral and cellular immune responses in appropriate animal models, i.e., transgenic mice or non-human primates. Also in the recent first clinical phase I trial, the results have been quite encouraging. The trial indicated the expected safety and efficacy also in human patients, interestingly independent from the level of prevalent anti-measles immunity before the trial. Thereby, recombinant measles vaccines expressing additional antigens are a promising platform for future vaccines.

  9. The Platformization of the Web: Making Web Data Platform Ready

    NARCIS (Netherlands)

    Helmond, A.

    2015-01-01

    In this article, I inquire into Facebook’s development as a platform by situating it within the transformation of social network sites into social media platforms. I explore this shift with a historical perspective on, what I refer to as, platformization, or the rise of the platform as the dominant

  10. Scoping review and evaluation of SMS/text messaging platforms for mHealth projects or clinical interventions.

    Science.gov (United States)

    Iribarren, Sarah J; Brown, William; Giguere, Rebecca; Stone, Patricia; Schnall, Rebecca; Staggers, Nancy; Carballo-Diéguez, Alex

    2017-05-01

    Mobile technology supporting text messaging interventions (TMIs) continues to evolve, presenting challenges for researchers and healthcare professionals who need to choose software solutions to best meet their program needs. The objective of this review was to systematically identify and compare text messaging platforms and to summarize their advantages and disadvantages as described in peer-reviewed literature. A scoping review was conducted using four steps: 1) identify currently available platforms through online searches and in mHealth repositories; 2) expand evaluation criteria of an mHealth mobile messaging toolkit and integrate prior user experiences as researchers; 3) evaluate each platform's functions and features based on the expanded criteria and a vendor survey; and 4) assess the documentation of platform use in the peer-review literature. Platforms meeting inclusion criteria were assessed independently by three reviewers and discussed until consensus was reached. The PRISMA guidelines were followed to report findings. Of the 1041 potentially relevant search results, 27 platforms met inclusion criteria. Most were excluded because they were not platforms (e.g., guides, toolkits, reports, or SMS gateways). Of the 27 platforms, only 12 were identified in existing mHealth repositories, 10 from Google searches, while five were found in both. The expanded evaluation criteria included 22 items. Results indicate no uniform presentation of platform features and functions, often making these difficult to discern. Fourteen of the platforms were reported as open source, 10 focused on health care and 16 were tailored to meet needs of low resource settings (not mutually exclusive). Fifteen platforms had do-it-yourself setup (programming not required) while the remainder required coding/programming skills or setups could be built to specification by the vendor. Frequently described features included data security and access to the platform via cloud-based systems. Pay

  11. Allele coding in genomic evaluation

    Directory of Open Access Journals (Sweden)

    Christensen Ole F

    2011-06-01

    Full Text Available Abstract Background Genomic data are used in animal breeding to assist genetic evaluation. Several models to estimate genomic breeding values have been studied. In general, two approaches have been used. One approach estimates the marker effects first and then, genomic breeding values are obtained by summing marker effects. In the second approach, genomic breeding values are estimated directly using an equivalent model with a genomic relationship matrix. Allele coding is the method chosen to assign values to the regression coefficients in the statistical model. A common allele coding is zero for the homozygous genotype of the first allele, one for the heterozygote, and two for the homozygous genotype for the other allele. Another common allele coding changes these regression coefficients by subtracting a value from each marker such that the mean of regression coefficients is zero within each marker. We call this centered allele coding. This study considered effects of different allele coding methods on inference. Both marker-based and equivalent models were considered, and restricted maximum likelihood and Bayesian methods were used in inference. Results Theoretical derivations showed that parameter estimates and estimated marker effects in marker-based models are the same irrespective of the allele coding, provided that the model has a fixed general mean. For the equivalent models, the same results hold, even though different allele coding methods lead to different genomic relationship matrices. Calculated genomic breeding values are independent of allele coding when the estimate of the general mean is included into the values. Reliabilities of estimated genomic breeding values calculated using elements of the inverse of the coefficient matrix depend on the allele coding because different allele coding methods imply different models. Finally, allele coding affects the mixing of Markov chain Monte Carlo algorithms, with the centered coding being

  12. QUIL: a chemical equilibrium code

    International Nuclear Information System (INIS)

    Lunsford, J.L.

    1977-02-01

    A chemical equilibrium code QUIL is described, along with two support codes FENG and SURF. QUIL is designed to allow calculations on a wide range of chemical environments, which may include surface phases. QUIL was written specifically to calculate distributions associated with complex equilibria involving fission products in the primary coolant loop of the high-temperature gas-cooled reactor. QUIL depends upon an energy-data library called ELIB. This library is maintained by FENG and SURF. FENG enters into the library all reactions having standard free energies of reaction that are independent of concentration. SURF enters all surface reactions into ELIB. All three codes are interactive codes written to be used from a remote terminal, with paging control provided. Plotted output is also available

  13. Balanced distributed coding of omnidirectional images

    Science.gov (United States)

    Thirumalai, Vijayaraghavan; Tosic, Ivana; Frossard, Pascal

    2008-01-01

    This paper presents a distributed coding scheme for the representation of 3D scenes captured by stereo omni-directional cameras. We consider a scenario where images captured from two different viewpoints are encoded independently, with a balanced rate distribution among the different cameras. The distributed coding is built on multiresolution representation and partitioning of the visual information in each camera. The encoder transmits one partition after entropy coding, as well as the syndrome bits resulting from the channel encoding of the other partition. The decoder exploits the intra-view correlation and attempts to reconstruct the source image by combination of the entropy-coded partition and the syndrome information. At the same time, it exploits the inter-view correlation using motion estimation between images from different cameras. Experiments demonstrate that the distributed coding solution performs better than a scheme where images are handled independently, and that the coding rate stays balanced between encoders.

  14. The 1996 ENDF pre-processing codes

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1996-01-01

    The codes are named 'the Pre-processing' codes, because they are designed to pre-process ENDF/B data, for later, further processing for use in applications. This is a modular set of computer codes, each of which reads and writes evaluated nuclear data in the ENDF/B format. Each code performs one or more independent operations on the data, as described below. These codes are designed to be computer independent, and are presently operational on every type of computer from large mainframe computer to small personal computers, such as IBM-PC and Power MAC. The codes are available from the IAEA Nuclear Data Section, free of charge upon request. (author)

  15. Analysis of offshore platforms lifting with fixed pile structure type (fixed platform) based on ASD89

    Science.gov (United States)

    Sugianto, Agus; Indriani, Andi Marini

    2017-11-01

    Platform construction GTS (Gathering Testing Sattelite) is offshore construction platform with fix pile structure type/fixed platform functioning to support the mining of petroleum exploitation. After construction fabrication process platform was moved to barges, then shipped to the installation site. Moving process is generally done by pull or push based on construction design determined when planning. But at the time of lifting equipment/cranes available in the work area then the moving process can be done by lifting so that moving activity can be implemented more quickly of work. This analysis moving process of GTS platform in a different way that is generally done to GTS platform types by lifting using problem is construction reinforcement required, so the construction can be moved by lifting with analyzing and checking structure working stress that occurs due to construction moving process by lifting AISC code standard and analysis using the SAP2000 structure analysis program. The analysis result showed that existing condition cannot be moved by lifting because stress ratio is above maximum allowable value that is 0.950 (AISC-ASD89). Overstress occurs on the member 295 and 324 with stress ratio value 0.97 and 0.95 so that it is required structural reinforcement. Box plate aplication at both members so that it produces stress ratio values 0.78 at the member 295 and stress ratio of 0.77 at the member 324. These results indicate that the construction have qualified structural reinforcement for being moved by lifting.

  16. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  17. ARC Code TI: Self-Healing Independent File Transfer (Shift)

    Data.gov (United States)

    National Aeronautics and Space Administration — Shift is a lightweight framework for high performance local and remote file transfers that provides resiliency across a wide variety of failure scenarios through...

  18. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  19. Transactional Network Platform: Applications

    Energy Technology Data Exchange (ETDEWEB)

    Katipamula, Srinivas; Lutes, Robert G.; Ngo, Hung; Underhill, Ronald M.

    2013-10-31

    In FY13, Pacific Northwest National Laboratory (PNNL) with funding from the Department of Energy’s (DOE’s) Building Technologies Office (BTO) designed, prototyped and tested a transactional network platform to support energy, operational and financial transactions between any networked entities (equipment, organizations, buildings, grid, etc.). Initially, in FY13, the concept demonstrated transactions between packaged rooftop air conditioners and heat pump units (RTUs) and the electric grid using applications or "agents" that reside on the platform, on the equipment, on a local building controller or in the Cloud. The transactional network project is a multi-lab effort with Oakridge National Laboratory (ORNL) and Lawrence Berkeley National Laboratory (LBNL) also contributing to the effort. PNNL coordinated the project and also was responsible for the development of the transactional network (TN) platform and three different applications associated with RTUs. This document describes two applications or "agents" in details, and also summarizes the platform. The TN platform details are described in another companion document.

  20. Dynamic Shannon Coding

    OpenAIRE

    Gagie, Travis

    2005-01-01

    We present a new algorithm for dynamic prefix-free coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient length-restricted coding, alphabetic coding and coding with unequal letter costs.

  1. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  2. Codes Over Hyperfields

    Directory of Open Access Journals (Sweden)

    Atamewoue Surdive

    2017-12-01

    Full Text Available In this paper, we define linear codes and cyclic codes over a finite Krasner hyperfield and we characterize these codes by their generator matrices and parity check matrices. We also demonstrate that codes over finite Krasner hyperfields are more interesting for code theory than codes over classical finite fields.

  3. A photon dominated region code comparison study

    NARCIS (Netherlands)

    Roellig, M.; Abel, N. P.; Bell, T.; Bensch, F.; Black, J.; Ferland, G. J.; Jonkheid, B.; Kamp, I.; Kaufman, M. J.; Le Bourlot, J.; Le Petit, F.; Meijerink, R.; Morata, O.; Ossenkopf, Volker; Roueff, E.; Shaw, G.; Spaans, M.; Sternberg, A.; Stutzki, J.; Thi, W.-F.; van Dishoeck, E. F.; van Hoof, P. A. M.; Viti, S.; Wolfire, M. G.

    Aims. We present a comparison between independent computer codes, modeling the physics and chemistry of interstellar photon dominated regions (PDRs). Our goal was to understand the mutual differences in the PDR codes and their effects on the physical and chemical structure of the model clouds, and

  4. Continuous-variable quantum erasure correcting code

    DEFF Research Database (Denmark)

    Lassen, Mikael Østergaard; Sabuncu, Metin; Huck, Alexander

    2010-01-01

    We experimentally demonstrate a continuous variable quantum erasure-correcting code, which protects coherent states of light against complete erasure. The scheme encodes two coherent states into a bi-party entangled state, and the resulting 4-mode code is conveyed through 4 independent channels...

  5. Platform-based production development

    DEFF Research Database (Denmark)

    Bossen, Jacob; Brunoe, Thomas Ditlev; Nielsen, Kjeld

    2015-01-01

    Platforms as a means for applying modular thinking in product development is relatively well studied, but platforms in the production system has until now not been given much attention. With the emerging concept of platform-based co-development the importance of production platforms is though...

  6. Application of genotyping-by-sequencing on semiconductor sequencing platforms: a comparison of genetic and reference-based marker ordering in barley.

    Directory of Open Access Journals (Sweden)

    Martin Mascher

    Full Text Available The rapid development of next-generation sequencing platforms has enabled the use of sequencing for routine genotyping across a range of genetics studies and breeding applications. Genotyping-by-sequencing (GBS, a low-cost, reduced representation sequencing method, is becoming a common approach for whole-genome marker profiling in many species. With quickly developing sequencing technologies, adapting current GBS methodologies to new platforms will leverage these advancements for future studies. To test new semiconductor sequencing platforms for GBS, we genotyped a barley recombinant inbred line (RIL population. Based on a previous GBS approach, we designed bar code and adapter sets for the Ion Torrent platforms. Four sets of 24-plex libraries were constructed consisting of 94 RILs and the two parents and sequenced on two Ion platforms. In parallel, a 96-plex library of the same RILs was sequenced on the Illumina HiSeq 2000. We applied two different computational pipelines to analyze sequencing data; the reference-independent TASSEL pipeline and a reference-based pipeline using SAMtools. Sequence contigs positioned on the integrated physical and genetic map were used for read mapping and variant calling. We found high agreement in genotype calls between the different platforms and high concordance between genetic and reference-based marker order. There was, however, paucity in the number of SNP that were jointly discovered by the different pipelines indicating a strong effect of alignment and filtering parameters on SNP discovery. We show the utility of the current barley genome assembly as a framework for developing very low-cost genetic maps, facilitating high resolution genetic mapping and negating the need for developing de novo genetic maps for future studies in barley. Through demonstration of GBS on semiconductor sequencing platforms, we conclude that the GBS approach is amenable to a range of platforms and can easily be modified as new

  7. The Definitive Guide to NetBeans™ Platform 7

    CERN Document Server

    Bock, Heiko

    2011-01-01

    The NetBeans Platform is the world's only modular Swing application framework, used by very large organizations in mission-critical scenarios, such as at Boeing and Northrop Grumman, as well as in the financial sector and in the oil/gas industry. For these large customers in enterprises who are increasingly interested in Maven and OSGi, the book will have particular relevance. The Definitive Guide to NetBeans Platform 7 is a thorough and authoritative introduction to the open-source NetBeans Platform, covering all its major APIs in detail, with relevant code examples used throughout. * Provide

  8. Volttron: An Agent Platform for the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Haack, Jereme N.; Akyol, Bora A.; Carpenter, Brandon J.; Tews, Cody W.; Foglesong, Lance W.

    2013-05-06

    VOLLTRON platform enables the deployment of intelligent sensors and controllers in the smart grid and provides a stable, secure and flexible framework that expands the sensing and control capabilities. VOLTTRON platform provides services fulfilling the essential requirements of resource management and security for agent operation in the power grid. The facilities provided by the platform allow agent developers to focus on the implementation of their agent system and not on the necessary "plumbing' code. For example, a simple collaborative demand response application was written in less than 200 lines of Python.

  9. Reusable platform concepts

    International Nuclear Information System (INIS)

    Gudmestad, O.T.; Sparby, B.K.; Stead, B.L.

    1993-01-01

    There is an increasing need to reduce costs of offshore production facilities in order to make development of offshore fields profitable. For small fields with short production time there is in particular a need to investigate ways to reduce costs. The idea of platform reuse is for such fields particularly attractive. This paper will review reusable platform concepts and will discuss their range of application. Particular emphasis will be placed on technical limitations. Traditional concepts as jackups and floating production facilities will be discussed by major attention will be given to newly developed ideas for reuse of steel jackets and concrete structures. It will be shown how the operator for several fields can obtain considerable savings by applying such reusable platform concepts

  10. Why comply with a code of ethics?

    Science.gov (United States)

    Spielthenner, Georg

    2015-05-01

    A growing number of professional associations and occupational groups are creating codes of ethics with the goal of guiding their members, protecting service users, and safeguarding the reputation of the profession. There is a great deal of literature dealing with the question to what extent ethical codes can achieve their desired objectives. The present paper does not contribute to this debate. Its aim is rather to investigate how rational it is to comply with codes of conduct. It is natural and virtually inevitable for a reflective person to ask why one should pay any attention to ethical codes, in particular if following a code is not in one's own interest. In order to achieve the aim of this paper, I shall (in "Quasi-reasons for complying with an ethical code" section) discuss reasons that only appear to be reasons for complying with a code. In "Code-independent reasons" section, I shall present genuine practical reasons that, however, turn out to be reasons of the wrong kind. In "Code-dependent reasons" section finally presents the most important reasons for complying with ethical codes. The paper argues that while ethical codes do not necessarily yield reasons for action, professionals can have genuine reasons for complying with a code, which may, however, be rather weak and easily overridden by reasons for deviating from the code.

  11. Modeling report of DYMOND code (DUPIC version)

    International Nuclear Information System (INIS)

    Park, Joo Hwan; Yacout, Abdellatif M.

    2003-04-01

    The DYMOND code employs the ITHINK dynamic modeling platform to assess the 100-year dynamic evolution scenarios for postulated global nuclear energy parks. Firstly, DYMOND code has been developed by ANL(Argonne National Laboratory) to perform the fuel cycle analysis of LWR once-through and LWR-FBR mixed plant. Since the extensive application of DYMOND code has been requested, the first version of DYMOND has been modified to adapt the DUPIC, MSR and RTF fuel cycle. DYMOND code is composed of three parts; the source language platform, input supply and output. But those platforms are not clearly distinguished. This report described all the equations which were modeled in the modified DYMOND code (which is called as DYMOND-DUPIC version). It divided into five parts;Part A deals model in reactor history which is included amount of the requested fuels and spent fuels. Part B aims to describe model of fuel cycle about fuel flow from the beginning to the end of fuel cycle. Part C is for model in re-processing which is included recovery of burned uranium, plutonium, minor actinide and fission product as well as the amount of spent fuels in storage and disposal. Part D is for model in other fuel cycle which is considered the thorium fuel cycle for MSR and RTF reactor. Part E is for model in economics. This part gives all the information of cost such as uranium mining cost, reactor operating cost, fuel cost etc

  12. Modeling report of DYMOND code (DUPIC version)

    Energy Technology Data Exchange (ETDEWEB)

    Park, Joo Hwan [KAERI, Taejon (Korea, Republic of); Yacout, Abdellatif M [Argonne National Laboratory, Ilinois (United States)

    2003-04-01

    The DYMOND code employs the ITHINK dynamic modeling platform to assess the 100-year dynamic evolution scenarios for postulated global nuclear energy parks. Firstly, DYMOND code has been developed by ANL(Argonne National Laboratory) to perform the fuel cycle analysis of LWR once-through and LWR-FBR mixed plant. Since the extensive application of DYMOND code has been requested, the first version of DYMOND has been modified to adapt the DUPIC, MSR and RTF fuel cycle. DYMOND code is composed of three parts; the source language platform, input supply and output. But those platforms are not clearly distinguished. This report described all the equations which were modeled in the modified DYMOND code (which is called as DYMOND-DUPIC version). It divided into five parts;Part A deals model in reactor history which is included amount of the requested fuels and spent fuels. Part B aims to describe model of fuel cycle about fuel flow from the beginning to the end of fuel cycle. Part C is for model in re-processing which is included recovery of burned uranium, plutonium, minor actinide and fission product as well as the amount of spent fuels in storage and disposal. Part D is for model in other fuel cycle which is considered the thorium fuel cycle for MSR and RTF reactor. Part E is for model in economics. This part gives all the information of cost such as uranium mining cost, reactor operating cost, fuel cost etc.

  13. Windows Azure Platform

    CERN Document Server

    Redkar, Tejaswi

    2011-01-01

    The Windows Azure Platform has rapidly established itself as one of the most sophisticated cloud computing platforms available. With Microsoft working to continually update their product and keep it at the cutting edge, the future looks bright - if you have the skills to harness it. In particular, new features such as remote desktop access, dynamic content caching and secure content delivery using SSL make the latest version of Azure a more powerful solution than ever before. It's widely agreed that cloud computing has produced a paradigm shift in traditional architectural concepts by providin

  14. Analytical validation of the CACECO containment analysis code

    International Nuclear Information System (INIS)

    Peak, R.D.

    1979-08-01

    The CACECO containment analysis code was developed to predict the thermodynamic responses of LMFBR containment facilities to a variety of accidents. This report covers the verification of the CACECO code by problems that can be solved by hand calculations or by reference to textbook and literature examples. The verification concentrates on the accuracy of the material and energy balances maintained by the code and on the independence of the four cells analyzed by the code so that the user can be assured that the code analyses are numerically correct and independent of the organization of the input data submitted to the code

  15. Independence and Product Systems

    OpenAIRE

    Skeide, Michael

    2003-01-01

    Starting from elementary considerations about independence and Markov processes in classical probability we arrive at the new concept of conditional monotone independence (or operator-valued monotone independence). With the help of product systems of Hilbert modules we show that monotone conditional independence arises naturally in dilation theory.

  16. Targeting multiple heterogeneous hardware platforms with OpenCL

    Science.gov (United States)

    Fox, Paul A.; Kozacik, Stephen T.; Humphrey, John R.; Paolini, Aaron; Kuller, Aryeh; Kelmelis, Eric J.

    2014-06-01

    The OpenCL API allows for the abstract expression of parallel, heterogeneous computing, but hardware implementations have substantial implementation differences. The abstractions provided by the OpenCL API are often insufficiently high-level to conceal differences in hardware architecture. Additionally, implementations often do not take advantage of potential performance gains from certain features due to hardware limitations and other factors. These factors make it challenging to produce code that is portable in practice, resulting in much OpenCL code being duplicated for each hardware platform being targeted. This duplication of effort offsets the principal advantage of OpenCL: portability. The use of certain coding practices can mitigate this problem, allowing a common code base to be adapted to perform well across a wide range of hardware platforms. To this end, we explore some general practices for producing performant code that are effective across platforms. Additionally, we explore some ways of modularizing code to enable optional optimizations that take advantage of hardware-specific characteristics. The minimum requirement for portability implies avoiding the use of OpenCL features that are optional, not widely implemented, poorly implemented, or missing in major implementations. Exposing multiple levels of parallelism allows hardware to take advantage of the types of parallelism it supports, from the task level down to explicit vector operations. Static optimizations and branch elimination in device code help the platform compiler to effectively optimize programs. Modularization of some code is important to allow operations to be chosen for performance on target hardware. Optional subroutines exploiting explicit memory locality allow for different memory hierarchies to be exploited for maximum performance. The C preprocessor and JIT compilation using the OpenCL runtime can be used to enable some of these techniques, as well as to factor in hardware

  17. pix2code: Generating Code from a Graphical User Interface Screenshot

    OpenAIRE

    Beltramelli, Tony

    2017-01-01

    Transforming a graphical user interface screenshot created by a designer into computer code is a typical task conducted by a developer in order to build customized software, websites, and mobile applications. In this paper, we show that deep learning methods can be leveraged to train a model end-to-end to automatically generate code from a single input image with over 77% of accuracy for three different platforms (i.e. iOS, Android and web-based technologies).

  18. The Creative Platform

    DEFF Research Database (Denmark)

    Byrge, Christian; Hansen, Søren

    whether you consider thirdgrade teaching, human-resource development, or radical new thinking in product development in a company. The Creative Platform was developed at Aalborg University through a series of research-and-development activities in collaboration with educational institutions and private...

  19. Creative Platform Learning (CPL)

    DEFF Research Database (Denmark)

    Christensen, Jonna Langeland; Hansen, Søren

    Creative Platform Learning (CPL) er en pædagogisk metode, der skaber foretagsomme og innovative elever, der kan anvende deres kreativitet til at lære nyt. Ifølge den nye skolereform skal Innovation og entreprenørskab tydeliggøres i alle fag. I CPL er det en integreret del af undervisningen...

  20. Games and Platform Decisions

    DEFF Research Database (Denmark)

    Hansen, Poul H. Kyvsgård; Mikkola, Juliana Hsuan

    2007-01-01

    is the application of on-line games in order to provide training for decision makers and in order to generate overview over the implications of platform decisions. However, games have to be placed in a context with other methods and we argue that a mixture of games, workshops, and simulations can provide improved...

  1. Shot loading platform analysis

    International Nuclear Information System (INIS)

    Norman, B.F.

    1994-01-01

    This document provides the wind/seismic analysis and evaluation for the shot loading platform. Hand calculations were used for the analysis. AISC and UBC load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met

  2. CERN Neutrino Platform Hardware

    CERN Document Server

    Nelson, Kevin

    2017-01-01

    My summer research was broadly in CERN's neutrino platform hardware efforts. This project had two main components: detector assembly and data analysis work for ICARUS. Specifically, I worked on assembly for the ProtoDUNE project and monitored the safety of ICARUS as it was transported to Fermilab by analyzing the accelerometer data from its move.

  3. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  4. Homological stabilizer codes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Jonas T., E-mail: jonastyleranderson@gmail.com

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  5. Design Patterns for Sparse-Matrix Computations on Hybrid CPU/GPU Platforms

    Directory of Open Access Journals (Sweden)

    Valeria Cardellini

    2014-01-01

    Full Text Available We apply object-oriented software design patterns to develop code for scientific software involving sparse matrices. Design patterns arise when multiple independent developments produce similar designs which converge onto a generic solution. We demonstrate how to use design patterns to implement an interface for sparse matrix computations on NVIDIA GPUs starting from PSBLAS, an existing sparse matrix library, and from existing sets of GPU kernels for sparse matrices. We also compare the throughput of the PSBLAS sparse matrix–vector multiplication on two platforms exploiting the GPU with that obtained by a CPU-only PSBLAS implementation. Our experiments exhibit encouraging results regarding the comparison between CPU and GPU executions in double precision, obtaining a speedup of up to 35.35 on NVIDIA GTX 285 with respect to AMD Athlon 7750, and up to 10.15 on NVIDIA Tesla C2050 with respect to Intel Xeon X5650.

  6. REVEAL: Software Documentation and Platform Migration

    Science.gov (United States)

    Wilson, Michael A.; Veibell, Victoir T.; Freudinger, Lawrence C.

    2008-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA s Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This report specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as a final report for that internship. The topics discussed include: the documentation of REVEAL source code; the migration of REVEAL to other platforms; and an end-to-end field test that successfully validates the efforts.

  7. Low Computational Complexity Network Coding For Mobile Networks

    DEFF Research Database (Denmark)

    Heide, Janus

    2012-01-01

    Network Coding (NC) is a technique that can provide benefits in many types of networks, some examples from wireless networks are: In relay networks, either the physical or the data link layer, to reduce the number of transmissions. In reliable multicast, to reduce the amount of signaling and enable......-flow coding technique. One of the key challenges of this technique is its inherent computational complexity which can lead to high computational load and energy consumption in particular on the mobile platforms that are the target platform in this work. To increase the coding throughput several...

  8. Concatenated quantum codes

    Energy Technology Data Exchange (ETDEWEB)

    Knill, E.; Laflamme, R.

    1996-07-01

    One main problem for the future of practial quantum computing is to stabilize the computation against unwanted interactions with the environment and imperfections in the applied operations. Existing proposals for quantum memories and quantum channels require gates with asymptotically zero error to store or transmit an input quantum state for arbitrarily long times or distances with fixed error. This report gives a method which has the property that to store or transmit a qubit with maximum error {epsilon} requires gates with errors at most {ital c}{epsilon} and storage or channel elements with error at most {epsilon}, independent of how long we wish to store the state or how far we wish to transmit it. The method relies on using concatenated quantum codes and hierarchically implemented recovery operations. The overhead of the method is polynomial in the time of storage or the distance of the transmission. Rigorous and heuristic lower bounds for the constant {ital c} are given.

  9. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  10. Mobile Platforms and Development Environments

    CERN Document Server

    Helal, Sumi; Li, Wengdong

    2012-01-01

    Mobile platform development has lately become a technological war zone with extremely dynamic and fluid movement, especially in the smart phone and tablet market space. This Synthesis lecture is a guide to the latest developments of the key mobile platforms that are shaping the mobile platform industry. The book covers the three currently dominant native platforms -- iOS, Android and Windows Phone -- along with the device-agnostic HTML5 mobile web platform. The lecture also covers location-based services (LBS) which can be considered as a platform in its own right. The lecture utilizes a sampl

  11. Diagnostic Coding for Epilepsy.

    Science.gov (United States)

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  12. Coding of Neuroinfectious Diseases.

    Science.gov (United States)

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  13. Are Independent Fiscal Institutions Really Independent?

    Directory of Open Access Journals (Sweden)

    Slawomir Franek

    2015-08-01

    Full Text Available In the last decade the number of independent fiscal institutions (known also as fiscal councils has tripled. They play an important oversight role over fiscal policy-making in democratic societies, especially as they seek to restore public finance stability in the wake of the recent financial crisis. Although common functions of such institutions include a role in analysis of fiscal policy, forecasting, monitoring compliance with fiscal rules or costing of spending proposals, their roles, resources and structures vary considerably across countries. The aim of the article is to determine the degree of independence of such institutions based on the analysis of the independence index of independent fiscal institutions. The analysis of this index values may be useful to determine the relations between the degree of independence of fiscal councils and fiscal performance of particular countries. The data used to calculate the index values will be derived from European Commission and IMF, which collect sets of information about characteristics of activity of fiscal councils.

  14. Providing Device Independence to Mobile Services

    OpenAIRE

    Nylander, Stina; Bylund, Markus

    2002-01-01

    People want user interfaces to services that are functional and well suited to the device they choose for access. To provide this, services must be able to offer device specific user interfaces for the wide range of devices available today. We propose to combine the two dominant approaches to platform independence, "Write Once, Run Every-where™" and "different version for each device", to create multiple device specific user interfaces for mobile services. This gives possibilities to minimize...

  15. CONTAIN independent peer review

    Energy Technology Data Exchange (ETDEWEB)

    Boyack, B.E. [Los Alamos National Lab., NM (United States); Corradini, M.L. [Univ. of Wisconsin, Madison, WI (United States). Nuclear Engineering Dept.; Denning, R.S. [Battelle Memorial Inst., Columbus, OH (United States); Khatib-Rahbar, M. [Energy Research Inc., Rockville, MD (United States); Loyalka, S.K. [Univ. of Missouri, Columbia, MO (United States); Smith, P.N. [AEA Technology, Dorchester (United Kingdom). Winfrith Technology Center

    1995-01-01

    The CONTAIN code was developed by Sandia National Laboratories under the sponsorship of the US Nuclear Regulatory Commission (NRC) to provide integrated analyses of containment phenomena. It is used to predict nuclear reactor containment loads, radiological source terms, and associated physical phenomena for a range of accident conditions encompassing both design-basis and severe accidents. The code`s targeted applications include support for containment-related experimental programs, light water and advanced light water reactor plant analysis, and analytical support for resolution of specific technical issues such as direct containment heating. The NRC decided that a broad technical review of the code should be performed by technical experts to determine its overall technical adequacy. For this purpose, a six-member CONTAIN Peer Review Committee was organized and a peer review as conducted. While the review was in progress, the NRC issued a draft ``Revised Severe Accident Code Strategy`` that incorporated revised design objectives and targeted applications for the CONTAIN code. The committee continued its effort to develop findings relative to the original NRC statement of design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications were considered by the Committee in assigning priorities to the Committee`s recommendations. The Committee determined some improvements are warranted and provided recommendations in five code-related areas: (1) documentation, (2) user guidance, (3) modeling capability, (4) code assessment, and (5) technical assessment.

  16. Analysis of the development of cross-platform mobile applications

    OpenAIRE

    Pinedo Escribano, Diego

    2012-01-01

    The development of mobile phone applications is a huge market nowadays. There are many companies investing lot of money to develop successful and profitable applications. The problem emerges when trying to develop an application to be used by every user independently of the platform they are using (Android, iOS, BlackBerry OS, Windows Phone, etc.). For this reason, on the last years many different technologies have appeared that making the development of cross-platform applications easier. In...

  17. Platform Performance and Challenges - using Platforms in Lego Company

    DEFF Research Database (Denmark)

    Munk, Lone; Mortensen, Niels Henrik

    2009-01-01

    needs focus on the incentive of using the platform. This problem lacks attention in literature, as well as industry, where assessment criteria do not cover this aspect. Therefore, we recommend including user incentive in platform assessment criteria to these challenges. Concrete solution elements...... ensuring user incentive in platforms is an object for future research...

  18. Determination of current loads of floating platform for special purposes

    Science.gov (United States)

    Ma, Guang-ying; Yao, Yun-long; Zhao, Chen-yao

    2017-08-01

    This article studied a new floating offshore platform for special purposes, which was assembled by standard floating modules. The environmental load calculation of the platform is an important part of the research of the ocean platform, which has always been paid attention to by engineers. In addition to wave loads, the wind loads and current loads are also important environmental factors that affect the dynamic response of the offshore platform. The current loads on the bottom structure should not be ignored. By Fluent software, the hydrostatic conditions and external current loads of the platform were calculated in this paper. The coefficient which is independent of the current velocity, namely, current force coefficient, can be fitted through current loads, which can be used for the consequent hydrodynamic and mooring analyses.

  19. Development of the DTNTES code

    International Nuclear Information System (INIS)

    Ortega Prieto, P.; Morales Dorado, M.D.; Alonso Santos, A.

    1987-01-01

    The DTNTES code has been developed in the Department of Nuclear Technology of the Polytechnical University in Madrid as a part of the Research Program on Quantitative Risk Analysis. DTNTES code calculates several time-dependent probabilistic characteristics of basic events, minimal cut sets and the top event of a fault tree. The code assumes that basic events are statistically independent, and they have failure and repair distributions. It computes the minimal cut upper bound approach for the top event unavailability, and the time-dependent unreliability of the top event by means of different methods, selected by the user. These methods are: expected number of system failures, failure rate, Barlow-Proschan bound, steady-state upper bound, and T* method. (author)

  20. Available: motorised platform

    CERN Multimedia

    The COMPASS collaboration

    2014-01-01

    The COMPASS collaboration would like to offer to a new owner the following useful and fully operational piece of equipment, which is due to be replaced with better adapted equipment.   Please contact Erwin Bielert (erwin.bielert@cern.ch or 160539) for further information.  Motorized platform (FOR FREE):   Fabricated by ACL (Alfredo Cardoso & Cia Ltd) in Portugal. The model number is MeXs 5-­‐30.  Specifications: 5 m wide, 1 m deep, adjustable height (1.5 m if folded). Maximum working floor height: 4 m. conforms to CERN regulations, number LV158. Type LD500, capacity 500 kg and weight 2000 kg.  If no interested party is found before December 2014, the platform will be thrown away.

  1. RemoteLabs Platform

    Directory of Open Access Journals (Sweden)

    Nils Crabeel

    2012-03-01

    Full Text Available This paper reports on a first step towards the implementation of a framework for remote experimentation of electric machines – the RemoteLabs platform. This project was focused on the development of two main modules: the user Web-based and the electric machines interfaces. The Web application provides the user with a front-end and interacts with the back-end – the user and experiment persistent data. The electric machines interface is implemented as a distributed client server application where the clients, launched by the Web application, interact with the server modules located in platforms physically connected the electric machines drives. Users can register and authenticate, schedule, specify and run experiments and obtain results in the form of CSV, XML and PDF files. These functionalities were successfully tested with real data, but still without including the electric machines. This inclusion is part of another project scheduled to start soon.

  2. Common tester platform concept.

    Energy Technology Data Exchange (ETDEWEB)

    Hurst, Michael James

    2008-05-01

    This report summarizes the results of a case study on the doctrine of a common tester platform, a concept of a standardized platform that can be applicable across the broad spectrum of testing requirements throughout the various stages of a weapons program, as well as across the various weapons programs. The common tester concept strives to define an affordable, next-generation design that will meet testing requirements with the flexibility to grow and expand; supporting the initial development stages of a weapons program through to the final production and surveillance stages. This report discusses a concept investing key leveraging technologies and operational concepts combined with prototype tester-development experiences and practical lessons learned gleaned from past weapons programs.

  3. Online stock trading platform

    Directory of Open Access Journals (Sweden)

    Ion LUNGU

    2006-01-01

    Full Text Available The Internet is the perfect tool that can assure the market’s transparency for any user who wants to trade on the stock market. The investor can have access to the market news, financial calendar or the press releases of the issuers. A good online trading platform also provides real-time intraday quotes, trading history and technical analysis giving the investor a clearer view of the supply and demand in the market. All this information provides the investor a good image of the market and encourages him to trade. This paper wishes to draft the pieces of an online trading platform and to analyze the impact of developing and implementing one in a brokerage firm.

  4. HPC - Platforms Penta Chart

    Energy Technology Data Exchange (ETDEWEB)

    Trujillo, Angelina Michelle [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-10-08

    Strategy, Planning, Acquiring- very large scale computing platforms come and go and planning for immensely scalable machines often precedes actual procurement by 3 years. Procurement can be another year or more. Integration- After Acquisition, machines must be integrated into the computing environments at LANL. Connection to scalable storage via large scale storage networking, assuring correct and secure operations. Management and Utilization – Ongoing operations, maintenance, and trouble shooting of the hardware and systems software at massive scale is required.

  5. Cloud Robotics Platforms

    Directory of Open Access Journals (Sweden)

    Busra Koken

    2015-01-01

    Full Text Available Cloud robotics is a rapidly evolving field that allows robots to offload computation-intensive and storage-intensive jobs into the cloud. Robots are limited in terms of computational capacity, memory and storage. Cloud provides unlimited computation power, memory, storage and especially collaboration opportunity. Cloud-enabled robots are divided into two categories as standalone and networked robots. This article surveys cloud robotic platforms, standalone and networked robotic works such as grasping, simultaneous localization and mapping (SLAM and monitoring.

  6. RIPE [robot independent programming environment]: A robot independent programming environment

    International Nuclear Information System (INIS)

    Miller, D.J.; Lennox, R.C.

    1990-01-01

    Remote manual operations in radiation environments are typically performed very slowly. Sensor-based computer-controlled robots hold great promise for increasing the speed and safety of remote operations; however, the programming of robotic systems has proven to be expensive and difficult. Generalized approaches to robot programming that reuse available software modules and employ programming languages which are independent of the specific robotic and sensory devices being used are needed to speed software development and increase overall system reliability. This paper discusses the robot independent programming environment (RIPE) developed at Sandia National Laboratories (SNL). The RIPE is an object-oriented approach to robot system architectures; it is a software environment that facilitates rapid design and implementation of complex robot systems for diverse applications. An architecture based on hierarchies of distributed multiprocessors provides the computing platform for a layered programming structure that models applications using software objects. These objects are designed to support model-based automated programming of robotic and machining devices, real-time sensor-based control, error handling, and robust communication

  7. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  8. The Prodiguer Messaging Platform

    Science.gov (United States)

    Denvil, S.; Greenslade, M. A.; Carenton, N.; Levavasseur, G.; Raciazek, J.

    2015-12-01

    CONVERGENCE is a French multi-partner national project designed to gather HPC and informatics expertise to innovate in the context of running French global climate models with differing grids and at differing resolutions. Efficient and reliable execution of these models and the management and dissemination of model output are some of the complexities that CONVERGENCE aims to resolve.At any one moment in time, researchers affiliated with the Institut Pierre Simon Laplace (IPSL) climate modeling group, are running hundreds of global climate simulations. These simulations execute upon a heterogeneous set of French High Performance Computing (HPC) environments. The IPSL's simulation execution runtime libIGCM (library for IPSL Global Climate Modeling group) has recently been enhanced so as to support hitherto impossible realtime use cases such as simulation monitoring, data publication, metrics collection, simulation control, visualizations … etc. At the core of this enhancement is Prodiguer: an AMQP (Advanced Message Queue Protocol) based event driven asynchronous distributed messaging platform. libIGCM now dispatches copious amounts of information, in the form of messages, to the platform for remote processing by Prodiguer software agents at IPSL servers in Paris. Such processing takes several forms: Persisting message content to database(s); Launching rollback jobs upon simulation failure; Notifying downstream applications; Automation of visualization pipelines; We will describe and/or demonstrate the platform's: Technical implementation; Inherent ease of scalability; Inherent adaptiveness in respect to supervising simulations; Web portal receiving simulation notifications in realtime.

  9. Vector Network Coding

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...

  10. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  11. Generalized concatenated quantum codes

    International Nuclear Information System (INIS)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-01-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  12. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  13. Central Bank independence

    Directory of Open Access Journals (Sweden)

    Vasile DEDU

    2012-08-01

    Full Text Available In this paper we present the key aspects regarding central bank’s independence. Most economists consider that the factor which positively influences the efficiency of monetary policy measures is the high independence of the central bank. We determined that the National Bank of Romania (NBR has a high degree of independence. NBR has both goal and instrument independence. We also consider that the hike of NBR’s independence played an important role in the significant disinflation process, as headline inflation dropped inside the targeted band of 3% ± 1 percentage point recently.

  14. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  15. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  16. Organizing Independent Student Work

    Directory of Open Access Journals (Sweden)

    Zhadyra T. Zhumasheva

    2015-03-01

    Full Text Available This article addresses issues in organizing independent student work. The author defines the term “independence”, discusses the concepts of independent learner work and independent learner work under the guidance of an instructor, proposes a classification of assignments to be done independently, and provides methodological recommendations as to the organization of independent student work. The article discusses the need for turning the student from a passive consumer of knowledge into an active creator of it, capable of formulating a problem, analyzing the ways of solving it, coming up with an optimum outcome, and proving its correctness. The preparation of highly qualified human resources is the primary condition for boosting Kazakhstan’s competitiveness. Independent student work is a means of fostering the professional competence of future specialists. The primary form of self-education is independent work.

  17. Utilizing platforms in industrialized construction

    DEFF Research Database (Denmark)

    Bonev, Martin; Wörösch, Michael; Hvam, Lars

    2015-01-01

    platform strategies, this researchhighlights key aspects of adapting platform-based developed theory to industrialised construction.Building projects use different layers of product, process and logistics platforms to form the right cost– value ratio for the target market application, while modelling...

  18. CONTAIN independent peer review

    International Nuclear Information System (INIS)

    Boyack, B.E.; Corradini, M.L.; Khatib-Rahbar, M.; Loyalka, S.K.; Smith, P.N.

    1995-01-01

    The CONTAIN code was developed by Sandia National Laboratories under the sponsorship of the US Nuclear Regulatory Commission (NRC) to provide integrated analyses of containment phenomena. It is used to predict nuclear reactor containment loads, radiological source terms, and associated physical phenomena for a range of accident conditions encompassing both design-basis and severe accidents. The code's targeted applications include support for containment-related experimental programs, light water and advanced light water reactor plant analysis, and analytical support for resolution of specific technical issues such as direct containment heating. The NRC decided that a broad technical review of the code should be performed by technical experts to determine its overall technical adequacy. For this purpose, a six-member CONTAIN Peer Review Committee was organized and a peer review as conducted. While the review was in progress, the NRC issued a draft ''Revised Severe Accident Code Strategy'' that incorporated revised design objectives and targeted applications for the CONTAIN code. The committee continued its effort to develop findings relative to the original NRC statement of design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications were considered by the Committee in assigning priorities to the Committee's recommendations. The Committee determined some improvements are warranted and provided recommendations in five code-related areas: (1) documentation, (2) user guidance, (3) modeling capability, (4) code assessment, and (5) technical assessment

  19. Advances in the development of the Mexican platform for analysis and design of nuclear reactors: AZTLAN Platform

    International Nuclear Information System (INIS)

    Gomez T, A. M.; Puente E, F.; Del Valle G, E.; Francois L, J. L.; Espinosa P, G.

    2017-09-01

    The AZTLAN platform project: development of a Mexican platform for the analysis and design of nuclear reactors, financed by the SENER-CONACYT Energy Sustain ability Fund, was approved in early 2014 and formally began at the end of that year. It is a national project led by the Instituto Nacional de Investigaciones Nucleares (ININ) and with the collaboration of Instituto Politecnico Nacional (IPN), the Universidad Autonoma Metropolitana (UAM) and Universidad Nacional Autonoma de Mexico (UNAM) as part of the development team and with the participation of the Laguna Verde Nuclear Power Plant, the National Commission of Nuclear Safety and Safeguards, the Ministry of Energy and the Karlsruhe Institute of Technology (Kit, Germany) as part of the user group. The general objective of the project is to modernize, improve and integrate the neutronic, thermo-hydraulic and thermo-mechanical codes, developed in Mexican institutions, in an integrated platform, developed and maintained by Mexican experts for the benefit of Mexican institutions. Two years into the process, important steps have been taken that have consolidated the platform. The main results of these first two years have been presented in different national and international forums. In this congress, some of the most recent results that have been implemented in the platform codes are shown in more detail. The current status of the platform from a more executive view point is summarized in this paper. (Author)

  20. Motion-adaptive intraframe transform coding of video signals

    NARCIS (Netherlands)

    With, de P.H.N.

    1989-01-01

    Spatial transform coding has been widely applied for image compression because of its high coding efficiency. However, in many intraframe systems, in which every TV frame is independently processed, coding of moving objects in the case of interlaced input signals is not addressed. In this paper, we

  1. NORTICA - a new code for cyclotron analysis

    International Nuclear Information System (INIS)

    Gorelov, D.; Johnson, D.; Marti, F.

    2001-01-01

    The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state

  2. Simulator platform for fast reactor operation and safety technology demonstration

    International Nuclear Information System (INIS)

    Vilim, R.B.; Park, Y.S.; Grandy, C.; Belch, H.; Dworzanski, P.; Misterka, J.

    2012-01-01

    A simulator platform for visualization and demonstration of innovative concepts in fast reactor technology is described. The objective is to make more accessible the workings of fast reactor technology innovations and to do so in a human factors environment that uses state-of-the art visualization technologies. In this work the computer codes in use at Argonne National Laboratory (ANL) for the design of fast reactor systems are being integrated to run on this platform. This includes linking reactor systems codes with mechanical structures codes and using advanced graphics to depict the thermo-hydraulic-structure interactions that give rise to an inherently safe response to upsets. It also includes visualization of mechanical systems operation including advanced concepts that make use of robotics for operations, in-service inspection, and maintenance.

  3. Simulator platform for fast reactor operation and safety technology demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Vilim, R. B.; Park, Y. S.; Grandy, C.; Belch, H.; Dworzanski, P.; Misterka, J. (Nuclear Engineering Division)

    2012-07-30

    A simulator platform for visualization and demonstration of innovative concepts in fast reactor technology is described. The objective is to make more accessible the workings of fast reactor technology innovations and to do so in a human factors environment that uses state-of-the art visualization technologies. In this work the computer codes in use at Argonne National Laboratory (ANL) for the design of fast reactor systems are being integrated to run on this platform. This includes linking reactor systems codes with mechanical structures codes and using advanced graphics to depict the thermo-hydraulic-structure interactions that give rise to an inherently safe response to upsets. It also includes visualization of mechanical systems operation including advanced concepts that make use of robotics for operations, in-service inspection, and maintenance.

  4. Integration of the program TNXYZ in the platform SALOME

    International Nuclear Information System (INIS)

    Chaparro V, F. J.; Silva A, L.; Del Valle G, E.; Gomez T, A. M.; Vargas E, S.

    2013-10-01

    This work presents the procedure realized to integrate the code TNXYZ like a processing tool to the graphic simulation platform SALOME. The code TNXYZ solves the neutron transport equation in stationary state, for several energy groups, quantizing the angular variable by the discrete ordinates method and the space variable by nodal methods. The platform SALOME is a graphic surrounding designed for the construction, edition and simulation of mechanical models focused to the industry and contrary to other software, it allows to integrate external source codes to the surrounding, to form a complete scheme of execution, supervision, pre and post information processing. The code TNXYZ was programmed in the 90s in a Fortran compiler, but to be used at the present time the code should be actualized to the current compiler characteristics; also, in the original scheme was carried out a modularization process, that is to say, the main program was divided in sections where the code carries out important operations, with the intention of flexibility the data extraction process along its processing sequence and that can be useful in a later development of coupling. Finally, to verify the integration a fuel assembly BWR was modeled, as well as a control cell. The cross sections were obtained with the Monte Carlo Serpent code. Some results obtained with Serpent were used to verify and to begin with the validation of the code, being obtained an acceptable comparison in the infinite multiplication factor. The validation process should extend and one has planned to present in a future work. This work is part of the development of the research group formed between the Escuela Superior de Fisica y Matematicas del Instituto Politecnico Nacional (IPN) and the Instituto Nacional de Investigaciones Nucleares (ININ) in which a simulation Mexican platform of nuclear reactors is developed. (Author)

  5. Novel Biochip Platform for Nucleic Acid Analysis

    Directory of Open Access Journals (Sweden)

    Juan J. Diaz-Mochon

    2012-06-01

    Full Text Available This manuscript describes the use of a novel biochip platform for the rapid analysis/identification of nucleic acids, including DNA and microRNAs, with very high specificity. This approach combines a unique dynamic chemistry approach for nucleic acid testing and analysis developed by DestiNA Genomics with the STMicroelectronics In-Check platform, which comprises two microfluidic optimized and independent PCR reaction chambers, and a sequential microarray area for nucleic acid capture and identification by fluorescence. With its compact bench-top “footprint” requiring only a single technician to operate, the biochip system promises to transform and expand routine clinical diagnostic testing and screening for genetic diseases, cancers, drug toxicology and heart disease, as well as employment in the emerging companion diagnostics market.

  6. Code Generation from Pragmatics Annotated Coloured Petri Nets

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    limited work has been done on transforming CPN model to protocol implementations. The goal of the thesis is to be able to automatically generate high-quality implementations of communication protocols based on CPN models. In this thesis, we develop a methodology for generating implementations of protocols...... third party libraries and the code should be easily usable by third party code. Finally, the code should be readable by developers with expertise on the considered platforms. In this thesis, we show that our code generation approach is able to generate code for a wide range of platforms without altering...... such as games and rich web applications. Finally, we conclude the evaluation of the criteria of our approach by using the WebSocket PA-CPN model to show that we are able to verify fairly large protocols....

  7. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  8. Massively parallel Fokker-Planck code ALLAp

    International Nuclear Information System (INIS)

    Batishcheva, A.A.; Krasheninnikov, S.I.; Craddock, G.G.; Djordjevic, V.

    1996-01-01

    The recently developed for workstations Fokker-Planck code ALLA simulates the temporal evolution of 1V, 2V and 1D2V collisional edge plasmas. In this work we present the results of code parallelization on the CRI T3D massively parallel platform (ALLAp version). Simultaneously we benchmark the 1D2V parallel vesion against an analytic self-similar solution of the collisional kinetic equation. This test is not trivial as it demands a very strong spatial temperature and density variation within the simulation domain. (orig.)

  9. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  10. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Shannon limit of the channel. Among the earliest discovered codes that approach the. Shannon limit were the low density parity check (LDPC) codes. The term low density arises from the property of the parity check matrix defining the code. We will now define this matrix and the role that it plays in decoding. 2. Linear Codes.

  11. Manually operated coded switch

    International Nuclear Information System (INIS)

    Barnette, J.H.

    1978-01-01

    The disclosure related to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made

  12. Coding in Muscle Disease.

    Science.gov (United States)

    Jones, Lyell K; Ney, John P

    2016-12-01

    Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.

  13. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  14. FUZZY CONTROLLER FOR THE CONTROL OF THE MOBILE PLATFORM OF THE CORBYS ROBOTIC GAIT REHABILITATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Maria Kyrarini

    2014-12-01

    Full Text Available In this paper, an inverse kinematics based control algorithm for the joystick control of the mobile platform of the novel mobile robot-assisted gait rehabilitation system CORBYS is presented. The mobile platform has four independently steered and driven wheels. Given the linear and angular velocities of the mobile platform, the inverse kinematics algorithm gives as its output the steering angle and the driving angular velocity of each of the four wheels. The paper is focused on the steering control of the platform for which a fuzzy logic controller is developed and implemented. The experimental results of the real-world steering of the platform are presented in the paper.

  15. The CERN Neutrino Platform

    CERN Document Server

    Bordoni, Stefania

    2018-01-01

    The long-baseline neutrino programme has been classified as one of the four highest-priority sci- entific objectives in 2013 by the European Strategy for Particle Physics. The Neutrino Platform is the CERN venture to foster and support the next generation of accelerator-based neutrino os- cillation experiments. Part of the present CERN Medium-Term Plan, the Neutrino Platform provide facilities to develop and prototype the next generation of neutrino detectors and contribute to unify the European neu- trino community towards the US and Japanese projects. A significative effort is made on R&D; for LAr TPC technologies: two big LAr TPC prototypes for the DUNE far detector are under con- struction at CERN. Those detectors will be exposed in 2018 to an entirely new and NP-dedicated beam-line from the SPS which will provide electron, muon and hadron beams with energies in the range of sub-GeV to a few GeV. Other projects are also presently under development: one can cite the refurbishing and shipping to the US ...

  16. ETF system code: composition and applications

    International Nuclear Information System (INIS)

    Reid, R.L.; Wu, K.F.

    1980-01-01

    A computer code has been developed for application to ETF tokamak system and conceptual design studies. The code determines cost, performance, configuration, and technology requirements as a function of tokamak parameters. The ETF code is structured in a modular fashion in order to allow independent modeling of each major tokamak component. The primary benefit of modularization is that it allows updating of a component module, such as the TF coil module, without disturbing the remainder of the system code as long as the input/output to the modules remains unchanged. The modules may be run independently to perform specific design studies, such as determining the effect of allowable strain on TF coil structural requirements, or the modules may be executed together as a system to determine global effects, such as defining the impact of aspect ratio on the entire tokamak system

  17. Accounting for Independent Schools.

    Science.gov (United States)

    Sonenstein, Burton

    The diversity of independent schools in size, function, and mode of operation has resulted in a considerable variety of accounting principles and practices. This lack of uniformity has tended to make understanding, evaluation, and comparison of independent schools' financial statements a difficult and sometimes impossible task. This manual has…

  18. Research and Design in Unified Coding Architecture for Smart Grids

    Directory of Open Access Journals (Sweden)

    Gang Han

    2013-09-01

    Full Text Available Standardized and sharing information platform is the foundation of the Smart Grids. In order to improve the dispatching center information integration of the power grids and achieve efficient data exchange, sharing and interoperability, a unified coding architecture is proposed. The architecture includes coding management layer, coding generation layer, information models layer and application system layer. Hierarchical design makes the whole coding architecture to adapt to different application environments, different interfaces, loosely coupled requirements, which can realize the integration model management function of the power grids. The life cycle and evaluation method of survival of unified coding architecture is proposed. It can ensure the stability and availability of the coding architecture. Finally, the development direction of coding technology of the Smart Grids in future is prospected.

  19. Opportunistic Adaptive Transmission for Network Coding Using Nonbinary LDPC Codes

    Directory of Open Access Journals (Sweden)

    Cocco Giuseppe

    2010-01-01

    Full Text Available Network coding allows to exploit spatial diversity naturally present in mobile wireless networks and can be seen as an example of cooperative communication at the link layer and above. Such promising technique needs to rely on a suitable physical layer in order to achieve its best performance. In this paper, we present an opportunistic packet scheduling method based on physical layer considerations. We extend channel adaptation proposed for the broadcast phase of asymmetric two-way bidirectional relaying to a generic number of sinks and apply it to a network context. The method consists of adapting the information rate for each receiving node according to its channel status and independently of the other nodes. In this way, a higher network throughput can be achieved at the expense of a slightly higher complexity at the transmitter. This configuration allows to perform rate adaptation while fully preserving the benefits of channel and network coding. We carry out an information theoretical analysis of such approach and of that typically used in network coding. Numerical results based on nonbinary LDPC codes confirm the effectiveness of our approach with respect to previously proposed opportunistic scheduling techniques.

  20. Linking Training Course Support to Fleet Platforms: An Equipment-Based Approach.

    Science.gov (United States)

    1981-01-01

    REQUIREMENT SPONSOR:OP-04 RESOURCE SPONSOR:OP-04 COURSE TITLE - ECONOMIC ANAL ACTIVITY ADDRESS- NAVSCOLCECOFF PT HUENEME FIND CODE = 2 - NO SPECIFIC...RESOURCE SPONSOR:OP-O1 COURSE TITLE - DD-963 MPU MAINTENANCE ACTIVITY ADDRESS COMBATSYSTECHSCOLCOM FIND CODE = 5 - SPECIFIC PLATFORM IN CANTRAC SHIP

  1. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  2. Preparing for a Product Platform

    DEFF Research Database (Denmark)

    Fiil-Nielsen, Ole; Munk, Lone; Mortensen, Niels Henrik

    2005-01-01

    on commonalities and similarities in the product family, and variance should be based on customer demands. To relate these terms and to improve the basis on which decisions are made, we need a way of visualizing the hierarchy of the product family as well as the commonality and variance. This visualization method...... of the platform or ensuring that the platform can meet future demands will be very useful in the preparation process of a platform synthesis as well as in the updating or reengineering of an existing product development platform.......Experience in the industry as well as recent related scientific publications show the benefits of product development platforms. Companies use platforms to develop not a single but multiple products (i.e. a product family) simultaneously. When these product development projects are coordinated...

  3. Adoption of Mobile Payment Platforms

    DEFF Research Database (Denmark)

    Staykova, Kalina Stefanova; Damsgaard, Jan

    2016-01-01

    Numerous mobile payment solutions, which rely on new disruptive technologies, have been launched on the payment market in recent years. But despite the growing number of mobile payment apps, very few solutions have turned to be successful as the majority of them fail to gain a critical mass...... of users. In this paper, we investigate successful platform adoption strategies by using the Reach and Range Framework for Multi-Sided Platforms as a strategic tool to which mobile payment providers can adhere in order to tackle some of the main challenges they face throughout the evolution...... of their platforms. The analysis indicates that successful mobile payment solutions tend to be launched as one-sided platforms and then gradually be expanded into being two-sided. Our study showcases that the success of mobile payment platforms lies with the ability of the platform to balance the reach (number...

  4. Web Platform Application

    Energy Technology Data Exchange (ETDEWEB)

    Paulsworth, Ashley [Sunvestment Group, Frederick, MD (United States); Kurtz, Jim [Sunvestment Group, Frederick, MD (United States); Brun de Pontet, Stephanie [Sunvestment Group, Frederick, MD (United States)

    2016-06-15

    Sunvestment Energy Group (previously called Sunvestment Group) was established to create a web application that brings together site hosts, those who will obtain the energy from the solar array, with project developers and funders, including affinity investors. Sunvestment Energy Group (SEG) uses a community-based model that engages with investors who have some affinity with the site host organization. In addition to a financial return, these investors receive non-financial value from their investments and are therefore willing to offer lower cost capital. This enables the site host to enjoy more savings from solar through these less expensive Community Power Purchase Agreements (CPPAs). The purpose of this award was to develop an online platform to bring site hosts and investors together virtually.

  5. Mobile4D platform

    CSIR Research Space (South Africa)

    Botha, Adèle

    2010-05-01

    Full Text Available and share their own Internet/Telco service Mashups. digm. This all 2.1.2  SPICE   Service Platform for Innovative Communication Environment was also a European Union’s Sixth Framework Programme (FP6) funded project, which formed a consortium consisting...  OPUCE  SPICE  Twisted   Mobicents  Telco  specific  Minimal Total Life cycle cost  ?  ?  ?  ?  ?  Standards Compliant Solution  ?  ?  ?  ?  ?  Bearer & Device Agnostic  ?  ?  ?  ?  ?  Ease of use and accessibility  ?  ?  ?  ?  ?  Synergies...

  6. Geometric information provider platform

    Directory of Open Access Journals (Sweden)

    Meisam Yousefzadeh

    2015-07-01

    Full Text Available Renovation of existing buildings is known as an essential stage in reduction of the energy loss. Considerable part of renovation process depends on geometric reconstruction of building based on semantic parameters. Following many research projects which were focused on parameterizing the energy usage, various energy modelling methods were developed during the last decade. On the other hand, by developing accurate measuring tools such as laser scanners, the interests of having accurate 3D building models are rapidly growing. But the automation of 3D building generation from laser point cloud or detection of specific objects in that is still a challenge.  The goal is designing a platform through which required geometric information can be efficiently produced to support energy simulation software. Developing a reliable procedure which extracts required information from measured data and delivers them to a standard energy modelling system is the main purpose of the project.

  7. Energy Tracking Software Platform

    Energy Technology Data Exchange (ETDEWEB)

    Ryan Davis; Nathan Bird; Rebecca Birx; Hal Knowles

    2011-04-04

    Acceleration has created an interactive energy tracking and visualization platform that supports decreasing electric, water, and gas usage. Homeowners have access to tools that allow them to gauge their use and track progress toward a smaller energy footprint. Real estate agents have access to consumption data, allowing for sharing a comparison with potential home buyers. Home builders have the opportunity to compare their neighborhood's energy efficiency with competitors. Home energy raters have a tool for gauging the progress of their clients after efficiency changes. And, social groups are able to help encourage members to reduce their energy bills and help their environment. EnergyIT.com is the business umbrella for all energy tracking solutions and is designed to provide information about our energy tracking software and promote sales. CompareAndConserve.com (Gainesville-Green.com) helps homeowners conserve energy through education and competition. ToolsForTenants.com helps renters factor energy usage into their housing decisions.

  8. Sinking offshore platform. Nedsenkbar fralandsplatform

    Energy Technology Data Exchange (ETDEWEB)

    Einstabland, T.B.; Olsen, O.

    1988-12-19

    The invention deals with a sinking offshore platform of the gravitational type designed for being installed on the sea bed on great depths. The platform consists of at least three inclining pillars placed on a foundation unit. The pillars are at the upper end connected to a tower structure by means of a rigid construction. The tower supports the platform deck. The rigid construction comprises a centre-positioned cylinder connected to the foundation. 11 figs.

  9. Lattice QCD simulations using the OpenACC platform

    International Nuclear Information System (INIS)

    Majumdar, Pushan

    2016-01-01

    In this article we will explore the OpenACC platform for programming Graphics Processing Units (GPUs). The OpenACC platform offers a directive based programming model for GPUs which avoids the detailed data flow control and memory management necessary in a CUDA programming environment. In the OpenACC model, programs can be written in high level languages with OpenMP like directives. We present some examples of QCD simulation codes using OpenACC and discuss their performance on the Fermi and Kepler GPUs. (paper)

  10. Probabilistic conditional independence structures

    CERN Document Server

    Studeny, Milan

    2005-01-01

    Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.

  11. Advanced Code for Photocathode Design

    Energy Technology Data Exchange (ETDEWEB)

    Ives, Robert Lawrence [Calabazas Creek Research, Inc., San Mateo, CA (United States); Jensen, Kevin [Naval Research Lab. (NRL), Washington, DC (United States); Montgomery, Eric [Univ. of Maryland, College Park, MD (United States); Bui, Thuc [Calabazas Creek Research, Inc., San Mateo, CA (United States)

    2015-12-15

    The Phase I activity demonstrated that PhotoQE could be upgraded and modified to allow input using a graphical user interface. Specific calls to platform-dependent (e.g. IMSL) function calls were removed, and Fortran77 components were rewritten for Fortran95 compliance. The subroutines, specifically the common block structures and shared data parameters, were reworked to allow the GUI to update material parameter data, and the system was targeted for desktop personal computer operation. The new structures overcomes the previous rigid and unmodifiable library structures by implementing new, materials library data sets and repositioning the library values to external files. Material data may originate from published literature or experimental measurements. Further optimization and restructuring would allow custom and specific emission models for beam codes that rely on parameterized photoemission algorithms. These would be based on simplified and parametric representations updated and extended from previous versions (e.g., Modified Fowler-Dubridge, Modified Three-Step, etc.).

  12. Product Platform Screening at LEGO

    DEFF Research Database (Denmark)

    Mortensen, Niels Henrik; Steen Jensen, Thomas; Nielsen, Ole Fiil

    2012-01-01

    Product platforms offer great benefits to companies developing new products in highly competitive markets. Literature describes how a single platform can be designed from a technical point of view, but rarely mentions how the process begins. How do companies identify possible platform candidates...... after a few changes had been applied to the initial process layout. This case study shows how companies must focus on a limited selection of simultaneous projects in order to keep focus. Primary stakeholders must be involved from the very beginning, and short presentations of the platform concepts...

  13. Flexible experimental FPGA based platform

    DEFF Research Database (Denmark)

    Andersen, Karsten Holm; Nymand, Morten

    2016-01-01

    This paper presents an experimental flexible Field Programmable Gate Array (FPGA) based platform for testing and verifying digital controlled dc-dc converters. The platform supports different types of control strategies, dc-dc converter topologies and switching frequencies. The controller platform...... interface supporting configuration and reading of setup parameters, controller status and the acquisition memory in a simple way. The FPGA based platform, provides an easy way within education or research to use different digital control strategies and different converter topologies controlled by an FPGA...

  14. Introducing Platform Interactions Model for Studying Multi-Sided Platforms

    DEFF Research Database (Denmark)

    Staykova, Kalina; Damsgaard, Jan

    2018-01-01

    Multi-Sided Platforms (MSPs) function as socio-technical entities that facilitate direct interactions between various affiliated to them constituencies through developing and managing IT architecture. In this paper, we aim to explain the nature of the platform interactions as key characteristic o...

  15. cMapper: gene-centric connectivity mapper for EBI-RDF platform.

    Science.gov (United States)

    Shoaib, Muhammad; Ansari, Adnan Ahmad; Ahn, Sung-Min

    2017-01-15

    In this era of biological big data, data integration has become a common task and a challenge for biologists. The Resource Description Framework (RDF) was developed to enable interoperability of heterogeneous datasets. The EBI-RDF platform enables an efficient data integration of six independent biological databases using RDF technologies and shared ontologies. However, to take advantage of this platform, biologists need to be familiar with RDF technologies and SPARQL query language. To overcome this practical limitation of the EBI-RDF platform, we developed cMapper, a web-based tool that enables biologists to search the EBI-RDF databases in a gene-centric manner without a thorough knowledge of RDF and SPARQL. cMapper allows biologists to search data entities in the EBI-RDF platform that are connected to genes or small molecules of interest in multiple biological contexts. The input to cMapper consists of a set of genes or small molecules, and the output are data entities in six independent EBI-RDF databases connected with the given genes or small molecules in the user's query. cMapper provides output to users in the form of a graph in which nodes represent data entities and the edges represent connections between data entities and inputted set of genes or small molecules. Furthermore, users can apply filters based on database, taxonomy, organ and pathways in order to focus on a core connectivity graph of their interest. Data entities from multiple databases are differentiated based on background colors. cMapper also enables users to investigate shared connections between genes or small molecules of interest. Users can view the output graph on a web browser or download it in either GraphML or JSON formats. cMapper is available as a web application with an integrated MySQL database. The web application was developed using Java and deployed on Tomcat server. We developed the user interface using HTML5, JQuery and the Cytoscape Graph API. cMapper can be accessed at

  16. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  17. SEVERO code - user's manual

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1989-01-01

    This user's manual contains all the necessary information concerning the use of SEVERO code. This computer code is related to the statistics of extremes = extreme winds, extreme precipitation and flooding hazard risk analysis. (A.C.A.S.)

  18. FERRET data analysis code

    International Nuclear Information System (INIS)

    Schmittroth, F.

    1979-09-01

    A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples

  19. Stylize Aesthetic QR Code

    OpenAIRE

    Xu, Mingliang; Su, Hao; Li, Yafei; Li, Xi; Liao, Jing; Niu, Jianwei; Lv, Pei; Zhou, Bing

    2018-01-01

    With the continued proliferation of smart mobile devices, Quick Response (QR) code has become one of the most-used types of two-dimensional code in the world. Aiming at beautifying the appearance of QR codes, existing works have developed a series of techniques to make the QR code more visual-pleasant. However, these works still leave much to be desired, such as visual diversity, aesthetic quality, flexibility, universal property, and robustness. To address these issues, in this paper, we pro...

  20. Enhancing QR Code Security

    OpenAIRE

    Zhang, Linfan; Zheng, Shuang

    2015-01-01

    Quick Response code opens possibility to convey data in a unique way yet insufficient prevention and protection might lead into QR code being exploited on behalf of attackers. This thesis starts by presenting a general introduction of background and stating two problems regarding QR code security, which followed by a comprehensive research on both QR code itself and related issues. From the research a solution taking advantages of cloud and cryptography together with an implementation come af...

  1. Current status and applications of intergrated safety assessment and simulation code system for ISA

    Energy Technology Data Exchange (ETDEWEB)

    Izquierdo, J. M.; Hortal, J.; Perea, M. Sanchez; Melendez, E. [Modeling and Simulation Area (MOSI), Nuclear Safety Council (CSN), Madrid (Spain); Queral, E.; Rivas-Lewicky, J. [Energy and Fuels Department, Technical University of Madrid (UPM), Madrid (Spain)

    2017-03-15

    This paper reviews current status of the unified approach known as integrated safety assessment (ISA), as well as the associated SCAIS (simulation codes system for ISA) computer platform. These constitute a proposal, which is the result of collaborative action among the Nuclear Safety Council (CSN), University of Madrid (UPM), and NFQ Solutions S.L, aiming to allow independent regulatory verification of industry quantitative risk assessments. The content elaborates on discussions of the classical treatment of time in conventional probabilistic safety assessment (PSA) sequences and states important conclusions that can be used to avoid systematic and unacceptable underestimation of the failure exceedance frequencies. The unified ISA method meets this challenge by coupling deterministic and probabilistic mutual influences. The feasibility of the approach is illustrated with some examples of its application to a real size plant.

  2. Diagnostic and prognostic signatures from the small non-coding RNA transcriptome in prostate cancer

    DEFF Research Database (Denmark)

    Martens-Uzunova, E S; Jalava, S E; Dits, N F

    2011-01-01

    Prostate cancer (PCa) is the most frequent male malignancy and the second most common cause of cancer-related death in Western countries. Current clinical and pathological methods are limited in the prediction of postoperative outcome. It is becoming increasingly evident that small non-coding RNA...... signatures of 102 fresh-frozen patient samples during PCa progression by miRNA microarrays. Both platforms were cross-validated by quantitative reverse transcriptase-PCR. Besides the altered expression of several miRNAs, our deep sequencing analyses revealed strong differential expression of small nucleolar...... RNAs (snoRNAs) and transfer RNAs (tRNAs). From microarray analysis, we derived a miRNA diagnostic classifier that accurately distinguishes normal from cancer samples. Furthermore, we were able to construct a PCa prognostic predictor that independently forecasts postoperative outcome. Importantly...

  3. Reconfigurable, Intelligently-Adaptive, Communication System, an SDR Platform

    Science.gov (United States)

    Roche, Rigoberto J.; Shalkhauser, Mary Jo; Hickey, Joseph P.; Briones, Janette C.

    2016-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework to abstract the application software from the radio platform hardware. STRS aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. The NASA Glenn Research Center (GRC) team made a software defined radio (SDR) platform STRS compliant by adding an STRS operating environment and a field programmable gate array (FPGA) wrapper, capable of implementing each of the platforms interfaces, as well as a test waveform to exercise those interfaces. This effort serves to provide a framework toward waveform development onto an STRS compliant platform to support future space communication systems for advanced exploration missions. The use of validated STRS compliant applications provides tested code with extensive documentation to potentially reduce risk, cost and e ort in development of space-deployable SDRs. This paper discusses the advantages of STRS, the integration of STRS onto a Reconfigurable, Intelligently-Adaptive, Communication System (RIACS) SDR platform, and the test waveform and wrapper development e orts. The paper emphasizes the infusion of the STRS Architecture onto the RIACS platform for potential use in next generation flight system SDRs for advanced exploration missions.

  4. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded...

  5. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  6. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  7. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H. Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H. Harold, E-mail: hli@radonc.wustl.edu [Department of Radiation Oncology, Washington University School of Medicine, 4921 Parkview Place, Campus Box 8224, St. Louis, Missouri 63110 (United States)

    2016-07-15

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: PENELOPE was first translated from FORTRAN to C++ and the result was confirmed to produce equivalent results to the original code. The C++ code was then adapted to CUDA in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gPENELOPE as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gPENELOPE. Ultimately, gPENELOPE was applied toward independent validation of patient doses calculated by MRIdian’s KMC. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread FORTRAN implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of PENELOPE. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this

  8. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    International Nuclear Information System (INIS)

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H. Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H. Harold

    2016-01-01

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: PENELOPE was first translated from FORTRAN to C++ and the result was confirmed to produce equivalent results to the original code. The C++ code was then adapted to CUDA in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gPENELOPE as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gPENELOPE. Ultimately, gPENELOPE was applied toward independent validation of patient doses calculated by MRIdian’s KMC. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread FORTRAN implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of PENELOPE. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this

  9. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model.

    Science.gov (United States)

    Wang, Yuhe; Mazur, Thomas R; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H Harold

    2016-07-01

    The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian's kmc. An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and

  10. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  11. A practice scaffolding interactive platform

    DEFF Research Database (Denmark)

    Bundsgaard, Jeppe

    2009-01-01

    A Practice Scaffolding Interactive Platform (PracSIP) is a social learning platform which supports students in collaborative project based learning by simulating a professional practice. A PracSIP puts the core tools of the simulated practice at the students' disposal, it organizes collaboration...

  12. Paper based electronics platform

    KAUST Repository

    Nassar, Joanna Mohammad

    2017-07-20

    A flexible and non-functionalized low cost paper-based electronic system platform fabricated from common paper, such as paper based sensors, and methods of producing paper based sensors, and methods of sensing using the paper based sensors are provided. A method of producing a paper based sensor can include the steps of: a) providing a conventional paper product to serve as a substrate for the sensor or as an active material for the sensor or both, the paper product not further treated or functionalized; and b) applying a sensing element to the paper substrate, the sensing element selected from the group consisting of a conductive material, the conductive material providing contacts and interconnects, sensitive material film that exhibits sensitivity to pH levels, a compressible and/or porous material disposed between a pair of opposed conductive elements, or a combination of two of more said sensing elements. The method of sensing can further include measuring, using the sensing element, a change in resistance, a change in voltage, a change in current, a change in capacitance, or a combination of any two or more thereof.

  13. The network code

    International Nuclear Information System (INIS)

    1997-01-01

    The Network Code defines the rights and responsibilities of all users of the natural gas transportation system in the liberalised gas industry in the United Kingdom. This report describes the operation of the Code, what it means, how it works and its implications for the various participants in the industry. The topics covered are: development of the competitive gas market in the UK; key points in the Code; gas transportation charging; impact of the Code on producers upstream; impact on shippers; gas storage; supply point administration; impact of the Code on end users; the future. (20 tables; 33 figures) (UK)

  14. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  15. NAGRADATA. Code key. Geology

    International Nuclear Information System (INIS)

    Mueller, W.H.; Schneider, B.; Staeuble, J.

    1984-01-01

    This reference manual provides users of the NAGRADATA system with comprehensive keys to the coding/decoding of geological and technical information to be stored in or retreaved from the databank. Emphasis has been placed on input data coding. When data is retreaved the translation into plain language of stored coded information is done automatically by computer. Three keys each, list the complete set of currently defined codes for the NAGRADATA system, namely codes with appropriate definitions, arranged: 1. according to subject matter (thematically) 2. the codes listed alphabetically and 3. the definitions listed alphabetically. Additional explanation is provided for the proper application of the codes and the logic behind the creation of new codes to be used within the NAGRADATA system. NAGRADATA makes use of codes instead of plain language for data storage; this offers the following advantages: speed of data processing, mainly data retrieval, economies of storage memory requirements, the standardisation of terminology. The nature of this thesaurian type 'key to codes' makes it impossible to either establish a final form or to cover the entire spectrum of requirements. Therefore, this first issue of codes to NAGRADATA must be considered to represent the current state of progress of a living system and future editions will be issued in a loose leave ringbook system which can be updated by an organised (updating) service. (author)

  16. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  17. Independent technical review, handbook

    International Nuclear Information System (INIS)

    1994-02-01

    Purpose Provide an independent engineering review of the major projects being funded by the Department of Energy, Office of Environmental Restoration and Waste Management. The independent engineering review will address questions of whether the engineering practice is sufficiently developed to a point where a major project can be executed without significant technical problems. The independent review will focus on questions related to: (1) Adequacy of development of the technical base of understanding; (2) Status of development and availability of technology among the various alternatives; (3) Status and availability of the industrial infrastructure to support project design, equipment fabrication, facility construction, and process and program/project operation; (4) Adequacy of the design effort to provide a sound foundation to support execution of project; (5) Ability of the organization to fully integrate the system, and direct, manage, and control the execution of a complex major project

  18. Independent technical review, handbook

    Energy Technology Data Exchange (ETDEWEB)

    1994-02-01

    Purpose Provide an independent engineering review of the major projects being funded by the Department of Energy, Office of Environmental Restoration and Waste Management. The independent engineering review will address questions of whether the engineering practice is sufficiently developed to a point where a major project can be executed without significant technical problems. The independent review will focus on questions related to: (1) Adequacy of development of the technical base of understanding; (2) Status of development and availability of technology among the various alternatives; (3) Status and availability of the industrial infrastructure to support project design, equipment fabrication, facility construction, and process and program/project operation; (4) Adequacy of the design effort to provide a sound foundation to support execution of project; (5) Ability of the organization to fully integrate the system, and direct, manage, and control the execution of a complex major project.

  19. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    1999-01-01

    The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)

  20. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  1. 3D shape measurement system developed on mobile platform

    Science.gov (United States)

    Wu, Zhoujie; Chang, Meng; Shi, Bowen; Zhang, Qican

    2017-02-01

    Three-dimensional (3-D) shape measurement technology based on structured light has become one hot research field inspired by the increasing requirements. Many methods have been implemented and applied in the industry applications, but most of their equipments are large and complex, cannot be portable. Meanwhile, the popularity of the smart mobile terminals, such as smart phones, provides a platform for the miniaturization and portability of this technology. The measurement system based on phase-shift algorithm and Gray-code pattern under the Android platform on a mobile phone is mainly studied and developed, and it has been encapsulated into a mobile phone application in order to reconstruct 3-D shape data in the employed smart phone easily and quickly. The experimental results of two measured object are given in this paper and demonstrate the application we developed in the mobile platform is effective.

  2. Design of verification platform for wireless vision sensor networks

    Science.gov (United States)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  3. Comparison of Learning Software Architecture by Developing Social Applications versus Games on the Android Platform

    Directory of Open Access Journals (Sweden)

    Bian Wu

    2012-01-01

    Full Text Available This paper describes an empirical study where the focus was on discovering differences and similarities in students working on development of social applications versus students working on development of games using the same Android development platform. In 2010-2011, students attending the software architecture course at the Norwegian University of Science and Technology (NTNU could choose between four types of projects. Independently of the chosen type of project, all students had to go through the same phases, produce the same documents based on the same templates, and follow exactly the same process. This study focuses on one of projects—Android project, to see how much the application domain affects the course project independently of the chosen technology. Our results revealed some positive effects for the students doing game development compared to social application development to learn software architecture, like motivated to work with games, a better focus on quality attributes such as modifiability and testability during the development, production of software architectures of higher complexity, and more productive coding working for the project. However, we did not find significant differences in awarded grade between students choosing the two different domains.

  4. When sparse coding meets ranking: a joint framework for learning sparse codes and ranking scores

    KAUST Repository

    Wang, Jim Jing-Yan

    2017-06-28

    Sparse coding, which represents a data point as a sparse reconstruction code with regard to a dictionary, has been a popular data representation method. Meanwhile, in database retrieval problems, learning the ranking scores from data points plays an important role. Up to now, these two problems have always been considered separately, assuming that data coding and ranking are two independent and irrelevant problems. However, is there any internal relationship between sparse coding and ranking score learning? If yes, how to explore and make use of this internal relationship? In this paper, we try to answer these questions by developing the first joint sparse coding and ranking score learning algorithm. To explore the local distribution in the sparse code space, and also to bridge coding and ranking problems, we assume that in the neighborhood of each data point, the ranking scores can be approximated from the corresponding sparse codes by a local linear function. By considering the local approximation error of ranking scores, the reconstruction error and sparsity of sparse coding, and the query information provided by the user, we construct a unified objective function for learning of sparse codes, the dictionary and ranking scores. We further develop an iterative algorithm to solve this optimization problem.

  5. Stratospheric Platforms for Monitoring Purposes

    International Nuclear Information System (INIS)

    Konigorski, D.; Gratzel, U.; Obersteiner, M.; Schneidereit, M.

    2010-01-01

    Stratospheric platforms are emerging systems based on challenging technology. Goal is to create a platform, payload, and mission design which is able to complement satellite services on a local scale. Applications are close to traditional satellite business in telecommunication, navigation, science, and earth observation and include for example mobile telecommunications, navigation augmentation, atmospheric research, or border control. Stratospheric platforms could potentially support monitoring activities related to safeguards, e.g. by imagery of surfaces, operational conditions of nuclear facilities, and search for undeclared nuclear activities. Stratospheric platforms are intended to be flown in an altitude band between 16 and 30 km, above 16-20 km to take advantage of usually lower winds facilitating station keeping, below 30 km to limit the challenges to achieve a reasonable payload at acceptable platform sizes. Stratospheric platforms could substitute satellites which are expensive and lack upgrade capabilities for new equipment. Furthermore they have practically an unlimited time over an area of interest. It is intended to keep the platforms operational and maintenance free on a 24/7 basis with an average deployment time of 3 years. Geostationary satellites lack resolution. Potential customers like Armed Forces, National Agencies and commercial customers have indicated interest in the use of stratospheric platforms. Governmental entities are looking for cheaper alternatives to communications and surveillance satellites and stratospheric platforms could offer the following potential advantages: Lower operational cost than satellite or UAV (Unmanned Aerial Vehicles) constellation (fleet required); Faster deployment than satellite constellation; Repositioning capability and ability to loiter as required; Persistent long-term real-time services over a fairly large regional spot; Surge capability: Able to extend capability (either monitoring or communications

  6. Progress on DART code optimization

    International Nuclear Information System (INIS)

    Taboada, Horacio; Solis, Diego; Rest, Jeffrey

    1999-01-01

    This work consists about the progress made on the design and development of a new optimized version of DART code (DART-P), a mechanistic computer model for the performance calculation and assessment of aluminum dispersion fuel. It is part of a collaboration agreement between CNEA and ANL in the area of Low Enriched Uranium Advanced Fuels. It is held by the Implementation Arrangement for Technical Exchange and Cooperation in the Area of Peaceful Uses of Nuclear Energy, signed on October 16, 1997 between US DOE and the National Atomic Energy Commission of the Argentine Republic. DART optimization is a biannual program; it is operative since February 8, 1999 and has the following goals: 1. Design and develop a new DART calculation kernel for implementation within a parallel processing architecture. 2. Design and develop new user-friendly I/O routines to be resident on Personal Computer (PC)/WorkStation (WS) platform. 2.1. The new input interface will be designed and developed by means of a Visual interface, able to guide the user in the construction of the problem to be analyzed with the aid of a new database (described in item 3, below). The new I/O interface will include input data check controls in order to avoid corrupted input data. 2.2. The new output interface will be designed and developed by means of graphical tools, able to translate numeric data output into 'on line' graphic information. 3. Design and develop a new irradiated materials database, to be resident on PC/WS platform, so as to facilitate the analysis of the behavior of different fuel and meat compositions with DART-P. Currently, a different version of DART is used for oxide, silicide, and advanced alloy fuels. 4. Develop rigorous general inspection algorithms in order to provide valuable DART-P benchmarks. 5. Design and develop new models, such as superplasticity, elastoplastic feedback, improved models for the calculation of fuel deformation and the evolution of the fuel microstructure for

  7. Automated Testing Infrastructure and Result Comparison for Geodynamics Codes

    Science.gov (United States)

    Heien, E. M.; Kellogg, L. H.

    2013-12-01

    The geodynamics community uses a wide variety of codes on a wide variety of both software and hardware platforms to simulate geophysical phenomenon. These codes are generally variants of finite difference or finite element calculations involving Stokes flow or wave propagation. A significant problem is that codes of even low complexity will return different results depending on the platform due to slight differences in hardware, software, compiler, and libraries. Furthermore, changes to the codes during development may affect solutions in unexpected ways such that previously validated results are altered. The Computational Infrastructure for Geodynamics (CIG) is funded by the NSF to enhance the capabilities of the geodynamics community through software development. CIG has recently done extensive work in setting up an automated testing and result validation system based on the BaTLab system developed at the University of Wisconsin, Madison. This system uses 16 variants of Linux and Mac platforms on both 32 and 64-bit processors to test several CIG codes, and has also recently been extended to support testing on the XSEDE TACC (Texas Advanced Computing Center) Stampede cluster. In this work we overview the system design and demonstrate how automated testing and validation occurs and results are reported. We also examine several results from the system from different codes and discuss how changes in compilers and libraries affect the results. Finally we detail some result comparison tools for different types of output (scalar fields, velocity fields, seismogram data), and discuss within what margins different results can be considered equivalent.

  8. The OpenPMU Platform for Open Source Phasor Measurements

    OpenAIRE

    Laverty, David M.; Best, Robert J.; Brogan, Paul; Al-Khatib, Iyad; Vanfretti, Luigi; Morrow, D John

    2013-01-01

    OpenPMU is an open platform for the development of phasor measurement unit (PMU) technology. A need has been identified for an open-source alternative to commercial PMU devices tailored to the needs of the university researcher and for enabling the development of new synchrophasor instruments from this foundation. OpenPMU achieves this through open-source hardware design specifications and software source code, allowing duplicates of the OpenPMU to be fabricated under open-source licenses. Th...

  9. Platform Expansion Design as Strategic Choice

    DEFF Research Database (Denmark)

    Staykova, Kalina S.; Damsgaard, Jan

    2016-01-01

    In this paper, we address how the strategic choice of platform expansion design impacts the subse-quent platform strategy. We identify two distinct approaches to platform expansion – platform bun-dling and platform constellations, which currently co-exist. The purpose of this paper is to outline...

  10. A Typology of Multi-sided Platforms

    DEFF Research Database (Denmark)

    Staykova, Kalina Stefanova; Damsgaard, Jan

    2015-01-01

    In this paper we address how the composition of a platform impacts the platform’s business model. By platform’s business model we mean platform features, platform architecture and platform governance. To this end, we construct the Platform Business Model Framework. We apply the framework to three...

  11. Cotton phenotyping with lidar from a track-mounted platform

    Science.gov (United States)

    French, Andrew N.; Gore, Michael A.; Thompson, Alison

    2016-05-01

    High-Throughput Phenotyping (HTP) is a discipline for rapidly identifying plant architectural and physiological responses to environmental factors such as heat and water stress. Experiments conducted since 2010 at Maricopa, Arizona with a three-fold sensor group, including thermal infrared radiometers, active visible/near infrared reflectance sensors, and acoustic plant height sensors, have shown the validity of HTP with a tractor-based system. However, results from these experiments also show that accuracy of plant phenotyping is limited by the system's inability to discriminate plant components and their local environmental conditions. This limitation may be overcome with plant imaging and laser scanning which can help map details in plant architecture and sunlit/shaded leaves. To test the capability for mapping cotton plants with a laser system, a track-mounted platform was deployed in 2015 over a full canopy and defoliated cotton crop consisting of a scanning LIDAR driven by Arduinocontrolled stepper motors. Using custom Python and Tkinter code, the platform moved autonomously along a pipe-track at 0.1 m/s while collecting LIDAR scans at 25 Hz (0.1667 deg. beam). These tests showed that an autonomous LIDAR platform can reduce HTP logistical problems and provide the capability to accurately map cotton plants and cotton bolls. A prototype track-mounted platform was developed to test the use of LIDAR scanning for High- Throughput Phenotyping (HTP). The platform was deployed in 2015 at Maricopa, Arizona over a senescent cotton crop. Using custom Python and Tkinter code, the platform moved autonomously along a pipe-track at cotton bolls.

  12. The European Photovoltaic Technology Platform

    International Nuclear Information System (INIS)

    Nowak, S.; Aulich, H.; Bal, J.L.; Dimmler, B.; Garnier, A.; Jongerden, G.; Luther, J.; Luque, A.; Milner, A.; Nelson, D.; Pataki, I.; Pearsall, N.; Perezagua, E.; Pietruszko, S.; Rehak, J.; Schellekens, E.; Shanker, A.; Silvestrini, G.; Sinke, W.; Willemsen, H.

    2006-05-01

    The European Photovoltaic Technology Platform is one of the European Technology Platforms, a new instrument proposed by the European Commission. European Technology Platforms (ETPs) are a mechanism to bring together all interested stakeholders to develop a long-term vision to address a specific challenge, create a coherent, dynamic strategy to achieve that vision and steer the implementation of an action plan to deliver agreed programmes of activities and optimise the benefits for all parties. The European Photovoltaic Technology Platform has recently been established to define, support and accompany the implementation of a coherent and comprehensive strategic plan for photovoltaics. The platform will mobilise all stakeholders sharing a long-term European vision for PV, helping to ensure that Europe maintains and improves its industrial position. The platform will realise a European Strategic Research Agenda for PV for the next decade(s). Guided by a Steering Committee of 20 high level decision-makers representing all relevant European PV Stakeholders, the European PV Technology Platform comprises 4 Working Groups dealing with the subjects policy and instruments; market deployment; science, technology and applications as well as developing countries and is supported by a secretariat

  13. Vertical Relationships within Platform Marketplaces

    Directory of Open Access Journals (Sweden)

    Mark J. Tremblay

    2016-07-01

    Full Text Available In two-sided markets a platform allows consumers and sellers to interact by creating sub-markets within the platform marketplace. For example, Amazon has sub-markets for all of the different product categories available on its site, and smartphones have sub-markets for different types of applications (gaming apps, weather apps, map apps, ridesharing apps, etc.. The network benefits between consumers and sellers depend on the mode of competition within the sub-markets: more competition between sellers lowers product prices, increases the surplus consumers receive from a sub-market, and makes platform membership more desirable for consumers. However, more competition also lowers profits for a seller which makes platform membership less desirable for a seller and reduces seller entry and the number of sub-markets available on the platform marketplace. This dynamic between seller competition within a sub-market and agents’ network benefits leads to platform pricing strategies, participation decisions by consumers and sellers, and welfare results that depend on the mode of competition. Thus, the sub-market structure is important when investigating platform marketplaces.

  14. FCJ-128 A Programmable Platform? Drupal, Modularity, and the Future of the Web

    Directory of Open Access Journals (Sweden)

    Fenwick McKelvey

    2011-10-01

    Full Text Available Sent as a walking advertisement of Canada’s technology sector, I arrived in Argentina to help a women’s rights organization develop a new website. I began using the Drupal content management platform to construct the site. Its interface brought me into the rarified world of web programming. My experience provides a way of entry into the Drupal platform – a platform I believe is re-programmable. The paper introduces the concept of re-programmability as a processes by which users and code interact to alter software’s running code, and works out this concept through the case of Drupal and how its modular code can be re-programmed by its users. The paper utilizes the theory of transduction to flip the critique of web2.0 platforms on its head – focusing on the processes of becoming a platform, rather than the platform as a final state. This offers a new line of critique for web2.0 platforms, namely how they enact their re-programming.

  15. Model-Independent Diffs

    DEFF Research Database (Denmark)

    Könemann, Patrick

    just contain a list of strings, one for each line, whereas the structure of models is defined by their meta models. There are tools available which are able to compute the diff between two models, e.g. RSA or EMF Compare. However, their diff is not model-independent, i.e. it refers to the models...

  16. All Those Independent Variables.

    Science.gov (United States)

    Meacham, Merle L.

    This paper presents a case study of a sixth grade remedial math class which illustrates the thesis that only the "experimental attitude," not the "experimental method," is appropriate in the classroom. The thesis is based on the fact that too many independent variables exist in a classroom situation to allow precise measurement. The case study…

  17. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...

  18. Independent safety organization

    International Nuclear Information System (INIS)

    Kato, W.Y.; Weinstock, E.V.; Carew, J.F.; Cerbone, R.J.; Guppy, J.G.; Hall, R.E.; Taylor, J.H.

    1985-01-01

    Brookhaven National Laboratory has conducted a study on the need and feasibility of an independent organization to investigate significant safety events for the Office for Analysis and Evaluation of Operational Data, USNRC. The study consists of three parts: the need for an independent organization to investigate significant safety events, alternative organizations to conduct investigations, and legislative requirements. The determination of need was investigated by reviewing current NRC investigation practices, comparing aviation and nuclear industry practices, and interviewing a spectrum of representatives from the nuclear industry, the regulatory agency, and the public sector. The advantages and disadvantages of alternative independent organizations were studied, namely, an Office of Nuclear Safety headed by a director reporting to the Executive Director for Operations (EDO) of NRC; an Office of Nuclear Safety headed by a director reporting to the NRC Commissioners; a multi-member NTSB-type Nuclear Safety Board independent of the NRC. The costs associated with operating a Nuclear Safety Board were also included in the study. The legislative requirements, both new authority and changes to the existing NRC legislative authority, were studied. 134 references

  19. ALLIANCES: simulation platform for radioactive waste disposal

    International Nuclear Information System (INIS)

    Deville, E.; Montarnal, Ph.; Loth, L.; Chavant, C.

    2009-01-01

    CEA, ANDRA and EDF are jointly developing the software platform ALLIANCES whose aim is to produce a tool for the simulation of nuclear waste storage and disposal. This type of simulations deals with highly coupled thermo-hydro-mechanical-chemical and radioactive (T-H-M-C-R) processes. ALLIANCES' aim is to accumulate within the same simulation environment the already acquired knowledge and to gradually integrate new knowledge. The current version of ALLIANCES contains the following modules: - Hydraulics and reactive transport in unsaturated and saturated media; - Multi-phase flow; - Mechanical thermal-hydraulics; - Thermo-Aeraulics; - Chemistry/Transport coupling in saturated media; - Alteration of waste package coupled with the environment; - Sensitivity analysis tools. The next releases will include more physical phenomena like: reactive transport in unsaturated flow and multicomponent multiphase flow; incorporation of responses surfaces in sensitivity analysis tools; integration of parallel numerical codes for flow and transport. Since the distribution of the first release of ALLIANCES (December 2003), the platform was used by ANDRA for his safety simulation program and by CEA for reactive transport simulations (migration of uranium in a soil, diffusion of different reactive species on laboratory samples, glass/iron/clay interaction). (authors)

  20. Bioinformatics on the Cloud Computing Platform Azure

    Science.gov (United States)

    Shanahan, Hugh P.; Owen, Anne M.; Harrison, Andrew P.

    2014-01-01

    We discuss the applicability of the Microsoft cloud computing platform, Azure, for bioinformatics. We focus on the usability of the resource rather than its performance. We provide an example of how R can be used on Azure to analyse a large amount of microarray expression data deposited at the public database ArrayExpress. We provide a walk through to demonstrate explicitly how Azure can be used to perform these analyses in Appendix S1 and we offer a comparison with a local computation. We note that the use of the Platform as a Service (PaaS) offering of Azure can represent a steep learning curve for bioinformatics developers who will usually have a Linux and scripting language background. On the other hand, the presence of an additional set of libraries makes it easier to deploy software in a parallel (scalable) fashion and explicitly manage such a production run with only a few hundred lines of code, most of which can be incorporated from a template. We propose that this environment is best suited for running stable bioinformatics software by users not involved with its development. PMID:25050811

  1. Global Software Development with Cloud Platforms

    Science.gov (United States)

    Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya

    Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.

  2. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  3. Platform decisions supported by gaming

    DEFF Research Database (Denmark)

    Hansen, Poul H. Kyvsgård; Mikkola, Juliana Hsuan

    2007-01-01

    Platform is an ambiguous multidisciplinary concept. The philosophy behind it is easy to communicate and makes intuitively sense. However, the ease in communication does overshadow the high complexity when the concept is implemented. The practical industrial platform implementation challenge can...... be described as being a configuration problem with a high number of variables. These variables are different in nature; they have contradictory influence on the total performance, and, their importance change over time. Consequently, the specific platform decisions become highly complex and the consequences...

  4. SALOME. A software integration platform for multi-physics, pre-processing and visualisation

    International Nuclear Information System (INIS)

    Bergeaud, Vincent; Lefebvre, Vincent

    2010-01-01

    In order to ease the development of applications integrating simulation codes, CAD modelers and post-processing tools. CEA and EDF R and D have invested in the SALOME platform, a tool dedicated to the environment of the scientific codes. The platform comes in the shape of a toolbox which offers functionalities for CAD, meshing, code coupling, visualization, GUI development. These tools can be combined to create integrated applications that make the scientific codes easier to use and well-interfaced with their environment be it other codes, CAD and meshing tools or visualization software. Many projects in CEA and EDF R and D now use SALOME, bringing technical coherence to the software suites of our institutions. (author)

  5. CBP Phase I Code Integration

    International Nuclear Information System (INIS)

    Smith, F.; Brown, K.; Flach, G.; Sarkar, S.

    2011-01-01

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material properties via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown and Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown and Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface

  6. CBP PHASE I CODE INTEGRATION

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F.; Brown, K.; Flach, G.; Sarkar, S.

    2011-09-30

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material properties via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was

  7. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  8. Navigation and Positioning System Using High Altitude Platforms Systems (HAPS)

    Science.gov (United States)

    Tsujii, Toshiaki; Harigae, Masatoshi; Harada, Masashi

    Recently, some countries have begun conducting feasibility studies and R&D projects on High Altitude Platform Systems (HAPS). Japan has been investigating the use of an airship system that will function as a stratospheric platform for applications such as environmental monitoring, communications and broadcasting. If pseudolites were mounted on the airships, their GPS-like signals would be stable augmentations that would improve the accuracy, availability, and integrity of GPS-based positioning systems. Also, the sufficient number of HAPS can function as a positioning system independent of GPS. In this paper, a system design of the HAPS-based positioning system and its positioning error analyses are described.

  9. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  10. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  11. Theory of epigenetic coding.

    Science.gov (United States)

    Elder, D

    1984-06-07

    The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward.

  12. DISP1 code

    International Nuclear Information System (INIS)

    Vokac, P.

    1999-12-01

    DISP1 code is a simple tool for assessment of the dispersion of the fission product cloud escaping from a nuclear power plant after an accident. The code makes it possible to tentatively check the feasibility of calculations by more complex PSA3 codes and/or codes for real-time dispersion calculations. The number of input parameters is reasonably low and the user interface is simple enough to allow a rapid processing of sensitivity analyses. All input data entered through the user interface are stored in the text format. Implementation of dispersion model corrections taken from the ARCON96 code enables the DISP1 code to be employed for assessment of the radiation hazard within the NPP area, in the control room for instance. (P.A.)

  13. Hardware independence checkout software

    Science.gov (United States)

    Cameron, Barry W.; Helbig, H. R.

    1990-01-01

    ACSI has developed a program utilizing CLIPS to assess compliance with various programming standards. Essentially the program parses C code to extract the names of all function calls. These are asserted as CLIPS facts which also include information about line numbers, source file names, and called functions. Rules have been devised to establish functions called that have not been defined in any of the source parsed. These are compared against lists of standards (represented as facts) using rules that check intersections and/or unions of these. By piping the output into other processes the source is appropriately commented by generating and executing parsed scripts.

  14. Phonological coding during reading.

    Science.gov (United States)

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  15. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  16. RunJumpCode: An Educational Game for Educating Programming

    Science.gov (United States)

    Hinds, Matthew; Baghaei, Nilufar; Ragon, Pedrito; Lambert, Jonathon; Rajakaruna, Tharindu; Houghton, Travers; Dacey, Simon

    2017-01-01

    Programming promotes critical thinking, problem solving and analytic skills through creating solutions that can solve everyday problems. However, learning programming can be a daunting experience for a lot of students. "RunJumpCode" is an educational 2D platformer video game, designed and developed in Unity, to teach players the…

  17. Extending CANTUP code analysis to probabilistic evaluations

    International Nuclear Information System (INIS)

    Florea, S.

    2001-01-01

    The structural analysis with numerical methods based on final element method plays at present a central role in evaluations and predictions of structural systems which require safety and reliable operation in aggressive environmental conditions. This is the case too for the CANDU - 600 fuel channel, where besides the corrosive and thermal aggression upon the Zr97.5Nb2.5 pressure tubes, a lasting irradiation adds which has marked consequences upon the materials properties evolution. This results in an unavoidable spreading in the materials properties in time, affected by high uncertainties. Consequently, the deterministic evaluation with computation codes based on finite element method are supplemented by statistic and probabilistic methods of evaluation of the response of structural components. This paper reports the works on extending the thermo-mechanical evaluation of the fuel channel components in the frame of probabilistic structure mechanics based on statistical methods and developed upon deterministic CANTUP code analyses. CANTUP code was adapted from LAHEY 77 platform onto Microsoft Developer Studio - Fortran Power Station 4.0 platform. To test the statistical evaluation of the creeping behaviour of pressure tube, the value of longitudinal elasticity modulus (Young) was used, as random variable, with a normal distribution around value, as used in deterministic analyses. The influence of the random quantity upon the hog and effective stress developed in the pressure tube for to time values, specific to primary and secondary creep was studied. The results obtained after a five year creep, corresponding to the secondary creep are presented

  18. IRET: requirements for service platforms

    OpenAIRE

    Baresi, Luciano; Ripa, Gianluca; Pasquale, Liliana

    2013-01-01

    peer-reviewed This paper describes IRENE (Indenica Requirements ElicitatioN mEthod), a methodology to elicit and model the requirements of service platforms, and IRET (IREne Tool), the Eclipse-based modeling framework we developed for IRENE

  19. Platform attitude data acquisition system

    Digital Repository Service at National Institute of Oceanography (India)

    Afzulpurkar, S.

    A system for automatic acquisition of underwater platform attitude data has been designed, developed and tested in the laboratory. This is a micro controller based system interfacing dual axis inclinometer, high-resolution digital compass...

  20. Microneedle Platforms for Cell Analysis

    KAUST Repository

    Kavaldzhiev, Mincho

    2017-01-01

    to the development of micro-needle platforms that offer customized fabrication and new capabilities for enhanced cell analyses. The highest degree of geometrical flexibility is achieved with 3D printed micro-needles, which enable optimizing the topographical stress

  1. Elevated Fixed Platform Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Elevated Fixed Platform (EFP) is a helicopter recovery test facility located at Lakehurst, NJ. It consists of a 60 by 85 foot steel and concrete deck built atop...

  2. Radiographic inspection on offshore platforms

    International Nuclear Information System (INIS)

    Soares, Sergio Damasceno; Sperandio, Augusto Gasparoni

    1994-01-01

    One of the great challenges for non-destructive inspection is on offshore platforms, where safety is a critical issue. Inspection by gammagraphy is practically forbidden on the platform deck due to problems to personnel safety and radiological protection. Ir-192 sources are used and the risk of an accident with loss of radioisotope must be considered. It is unfeasible to use gammagraphy, because in case of an accident the rapid evacuation from the platform would be impossible. This problem does not occur when X-ray equipment is used as the radiation source. The limited practicality and portability of the X-ray equipment have prevented its use as a replacement for the gammagraphy. This paper presents the preliminary tests to see the viable use of radiographic tests with constant potential on offshore platforms. (author). 2 refs., 1 fig., 2 tabs, 3 photos

  3. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  4. The nuclear reaction model code MEDICUS

    International Nuclear Information System (INIS)

    Ibishia, A.I.

    2008-01-01

    The new computer code MEDICUS has been used to calculate cross sections of nuclear reactions. The code, implemented in MATLAB 6.5, Mathematica 5, and Fortran 95 programming languages, can be run in graphical and command line mode. Graphical User Interface (GUI) has been built that allows the user to perform calculations and to plot results just by mouse clicking. The MS Windows XP and Red Hat Linux platforms are supported. MEDICUS is a modern nuclear reaction code that can compute charged particle-, photon-, and neutron-induced reactions in the energy range from thresholds to about 200 MeV. The calculation of the cross sections of nuclear reactions are done in the framework of the Exact Many-Body Nuclear Cluster Model (EMBNCM), Direct Nuclear Reactions, Pre-equilibrium Reactions, Optical Model, DWBA, and Exciton Model with Cluster Emission. The code can be used also for the calculation of nuclear cluster structure of nuclei. We have calculated nuclear cluster models for some nuclei such as 177 Lu, 90 Y, and 27 Al. It has been found that nucleus 27 Al can be represented through the two different nuclear cluster models: 25 Mg + d and 24 Na + 3 He. Cross sections in function of energy for the reaction 27 Al( 3 He,x) 22 Na, established as a production method of 22 Na, are calculated by the code MEDICUS. Theoretical calculations of cross sections are in good agreement with experimental results. Reaction mechanisms are taken into account. (author)

  5. Quantum independent increment processes

    CERN Document Server

    Franz, Uwe

    2005-01-01

    This volume is the first of two volumes containing the revised and completed notes lectures given at the school "Quantum Independent Increment Processes: Structure and Applications to Physics". This school was held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald during the period March 9 – 22, 2003, and supported by the Volkswagen Foundation. The school gave an introduction to current research on quantum independent increment processes aimed at graduate students and non-specialists working in classical and quantum probability, operator algebras, and mathematical physics. The present first volume contains the following lectures: "Lévy Processes in Euclidean Spaces and Groups" by David Applebaum, "Locally Compact Quantum Groups" by Johan Kustermans, "Quantum Stochastic Analysis" by J. Martin Lindsay, and "Dilations, Cocycles and Product Systems" by B.V. Rajarama Bhat.

  6. Quantum independent increment processes

    CERN Document Server

    Franz, Uwe

    2006-01-01

    This is the second of two volumes containing the revised and completed notes of lectures given at the school "Quantum Independent Increment Processes: Structure and Applications to Physics". This school was held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald in March, 2003, and supported by the Volkswagen Foundation. The school gave an introduction to current research on quantum independent increment processes aimed at graduate students and non-specialists working in classical and quantum probability, operator algebras, and mathematical physics. The present second volume contains the following lectures: "Random Walks on Finite Quantum Groups" by Uwe Franz and Rolf Gohm, "Quantum Markov Processes and Applications in Physics" by Burkhard Kümmerer, Classical and Free Infinite Divisibility and Lévy Processes" by Ole E. Barndorff-Nielsen, Steen Thorbjornsen, and "Lévy Processes on Quantum Groups and Dual Groups" by Uwe Franz.

  7. Bioma platform advancements during 2017

    OpenAIRE

    FUMAGALLI DAVIDE; NIEMEYER STEFAN

    2017-01-01

    In this report we describe the advancements on the Bioma Framework developed during year 2017. Given that the Bioma platform is quite mature, its core was not recently changed. So that the majority of changes concerns the implementation of the models developed in the platform. Moreover, during 2017 we also set up an alternative version of the framework itself, based on a new developing framework called .NET Core, with the purpose of being able to create a version of Bioma runnable on Linux. ...

  8. A GPU code for analytic continuation through a sampling method

    Directory of Open Access Journals (Sweden)

    Johan Nordström

    2016-01-01

    Full Text Available We here present a code for performing analytic continuation of fermionic Green’s functions and self-energies as well as bosonic susceptibilities on a graphics processing unit (GPU. The code is based on the sampling method introduced by Mishchenko et al. (2000, and is written for the widely used CUDA platform from NVidia. Detailed scaling tests are presented, for two different GPUs, in order to highlight the advantages of this code with respect to standard CPU computations. Finally, as an example of possible applications, we provide the analytic continuation of model Gaussian functions, as well as more realistic test cases from many-body physics.

  9. Efficient Coding of Information: Huffman Coding -RE ...

    Indian Academy of Sciences (India)

    to a stream of equally-likely symbols so as to recover the original stream in the event of errors. The for- ... The source-coding problem is one of finding a mapping from U to a ... probability that the random variable X takes the value x written as ...

  10. NR-code: Nonlinear reconstruction code

    Science.gov (United States)

    Yu, Yu; Pen, Ue-Li; Zhu, Hong-Ming

    2018-04-01

    NR-code applies nonlinear reconstruction to the dark matter density field in redshift space and solves for the nonlinear mapping from the initial Lagrangian positions to the final redshift space positions; this reverses the large-scale bulk flows and improves the precision measurement of the baryon acoustic oscillations (BAO) scale.

  11. A portable virtual machine target for proof-carrying code

    DEFF Research Database (Denmark)

    Franz, Michael; Chandra, Deepak; Gal, Andreas

    2005-01-01

    Virtual Machines (VMs) and Proof-Carrying Code (PCC) are two techniques that have been used independently to provide safety for (mobile) code. Existing virtual machines, such as the Java VM, have several drawbacks: First, the effort required for safety verification is considerable. Second and mor...... simultaneously providing efficient justin-time compilation and target-machine independence. In particular, our approach reduces the complexity of the required proofs, resulting in fewer proof obligations that need to be discharged at the target machine....

  12. International exploration by independents

    International Nuclear Information System (INIS)

    Bertagne, R.G.

    1991-01-01

    Recent industry trends indicate that the smaller US independents are looking at foreign exploration opportunities as one of the alternatives for growth in the new age of exploration. It is usually accepted that foreign finding costs per barrel are substantially lower than domestic because of the large reserve potential of international plays. To get involved overseas requires, however, an adaptation to different cultural, financial, legal, operational, and political conditions. Generally foreign exploration proceeds at a slower pace than domestic because concessions are granted by the government, or are explored in partnership with the national oil company. First, a mid- to long-term strategy, tailored to the goals and the financial capabilities of the company, must be prepared; it must be followed by an ongoing evaluation of quality prospects in various sedimentary basins, and a careful planning and conduct of the operations. To successfully explore overseas also requires the presence on the team of a minimum number of explorationists and engineers thoroughly familiar with the various exploratory and operational aspects of foreign work, having had a considerable amount of onsite experience in various geographical and climatic environments. Independents that are best suited for foreign expansion are those that have been financially successful domestically, and have a good discovery track record. When properly approached foreign exploration is well within the reach of smaller US independents and presents essentially no greater risk than domestic exploration; the reward, however, can be much larger and can catapult the company into the big leagues

  13. International exploration by independent

    International Nuclear Information System (INIS)

    Bertragne, R.G.

    1992-01-01

    Recent industry trends indicate that the smaller U.S. independents are looking at foreign exploration opportunities as one of the alternatives for growth in the new age of exploration. Foreign finding costs per barrel usually are accepted to be substantially lower than domestic costs because of the large reserve potential of international plays. To get involved in overseas exploration, however, requires the explorationist to adapt to different cultural, financial, legal, operational, and political conditions. Generally, foreign exploration proceeds at a slower pace than domestic exploration because concessions are granted by a country's government, or are explored in partnership with a national oil company. First, the explorationist must prepare a mid- to long-term strategy, tailored to the goals and the financial capabilities of the company; next, is an ongoing evaluation of quality prospects in various sedimentary basins, and careful planning and conduct of the operations. To successfully explore overseas also requires the presence of a minimum number of explorationists and engineers thoroughly familiar with the various exploratory and operational aspects of foreign work. Ideally, these team members will have had a considerable amount of on-site experience in various countries and climates. Independents best suited for foreign expansion are those who have been financially successful in domestic exploration. When properly approached, foreign exploration is well within the reach of smaller U.S. independents, and presents essentially no greater risk than domestic exploration; however, the reward can be much larger and can catapult the company into the 'big leagues.'

  14. Agent independent task planning

    Science.gov (United States)

    Davis, William S.

    1990-01-01

    Agent-Independent Planning is a technique that allows the construction of activity plans without regard to the agent that will perform them. Once generated, a plan is then validated and translated into instructions for a particular agent, whether a robot, crewmember, or software-based control system. Because Space Station Freedom (SSF) is planned for orbital operations for approximately thirty years, it will almost certainly experience numerous enhancements and upgrades, including upgrades in robotic manipulators. Agent-Independent Planning provides the capability to construct plans for SSF operations, independent of specific robotic systems, by combining techniques of object oriented modeling, nonlinear planning and temporal logic. Since a plan is validated using the physical and functional models of a particular agent, new robotic systems can be developed and integrated with existing operations in a robust manner. This technique also provides the capability to generate plans for crewmembers with varying skill levels, and later apply these same plans to more sophisticated robotic manipulators made available by evolutions in technology.

  15. International exploration by independents

    International Nuclear Information System (INIS)

    Bertagne, R.G.

    1992-01-01

    Recent industry trends indicate that the smaller U.S. independents are looking at foreign exploration opportunities as one of the alternatives for growth in the new age of exploration. The problems of communications and logistics caused by different cultures and by geographic distances must be carefully evaluated. A mid-term to long-term strategy tailored to the goals and the financial capabilities of the company should be prepared and followed by a careful planning of the operations. This paper addresses some aspects of foreign exploration that should be considered before an independent venture into the foreign field. It also provides some guidelines for conducting successful overseas operations. When properly assessed, foreign exploration is well within the reach of smaller U.S. independents and presents no greater risk than domestic exploration; the rewards, however, can be much larger. Furthermore, the Oil and Gas Journal surveys of the 300 largest U.S. petroleum companies show that companies with a consistent foreign exploration policy have fared better financially during difficult times

  16. The COMET Sleep Research Platform.

    Science.gov (United States)

    Nichols, Deborah A; DeSalvo, Steven; Miller, Richard A; Jónsson, Darrell; Griffin, Kara S; Hyde, Pamela R; Walsh, James K; Kushida, Clete A

    2014-01-01

    The Comparative Outcomes Management with Electronic Data Technology (COMET) platform is extensible and designed for facilitating multicenter electronic clinical research. Our research goals were the following: (1) to conduct a comparative effectiveness trial (CET) for two obstructive sleep apnea treatments-positive airway pressure versus oral appliance therapy; and (2) to establish a new electronic network infrastructure that would support this study and other clinical research studies. The COMET platform was created to satisfy the needs of CET with a focus on creating a platform that provides comprehensive toolsets, multisite collaboration, and end-to-end data management. The platform also provides medical researchers the ability to visualize and interpret data using business intelligence (BI) tools. COMET is a research platform that is scalable and extensible, and which, in a future version, can accommodate big data sets and enable efficient and effective research across multiple studies and medical specialties. The COMET platform components were designed for an eventual move to a cloud computing infrastructure that enhances sustainability, overall cost effectiveness, and return on investment.

  17. Distributed Video Coding for Multiview and Video-plus-depth Coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo

    The interest in Distributed Video Coding (DVC) systems has grown considerably in the academic world in recent years. With DVC the correlation between frames is exploited at the decoder (joint decoding). The encoder codes the frame independently, performing relatively simple operations. Therefore......, with DVC the complexity is shifted from encoder to decoder, making the coding architecture a viable solution for encoders with limited resources. DVC may empower new applications which can benefit from this reversed coding architecture. Multiview Distributed Video Coding (M-DVC) is the application...... of the to-be-decoded frame. Another key element is the Residual estimation, indicating the reliability of the SI, which is used to calculate the parameters of the correlation noise model between SI and original frame. In this thesis new methods for Inter-camera SI generation are analyzed in the Stereo...

  18. Running code as part of an open standards policy

    OpenAIRE

    Shah, Rajiv; Kesan, Jay

    2009-01-01

    Governments around the world are considering implementing or even mandating open standards policies. They believe these policies will provide economic, socio-political, and technical benefits. In this article, we analyze the failure of the Massachusetts’s open standards policy as applied to document formats. We argue it failed due to the lack of running code. Running code refers to multiple independent, interoperable implementations of an open standard. With running code, users have choice ...

  19. Bit-Wise Arithmetic Coding For Compression Of Data

    Science.gov (United States)

    Kiely, Aaron

    1996-01-01

    Bit-wise arithmetic coding is data-compression scheme intended especially for use with uniformly quantized data from source with Gaussian, Laplacian, or similar probability distribution function. Code words of fixed length, and bits treated as being independent. Scheme serves as means of progressive transmission or of overcoming buffer-overflow or rate constraint limitations sometimes arising when data compression used.

  20. Peripheral Codes in ASTRA for the TJ-II

    International Nuclear Information System (INIS)

    Lopez-Bruna, D.; Reynolds, J. M.; Cappa, A.; Martinell, J.; Garcia, J.; Gutierrez-Tapia, C.

    2010-01-01

    The study of data from the TJ-II device is often done with transport calculations based on the ASTRA transport system. However, complicated independent codes are used to obtain fundamental ingredients in these calculations, such as the particle and/or energy sources. These codes are accessible from ASTRA through the procedures explained in this report. (Author) 37 refs.

  1. Authorization request for potential non-compliance with the American Standard Safety Code for Elevators Dumbwaiters and Escalators

    Energy Technology Data Exchange (ETDEWEB)

    Boyd, J.E.

    1964-09-28

    A Third Party inspection of the reactor work platforms was conducted by representatives of the Travelers Insurance Company in 1958. An inspection report submitted by these representatives described hazardous conditions noted and presented a series of recommendations to improve the operational safety of the systems. Project CGI-960, ``C`` & ``D`` Work Platform Safety Improvements -- All Reactors, vas initiated to modify the platforms in compliance with the Third Party recommendations. The American Standard Safety Code for Elevators Dumbwaiters and Escalators (A-17.1) is used as a guide by the Third Party in formulating their recommendations. This code is used because there is no other applicable code for this type of equipment. While the work platforms do not and in some cases can not comply with this code because of operational use, every effort is made to comply with the intent of the code.

  2. Code of Ethics

    Science.gov (United States)

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  3. Interleaved Product LDPC Codes

    OpenAIRE

    Baldi, Marco; Cancellieri, Giovanni; Chiaraluce, Franco

    2011-01-01

    Product LDPC codes take advantage of LDPC decoding algorithms and the high minimum distance of product codes. We propose to add suitable interleavers to improve the waterfall performance of LDPC decoding. Interleaving also reduces the number of low weight codewords, that gives a further advantage in the error floor region.

  4. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  5. Error Correcting Codes

    Indian Academy of Sciences (India)

    Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.

  6. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Lene; Pries-Heje, Jan; Dalgaard, Bente

    2013-01-01

    is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  7. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  8. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  9. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Energy Conservation Code. International Existing Building Code. International Fire Code. International... Code. International Property Maintenance Code. International Residential Code. International Swimming Pool and Spa Code International Wildland-Urban Interface Code. International Zoning Code. ICC Standards...

  10. Seqcrawler: biological data indexing and browsing platform.

    Science.gov (United States)

    Sallou, Olivier; Bretaudeau, Anthony; Roult, Aurelien

    2012-07-24

    Seqcrawler takes its roots in software like SRS or Lucegene. It provides an indexing platform to ease the search of data and meta-data in biological banks and it can scale to face the current flow of data. While many biological bank search tools are available on the Internet, mainly provided by large organizations to search their data, there is a lack of free and open source solutions to browse one's own set of data with a flexible query system and able to scale from a single computer to a cloud system. A personal index platform will help labs and bioinformaticians to search their meta-data but also to build a larger information system with custom subsets of data. The software is scalable from a single computer to a cloud-based infrastructure. It has been successfully tested in a private cloud with 3 index shards (pieces of index) hosting ~400 millions of sequence information (whole GenBank, UniProt, PDB and others) for a total size of 600 GB in a fault tolerant architecture (high-availability). It has also been successfully integrated with software to add extra meta-data from blast results to enhance users' result analysis. Seqcrawler provides a complete open source search and store solution for labs or platforms needing to manage large amount of data/meta-data with a flexible and customizable web interface. All components (search engine, visualization and data storage), though independent, share a common and coherent data system that can be queried with a simple HTTP interface. The solution scales easily and can also provide a high availability infrastructure.

  11. Seqcrawler: biological data indexing and browsing platform

    Directory of Open Access Journals (Sweden)

    Sallou Olivier

    2012-07-01

    Full Text Available Abstract Background Seqcrawler takes its roots in software like SRS or Lucegene. It provides an indexing platform to ease the search of data and meta-data in biological banks and it can scale to face the current flow of data. While many biological bank search tools are available on the Internet, mainly provided by large organizations to search their data, there is a lack of free and open source solutions to browse one’s own set of data with a flexible query system and able to scale from a single computer to a cloud system. A personal index platform will help labs and bioinformaticians to search their meta-data but also to build a larger information system with custom subsets of data. Results The software is scalable from a single computer to a cloud-based infrastructure. It has been successfully tested in a private cloud with 3 index shards (pieces of index hosting ~400 millions of sequence information (whole GenBank, UniProt, PDB and others for a total size of 600 GB in a fault tolerant architecture (high-availability. It has also been successfully integrated with software to add extra meta-data from blast results to enhance users’ result analysis. Conclusions Seqcrawler provides a complete open source search and store solution for labs or platforms needing to manage large amount of data/meta-data with a flexible and customizable web interface. All components (search engine, visualization and data storage, though independent, share a common and coherent data system that can be queried with a simple HTTP interface. The solution scales easily and can also provide a high availability infrastructure.

  12. Advances in the development of the Mexican platform for analysis and design of nuclear reactors: AZTLAN Platform; Avances en el desarrollo de la plataforma mexicana para analisis y diseno de reactores nucleares: AZTLAN Platform

    Energy Technology Data Exchange (ETDEWEB)

    Gomez T, A. M.; Puente E, F. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Del Valle G, E. [IPN, Escuela Superior de Fisica y Matematicas, Av. IPN s/n, 07738 Ciudad de Mexico (Mexico); Francois L, J. L. [UNAM, Facultad de Ingenieria, Departamento de Sistemas Energeticos, Paseo Cuauhnahuac 8532, Col. Progreso, 62550 Jiutepec, Morelos (Mexico); Espinosa P, G., E-mail: armando.gomez@inin.gob.mx [Universidad Autonoma Metropolitana, Unidad Iztapalapa, Av. San Rafael Atlixco 186, Col. Vicentina, 09340 Ciudad de Mexico (Mexico)

    2017-09-15

    The AZTLAN platform project: development of a Mexican platform for the analysis and design of nuclear reactors, financed by the SENER-CONACYT Energy Sustain ability Fund, was approved in early 2014 and formally began at the end of that year. It is a national project led by the Instituto Nacional de Investigaciones Nucleares (ININ) and with the collaboration of Instituto Politecnico Nacional (IPN), the Universidad Autonoma Metropolitana (UAM) and Universidad Nacional Autonoma de Mexico (UNAM) as part of the development team and with the participation of the Laguna Verde Nuclear Power Plant, the National Commission of Nuclear Safety and Safeguards, the Ministry of Energy and the Karlsruhe Institute of Technology (Kit, Germany) as part of the user group. The general objective of the project is to modernize, improve and integrate the neutronic, thermo-hydraulic and thermo-mechanical codes, developed in Mexican institutions, in an integrated platform, developed and maintained by Mexican experts for the benefit of Mexican institutions. Two years into the process, important steps have been taken that have consolidated the platform. The main results of these first two years have been presented in different national and international forums. In this congress, some of the most recent results that have been implemented in the platform codes are shown in more detail. The current status of the platform from a more executive view point is summarized in this paper. (Author)

  13. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  14. Fracture flow code

    International Nuclear Information System (INIS)

    Dershowitz, W; Herbert, A.; Long, J.

    1989-03-01

    The hydrology of the SCV site will be modelled utilizing discrete fracture flow models. These models are complex, and can not be fully cerified by comparison to analytical solutions. The best approach for verification of these codes is therefore cross-verification between different codes. This is complicated by the variation in assumptions and solution techniques utilized in different codes. Cross-verification procedures are defined which allow comparison of the codes developed by Harwell Laboratory, Lawrence Berkeley Laboratory, and Golder Associates Inc. Six cross-verification datasets are defined for deterministic and stochastic verification of geometric and flow features of the codes. Additional datasets for verification of transport features will be documented in a future report. (13 figs., 7 tabs., 10 refs.) (authors)

  15. The design and verification of probabilistic safety analysis platform NFRisk

    International Nuclear Information System (INIS)

    Hu Wenjun; Song Wei; Ren Lixia; Qian Hongtao

    2010-01-01

    To increase the technical ability in Probabilistic Safety Analysis (PSA) field in China,it is necessary and important to study and develop indigenous professional PSA platform. Following such principle as 'from structure simplification to modulization to production of cut sets to minimum of cut sets', the algorithms, including simplification algorithm, modulization algorithm, the algorithm of conversion from fault tree to binary decision diagram (BDD), the solving algorithm of cut sets, the minimum algorithm of cut sets, and so on, were designed and developed independently; the design of data management and operation platform was completed all alone; the verification and validation of NFRisk platform based on 3 typical fault trees was finished on our own. (authors)

  16. Pre-processing of input files for the AZTRAN code

    International Nuclear Information System (INIS)

    Vargas E, S.; Ibarra, G.

    2017-09-01

    The AZTRAN code began to be developed in the Nuclear Engineering Department of the Escuela Superior de Fisica y Matematicas (ESFM) of the Instituto Politecnico Nacional (IPN) with the purpose of numerically solving various models arising from the physics and engineering of nuclear reactors. The code is still under development and is part of the AZTLAN platform: Development of a Mexican platform for the analysis and design of nuclear reactors. Due to the complexity to generate an input file for the code, a script based on D language is developed, with the purpose of making its elaboration easier, based on a new input file format which includes specific cards, which have been divided into two blocks, mandatory cards and optional cards, including a pre-processing of the input file to identify possible errors within it, as well as an image generator for the specific problem based on the python interpreter. (Author)

  17. Independence in appearance

    DEFF Research Database (Denmark)

    Warming-Rasmussen, Bent; Quick, Reiner; Liempd, Dennis van

    2011-01-01

    In the wake of the financial crisis, the EU Commission has published a Green Paper on the future role of the audit function in Europe. The Green Paper lists a number of proposals for tighter rules for audits and auditors in order to contribute to stabilizing the financial system. The present...... article presents research contributions to the question whether the auditor is to continue to provide both audit and non-audit services (NAS) to an audit client. Research results show that this double function for the same audit client is a problem for stakeholders' confidence in auditor independence...

  18. ClinicalCodes: an online clinical codes repository to improve the validity and reproducibility of research using electronic medical records.

    Science.gov (United States)

    Springate, David A; Kontopantelis, Evangelos; Ashcroft, Darren M; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David

    2014-01-01

    Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.

  19. Recent advances in neutral particle transport methods and codes

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1996-01-01

    An overview of ORNL's three-dimensional neutral particle transport code, TORT, is presented. Special features of the code that make it invaluable for large applications are summarized for the prospective user. Advanced capabilities currently under development and installation in the production release of TORT are discussed; they include: multitasking on Cray platforms running the UNICOS operating system; Adjacent cell Preconditioning acceleration scheme; and graphics codes for displaying computed quantities such as the flux. Further developments for TORT and its companion codes to enhance its present capabilities, as well as expand its range of applications are disucssed. Speculation on the next generation of neutron particle transport codes at ORNL, especially regarding unstructured grids and high order spatial approximations, are also mentioned

  20. Huffman coding in advanced audio coding standard

    Science.gov (United States)

    Brzuchalski, Grzegorz

    2012-05-01

    This article presents several hardware architectures of Advanced Audio Coding (AAC) Huffman noiseless encoder, its optimisations and working implementation. Much attention has been paid to optimise the demand of hardware resources especially memory size. The aim of design was to get as short binary stream as possible in this standard. The Huffman encoder with whole audio-video system has been implemented in FPGA devices.

  1. Pro Smartphone Cross-Platform Development IPhone, Blackberry, Windows Mobile, and Android Development and Distribution

    CERN Document Server

    Allen, Sarah; Lundrigan, Lee

    2010-01-01

    Learn the theory behind cross-platform development, and put the theory into practice with code using the invaluable information presented in this book. With in-depth coverage of development and distribution techniques for iPhone, BlackBerry, Windows Mobile, and Android, you'll learn the native approach to working with each of these platforms. With detailed coverage of emerging frameworks like PhoneGap and Rhomobile, you'll learn the art of creating applications that will run across all devices. You'll also be introduced to the code-signing process and the distribution of applications through t

  2. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  3. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  4. Bit-wise arithmetic coding for data compression

    Science.gov (United States)

    Kiely, A. B.

    1994-01-01

    This article examines the problem of compressing a uniformly quantized independent and identically distributed (IID) source. We present a new compression technique, bit-wise arithmetic coding, that assigns fixed-length codewords to the quantizer output and uses arithmetic coding to compress the codewords, treating the codeword bits as independent. We examine the performance of this method and evaluate the overhead required when used block-adaptively. Simulation results are presented for Gaussian and Laplacian sources. This new technique could be used as the entropy coder in a transform or subband coding system.

  5. Stratifying the Develoment of Product Platforms

    DEFF Research Database (Denmark)

    Sköld, Martin; Karlsson, Christer

    2013-01-01

    companies develop platforms for different aims, purposes, and product scopes. Following on from this, the requirements for platform development resources, the ways of organizing platform development, and the implications for management styles have not been explored and are presumably varying. To start...... influencing the project length, requirements for platform development resources, principles for organizing, and implications for management styles....

  6. The Educational Platform: Constructing Conceptual Frameworks.

    Science.gov (United States)

    Peca, Kathy; Isham, Mark

    2001-01-01

    The education faculty at Eastern New Mexico University used educational platforms as a means of developing the unit's conceptual framework. Faculty members developed personal platforms, then synthesized them into one curricular area platform. The resultant unit educational platform became the basis for the unit's conceptual framework, which…

  7. New offshore platform in the Mexican Gulf

    Energy Technology Data Exchange (ETDEWEB)

    Beisel, T.

    1982-04-01

    After a construction period of only 10 months, the second steel Offshore platform was recently completed in the Mexican Gulf. The pattern for this structure was the Cognac platform. The erection of the new platform, called the 'Cerveza' platform, is described in the article.

  8. The Dynamics of Digital Platform Innovation

    DEFF Research Database (Denmark)

    Eaton, Ben

    2016-01-01

    Curated platforms provide an architectural basis for third parties to develop platform complements and for platform owners to control their implementation as a form of open innovation. The refusal to implement complements as innovations can cause tension between platform owners and developers. Th...

  9. The importance of board independence

    NARCIS (Netherlands)

    Zijl, N.J.M.

    2012-01-01

    Although the attributed importance of board independence is high, a clear definition of independence does not exist. Furthermore, the aim and consequences of independence are the subject of discussion and empirical evidence about the impact of independence is weak and disputable. Despite this lack

  10. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  11. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...

  12. Transport theory and codes

    International Nuclear Information System (INIS)

    Clancy, B.E.

    1986-01-01

    This chapter begins with a neutron transport equation which includes the one dimensional plane geometry problems, the one dimensional spherical geometry problems, and numerical solutions. The section on the ANISN code and its look-alikes covers problems which can be solved; eigenvalue problems; outer iteration loop; inner iteration loop; and finite difference solution procedures. The input and output data for ANISN is also discussed. Two dimensional problems such as the DOT code are given. Finally, an overview of the Monte-Carlo methods and codes are elaborated on

  13. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  14. Disentangling Competition Among Platform Driven Strategic Groups

    DEFF Research Database (Denmark)

    Kazan, Erol; Tan, Chee-Wee; Lim, Eric

    2015-01-01

    In platform-driven markets, competitive advantage is derived from superior platform design and configurations. For this reason, platform owners strive to create unique and inimitable platform configurals to maintain and extend their competitiveness within network economies. To disentangle firm...... competition within platform-driven markets, we opted for the UK mobile payment market as our empirical setting. By embracing the theoretical lens of strategic groups and digital platforms, this study supplements prior research by deriving a taxonomy of platform-driven strategic groups that is grounded...

  15. Autonomy, Independence, Inclusion

    Directory of Open Access Journals (Sweden)

    Filippo Angelucci

    2015-04-01

    Full Text Available The living environment must not only meet the primary needs of living, but also the expectations of improvement of life and social relations and people’s work. The need for a living environment that responds to the needs of users with their different abilities, outside of standardizations, is increasingly felt as autonomy, independence and well-being are the result of real usability and adaptability of the spaces. The project to improve the inclusivity of living space and to promote the rehabilitation of fragile users need to be characterized as an interdisciplinary process in which the integration of specialized contributions leads to adaptive customization of space solutions and technological that evolve with the changing needs, functional capacities and abilities of individuals.

  16. Licensing experience with SPINLINE digital I/C platform - 15099

    International Nuclear Information System (INIS)

    Jegou, H.; Duthou, A.; Bach, J.; Burzynski, M.

    2015-01-01

    Rolls-Royce recently received a safety evaluation report from the NRC for the SPINLINE 3 digital safety instrumentation and control platform. The main Rolls-Royce interest in the NRC review was approval of the fail-safe, fault-tolerance, self-monitoring, deterministic, and communication independence features of the platform. The SPINLINE 3 platform consists of a set of standardized, modular hardware and software components and associated development tools. Rolls-Royce used a set of EPRI guidance documents to successfully develop a commercial grade dedication case of the platform. It was important to describe the technical critical characteristics for performance and dependability in the documentation submitted to NRC. The NRC audit forum was an important opportunity to effectively communicate complex technical information about the SPINLINE 3 platform. The NRC review had five interesting focus areas that offer opportunities for lessons learned. The main lesson learned is to put the same emphasis on the review for communication effectiveness as is put on the review for technical completeness and accuracy

  17. Development of an IHE MRRT-compliant open-source web-based reporting platform.

    Science.gov (United States)

    Pinto Dos Santos, Daniel; Klos, G; Kloeckner, R; Oberle, R; Dueber, C; Mildenberger, P

    2017-01-01

    To develop a platform that uses structured reporting templates according to the IHE Management of Radiology Report Templates (MRRT) profile, and to implement this platform into clinical routine. The reporting platform uses standard web technologies (HTML / JavaScript and PHP / MySQL) only. Several freely available external libraries were used to simplify the programming. The platform runs on a standard web server, connects with the radiology information system (RIS) and PACS, and is easily accessible via a standard web browser. A prototype platform that allows structured reporting to be easily incorporated into the clinical routine was developed and successfully tested. To date, 797 reports were generated using IHE MRRT-compliant templates (many of them downloaded from the RSNA's radreport.org website). Reports are stored in a MySQL database and are easily accessible for further analyses. Development of an IHE MRRT-compliant platform for structured reporting is feasible using only standard web technologies. All source code will be made available upon request under a free license, and the participation of other institutions in further development is welcome. • A platform for structured reporting using IHE MRRT-compliant templates is presented. • Incorporating structured reporting into clinical routine is feasible. • Full source code will be provided upon request under a free license.

  18. Microneedle Platforms for Cell Analysis

    KAUST Repository

    Kavaldzhiev, Mincho

    2017-11-01

    Micro-needle platforms are the core components of many recent drug delivery and gene-editing techniques, which allow for intracellular access, controlled cell membrane stress or mechanical trapping of the nucleus. This dissertation work is devoted to the development of micro-needle platforms that offer customized fabrication and new capabilities for enhanced cell analyses. The highest degree of geometrical flexibility is achieved with 3D printed micro-needles, which enable optimizing the topographical stress environment for cells and cell populations of any size. A fabrication process for 3D-printed micro-needles has been developed as well as a metal coating technique based on standard sputter deposition. This extends the functionalities of the platforms by electrical as well as magnetic features. The micro-needles have been tested on human colon cancer cells (HCT116), showing a high degree of biocompatibility of the platform. Moreover, the capabilities of the 3D-printed micro-needles have been explored for drug delivery via the well-established electroporation technique, by coating the micro-needles with gold. Antibodies and fluorescent dyes have been delivered to HCT116 cells and human embryonic kidney cells with a very high transfection rate up to 90%. In addition, the 3D-printed electroporation platform enables delivery of molecules to suspended cells or adherent cells, with or without electroporation buffer solution, and at ultra-low voltages of 2V. In order to provide a micro-needle platform that exploits existing methods for mass fabrication a custom designed template-based process has been developed. It has been used for the production of gold, iron, nickel and poly-pyrrole micro-needles on silicon and glass substrates. A novel delivery method is introduced that activates the micro-needles by electromagnetic induction, which enables to wirelessly gain intracellular access. The method has been successfully tested on HCT116 cells in culture, where a time

  19. Moodle vs. Social Media Platforms

    DEFF Research Database (Denmark)

    Gulieva, Valeria

    Given the competition coming from various social media platforms, it is explored in this paper how students could be encouraged to use Moodle more proactively during their studies. Moodle is a course management system for online learning. It is designed to be a flexible template-based system, which......, known and widely used social platforms. It might be also due to the fact that students do not see the benefits in investing time and efforts in learning the new system. Another reason might be the mandatory nature of Moodle, i.e., it is imposed on students, rather than a free choice – and this might...

  20. ITER Dynamic Tritium Inventory Modeling Code

    International Nuclear Information System (INIS)

    Cristescu, Ioana-R.; Doerr, L.; Busigin, A.; Murdoch, D.

    2005-01-01

    A tool for tritium inventory evaluation within each sub-system of the Fuel Cycle of ITER is vital, with respect to both the process of licensing ITER and also for operation. It is very likely that measurements of total tritium inventories may not be possible for all sub-systems, however tritium accounting may be achieved by modeling its hold-up within each sub-system and by validating these models in real-time against the monitored flows and tritium streams between the systems. To get reliable results, an accurate dynamic modeling of the tritium content in each sub-system is necessary. In order to optimize the configuration and operation of the ITER fuel cycle, a dynamic fuel cycle model was developed progressively in the decade up to 2000-2001. As the design for some sub-systems from the fuel cycle (i.e. Vacuum pumping, Neutral Beam Injectors (NBI)) have substantially progressed meanwhile, a new code developed under a different platform to incorporate these modifications has been developed. The new code is taking over the models and algorithms for some subsystems, such as Isotope Separation System (ISS); where simplified models have been previously considered, more detailed have been introduced, as for the Water Detritiation System (WDS). To reflect all these changes, the new code developed inside EU participating team was nominated TRIMO (Tritium Inventory Modeling), to emphasize the use of the code on assessing the tritium inventory within ITER

  1. Towards a Framework of Digital Platform Competition

    DEFF Research Database (Denmark)

    Kazan, Erol; Tan, Chee-Wee; Lim, Eric T. K.

    2016-01-01

    between monopolistic (i.e., Pingit) and federated (i.e., Paym) mobile payment platforms to illustrate its applicability and yield principles on the nature and impact of competition among platform-driven ubiquitous systems. Preliminary findings indicate that monopolistic mobile digital platforms attempt...... to create unique configurals to obtain monopolistic power by tightly coupling platform layers, which are difficult to replicate. Conversely, federated digital platforms compete by dispersing the service layer to harness the collective resources from individual firms. Furthermore, the interaction...

  2. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  3. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed; Ghanem, Bernard; Wonka, Peter

    2018-01-01

    coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements

  4. SASSYS LMFBR systems code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time

  5. OCA Code Enforcement

    Data.gov (United States)

    Montgomery County of Maryland — The Office of the County Attorney (OCA) processes Code Violation Citations issued by County agencies. The citations can be viewed by issued department, issued date...

  6. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  7. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  8. VT ZIP Code Areas

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) A ZIP Code Tabulation Area (ZCTA) is a statistical geographic entity that approximates the delivery area for a U.S. Postal Service five-digit...

  9. Bandwidth efficient coding

    CERN Document Server

    Anderson, John B

    2017-01-01

    Bandwidth Efficient Coding addresses the major challenge in communication engineering today: how to communicate more bits of information in the same radio spectrum. Energy and bandwidth are needed to transmit bits, and bandwidth affects capacity the most. Methods have been developed that are ten times as energy efficient at a given bandwidth consumption as simple methods. These employ signals with very complex patterns and are called "coding" solutions. The book begins with classical theory before introducing new techniques that combine older methods of error correction coding and radio transmission in order to create narrowband methods that are as efficient in both spectrum and energy as nature allows. Other topics covered include modulation techniques such as CPM, coded QAM and pulse design.

  10. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    2001-01-01

    The description of reactor lattice codes is carried out on the example of the WIMSD-5B code. The WIMS code in its various version is the most recognised lattice code. It is used in all parts of the world for calculations of research and power reactors. The version WIMSD-5B is distributed free of charge by NEA Data Bank. The description of its main features given in the present lecture follows the aspects defined previously for lattice calculations in the lecture on Reactor Lattice Transport Calculations. The spatial models are described, and the approach to the energy treatment is given. Finally the specific algorithm applied in fuel depletion calculations is outlined. (author)

  11. Cross-platform comparison of microarray data using order restricted inference

    Science.gov (United States)

    Klinglmueller, Florian; Tuechler, Thomas; Posch, Martin

    2013-01-01

    Motivation Titration experiments measuring the gene expression from two different tissues, along with total RNA mixtures of the pure samples, are frequently used for quality evaluation of microarray technologies. Such a design implies that the true mRNA expression of each gene, is either constant or follows a monotonic trend between the mixtures, applying itself to the use of order restricted inference procedures. Exploiting only the postulated monotonicity of titration designs, we propose three statistical analysis methods for the validation of high-throughput genetic data and corresponding preprocessing techniques. Results Our methods allow for inference of accuracy, repeatability and cross-platform agreement, with minimal required assumptions regarding the underlying data generating process. Therefore, they are readily applicable to all sorts of genetic high-throughput data independent of the degree of preprocessing. An application to the EMERALD dataset was used to demonstrate how our methods provide a rich spectrum of easily interpretable quality metrics and allow the comparison of different microarray technologies and normalization methods. The results are on par with previous work, but provide additional new insights that cast doubt on the utility of popular preprocessing techniques, specifically concerning the EMERALD projects dataset. Availability All datasets are available on EBI’s ArrayExpress web site (http://www.ebi.ac.uk/microarray-as/ae/) under accession numbers E-TABM-536, E-TABM-554 and E-TABM-555. Source code implemented in C and R is available at: http://statistics.msi.meduniwien.ac.at/float/cross_platform/. Methods for testing and variance decomposition have been made available in the R-package orQA, which can be downloaded and installed from CRAN http://cran.r-project.org. PMID:21317143

  12. Critical Care Coding for Neurologists.

    Science.gov (United States)

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  13. Lattice Index Coding

    OpenAIRE

    Natarajan, Lakshmi; Hong, Yi; Viterbo, Emanuele

    2014-01-01

    The index coding problem involves a sender with K messages to be transmitted across a broadcast channel, and a set of receivers each of which demands a subset of the K messages while having prior knowledge of a different subset as side information. We consider the specific case of noisy index coding where the broadcast channel is Gaussian and every receiver demands all the messages from the source. Instances of this communication problem arise in wireless relay networks, sensor networks, and ...

  14. Cracking the Gender Codes

    DEFF Research Database (Denmark)

    Rennison, Betina Wolfgang

    2016-01-01

    extensive work to raise the proportion of women. This has helped slightly, but women remain underrepresented at the corporate top. Why is this so? What can be done to solve it? This article presents five different types of answers relating to five discursive codes: nature, talent, business, exclusion...... in leadership management, we must become more aware and take advantage of this complexity. We must crack the codes in order to crack the curve....

  15. A mass spectrometry proteomics data management platform.

    Science.gov (United States)

    Sharma, Vagisha; Eng, Jimmy K; Maccoss, Michael J; Riffle, Michael

    2012-09-01

    Mass spectrometry-based proteomics is increasingly being used in biomedical research. These experiments typically generate a large volume of highly complex data, and the volume and complexity are only increasing with time. There exist many software pipelines for analyzing these data (each typically with its own file formats), and as technology improves, these file formats change and new formats are developed. Files produced from these myriad software programs may accumulate on hard disks or tape drives over time, with older files being rendered progressively more obsolete and unusable with each successive technical advancement and data format change. Although initiatives exist to standardize the file formats used in proteomics, they do not address the core failings of a file-based data management system: (1) files are typically poorly annotated experimentally, (2) files are "organically" distributed across laboratory file systems in an ad hoc manner, (3) files formats become obsolete, and (4) searching the data and comparing and contrasting results across separate experiments is very inefficient (if possible at all). Here we present a relational database architecture and accompanying web application dubbed Mass Spectrometry Data Platform that is designed to address the failings of the file-based mass spectrometry data management approach. The database is designed such that the output of disparate software pipelines may be imported into a core set of unified tables, with these core tables being extended to support data generated by specific pipelines. Because the data are unified, they may be queried, viewed, and compared across multiple experiments using a common web interface. Mass Spectrometry Data Platform is open source and freely available at http://code.google.com/p/msdapl/.

  16. KENO-V code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P 1 scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes

  17. Code, standard and specifications

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    Radiography also same as the other technique, it need standard. This standard was used widely and method of used it also regular. With that, radiography testing only practical based on regulations as mentioned and documented. These regulation or guideline documented in code, standard and specifications. In Malaysia, level one and basic radiographer can do radiography work based on instruction give by level two or three radiographer. This instruction was produced based on guideline that mention in document. Level two must follow the specifications mentioned in standard when write the instruction. From this scenario, it makes clearly that this radiography work is a type of work that everything must follow the rule. For the code, the radiography follow the code of American Society for Mechanical Engineer (ASME) and the only code that have in Malaysia for this time is rule that published by Atomic Energy Licensing Board (AELB) known as Practical code for radiation Protection in Industrial radiography. With the existence of this code, all the radiography must follow the rule or standard regulated automatically.

  18. Developing HYDMN code to include the transient of MNSR

    International Nuclear Information System (INIS)

    Al-Barhoum, M.

    2000-11-01

    A description of the programs added to HYDMN code (a code for thermal-hydraulic steady state of MNSR) to include the transient of the same MNSR is presented. The code asks the initial conditions for the power (in k W) and the cold initial core inlet temperature (in degrees centigrade). A time-dependent study of the coolant inlet and outlet temperature, its speed, pool and tank temperatures is done for MNSR in general and for the Syrian MNSR in particular. The study solves the differential equations taken from reference (1) by using some numerical methods found in reference (3). The code becomes this way independent of any external information source. (Author)

  19. Paracantor: A two group, two region reactor code

    Energy Technology Data Exchange (ETDEWEB)

    Stone, Stuart

    1956-07-01

    Paracantor I a two energy group, two region, time independent reactor code, which obtains a closed solution for a critical reactor assembly. The code deals with cylindrical reactors of finite length and with a radial reflector of finite thickness. It is programmed for the 1.B.M: Magnetic Drum Data-Processing Machine, Type 650. The limited memory space available does not permit a flux solution to be included in the basic Paracantor code. A supplementary code, Paracantor 11, has been programmed which computes fluxes, .including adjoint fluxes, from the .output of Paracamtor I.

  20. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  1. A method for scientific code coupling in a distributed environment

    International Nuclear Information System (INIS)

    Caremoli, C.; Beaucourt, D.; Chen, O.; Nicolas, G.; Peniguel, C.; Rascle, P.; Richard, N.; Thai Van, D.; Yessayan, A.

    1994-12-01

    This guide book deals with coupling of big scientific codes. First, the context is introduced: big scientific codes devoted to a specific discipline coming to maturity, and more and more needs in terms of multi discipline studies. Then we describe different kinds of code coupling and an example of code coupling: 3D thermal-hydraulic code THYC and 3D neutronics code COCCINELLE. With this example we identify problems to be solved to realize a coupling. We present the different numerical methods usable for the resolution of coupling terms. This leads to define two kinds of coupling: with the leak coupling, we can use explicit methods, and with the strong coupling we need to use implicit methods. On both cases, we analyze the link with the way of parallelizing code. For translation of data from one code to another, we define the notion of Standard Coupling Interface based on a general structure for data. This general structure constitutes an intermediary between the codes, thus allowing a relative independence of the codes from a specific coupling. The proposed method for the implementation of a coupling leads to a simultaneous run of the different codes, while they exchange data. Two kinds of data communication with message exchange are proposed: direct communication between codes with the use of PVM product (Parallel Virtual Machine) and indirect communication with a coupling tool. This second way, with a general code coupling tool, is based on a coupling method, and we strongly recommended to use it. This method is based on the two following principles: re-usability, that means few modifications on existing codes, and definition of a code usable for coupling, that leads to separate the design of a code usable for coupling from the realization of a specific coupling. This coupling tool available from beginning of 1994 is described in general terms. (authors). figs., tabs

  2. Computer code development plant for SMART design

    International Nuclear Information System (INIS)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H.

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  3. Computer code development plant for SMART design

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  4. AZTLAN platform: Mexican platform for analysis and design of nuclear reactors

    International Nuclear Information System (INIS)

    Gomez T, A. M.; Puente E, F.; Del Valle G, E.; Francois L, J. L.; Martin del Campo M, C.; Espinosa P, G.

    2014-10-01

    The Aztlan platform Project is a national initiative led by the Instituto Nacional de Investigaciones Nucleares (ININ) which brings together the main public houses of higher studies in Mexico, such as: Instituto Politecnico Nacional, Universidad Nacional Autonoma de Mexico and Universidad Autonoma Metropolitana in an effort to take a significant step toward the calculation autonomy and analysis that seeks to place Mexico in the medium term in a competitive international level on software issues for analysis of nuclear reactors. This project aims to modernize, improve and integrate the neutron, thermal-hydraulic and thermo-mechanical codes, developed in Mexican institutions, within an integrated platform, developed and maintained by Mexican experts to benefit from the same institutions. This project is financed by the mixed fund SENER-CONACYT of Energy Sustain ability, and aims to strengthen substantially to research institutions, such as educational institutions contributing to the formation of highly qualified human resources in the area of analysis and design of nuclear reactors. As innovative part the project includes the creation of a user group, made up of members of the project institutions as well as the Comision Nacional de Seguridad Nuclear y Salvaguardias, Central Nucleoelectrica de Laguna Verde (CNLV), Secretaria de Energia (Mexico) and Karlsruhe Institute of Technology (Germany) among others. This user group will be responsible for using the software and provide feedback to the development equipment in order that progress meets the needs of the regulator and industry; in this case the CNLV. Finally, in order to bridge the gap between similar developments globally, they will make use of the latest super computing technology to speed up calculation times. This work intends to present to national nuclear community the project, so a description of the proposed methodology is given, as well as the goals and objectives to be pursued for the development of the

  5. Verkenning locatiegegevens en sociale platforms

    NARCIS (Netherlands)

    van Loenen, B.; Kilic, Deniz; Van de Velde, Rob

    2017-01-01

    In deze rapportage hebben we ons verdiept in de wereld van de commerciële toepassingen van sensordata, waarin locatiegegevens van smartphones op grote schaal worden vastgelegd. Een beter inzicht in de wereld van de sociale en commerciële platforms zou ervoor moeten zorgen dat deze beter begrepen

  6. Platform pricing in matching markets

    NARCIS (Netherlands)

    Goos, M.; van Cayseele, P.; Willekens, B.

    2011-01-01

    This paper develops a simple model of monopoly platform pricing accounting for two pertinent features of matching markets. 1) The trading process is characterized by search and matching frictions implying limits to positive cross-side network effects and the presence of own-side congestion.

  7. Towards trustworthy health platform cloud

    NARCIS (Netherlands)

    Deng, M.; Nalin, M.; Petkovic, M.; Baroni, I.; Marco, A.; Jonker, W.; Petkovic, M.

    2012-01-01

    To address today’s major concerns of health service providers regarding security, resilience and data protection when moving on the cloud, we propose an approach to build a trustworthy healthcare platform cloud, based on a trustworthy cloud infrastructure. This paper first highlights the main

  8. Lessons from independence

    International Nuclear Information System (INIS)

    Hauptfuhrer, R.R.

    1990-01-01

    The recent history of Oryx provides invaluable lessons for those who plan future energy strategies, relates the author of this paper. When Oryx became an independent oil and gas company, its reserves were declining, its stock was selling below asset values, and the price of oil seemed stuck below $15 per barrel. The message from Oryx management to Oryx employees was: We are in charge of our own destiny. We are about to create our own future. Oryx had developed a new, positive corporate culture and the corporate credit required for growth. This paper points to two basic principles that have guided the metamorphosis in Oryx's performance. The first objective was to improve operational efficiency and to identify the right performance indicators to measure this improvement. It states that the most critical performance indicator for an exploration and production company must be replacement and expansion of reserves at a competitive replacement cost. Oryx has cut its finding costs from $12 to $5 per barrel, while the BP acquisition provided proven reserves at a cost of only $4 per barrel. Another performance indicator measures Oryx's standing in the financial markets

  9. Independents' group posts loss

    International Nuclear Information System (INIS)

    Sanders, V.; Price, R.B.

    1992-01-01

    Low oil gas prices and special charges caused the group of 50 U.S. independent producers Oil and Gas Journal tracks to post a combined loss in first half 1992. The group logged a net loss of $53 million in the first half compared with net earnings of $354 million in first half 1991, when higher oil prices during the Persian Gulf crisis buoyed earnings in spite of crude oil and natural gas production declines. The combined loss in the first half follows a 45% drop in the group's earnings in 1991 and compares with the OGJ group of integrated oil companies whose first half 1992 income fell 47% from the prior year. Special charges, generally related to asset writedowns, accounted for most of the almost $560 million in losses posted by about the third of the group. Nerco Oil and Gas Inc., Vancouver, Wash., alone accounted for almost half that total with charges related to an asset writedown of $238 million in the first quarter. Despite the poor first half performance, the outlook is bright for sharply improved group earnings in the second half, assuming reasonably healthy oil and gas prices and increased production resulting from acquisitions and in response to those prices

  10. PPARγ-Independent Mechanism

    Directory of Open Access Journals (Sweden)

    Christopher M. Hogan

    2011-01-01

    Full Text Available Acute and chronic lung inflammation is associated with numerous important disease pathologies including asthma, chronic obstructive pulmonary disease and silicosis. Lung fibroblasts are a novel and important target of anti-inflammatory therapy, as they orchestrate, respond to, and amplify inflammatory cascades and are the key cell in the pathogenesis of lung fibrosis. Peroxisome proliferator-activated receptor gamma (PPARγ ligands are small molecules that induce anti-inflammatory responses in a variety of tissues. Here, we report for the first time that PPARγ ligands have potent anti-inflammatory effects on human lung fibroblasts. 2-cyano-3, 12-dioxoolean-1, 9-dien-28-oic acid (CDDO and 15-deoxy-Δ12,14-prostaglandin J2 (15d-PGJ2 inhibit production of the inflammatory mediators interleukin-6 (IL-6, monocyte chemoattractant protein-1 (MCP-1, COX-2, and prostaglandin (PGE2 in primary human lung fibroblasts stimulated with either IL-1β or silica. The anti-inflammatory properties of these molecules are not blocked by the PPARγ antagonist GW9662 and thus are largely PPARγ independent. However, they are dependent on the presence of an electrophilic carbon. CDDO and 15d-PGJ2, but not rosiglitazone, inhibited NF-κB activity. These results demonstrate that CDDO and 15d-PGJ2 are potent attenuators of proinflammatory responses in lung fibroblasts and suggest that these molecules should be explored as the basis for novel, targeted anti-inflammatory therapies in the lung and other organs.

  11. A not-so-short description of the PERFECT platform

    International Nuclear Information System (INIS)

    Bugat, S.; Zeghadi, A.; Adjanor, G.

    2010-01-01

    This article describes the building of the so-called 'PERFECT platform', which main issue was to allow the development of the PERFECT end-products dedicated to the prediction of the degradation of material properties due to irradiation. First, the general principles used to build the platform are detailed. Such principles guided the choices of preferential development language, architecture, and operating system. The architecture of the platform is then described. It allows an easy development of the end-products, and a 'black-box' integration of the codes developed during the project. Each end-product can be seen as a sequence of modules, each module representing a physical phenomenon in time and space. The platform is very flexible, so that different methodologies can be tested and compared inside an end-product. The second part is devoted to the description of a classical PERFECT study, defined thanks to the graphical user interface developed in the project. Focus is made in particular on how a selection of modules is done, how the input data can be entered, and how the study execution is fully controlled by the user. A final description of the post-processing facilities on the results is exposed.

  12. Platform capitalism: The intermediation and capitalization of digital economic circulation

    Directory of Open Access Journals (Sweden)

    Paul Langley

    2017-10-01

    Full Text Available A new form of digital economic circulation has emerged, wherein ideas, knowledge, labour and use rights for otherwise idle assets move between geographically distributed but connected and interactive online communities. Such circulation is apparent across a number of digital economic ecologies, including social media, online marketplaces, crowdsourcing, crowdfunding and other manifestations of the so-called ‘sharing economy’. Prevailing accounts deploy concepts such as ‘co-production’, ‘prosumption’ and ‘peer-to-peer’ to explain digital economic circulation as networked exchange relations characterised by their disintermediated, collaborative and democratising qualities. Building from the neologism of platform capitalism, we place ‘the platform’ – understood as a distinct mode of socio-technical intermediary and business arrangement that is incorporated into wider processes of capitalisation – at the centre of the critical analysis of digital economic circulation. To create multi-sided markets and coordinate network effects, platforms enrol users through a participatory economic culture and mobilise code and data analytics to compose immanent infrastructures. Platform intermediation is also nested in the ex-post construction of a replicable business model. Prioritising rapid up-scaling and extracting revenues from circulations and associated data trails, the model performs the structure of venture capital investment which capitalises on the potential of platforms to realise monopoly rents.

  13. Validation of the reactor dynamics code TRAB

    International Nuclear Information System (INIS)

    Raety, H.; Kyrki-Rajamaeki, R.; Rajamaeki, M.

    1991-05-01

    The one-dimensional reactor dynamics code TRAB (Transient Analysis code for BWRs) developed at VTT was originally designed for BWR analyses, but it can in its present version be used for various modelling purposes. The core model of TRAB can be used separately for LWR calculations. For PWR modelling the core model of TRAB has been coupled to circuit model SMABRE to form the SMATRA code. The versatile modelling capabilities of TRAB have been utilized also in analyses of e.g. the heating reactor SECURE and the RBMK-type reactor (Chernobyl). The report summarizes the extensive validation of TRAB. TRAB has been validated with benchmark problems, comparative calculations against independent analyses, analyses of start-up experiments of nuclear power plants and real plant transients. Comparative RBMES type reactor calculations have been made against Soviet simulations and the initial power excursion of the Chernobyl reactor accident has also been calculated with TRAB

  14. Review of SKB's Code Documentation and Testing

    International Nuclear Information System (INIS)

    Hicks, T.W.

    2005-01-01

    safety assessment. The projects studied require that software is managed under a rigorous graded approach based on a software life-cycle methodology, with documentation requirements that include user's manuals and verification and validation documents. These requirements also include procedures for the use of external codes. Under the graded approach, reduced versions of the software life-cycle are adopted for simple codes, such as those that can be independently verified by inspection or hand calculation. SKB should provide details of its software QA procedures covering different categories of software (e.g., internal, commercial, academic, and simple codes). In order to gain greater understanding and confidence in, and become more familiar with SKB's codes, SKI could consider testing some of SKB's codes against its own codes. This would also serve as a useful background to any future sensitivity analyses that SKI might conduct with these codes. Further, SKI could review its own software QA procedures and the required extent of documentation and testing of its own codes

  15. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  16. Nuclear code abstracts (1975 edition)

    International Nuclear Information System (INIS)

    Akanuma, Makoto; Hirakawa, Takashi

    1976-02-01

    Nuclear Code Abstracts is compiled in the Nuclear Code Committee to exchange information of the nuclear code developments among members of the committee. Enlarging the collection, the present one includes nuclear code abstracts obtained in 1975 through liaison officers of the organizations in Japan participating in the Nuclear Energy Agency's Computer Program Library at Ispra, Italy. The classification of nuclear codes and the format of code abstracts are the same as those in the library. (auth.)

  17. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  18. Commonwealth of (Independent States

    Directory of Open Access Journals (Sweden)

    Vrućinić Dušan

    2013-01-01

    Full Text Available Following the stages from the establishment itself to the present day of the functioning of such a specific regional organization as the Commonwealth of Independent States (CIS, the article seeks to further explain the meaning of its existence, efficiency and functioning. The CIS was created in order to make the dissolution of a major world super-power, which throughout the 20th century together with the USA defined the bipolar world, as painless as possible, especially for the new countries and its nationally and ethnically diverse population. During the early years after the dissolution of the USSR, the CIS played a major role in a more flexible and less severe dissolution of the Soviet empire, alleviating the consequences for its people. A more efficient functioning among the republics in all fields was also one of the tasks of the Commonwealth, to which it was devoted to the extent which was permitted by the then, not too favourable circumstances. Difficult years of economic crisis did not allow the CIS to mutually integrate its members as much as possible on the economy level. Thanks to the economic recovery of the post-Soviet states in the early 21st century, the Commonwealth has also been transformed, reformed, and renewed, and all this in order to achieve better and more fruitful cooperation between the members. The CIS may serve as a proper example of how the former Soviet Union states are inextricably linked by social, security-political, economic, cultural, communication-transport, and other ties, thanks to the centuries-long existence of the peoples of these states in this area, despite both internal and external factors which occasionally, but temporarily halt the post-Soviet integration. Mathematically expressed, the CIS members are naturally predisposed, to be reciprocally depended on each other, just as they also have the capacity for successful cooperation in the future times and epochs brought on by the modern world.

  19. ACE - Manufacturer Identification Code (MID)

    Data.gov (United States)

    Department of Homeland Security — The ACE Manufacturer Identification Code (MID) application is used to track and control identifications codes for manufacturers. A manufacturer is identified on an...

  20. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  1. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  2. Adaptive distributed source coding.

    Science.gov (United States)

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  3. Media independence and dividend policy

    DEFF Research Database (Denmark)

    Farooq, Omar; Dandoune, Salma

    2012-01-01

    independence and dividend policies in emerging markets. Using a dataset from twenty three emerging markets, we show a significantly negative relationship between dividend policies (payout ratio and decision to pay dividend) and media independence. We argue that independent media reduces information asymmetries...... for stock market participants. Consequently, stock market participants in emerging markets with more independent media do not demand as high and as much dividends as their counterparts in emerging markets with less independent media. We also show that press independence is more important in defining......Can media pressurize managers to disgorge excess cash to shareholders? Do firms in countries with more independent media follow different dividend policies than firms with less independent media? This paper seeks to answer these questions and aims to document the relationship between media...

  4. Integration of DYN3D inside the NURESIM platform

    International Nuclear Information System (INIS)

    Gomez T, A. M.; Sanchez E, V. H.; Kliem, S.; Gommlich, A.; Rohde, U.

    2010-10-01

    The NURISP project (Nuclear Reactor Integrated Simulation Project) is focused on the further development of the European Nuclear Reactor Simulation (NURESIM) platform for advanced numerical reactor design and safety analysis tools. NURESIM is based on an open source platform - called SALOME - that offers flexible and powerful capabilities for pre- and post processing as well as for coupling of multi-physics and multi-scale solutions. The developments within the NURISP project are concentrated in the areas of reactors, physics, thermal hydraulics, multi-physics, and sensitivity and uncertainty methodologies. The aim is to develop experimentally validated advanced simulation tools including capabilities for uncertainty and sensitivity quantification. A unique feature of NURESIM is the flexibility in selecting the solvers for the area of interest and the interpolation and mapping schemes according to the problem under consideration. The Sub Project 3 (S P3) of NURISP is focused on the development of multi-physics methodologies at different scales and covering different physical fields (neutronics, thermal hydraulics and pin mechanics). One of the objectives of S P3 is the development of multi-physics methodologies beyond the state-of-the-art for improved prediction of local safety margins and design at pin-by-pin scale. The Karlsruhe Institute of Technology and the Research Center Dresden-Rossendorf are involved in the integration of the reactor dynamics code DYN3D into the SALOME platform for coupling with a thermal hydraulic sub-channel code (FLICA4) at fuel assembly and pin level. In this paper, the main capabilities of the SALOME platform, the steps for the integration process of DYN3D as well as selected preliminary results obtained for the DYN3D/FLICA4 coupling are presented and discussed. Finally the next steps for the validation of the coupling scheme at fuel assembly and pin basis are given. (Author)

  5. Spatially coded backscatter radiography

    International Nuclear Information System (INIS)

    Thangavelu, S.; Hussein, E.M.A.

    2007-01-01

    Conventional radiography requires access to two opposite sides of an object, which makes it unsuitable for the inspection of extended and/or thick structures (airframes, bridges, floors etc.). Backscatter imaging can overcome this problem, but the indications obtained are difficult to interpret. This paper applies the coded aperture technique to gamma-ray backscatter-radiography in order to enhance the detectability of flaws. This spatial coding method involves the positioning of a mask with closed and open holes to selectively permit or block the passage of radiation. The obtained coded-aperture indications are then mathematically decoded to detect the presence of anomalies. Indications obtained from Monte Carlo calculations were utilized in this work to simulate radiation scattering measurements. These simulated measurements were used to investigate the applicability of this technique to the detection of flaws by backscatter radiography

  6. Aztheca Code; Codigo Aztheca

    Energy Technology Data Exchange (ETDEWEB)

    Quezada G, S.; Espinosa P, G. [Universidad Autonoma Metropolitana, Unidad Iztapalapa, San Rafael Atlixco No. 186, Col. Vicentina, 09340 Ciudad de Mexico (Mexico); Centeno P, J.; Sanchez M, H., E-mail: sequga@gmail.com [UNAM, Facultad de Ingenieria, Ciudad Universitaria, Circuito Exterior s/n, 04510 Ciudad de Mexico (Mexico)

    2017-09-15

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  7. The Coding Question.

    Science.gov (United States)

    Gallistel, C R

    2017-07-01

    Recent electrophysiological results imply that the duration of the stimulus onset asynchrony in eyeblink conditioning is encoded by a mechanism intrinsic to the cerebellar Purkinje cell. This raises the general question - how is quantitative information (durations, distances, rates, probabilities, amounts, etc.) transmitted by spike trains and encoded into engrams? The usual assumption is that information is transmitted by firing rates. However, rate codes are energetically inefficient and computationally awkward. A combinatorial code is more plausible. If the engram consists of altered synaptic conductances (the usual assumption), then we must ask how numbers may be written to synapses. It is much easier to formulate a coding hypothesis if the engram is realized by a cell-intrinsic molecular mechanism. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Revised SRAC code system

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro; Ishiguro, Yukio; Kaneko, Kunio; Ido, Masaru.

    1986-09-01

    Since the publication of JAERI-1285 in 1983 for the preliminary version of the SRAC code system, a number of additions and modifications to the functions have been made to establish an overall neutronics code system. Major points are (1) addition of JENDL-2 version of data library, (2) a direct treatment of doubly heterogeneous effect on resonance absorption, (3) a generalized Dancoff factor, (4) a cell calculation based on the fixed boundary source problem, (5) the corresponding edit required for experimental analysis and reactor design, (6) a perturbation theory calculation for reactivity change, (7) an auxiliary code for core burnup and fuel management, etc. This report is a revision of the users manual which consists of the general description, input data requirements and their explanation, detailed information on usage, mathematics, contents of libraries and sample I/O. (author)

  9. Code query by example

    Science.gov (United States)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  10. Multilevel LDPC Codes Design for Multimedia Communication CDMA System

    Directory of Open Access Journals (Sweden)

    Hou Jia

    2004-01-01

    Full Text Available We design multilevel coding (MLC with a semi-bit interleaved coded modulation (BICM scheme based on low density parity check (LDPC codes. Different from the traditional designs, we joined the MLC and BICM together by using the Gray mapping, which is suitable to transmit the data over several equivalent channels with different code rates. To perform well at signal-to-noise ratio (SNR to be very close to the capacity of the additive white Gaussian noise (AWGN channel, random regular LDPC code and a simple semialgebra LDPC (SA-LDPC code are discussed in MLC with parallel independent decoding (PID. The numerical results demonstrate that the proposed scheme could achieve both power and bandwidth efficiency.

  11. NORTICA—a new code for cyclotron analysis

    Science.gov (United States)

    Gorelov, D.; Johnson, D.; Marti, F.

    2001-12-01

    The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER [1] developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state.

  12. The correspondence between projective codes and 2-weight codes

    NARCIS (Netherlands)

    Brouwer, A.E.; Eupen, van M.J.M.; Tilborg, van H.C.A.; Willems, F.M.J.

    1994-01-01

    The hyperplanes intersecting a 2-weight code in the same number of points obviously form the point set of a projective code. On the other hand, if we have a projective code C, then we can make a 2-weight code by taking the multiset of points E PC with multiplicity "Y(w), where W is the weight of

  13. Visualizing code and coverage changes for code review

    NARCIS (Netherlands)

    Oosterwaal, Sebastiaan; van Deursen, A.; De Souza Coelho, R.; Sawant, A.A.; Bacchelli, A.

    2016-01-01

    One of the tasks of reviewers is to verify that code modifications are well tested. However, current tools offer little support in understanding precisely how changes to the code relate to changes to the tests. In particular, it is hard to see whether (modified) test code covers the changed code.

  14. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    Today, both turbo codes and low-density parity-check codes are largely superior to other code families and are being used in an increasing number of modern communication systems including 3G standards, satellite and deep space communications. However, the two codes have certain distinctive characteristics that ...

  15. The Overshoot Phenomenon in Geodynamics Codes

    Science.gov (United States)

    Kommu, R. K.; Heien, E. M.; Kellogg, L. H.; Bangerth, W.; Heister, T.; Studley, E. H.

    2013-12-01

    The overshoot phenomenon is a common occurrence in numerical software when a continuous function on a finite dimensional discretized space is used to approximate a discontinuous jump, in temperature and material concentration, for example. The resulting solution overshoots, and undershoots, the discontinuous jump. Numerical simulations play an extremely important role in mantle convection research. This is both due to the strong temperature and stress dependence of viscosity and also due to the inaccessibility of deep earth. Under these circumstances, it is essential that mantle convection simulations be extremely accurate and reliable. CitcomS and ASPECT are two finite element based mantle convection simulations developed and maintained by the Computational Infrastructure for Geodynamics. CitcomS is a finite element based mantle convection code that is designed to run on multiple high-performance computing platforms. ASPECT, an adaptive mesh refinement (AMR) code built on the Deal.II library, is also a finite element based mantle convection code that scales well on various HPC platforms. CitcomS and ASPECT both exhibit the overshoot phenomenon. One attempt at controlling the overshoot uses the Entropy Viscosity method, which introduces an artificial diffusion term in the energy equation of mantle convection. This artificial diffusion term is small where the temperature field is smooth. We present results from CitcomS and ASPECT that quantify the effect of the Entropy Viscosity method in reducing the overshoot phenomenon. In the discontinuous Galerkin (DG) finite element method, the test functions used in the method are continuous within each element but are discontinuous across inter-element boundaries. The solution space in the DG method is discontinuous. FEniCS is a collection of free software tools that automate the solution of differential equations using finite element methods. In this work we also present results from a finite element mantle convection

  16. Independent Mobility Achieved through a Wireless Brain-Machine Interface.

    Directory of Open Access Journals (Sweden)

    Camilo Libedinsky

    Full Text Available Individuals with tetraplegia lack independent mobility, making them highly dependent on others to move from one place to another. Here, we describe how two macaques were able to use a wireless integrated system to control a robotic platform, over which they were sitting, to achieve independent mobility using the neuronal activity in their motor cortices. The activity of populations of single neurons was recorded using multiple electrode arrays implanted in the arm region of primary motor cortex, and decoded to achieve brain control of the platform. We found that free-running brain control of the platform (which was not equipped with any machine intelligence was fast and accurate, resembling the performance achieved using joystick control. The decoding algorithms can be trained in the absence of joystick movements, as would be required for use by tetraplegic individuals, demonstrating that the non-human primate model is a good pre-clinical model for developing such a cortically-controlled movement prosthetic. Interestingly, we found that the response properties of some neurons differed greatly depending on the mode of control (joystick or brain control, suggesting different roles for these neurons in encoding movement intention and movement execution. These results demonstrate that independent mobility can be achieved without first training on prescribed motor movements, opening the door for the implementation of this technology in persons with tetraplegia.

  17. Code of Medical Ethics

    Directory of Open Access Journals (Sweden)

    . SZD-SZZ

    2017-03-01

    Full Text Available Te Code was approved on December 12, 1992, at the 3rd regular meeting of the General Assembly of the Medical Chamber of Slovenia and revised on April 24, 1997, at the 27th regular meeting of the General Assembly of the Medical Chamber of Slovenia. The Code was updated and harmonized with the Medical Association of Slovenia and approved on October 6, 2016, at the regular meeting of the General Assembly of the Medical Chamber of Slovenia.

  18. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed

    2018-04-08

    Convolutional Sparse Coding (CSC) is a well-established image representation model especially suited for image restoration tasks. In this work, we extend the applicability of this model by proposing a supervised approach to convolutional sparse coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements to be discriminative. Experimental results show that using supervised convolutional learning results in two key advantages. First, we learn more semantically relevant filters in the dictionary and second, we achieve improved image reconstruction on unseen data.

  19. CONCEPT computer code

    International Nuclear Information System (INIS)

    Delene, J.

    1984-01-01

    CONCEPT is a computer code that will provide conceptual capital investment cost estimates for nuclear and coal-fired power plants. The code can develop an estimate for construction at any point in time. Any unit size within the range of about 400 to 1300 MW electric may be selected. Any of 23 reference site locations across the United States and Canada may be selected. PWR, BWR, and coal-fired plants burning high-sulfur and low-sulfur coal can be estimated. Multiple-unit plants can be estimated. Costs due to escalation/inflation and interest during construction are calculated

  20. Principles of speech coding

    CERN Document Server

    Ogunfunmi, Tokunbo

    2010-01-01

    It is becoming increasingly apparent that all forms of communication-including voice-will be transmitted through packet-switched networks based on the Internet Protocol (IP). Therefore, the design of modern devices that rely on speech interfaces, such as cell phones and PDAs, requires a complete and up-to-date understanding of the basics of speech coding. Outlines key signal processing algorithms used to mitigate impairments to speech quality in VoIP networksOffering a detailed yet easily accessible introduction to the field, Principles of Speech Coding provides an in-depth examination of the