WorldWideScience

Sample records for executive codes

  1. Model-Driven Engineering of Machine Executable Code

    Science.gov (United States)

    Eichberg, Michael; Monperrus, Martin; Kloppenburg, Sven; Mezini, Mira

    Implementing static analyses of machine-level executable code is labor intensive and complex. We show how to leverage model-driven engineering to facilitate the design and implementation of programs doing static analyses. Further, we report on important lessons learned on the benefits and drawbacks while using the following technologies: using the Scala programming language as target of code generation, using XML-Schema to express a metamodel, and using XSLT to implement (a) transformations and (b) a lint like tool. Finally, we report on the use of Prolog for writing model transformations.

  2. Code of conduct for non-executive directors and supervisors

    NARCIS (Netherlands)

    Lückerath – Rovers, M.; Bos, de A.

    2011-01-01

    After the corporate scandals at the beginning of the new millennium, corporate governance codes were drafted and implemented in national laws and regulations. Unfortunately, due to an ongoing supply of new financial scandals and societal deceptions, our society increasingly distrusts executive

  3. Provenance metadata gathering and cataloguing of EFIT++ code execution

    Energy Technology Data Exchange (ETDEWEB)

    Lupelli, I., E-mail: ivan.lupelli@ccfe.ac.uk [CCFE, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Muir, D.G.; Appel, L.; Akers, R.; Carr, M. [CCFE, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Abreu, P. [Instituto de Plasmas e Fusão Nuclear, Instituto Superior Técnico, Universidade de Lisboa, 1049-001 Lisboa (Portugal)

    2015-10-15

    Highlights: • An approach for automatic gathering of provenance metadata has been presented. • A provenance metadata catalogue has been created. • The overhead in the code runtime is less than 10%. • The metadata/data size ratio is about ∼20%. • A visualization interface based on Gephi, has been presented. - Abstract: Journal publications, as the final product of research activity, are the result of an extensive complex modeling and data analysis effort. It is of paramount importance, therefore, to capture the origins and derivation of the published data in order to achieve high levels of scientific reproducibility, transparency, internal and external data reuse and dissemination. The consequence of the modern research paradigm is that high performance computing and data management systems, together with metadata cataloguing, have become crucial elements within the nuclear fusion scientific data lifecycle. This paper describes an approach to the task of automatically gathering and cataloguing provenance metadata, currently under development and testing at Culham Center for Fusion Energy. The approach is being applied to a machine-agnostic code that calculates the axisymmetric equilibrium force balance in tokamaks, EFIT++, as a proof of principle test. The proposed approach avoids any code instrumentation or modification. It is based on the observation and monitoring of input preparation, workflow and code execution, system calls, log file data collection and interaction with the version control system. Pre-processing, post-processing, and data export and storage are monitored during the code runtime. Input data signals are captured using a data distribution platform called IDAM. The final objective of the catalogue is to create a complete description of the modeling activity, including user comments, and the relationship between data output, the main experimental database and the execution environment. For an intershot or post-pulse analysis (∼1000

  4. Provenance metadata gathering and cataloguing of EFIT++ code execution

    International Nuclear Information System (INIS)

    Lupelli, I.; Muir, D.G.; Appel, L.; Akers, R.; Carr, M.; Abreu, P.

    2015-01-01

    Highlights: • An approach for automatic gathering of provenance metadata has been presented. • A provenance metadata catalogue has been created. • The overhead in the code runtime is less than 10%. • The metadata/data size ratio is about ∼20%. • A visualization interface based on Gephi, has been presented. - Abstract: Journal publications, as the final product of research activity, are the result of an extensive complex modeling and data analysis effort. It is of paramount importance, therefore, to capture the origins and derivation of the published data in order to achieve high levels of scientific reproducibility, transparency, internal and external data reuse and dissemination. The consequence of the modern research paradigm is that high performance computing and data management systems, together with metadata cataloguing, have become crucial elements within the nuclear fusion scientific data lifecycle. This paper describes an approach to the task of automatically gathering and cataloguing provenance metadata, currently under development and testing at Culham Center for Fusion Energy. The approach is being applied to a machine-agnostic code that calculates the axisymmetric equilibrium force balance in tokamaks, EFIT++, as a proof of principle test. The proposed approach avoids any code instrumentation or modification. It is based on the observation and monitoring of input preparation, workflow and code execution, system calls, log file data collection and interaction with the version control system. Pre-processing, post-processing, and data export and storage are monitored during the code runtime. Input data signals are captured using a data distribution platform called IDAM. The final objective of the catalogue is to create a complete description of the modeling activity, including user comments, and the relationship between data output, the main experimental database and the execution environment. For an intershot or post-pulse analysis (∼1000

  5. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    Science.gov (United States)

    Smith, L. M.; Hochstedler, R. D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of the accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).

  6. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    International Nuclear Information System (INIS)

    Smith, L.M.; Hochstedler, R.D.

    1997-01-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of the accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code)

  7. Transformation of Graphical ECA Policies into Executable PonderTalk Code

    Science.gov (United States)

    Romeikat, Raphael; Sinsel, Markus; Bauer, Bernhard

    Rules are becoming more and more important in business modeling and systems engineering and are recognized as a high-level programming paradigma. For the effective development of rules it is desired to start at a high level, e.g. with graphical rules, and to refine them into code of a particular rule language for implementation purposes later. An model-driven approach is presented in this paper to transform graphical rules into executable code in a fully automated way. The focus is on event-condition-action policies as a special rule type. These are modeled graphically and translated into the PonderTalk language. The approach may be extended to integrate other rule types and languages as well.

  8. Combining loop unrolling strategies and code predication to reduce the worst-case execution time of real-time software

    Directory of Open Access Journals (Sweden)

    Andreu Carminati

    2017-07-01

    Full Text Available Worst-case execution time (WCET is a parameter necessary to guarantee timing constraints on real-time systems. The higher the worst-case execution time of tasks, the higher will be the resource demand for the associated system. The goal of this paper is to propose a different way to perform loop unrolling on data-dependent loops using code predication targeting WCET reduction, because existing techniques only consider loops with fixed execution counts. We also combine our technique with existing unrolling approaches. Results showed that this combination can produce aggressive WCET reductions when compared with the original code.

  9. Fast and Safe Concrete Code Execution for Reinforcing Static Analysis and Verification

    Directory of Open Access Journals (Sweden)

    M. Belyaev

    2015-01-01

    Full Text Available The problem of improving precision of static analysis and verification techniques for C is hard due to simplification assumptions these techniques make about the code model. We present a novel approach to improving precision by executing the code model in a controlled environment that captures program errors and contract violations in a memory and time efficient way. We implemented this approach as an executor module Tassadar as a part of bounded model checker Borealis. We tested Tassadar on two test sets, showing that its impact on performance of Borealis is minimal.The article is published in the authors’ wording.

  10. ARC Code TI: IPG Execution Service

    Data.gov (United States)

    National Aeronautics and Space Administration — The Execution Service allows users to submit, monitor, and cancel complex jobs. Each job consists of a set of tasks that perform actions such as executing...

  11. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    Directory of Open Access Journals (Sweden)

    Almeida Jonas S

    2006-03-01

    Full Text Available Abstract Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else. Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web

  12. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    Science.gov (United States)

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over

  13. Verification of the Korsar code on results of experiments executed on the PSB-VVER facility

    International Nuclear Information System (INIS)

    Roginskaya, V.L.; Pylev, S.S.; Elkin, I.V.

    2005-01-01

    Full text of publication follows: Paper represents some results of computational research executed within the framework of verification of the KORSAR thermal hydraulic code. This code was designed in the NITI by A.P. Aleksandrov (Russia). The general purpose of the work was development of a nodding scheme of the PSB-VVER integral facility, scheme testing and computational modelling of the experiment 'The PSB-VVER Natural Circulation Test With Stepwise Reduction of the Primary Inventory'. The NC test has been performed within the framework of the OECD PSB-VVER Project (task no. 3). This Project is focused upon the provision of experimental data for codes assessment with regard to VVER analysis. Paper presents a nodding scheme of the PSB-VVER facility and results of pre- and post-test calculations of the specified experiment, obtained with the KORSAR code. The experiment data and the KORSAR pre-test calculation results are in good agreement. A post-test calculation of the experiment with KORSAR code has been performed in order to assess the code capability to simulate the phenomena relevant to the test. The code showed a reasonable prediction of the phenomena measured in the experiment. (authors)

  14. Verification of the Korsar code on results of experiments executed on the PSB-VVER facility

    Energy Technology Data Exchange (ETDEWEB)

    Roginskaya, V.L.; Pylev, S.S.; Elkin, I.V. [NSI RRC ' Kurchatov Institute' , Kurchatov Sq., 1, Moscow, 123182 (Russian Federation)

    2005-07-01

    Full text of publication follows: Paper represents some results of computational research executed within the framework of verification of the KORSAR thermal hydraulic code. This code was designed in the NITI by A.P. Aleksandrov (Russia). The general purpose of the work was development of a nodding scheme of the PSB-VVER integral facility, scheme testing and computational modelling of the experiment 'The PSB-VVER Natural Circulation Test With Stepwise Reduction of the Primary Inventory'. The NC test has been performed within the framework of the OECD PSB-VVER Project (task no. 3). This Project is focused upon the provision of experimental data for codes assessment with regard to VVER analysis. Paper presents a nodding scheme of the PSB-VVER facility and results of pre- and post-test calculations of the specified experiment, obtained with the KORSAR code. The experiment data and the KORSAR pre-test calculation results are in good agreement. A post-test calculation of the experiment with KORSAR code has been performed in order to assess the code capability to simulate the phenomena relevant to the test. The code showed a reasonable prediction of the phenomena measured in the experiment. (authors)

  15. Self-assembled software and method of overriding software execution

    Science.gov (United States)

    Bouchard, Ann M.; Osbourn, Gordon C.

    2013-01-08

    A computer-implemented software self-assembled system and method for providing an external override and monitoring capability to dynamically self-assembling software containing machines that self-assemble execution sequences and data structures. The method provides an external override machine that can be introduced into a system of self-assembling machines while the machines are executing such that the functionality of the executing software can be changed or paused without stopping the code execution and modifying the existing code. Additionally, a monitoring machine can be introduced without stopping code execution that can monitor specified code execution functions by designated machines and communicate the status to an output device.

  16. The role of the PIRT process in identifying code improvements and executing code development

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1997-01-01

    In September 1988, the USNRC issued a revised ECCS rule for light water reactors that allows, as an option, the use of best estimate (BE) plus uncertainty methods in safety analysis. The key feature of this licensing option relates to quantification of the uncertainty in the determination that an NPP has a low probability of violating the safety criteria specified in 10 CFR 50. To support the 1988 licensing revision, the USNRC and its contractors developed the CSAU evaluation methodology to demonstrate the feasibility of the BE plus uncertainty approach. The PIRT process, Step 3 in the CSAU methodology, was originally formulated to support the BE plus uncertainty licensing option as executed in the CSAU approach to safety analysis. Subsequent work has shown the PIRT process to be a much more powerful tool than conceived in its original form. Through further development and application, the PIRT process has shown itself to be a robust means to establish safety analysis computer code phenomenological requirements in their order of importance to such analyses. Used early in research directed toward these objectives, PIRT results also provide the technical basis and cost effective organization for new experimental programs needed to improve the safety analysis codes for new applications. The primary purpose of this paper is to describe the generic PIRT process, including typical and common illustrations from prior applications. The secondary objective is to provide guidance to future applications of the process to help them focus, in a graded approach, on systems, components, processes and phenomena that have been common in several prior applications

  17. The role of the PIRT process in identifying code improvements and executing code development

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G.E. [Idaho National Engineering Lab., Idaho Falls, ID (United States); Boyack, B.E. [Los Alamos National Lab., NM (United States)

    1997-07-01

    In September 1988, the USNRC issued a revised ECCS rule for light water reactors that allows, as an option, the use of best estimate (BE) plus uncertainty methods in safety analysis. The key feature of this licensing option relates to quantification of the uncertainty in the determination that an NPP has a {open_quotes}low{close_quotes} probability of violating the safety criteria specified in 10 CFR 50. To support the 1988 licensing revision, the USNRC and its contractors developed the CSAU evaluation methodology to demonstrate the feasibility of the BE plus uncertainty approach. The PIRT process, Step 3 in the CSAU methodology, was originally formulated to support the BE plus uncertainty licensing option as executed in the CSAU approach to safety analysis. Subsequent work has shown the PIRT process to be a much more powerful tool than conceived in its original form. Through further development and application, the PIRT process has shown itself to be a robust means to establish safety analysis computer code phenomenological requirements in their order of importance to such analyses. Used early in research directed toward these objectives, PIRT results also provide the technical basis and cost effective organization for new experimental programs needed to improve the safety analysis codes for new applications. The primary purpose of this paper is to describe the generic PIRT process, including typical and common illustrations from prior applications. The secondary objective is to provide guidance to future applications of the process to help them focus, in a graded approach, on systems, components, processes and phenomena that have been common in several prior applications.

  18. Vectorization vs. compilation in query execution

    NARCIS (Netherlands)

    J. Sompolski (Juliusz); M. Zukowski (Marcin); P.A. Boncz (Peter)

    2011-01-01

    textabstractCompiling database queries into executable (sub-) programs provides substantial benefits comparing to traditional interpreted execution. Many of these benefits, such as reduced interpretation overhead, better instruction code locality, and providing opportunities to use SIMD

  19. Numerical Analysis of Diaphragm Wall Model Executed in Poznań Clay Formation Applying Selected Fem Codes

    Directory of Open Access Journals (Sweden)

    Superczyńska M.

    2016-09-01

    Full Text Available The paper presents results of numerical calculations of a diaphragm wall model executed in Poznań clay formation. Two selected FEM codes were applied, Plaxis and Abaqus. Geological description of Poznań clay formation in Poland as well as geotechnical conditions on construction site in Warsaw city area were presented. The constitutive models of clay implemented both in Plaxis and Abaqus were discussed. The parameters of the Poznań clay constitutive models were assumed based on authors’ experimental tests. The results of numerical analysis were compared taking into account the measured values of horizontal displacements.

  20. Execute-Only Attacks against Execute-Only Defenses

    Science.gov (United States)

    2016-02-18

    strongest implementations of execute-only defenses: it exploits novel hardware features to incorporate non-readable code to prevent direct information...build two proof-of- concept exploits that can achieve control flow hijacking on a system protected by full- featured Readactor. • We evaluate the...According to the CVE [31], 123 such arbitrary read vulnerabilities were reported between January and September of 2015, in Firefox (CVE-2015-4495

  1. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  2. Data processing with microcode designed with source coding

    Science.gov (United States)

    McCoy, James A; Morrison, Steven E

    2013-05-07

    Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.

  3. Acceleration of a Monte Carlo radiation transport code

    International Nuclear Information System (INIS)

    Hochstedler, R.D.; Smith, L.M.

    1996-01-01

    Execution time for the Integrated TIGER Series (ITS) Monte Carlo radiation transport code has been reduced by careful re-coding of computationally intensive subroutines. Three test cases for the TIGER (1-D slab geometry), CYLTRAN (2-D cylindrical geometry), and ACCEPT (3-D arbitrary geometry) codes were identified and used to benchmark and profile program execution. Based upon these results, sixteen top time-consuming subroutines were examined and nine of them modified to accelerate computations with equivalent numerical output to the original. The results obtained via this study indicate that speedup factors of 1.90 for the TIGER code, 1.67 for the CYLTRAN code, and 1.11 for the ACCEPT code are achievable. copyright 1996 American Institute of Physics

  4. Detecting Malicious Code by Binary File Checking

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2014-01-01

    Full Text Available The object, library and executable code is stored in binary files. Functionality of a binary file is altered when its content or program source code is changed, causing undesired effects. A direct content change is possible when the intruder knows the structural information of the binary file. The paper describes the structural properties of the binary object files, how the content can be controlled by a possible intruder and what the ways to identify malicious code in such kind of files. Because the object files are inputs in linking processes, early detection of the malicious content is crucial to avoid infection of the binary executable files.

  5. PREREM: an interactive data preprocessing code for INREM II. Part I: user's manual. Part II: code structure

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, M.T.; Fields, D.E.

    1981-05-01

    PREREM is an interactive computer code developed as a data preprocessor for the INREM-II (Killough, Dunning, and Pleasant, 1978a) internal dose program. PREREM is intended to provide easy access to current and self-consistent nuclear decay and radionuclide-specific metabolic data sets. Provision is made for revision of metabolic data, and the code is intended for both production and research applications. Documentation for the code is in two parts. Part I is a user's manual which emphasizes interpretation of program prompts and choice of user input. Part II stresses internal structure and flow of program control and is intended to assist the researcher who wishes to revise or modify the code or add to its capabilities. PREREM is written for execution on a Digital Equipment Corporation PDP-10 System and much of the code will require revision before it can be run on other machines. The source program length is 950 lines (116 blocks) and computer core required for execution is 212 K bytes. The user must also have sufficient file space for metabolic and S-factor data sets. Further, 64 100 K byte blocks of computer storage space are required for the nuclear decay data file. Computer storage space must also be available for any output files produced during the PREREM execution. 9 refs., 8 tabs.

  6. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  7. Improving performance of single-path code through a time-predictable memory hierarchy

    DEFF Research Database (Denmark)

    Cilku, Bekim; Puffitsch, Wolfgang; Prokesch, Daniel

    2017-01-01

    -predictable memory hierarchy with a prefetcher that exploits the predictability of execution traces in single-path code to speed up code execution. The new memory hierarchy reduces both the cache-miss penalty time and the cache-miss rate on the instruction cache. The benefit of the approach is demonstrated through...

  8. SWAT4.0 - The integrated burnup code system driving continuous energy Monte Carlo codes MVP, MCNP and deterministic calculation code SRAC

    International Nuclear Information System (INIS)

    Kashima, Takao; Suyama, Kenya; Takada, Tomoyuki

    2015-03-01

    There have been two versions of SWAT depending on details of its development history: the revised SWAT that uses the deterministic calculation code SRAC as a neutron transportation solver, and the SWAT3.1 that uses the continuous energy Monte Carlo code MVP or MCNP5 for the same purpose. It takes several hours, however, to execute one calculation by the continuous energy Monte Carlo code even on the super computer of the Japan Atomic Energy Agency. Moreover, two-dimensional burnup calculation is not practical using the revised SWAT because it has problems on production of effective cross section data and applying them to arbitrary fuel geometry when a calculation model has multiple burnup zones. Therefore, SWAT4.0 has been developed by adding, to SWAT3.1, a function to utilize the deterministic code SARC2006, which has shorter calculation time, as an outer module of neutron transportation solver for burnup calculation. SWAT4.0 has been enabled to execute two-dimensional burnup calculation by providing an input data template of SRAC2006 to SWAT4.0 input data, and updating atomic number densities of burnup zones in each burnup step. This report describes outline, input data instruction, and examples of calculations of SWAT4.0. (author)

  9. Computer code ANISN multiplying media and shielding calculation 2. Code description (input/output)

    International Nuclear Information System (INIS)

    Maiorino, J.R.

    1991-01-01

    The new code CCC-0514-ANISN/PC is described, as well as a ''GENERAL DESCRIPTION OF ANISN/PC code''. In addition to the ANISN/PC code, the transmittal package includes an interactive input generation programme called APE (ANISN Processor and Evaluator), which facilitates the work of the user in giving input. Also, a 21 group photon cross section master library FLUNGP.LIB in ISOTX format, which can be edited by an executable file LMOD.EXE, is included in the package. The input and output subroutines are reviewed. 6 refs, 1 fig., 1 tab

  10. Assessment of Recovery of Damages in the New Romanian Civil Code

    Directory of Open Access Journals (Sweden)

    Ion Țuțuianu

    2016-01-01

    Full Text Available AbstractThe subject’s approach is required also because, once adopted the New Civil Code, it acquired a new juridical frame, but also a new perspective. A common law creditor who does not obtain the direct execution  of his obligation is entitled to be compensated for the damage caused by the non-execution  with an amount of money which is equivalent to the benefit that the exact, total, and duly execution  of the obligation would have brought the creditor.Keywords:  interest, damages, civil code, juridical responsibility

  11. Electronic Code of Federal Regulations

    Data.gov (United States)

    National Archives and Records Administration — The Electronic Code of Federal Regulations (e-CFR) is the codification of the general and permanent rules published in the Federal Register by the executive...

  12. Hide and Seek: Exploiting and Hardening Leakage-Resilient Code Randomization

    Science.gov (United States)

    2016-05-30

    HMACs generated us- ing 128-bit AES encryption . We do not use AES en- cryption to generate HMACs due to its high overhead; the authors of CCFI report...execute-only permissions on memory accesses, (ii) code pointer hid- ing (e.g., indirection or encryption ), and (iii) decoys (e.g., booby traps). Among...lowing techniques: they a) enforce execute-only permis- sions on code pages to mitigate direct information leak- age, b) introduce an encryption or

  13. Parallelization of 2-D lattice Boltzmann codes

    International Nuclear Information System (INIS)

    Suzuki, Soichiro; Kaburaki, Hideo; Yokokawa, Mitsuo.

    1996-03-01

    Lattice Boltzmann (LB) codes to simulate two dimensional fluid flow are developed on vector parallel computer Fujitsu VPP500 and scalar parallel computer Intel Paragon XP/S. While a 2-D domain decomposition method is used for the scalar parallel LB code, a 1-D domain decomposition method is used for the vector parallel LB code to be vectorized along with the axis perpendicular to the direction of the decomposition. High parallel efficiency of 95.1% by the vector parallel calculation on 16 processors with 1152x1152 grid and 88.6% by the scalar parallel calculation on 100 processors with 800x800 grid are obtained. The performance models are developed to analyze the performance of the LB codes. It is shown by our performance models that the execution speed of the vector parallel code is about one hundred times faster than that of the scalar parallel code with the same number of processors up to 100 processors. We also analyze the scalability in keeping the available memory size of one processor element at maximum. Our performance model predicts that the execution time of the vector parallel code increases about 3% on 500 processors. Although the 1-D domain decomposition method has in general a drawback in the interprocessor communication, the vector parallel LB code is still suitable for the large scale and/or high resolution simulations. (author)

  14. Parallelization of 2-D lattice Boltzmann codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Soichiro; Kaburaki, Hideo; Yokokawa, Mitsuo

    1996-03-01

    Lattice Boltzmann (LB) codes to simulate two dimensional fluid flow are developed on vector parallel computer Fujitsu VPP500 and scalar parallel computer Intel Paragon XP/S. While a 2-D domain decomposition method is used for the scalar parallel LB code, a 1-D domain decomposition method is used for the vector parallel LB code to be vectorized along with the axis perpendicular to the direction of the decomposition. High parallel efficiency of 95.1% by the vector parallel calculation on 16 processors with 1152x1152 grid and 88.6% by the scalar parallel calculation on 100 processors with 800x800 grid are obtained. The performance models are developed to analyze the performance of the LB codes. It is shown by our performance models that the execution speed of the vector parallel code is about one hundred times faster than that of the scalar parallel code with the same number of processors up to 100 processors. We also analyze the scalability in keeping the available memory size of one processor element at maximum. Our performance model predicts that the execution time of the vector parallel code increases about 3% on 500 processors. Although the 1-D domain decomposition method has in general a drawback in the interprocessor communication, the vector parallel LB code is still suitable for the large scale and/or high resolution simulations. (author).

  15. NSURE code

    International Nuclear Information System (INIS)

    Rattan, D.S.

    1993-11-01

    NSURE stands for Near-Surface Repository code. NSURE is a performance assessment code. developed for the safety assessment of near-surface disposal facilities for low-level radioactive waste (LLRW). Part one of this report documents the NSURE model, governing equations and formulation of the mathematical models, and their implementation under the SYVAC3 executive. The NSURE model simulates the release of nuclides from an engineered vault, their subsequent transport via the groundwater and surface water pathways tot he biosphere, and predicts the resulting dose rate to a critical individual. Part two of this report consists of a User's manual, describing simulation procedures, input data preparation, output and example test cases

  16. A Case for Dynamic Reverse-code Generation to Debug Non-deterministic Programs

    Directory of Open Access Journals (Sweden)

    Jooyong Yi

    2013-09-01

    Full Text Available Backtracking (i.e., reverse execution helps the user of a debugger to naturally think backwards along the execution path of a program, and thinking backwards makes it easy to locate the origin of a bug. So far backtracking has been implemented mostly by state saving or by checkpointing. These implementations, however, inherently do not scale. Meanwhile, a more recent backtracking method based on reverse-code generation seems promising because executing reverse code can restore the previous states of a program without state saving. In the literature, there can be found two methods that generate reverse code: (a static reverse-code generation that pre-generates reverse code through static analysis before starting a debugging session, and (b dynamic reverse-code generation that generates reverse code by applying dynamic analysis on the fly during a debugging session. In particular, we espoused the latter one in our previous work to accommodate non-determinism of a program caused by e.g., multi-threading. To demonstrate the usefulness of our dynamic reverse-code generation, this article presents a case study of various backtracking methods including ours. We compare the memory usage of various backtracking methods in a simple but nontrivial example, a bounded-buffer program. In the case of non-deterministic programs such as this bounded-buffer program, our dynamic reverse-code generation outperforms the existing backtracking methods in terms of memory efficiency.

  17. 12 CFR 1710.14 - Code of conduct and ethics.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Code of conduct and ethics. 1710.14 Section... Code of conduct and ethics. (a) General. An Enterprise shall establish and administer a written code of conduct and ethics that is reasonably designed to assure the ability of board members, executive officers...

  18. Code of Federal Regulations in XML

    Data.gov (United States)

    National Archives and Records Administration — The Code of Federal Regulations (CFR) is the codification of the general and permanent rules published in the Federal Register by the executive departments and...

  19. Lean and Efficient Software: Whole Program Optimization of Executables

    Science.gov (United States)

    2016-12-31

    19b. TELEPHONE NUMBER (Include area code) 12/31/2016 Final Technical Report (Phase I - Base Period) 30-06-2014 - 31-12-2016 Lean and Efficient...Software: Whole-Program Optimization of Executables Final Report Evan Driscoll Tom Johnson GrammaTech, Inc. 531 Esty Street Ithaca, NY 14850 Office of...hardening U U U UU 30 Tom Johnson (607) 273-7340 x.134 Page 1 of 30 “ Lean and Efficient Software: Whole-Program Optimization of Executables

  20. KENO-V code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P 1 scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes

  1. A Semantics for Distributed Execution of Statemate

    DEFF Research Database (Denmark)

    Fränzle, Martin; Niehaus, Jürgen; Metzner, Alexander

    2003-01-01

    We present a semantics for the statechart variant implemented in the Statemate product of i-Logix. Our semantics enables distributed code generation for Statemate models in the context of rapid prototyping for embedded control applications. We argue that it seems impossible to efficiently generate......, the changes made regarding the interaction of distributed model parts are similar to the interaction between the model and its environment in the original semantics, thus giving designers a familiar execution model. The semantics has been implemented in Grace, a framework for rapid prototyping code generation...... distributed code using the original Statemate semantics. The new, distributed semantics has the advantages that, first, it enables the generation of efficient distributed code, second, it preserves many aspects of the original semantics for those parts of a model that are not distributed, and third...

  2. GridRun: A lightweight packaging and execution environment forcompact, multi-architecture binaries

    Energy Technology Data Exchange (ETDEWEB)

    Shalf, John; Goodale, Tom

    2004-02-01

    GridRun offers a very simple set of tools for creating and executing multi-platform binary executables. These ''fat-binaries'' archive native machine code into compact packages that are typically a fraction the size of the original binary images they store, enabling efficient staging of executables for heterogeneous parallel jobs. GridRun interoperates with existing distributed job launchers/managers like Condor and the Globus GRAM to greatly simplify the logic required launching native binary applications in distributed heterogeneous environments.

  3. A solution for automatic parallelization of sequential assembly code

    Directory of Open Access Journals (Sweden)

    Kovačević Đorđe

    2013-01-01

    Full Text Available Since modern multicore processors can execute existing sequential programs only on a single core, there is a strong need for automatic parallelization of program code. Relying on existing algorithms, this paper describes one new software solution tool for parallelization of sequential assembly code. The main goal of this paper is to develop the parallelizator which reads sequential assembler code and at the output provides parallelized code for MIPS processor with multiple cores. The idea is the following: the parser translates assembler input file to program objects suitable for further processing. After that the static single assignment is done. Based on the data flow graph, the parallelization algorithm separates instructions on different cores. Once sequential code is parallelized by the parallelization algorithm, registers are allocated with the algorithm for linear allocation, and the result at the end of the program is distributed assembler code on each of the cores. In the paper we evaluate the speedup of the matrix multiplication example, which was processed by the parallelizator of assembly code. The result is almost linear speedup of code execution, which increases with the number of cores. The speed up on the two cores is 1.99, while on 16 cores the speed up is 13.88.

  4. Writing executable assertions to test flight software

    Science.gov (United States)

    Mahmood, A.; Andrews, D. M.; Mccluskey, E. J.

    1984-01-01

    An executable assertion is a logical statement about the variables or a block of code. If there is no error during execution, the assertion statement results in a true value. Executable assertions can be used for dynamic testing of software. They can be employed for validation during the design phase, and exception and error detection during the operation phase. The present investigation is concerned with the problem of writing executable assertions, taking into account the use of assertions for testing flight software. They can be employed for validation during the design phase, and for exception handling and error detection during the operation phase The digital flight control system and the flight control software are discussed. The considered system provides autopilot and flight director modes of operation for automatic and manual control of the aircraft during all phases of flight. Attention is given to techniques for writing and using assertions to test flight software, an experimental setup to test flight software, and language features to support efficient use of assertions.

  5. Tandem Mirror Reactor Systems Code (Version I)

    International Nuclear Information System (INIS)

    Reid, R.L.; Finn, P.A.; Gohar, M.Y.

    1985-09-01

    A computer code was developed to model a Tandem Mirror Reactor. Ths is the first Tandem Mirror Reactor model to couple, in detail, the highly linked physics, magnetics, and neutronic analysis into a single code. This report describes the code architecture, provides a summary description of the modules comprising the code, and includes an example execution of the Tandem Mirror Reactor Systems Code. Results from this code for two sensitivity studies are also included. These studies are: (1) to determine the impact of center cell plasma radius, length, and ion temperature on reactor cost and performance at constant fusion power; and (2) to determine the impact of reactor power level on cost

  6. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author)

  7. Use of computer codes for system reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sabek, M.; Gaafar, M. (Nuclear Regulatory and Safety Centre, Atomic Energy Authority, Cairo (Egypt)); Poucet, A. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author).

  8. The interactive roles of parenting, emotion regulation and executive functioning in moral reasoning during middle childhood.

    Science.gov (United States)

    Hinnant, J Benjamin; Nelson, Jackie A; O'Brien, Marion; Keane, Susan P; Calkins, Susan D

    2013-01-01

    We examined mother-child co-operative behaviour, children's emotion regulation and executive function, as well as combinations of these factors, as predictors of moral reasoning in 89 10-year-old children. Dyadic co-operation was coded from videotaped observations of laboratory puzzle and speech tasks. Emotion regulation was derived from maternal report, and executive functioning was assessed with the Tower of London task. Moral reasoning was coded during mother-child conversations about morally ambiguous, peer-conflict situations. Two significant interactions indicated that children from more co-operative dyads who also had higher executive function skills had higher moral reasoning scores than other children, and children lower in both emotion regulation and executive function had lower moral reasoning scores than other children. The results contribute to the literature on the multiple and interactive levels of influence on moral reasoning in childhood.

  9. The Interactive Roles of Parenting, Emotion Regulation and Executive Functioning in Moral Reasoning during Middle Childhood

    Science.gov (United States)

    Hinnant, J. Benjamin; Nelson, Jackie A.; O’Brien, Marion; Keane, Susan P.; Calkins, Susan D.

    2013-01-01

    We examined mother-child cooperative behavior, children’s emotion regulation and executive function, as well as combinations of these factors, as predictors of moral reasoning in 89 10-year-old children. Dyadic cooperation was coded from videotaped observations of laboratory puzzle and speech tasks. Emotion regulation was derived from maternal report, and executive functioning was assessed with the Tower of London task. Moral reasoning was coded during mother-child conversations about morally ambiguous, peer-conflict situations. Two significant interactions indicated that children from more cooperative dyads who also had higher executive function skills had higher moral reasoning scores than other children, and children lower in both emotion regulation and executive function had lower moral reasoning scores than other children. The results contribute to the literature on the multiple and interactive levels of influence on moral reasoning in childhood. PMID:23650955

  10. An Efficient Platform for the Automatic Extraction of Patterns in Native Code

    Directory of Open Access Journals (Sweden)

    Javier Escalada

    2017-01-01

    Full Text Available Different software tools, such as decompilers, code quality analyzers, recognizers of packed executable files, authorship analyzers, and malware detectors, search for patterns in binary code. The use of machine learning algorithms, trained with programs taken from the huge number of applications in the existing open source code repositories, allows finding patterns not detected with the manual approach. To this end, we have created a versatile platform for the automatic extraction of patterns from native code, capable of processing big binary files. Its implementation has been parallelized, providing important runtime performance benefits for multicore architectures. Compared to the single-processor execution, the average performance improvement obtained with the best configuration is 3.5 factors over the maximum theoretical gain of 4 factors.

  11. Clean Code - Why you should care

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    - Martin Fowler Writing code is communication, not solely with the computer that executes it, but also with other developers and with oneself. A developer spends a lot of his working time reading and understanding code that was written by other developers or by himself in the past. The readability of the code plays an important factor for the time to find a bug or add new functionality, which in turn has a big impact on the productivity. Code that is difficult to undestand, hard to maintain and refactor, and offers many spots for bugs to hide is not considered to be "clean code". But what could considered as "clean code" and what are the advantages of a strict application of its guidelines? In this presentation we will take a look on some typical "code smells" and proposed guidelines to improve your coding skills to write cleaner code that is less bug prone and better to maintain.

  12. Parallelization of the MAAP-A code neutronics/thermal hydraulics coupling

    International Nuclear Information System (INIS)

    Froehle, P.H.; Wei, T.Y.C.; Weber, D.P.; Henry, R.E.

    1998-01-01

    A major new feature, one-dimensional space-time kinetics, has been added to a developmental version of the MAAP code through the introduction of the DIF3D-K module. This code is referred to as MAAP-A. To reduce the overall job time required, a capability has been provided to run the MAAP-A code in parallel. The parallel version of MAAP-A utilizes two machines running in parallel, with the DIF3D-K module executing on one machine and the rest of the MAAP-A code executing on the other machine. Timing results obtained during the development of the capability indicate that reductions in time of 30--40% are possible. The parallel version can be run on two SPARC 20 (SUN OS 5.5) workstations connected through the ethernet. MPI (Message Passing Interface standard) needs to be implemented on the machines. If necessary the parallel version can also be run on only one machine. The results obtained running in this one-machine mode identically match the results obtained from the serial version of the code

  13. Utilization of KENO-IV computer code with HANSEN-ROACH library

    International Nuclear Information System (INIS)

    Lima Barros, M. de; Vellozo, S.O.

    1982-01-01

    Several analysis with KENO-IV computer code, which is based in the Monte Carlo method, and the cross section library HANSEN-ROACH, were done, aiming to present the more convenient form to execute criticality calculations with this computer code and this cross sections. (E.G.) [pt

  14. Parallelization of MCNP4 code by using simple FORTRAN algorithms

    International Nuclear Information System (INIS)

    Yazid, P.I.; Takano, Makoto; Masukawa, Fumihiro; Naito, Yoshitaka.

    1993-12-01

    Simple FORTRAN algorithms, that rely only on open, close, read and write statements, together with disk files and some UNIX commands have been applied to parallelization of MCNP4. The code, named MCNPNFS, maintains almost all capabilities of MCNP4 in solving shielding problems. It is able to perform parallel computing on a set of any UNIX workstations connected by a network, regardless of the heterogeneity in hardware system, provided that all processors produce a binary file in the same format. Further, it is confirmed that MCNPNFS can be executed also on Monte-4 vector-parallel computer. MCNPNFS has been tested intensively by executing 5 photon-neutron benchmark problems, a spent fuel cask problem and 17 sample problems included in the original code package of MCNP4. Three different workstations, connected by a network, have been used to execute MCNPNFS in parallel. By measuring CPU time, the parallel efficiency is determined to be 58% to 99% and 86% in average. On Monte-4, MCNPNFS has been executed using 4 processors concurrently and has achieved the parallel efficiency of 79% in average. (author)

  15. Executive and language control in the multilingual brain.

    Science.gov (United States)

    Kong, Anthony Pak-Hin; Abutalebi, Jubin; Lam, Karen Sze-Yan; Weekes, Brendan

    2014-01-01

    Neuroimaging studies suggest that the neural network involved in language control may not be specific to bi-/multilingualism but is part of a domain-general executive control system. We report a trilingual case of a Cantonese (L1), English (L2), and Mandarin (L3) speaker, Dr. T, who sustained a brain injury at the age of 77 causing lesions in the left frontal lobe and in the left temporo-parietal areas resulting in fluent aphasia. Dr. T's executive functions were impaired according to a modified version of the Stroop color-word test and the Wisconsin Card Sorting Test performance was characterized by frequent perseveration errors. Dr. T demonstrated pathological language switching and mixing across her three languages. Code switching in Cantonese was more prominent in discourse production than confrontation naming. Our case suggests that voluntary control of spoken word production in trilingual speakers shares neural substrata in the frontobasal ganglia system with domain-general executive control mechanisms. One prediction is that lesions to such a system would give rise to both pathological switching and impairments of executive functions in trilingual speakers.

  16. Reactor Systems Technology Division code development and configuration/quality control procedures

    International Nuclear Information System (INIS)

    Johnson, E.C.

    1985-06-01

    Procedures are prescribed for executing a code development task and implementing the resulting coding in an official version of a computer code. The responsibilities of the project manager, development staff members, and the Code Configuration/Quality Control Group are defined. Examples of forms, logs, computer job control language, and suggested outlines for reports associated with software production and implementation are included in Appendix A. 1 raf., 2 figs

  17. The procedure execution manager and its application to Advanced Photon Source operation

    International Nuclear Information System (INIS)

    Borland, M.

    1997-01-01

    The Procedure Execution Manager (PEM) combines a complete scripting environment for coding accelerator operation procedures with a manager application for executing and monitoring the procedures. PEM is based on Tcl/Tk, a supporting widget library, and the dp-tcl extension for distributed processing. The scripting environment provides support for distributed, parallel execution of procedures along with join and abort operations. Nesting of procedures is supported, permitting the same code to run as a top-level procedure under operator control or as a subroutine under control of another procedure. The manager application allows an operator to execute one or more procedures in automatic, semi-automatic, or manual modes. It also provides a standard way for operators to interact with procedures. A number of successful applications of PEM to accelerator operations have been made to date. These include start-up, shutdown, and other control of the positron accumulator ring (PAR), low-energy transport (LET) lines, and the booster rf systems. The PAR/LET procedures make nested use of PEM's ability to run parallel procedures. There are also a number of procedures to guide and assist tune-up operations, to make accelerator physics measurements, and to diagnose equipment. Because of the success of the existing procedures, expanded use of PEM is planned

  18. Ffuzz: Towards full system high coverage fuzz testing on binary executables.

    Directory of Open Access Journals (Sweden)

    Bin Zhang

    Full Text Available Bugs and vulnerabilities in binary executables threaten cyber security. Current discovery methods, like fuzz testing, symbolic execution and manual analysis, both have advantages and disadvantages when exercising the deeper code area in binary executables to find more bugs. In this paper, we designed and implemented a hybrid automatic bug finding tool-Ffuzz-on top of fuzz testing and selective symbolic execution. It targets full system software stack testing including both the user space and kernel space. Combining these two mainstream techniques enables us to achieve higher coverage and avoid getting stuck both in fuzz testing and symbolic execution. We also proposed two key optimizations to improve the efficiency of full system testing. We evaluated the efficiency and effectiveness of our method on real-world binary software and 844 memory corruption vulnerable programs in the Juliet test suite. The results show that Ffuzz can discover software bugs in the full system software stack effectively and efficiently.

  19. Ffuzz: Towards full system high coverage fuzz testing on binary executables.

    Science.gov (United States)

    Zhang, Bin; Ye, Jiaxi; Bi, Xing; Feng, Chao; Tang, Chaojing

    2018-01-01

    Bugs and vulnerabilities in binary executables threaten cyber security. Current discovery methods, like fuzz testing, symbolic execution and manual analysis, both have advantages and disadvantages when exercising the deeper code area in binary executables to find more bugs. In this paper, we designed and implemented a hybrid automatic bug finding tool-Ffuzz-on top of fuzz testing and selective symbolic execution. It targets full system software stack testing including both the user space and kernel space. Combining these two mainstream techniques enables us to achieve higher coverage and avoid getting stuck both in fuzz testing and symbolic execution. We also proposed two key optimizations to improve the efficiency of full system testing. We evaluated the efficiency and effectiveness of our method on real-world binary software and 844 memory corruption vulnerable programs in the Juliet test suite. The results show that Ffuzz can discover software bugs in the full system software stack effectively and efficiently.

  20. Directed Hidden-Code Extractor for Environment-Sensitive Malwares

    Science.gov (United States)

    Jia, Chunfu; Wang, Zhi; Lu, Kai; Liu, Xinhai; Liu, Xin

    Malware writers often use packing technique to hide malicious payload. A number of dynamic unpacking tools are.designed in order to identify and extract the hidden code in the packed malware. However, such unpacking methods.are all based on a highly controlled environment that is vulnerable to various anti-unpacking techniques. If execution.environment is suspicious, malwares may stay inactive for a long time or stop execution immediately to evade.detection. In this paper, we proposed a novel approach that automatically reasons about the environment requirements.imposed by malware, then directs a unpacking tool to change the controlled environment to extract the hide code at.the new environment. The experimental results show that our approach significantly increases the resilience of the.traditional unpacking tools to environment-sensitive malware.

  1. Experience with Remote Job Execution

    International Nuclear Information System (INIS)

    Lynch, Vickie E.; Cobb, John W; Green, Mark L.; Kohl, James Arthur; Miller, Stephen D.; Ren, Shelly; Smith, Bradford C.; Vazhkudai, Sudharshan S.

    2008-01-01

    The Neutron Science Portal at Oak Ridge National Laboratory submits jobs to the TeraGrid for remote job execution. The TeraGrid is a network of high performance computers supported by the US National Science Foundation. There are eleven partner facilities with over a petaflop of peak computing performance and sixty petabytes of long-term storage. Globus is installed on a local machine and used for job submission. The graphical user interface is produced by java coding that reads an XML file. After submission, the status of the job is displayed in a Job Information Service window which queries globus for the status. The output folder produced in the scratch directory of the TeraGrid machine is returned to the portal with globus-url-copy command that uses the gridftp servers on the TeraGrid machines. This folder is copied from the stage-in directory of the community account to the user's results directory where the output can be plotted using the portal's visualization services. The primary problem with remote job execution is diagnosing execution problems. We have daily tests of submitting multiple remote jobs from the portal. When these jobs fail on a computer, it is difficult to diagnose the problem from the globus output. Successes and problems will be presented

  2. Users' manual for the FTDRAW (Fault Tree Draw) code

    International Nuclear Information System (INIS)

    Oikawa, Tetsukuni; Hikawa, Michihiro; Tanabe, Syuichi; Nakamura, Norihiro

    1985-02-01

    This report provides the information needed to use the FTDRAW (Fault Tree Draw) code, which is designed for drawing a fault tree. The FTDRAW code has several optional functions, such as the overview of a fault tree output, fault tree output in English description, fault tree output in Japanese description and summary tree output. Inputs for the FTDRAW code are component failure rate information and gate information which are filed out by a execution of the FTA-J (Fault Tree Analysis-JAERI) code system and option control data. Using the FTDRAW code, we can get drawings of fault trees which is easy to see, efficiently. (author)

  3. Vectorization and parallelization of a production reactor assembly code

    International Nuclear Information System (INIS)

    Vujic, J.L.; Martin, W.R.; Michigan Univ., Ann Arbor, MI

    1991-01-01

    In order to use efficiently the new features of supercomputers, production codes, usually written 10 -20 years ago, must be tailored for modern computer architectures. We have chosen to optimize the CPM-2 code, a production reactor assembly code based on the collision probability transport method. Substantial speedup in the execution times was obtained with the parallel/vector version of the CPM-2 code. In addition, we have developed a new transfer probability method, which removes some of the modelling limitations of the collision probability method encoded in the CPM-2 code, and can fully utilize the parallel/vector architecture of a multiprocessor IBM 3090. (author)

  4. Vectorization and parallelization of a production reactor assembly code

    International Nuclear Information System (INIS)

    Vujic, J.L.; Martin, W.R.

    1991-01-01

    In order to efficiently use new features of supercomputers, production codes, usually written 10 - 20 years ago, must be tailored for modern computer architectures. We have chosen to optimize the CPM-2 code, a production reactor assembly code based on the collision probability transport method. Substantial speedups in the execution times were obtained with the parallel/vector version of the CPM-2 code. In addition, we have developed a new transfer probability method, which removes some of the modelling limitations of the collision probability method encoded in the CPM-2 code, and can fully utilize parallel/vector architecture of a multiprocessor IBM 3090. (author)

  5. Executive and Language Control in the Multilingual Brain

    Directory of Open Access Journals (Sweden)

    Anthony Pak-Hin Kong

    2014-01-01

    Full Text Available Neuroimaging studies suggest that the neural network involved in language control may not be specific to bi-/multilingualism but is part of a domain-general executive control system. We report a trilingual case of a Cantonese (L1, English (L2, and Mandarin (L3 speaker, Dr. T, who sustained a brain injury at the age of 77 causing lesions in the left frontal lobe and in the left temporo-parietal areas resulting in fluent aphasia. Dr. T’s executive functions were impaired according to a modified version of the Stroop color-word test and the Wisconsin Card Sorting Test performance was characterized by frequent perseveration errors. Dr. T demonstrated pathological language switching and mixing across her three languages. Code switching in Cantonese was more prominent in discourse production than confrontation naming. Our case suggests that voluntary control of spoken word production in trilingual speakers shares neural substrata in the frontobasal ganglia system with domain-general executive control mechanisms. One prediction is that lesions to such a system would give rise to both pathological switching and impairments of executive functions in trilingual speakers.

  6. Just-in-Time Compilation-Inspired Methodology for Parallelization of Compute Intensive Java Code

    Directory of Open Access Journals (Sweden)

    GHULAM MUSTAFA

    2017-01-01

    Full Text Available Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system

  7. Just-in-time compilation-inspired methodology for parallelization of compute intensive java code

    International Nuclear Information System (INIS)

    Mustafa, G.; Ghani, M.U.

    2017-01-01

    Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time) compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum) benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system. (author)

  8. Simty: generalized SIMT execution on RISC-V

    OpenAIRE

    Collange , Sylvain

    2017-01-01

    International audience; We present Simty, a massively multi-threaded RISC-V processor core that acts as a proof of concept for dynamic inter-thread vector-ization at the micro-architecture level. Simty runs groups of scalar threads executing SPMD code in lockstep, and assembles SIMD instructions dynamically across threads. Unlike existing SIMD or SIMT processors like GPUs or vector processors, Simty vector-izes scalar general-purpose binaries. It does not involve any instruction set extension...

  9. ETF system code: composition and applications

    International Nuclear Information System (INIS)

    Reid, R.L.; Wu, K.F.

    1980-01-01

    A computer code has been developed for application to ETF tokamak system and conceptual design studies. The code determines cost, performance, configuration, and technology requirements as a function of tokamak parameters. The ETF code is structured in a modular fashion in order to allow independent modeling of each major tokamak component. The primary benefit of modularization is that it allows updating of a component module, such as the TF coil module, without disturbing the remainder of the system code as long as the input/output to the modules remains unchanged. The modules may be run independently to perform specific design studies, such as determining the effect of allowable strain on TF coil structural requirements, or the modules may be executed together as a system to determine global effects, such as defining the impact of aspect ratio on the entire tokamak system

  10. Executive Functions: Influence of Sex, Age and Its Relationship With Intelligence

    Directory of Open Access Journals (Sweden)

    Larissa de Oliveira e Ferreira

    2015-12-01

    Full Text Available AbstractThe Tower of Hanoi is a tool used to evaluate executive functions. However, few studies describe what functions are evaluated in this test. This study investigates the executive functions, evaluated by the Tower of Hanoi (ToH, and the influence of gender, age and its relationship with intelligence. We evaluated 55 children and adolescents, between the ages of ten and 16, without diagnosed neuropsychiatric disorders. The results showed that the performance and time in to complete the Tower of Hanoi have no discriminative power when comparing age groups and sex; there was also no significant correlation found between the ToH and the execution quotient of the Wechsler Intelligence Scale for Children - Third Edition (WISC III, perceptual organization and the speed of processing. Only the subtest coding were positively related to the ToH, demonstrating that these instruments may be measuring related aspects of intelligence and executive functions, namely intelligence and working memory.

  11. A general purpose code for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Wilcke, W.W.; Rochester Univ., NY

    1984-01-01

    A general-purpose computer code MONTHY has been written to perform Monte Carlo simulations of physical systems. To achieve a high degree of flexibility the code is organized like a general purpose computer, operating on a vector describing the time dependent state of the system under simulation. The instruction set of the 'computer' is defined by the user and is therefore adaptable to the particular problem studied. The organization of MONTHY allows iterative and conditional execution of operations. (orig.)

  12. Mining dynamic noteworthy functions in software execution sequences.

    Science.gov (United States)

    Zhang, Bing; Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong

    2017-01-01

    As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely.

  13. The Puzzle of Processing Speed, Memory and Executive Function Impairments in Schizophrenia: Fitting the Pieces Together

    Science.gov (United States)

    Knowles, Emma E. M.; Weiser, Mark; David, Anthony S.; Glahn, David; Davidson, Michael; Reichenberg, Abraham

    2015-01-01

    Background Substantial impairment in digit-symbol substitution task performance in schizophrenia is well established, which has been widely interpreted as denoting a specific impairment in processing-speed ability. However, other higher-order cognitive functions might be more critical to performance on this task. To date, this has not been rigorously investigated in schizophrenia. Methods One-hundred and twenty-five schizophrenia cases and 272 controls completed neuropsychological measures of processing speed, memory and executive functioning. We implemented a series of confirmatory factor and structural regression modeling in order to build an integrated model of processing speed, memory and executive function with which to deconstruct digit-symbol substitution task and characterize discrepancies between cases and controls. Results The overall structure of the processing speed, memory and executive function model was the same across groups (χ2 = 208.86, p>.05) but the contribution of the specific cognitive domains to coding task performance differed significantly. When completing the task controls relied on executive function and, indirectly, on working memory ability; while schizophrenia cases utilized an alternative set of cognitive operations whereby they relied on the same processes required to complete verbal fluency tasks. Conclusions Successful coding task performance is predominantly reliant on executive function, rather than processing-speed or memory abilities. Schizophrenia patients perform poorly on this task due to an apparent lack of appropriate executive function input, they rely instead on an alternative cognitive pathway. PMID:25863361

  14. Vectorization of three-dimensional neutron diffusion code CITATION

    International Nuclear Information System (INIS)

    Harada, Hiroo; Ishiguro, Misako

    1985-01-01

    Three-dimensional multi-group neutron diffusion code CITATION has been widely used for reactor criticality calculations. The code is expected to be run at a high speed by using recent vector supercomputers, when it is appropriately vectorized. In this paper, vectorization methods and their effects are described for the CITATION code. Especially, calculation algorithms suited for vectorization of the inner-outer iterative calculations which spend most of the computing time are discussed. The SLOR method, which is used in the original CITATION code, and the SOR method, which is adopted in the revised code, are vectorized by odd-even mesh ordering. The vectorized CITATION code is executed on the FACOM VP-100 and VP-200 computers, and is found to run over six times faster than the original code for a practical-scale problem. The initial value of the relaxation factor and the number of inner-iterations given as input data are also investigated since the computing time depends on these values. (author)

  15. Neutron star evolutions using tabulated equations of state with a new execution model

    Science.gov (United States)

    Anderson, Matthew; Kaiser, Hartmut; Neilsen, David; Sterling, Thomas

    2012-03-01

    The addition of nuclear and neutrino physics to general relativistic fluid codes allows for a more realistic description of hot nuclear matter in neutron star and black hole systems. This additional microphysics requires that each processor have access to large tables of data, such as equations of state, and in large simulations the memory required to store these tables locally can become excessive unless an alternative execution model is used. In this talk we present neutron star evolution results obtained using a message driven multi-threaded execution model known as ParalleX as an alternative to using a hybrid MPI-OpenMP approach. ParalleX provides the user a new way of computation based on message-driven flow control coordinated by lightweight synchronization elements which improves scalability and simplifies code development. We present the spectrum of radial pulsation frequencies for a neutron star with the Shen equation of state using the ParalleX execution model. We present performance results for an open source, distributed, nonblocking ParalleX-based tabulated equation of state component capable of handling tables that may even be too large to read into the memory of a single node.

  16. Comparison of beam deposition for three neutral beam injection codes

    International Nuclear Information System (INIS)

    Wieland, R.M.; Houlberg, W.A.; Mense, A.T.

    1979-03-01

    The three neutral beam injection codes BEAM (Houlberg, ORNL), HOFR (Howe, ORNL), and FREYA (Post, PPPL) are compared with respect to the calculation of the fast ion deposition profile H(r). Only plasmas of circular cross section are considered, with injection confined to the mid-plane of the torus. The approximations inherent in each code are pointed out, and a series of comparisons varying several parameters (beam energy and radius, machine size, and injection angle) shows excellent agreement among all the codes. A cost comparison (execution time and memory requirements) is made which points out the relative merits of each code within the context of incorporation into a plasma transport simulation code

  17. Providing Virtual Execution Environments: A Twofold Illustration

    CERN Document Server

    Grehant, Xavier

    2008-01-01

    Platform virtualization helps solving major grid computing challenges: share resource with flexible, user-controlled and custom execution environments and in the meanwhile, isolate failures and malicious code. Grid resource management tools will evolve to embrace support for virtual resource. We present two open source projects that transparently supply virtual execution environments. Tycoon has been developed at HP Labs to optimise resource usage in creating an economy where users bid to access virtual machines and compete for CPU cycles. SmartDomains provides a peer-to-peer layer that automates virtual machines deployment using a description language and deployment engine from HP Labs. These projects demonstrate both client-server and peer-to-peer approaches to virtual resource management. The first case makes extensive use of virtual machines features for dynamic resource allocation. The second translates virtual machines capabilities into a sophisticated language where resource management components can b...

  18. RELAP5-3D Code for Supercritical-Pressure Light-Water-Cooled Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Riemke, Richard Allan; Davis, Cliff Bybee; Schultz, Richard Raphael

    2003-04-01

    The RELAP5-3D computer program has been improved for analysis of supercritical-pressure, light-water-cooled reactors. Several code modifications were implemented to correct code execution failures. Changes were made to the steam table generation, steam table interpolation, metastable states, interfacial heat transfer coefficients, and transport properties (viscosity and thermal conductivity). The code modifications now allow the code to run slow transients above the critical pressure as well as blowdown transients (modified Edwards pipe and modified existing pressurized water reactor model) that pass near the critical point.

  19. Counting, enumerating and sampling of execution plans in a cost-based query optimizer

    NARCIS (Netherlands)

    F. Waas; C.A. Galindo-Legaria

    1999-01-01

    textabstractTesting an SQL database system by running large sets of deterministic or stochastic SQL statements is common practice in commercial database development. However, code defects often remain undetected as the query optimizer's choice of an execution plan is not only depending on

  20. On the linear programming bound for linear Lee codes.

    Science.gov (United States)

    Astola, Helena; Tabus, Ioan

    2016-01-01

    Based on an invariance-type property of the Lee-compositions of a linear Lee code, additional equality constraints can be introduced to the linear programming problem of linear Lee codes. In this paper, we formulate this property in terms of an action of the multiplicative group of the field [Formula: see text] on the set of Lee-compositions. We show some useful properties of certain sums of Lee-numbers, which are the eigenvalues of the Lee association scheme, appearing in the linear programming problem of linear Lee codes. Using the additional equality constraints, we formulate the linear programming problem of linear Lee codes in a very compact form, leading to a fast execution, which allows to efficiently compute the bounds for large parameter values of the linear codes.

  1. The puzzle of processing speed, memory, and executive function impairments in schizophrenia: fitting the pieces together.

    Science.gov (United States)

    Knowles, Emma E M; Weiser, Mark; David, Anthony S; Glahn, David C; Davidson, Michael; Reichenberg, Abraham

    2015-12-01

    Substantial impairment in performance on the digit-symbol substitution task in patients with schizophrenia is well established, which has been widely interpreted as denoting a specific impairment in processing speed. However, other higher order cognitive functions might be more critical to performance on this task. To date, this idea has not been rigorously investigated in patients with schizophrenia. Neuropsychological measures of processing speed, memory, and executive functioning were completed by 125 patients with schizophrenia and 272 control subjects. We implemented a series of confirmatory factor and structural regression modeling to build an integrated model of processing speed, memory, and executive function with which to deconstruct the digit-symbol substitution task and characterize discrepancies between patients with schizophrenia and control subjects. The overall structure of the processing speed, memory, and executive function model was the same across groups (χ(2) = 208.86, p > .05), but the contribution of the specific cognitive domains to coding task performance differed significantly. When completing the task, control subjects relied on executive function and, indirectly, on working memory ability, whereas patients with schizophrenia used an alternative set of cognitive operations whereby they relied on the same processes required to complete verbal fluency tasks. Successful coding task performance relies predominantly on executive function, rather than processing speed or memory. Patients with schizophrenia perform poorly on this task because of an apparent lack of appropriate executive function input; they rely instead on an alternative cognitive pathway. Copyright © 2015 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  2. Preparation of the TRANSURANUS code for TEMELIN NPP

    International Nuclear Information System (INIS)

    Klouzal, J.

    2011-01-01

    Since 2010 Temelin NPP started using TVSA-T fuel supplied by JSC TVEL. The transition process included implementation of several new core reload design codes. TRANSURANUS code was selected for the evaluation of the fuel rod thermomechanical performance. The adaptation and validation of the code was performed by Nuclear Research Institute Rez. TRANSURANUS code contains wide selection of alternative models for most of phenomena important for the fuel behaviour. It was therefore necessary to select, based on a comparison with experimental data, those most suitable for the modeling of TVSA-T fuel rods. In some cases, new models were implemented. Software tools and methodology for the evaluation of the proposed core reload design using TRANSURANUS code were also developed in NRI. The software tools include the interface to core physics code ANDREA and a set of scripts for an automated execution and processing of the computational runs. Independent confirmation of some of the vendor specified core reload design criteria was performed using TRANSURANUS. (authors)

  3. EBT time-dependent point model code: description and user's guide

    International Nuclear Information System (INIS)

    Roberts, J.F.; Uckan, N.A.

    1977-07-01

    A D-T time-dependent point model has been developed to assess the energy balance in an EBT reactor plasma. Flexibility is retained in the model to permit more recent data to be incorporated as they become available from the theoretical and experimental studies. This report includes the physics models involved, the program logic, and a description of the variables and routines used. All the files necessary for execution are listed, and the code, including a post-execution plotting routine, is discussed

  4. Performance Tuning of x86 OpenMP Codes with MAQAO

    Science.gov (United States)

    Barthou, Denis; Charif Rubial, Andres; Jalby, William; Koliai, Souad; Valensi, Cédric

    Failing to find the best optimization sequence for a given application code can lead to compiler generated codes with poor performances or inappropriate code. It is necessary to analyze performances from the assembly generated code to improve over the compilation process. This paper presents a tool for the performance analysis of multithreaded codes (OpenMP programs support at the moment). MAQAO relies on static performance evaluation to identify compiler optimizations and assess performance of loops. It exploits static binary rewriting for reading and instrumenting object files or executables. Static binary instrumentation allows the insertion of probes at instruction level. Memory accesses can be captured to help tune the code, but such traces require to be compressed. MAQAO can analyze the results and provide hints for tuning the code. We show on some examples how this can help users improve their OpenMP applications.

  5. Counting, Enumerating and Sampling of Execution Plans in a Cost-Based Query Optimizer

    NARCIS (Netherlands)

    F. Waas; C.A. Galindo-Legaria

    2000-01-01

    textabstractTesting an SQL database system by running large sets of deterministic or stochastic SQL statements is common practice in commercial database development. However, code defects often remain undetected as the query optimizer's choice of an execution plan is not only depending on the query

  6. Sandia reactor kinetics codes: SAK and PK1D

    International Nuclear Information System (INIS)

    Pickard, P.S.; Odom, J.P.

    1978-01-01

    The Sandia Kinetics code (SAK) is a one-dimensional coupled thermal-neutronics transient analysis code for use in simulation of reactor transients. The time-dependent cross section routines allow arbitrary time-dependent changes in material properties. The one-dimensional heat transfer routines are for cylindrical geometry and allow arbitrary mesh structure, temperature-dependent thermal properties, radiation treatment, and coolant flow and heat-transfer properties at the surface of a fuel element. The Point Kinetics 1 Dimensional Heat Transfer Code (PK1D) solves the point kinetics equations and has essentially the same heat-transfer treatment as SAK. PK1D can address extended reactor transients with minimal computer execution time

  7. Plaspp: A New X-Ray Postprocessing Capability for ASCI Codes

    International Nuclear Information System (INIS)

    Pollak, Gregory

    2003-01-01

    This report announces the availability of the beta version of a (partly) new code, Plaspp (Plasma Postprocessor). This code postprocesses (graphics) dumps from at least two ASCI code suites: Crestone Project and Shavano Project. The basic structure of the code follows that of TDG, the equivalent postprocessor code for LASNEX. In addition to some new commands, the basic differences between TDG and Plaspp are the following: Plaspp uses a graphics dump instead of the unique TDG dump, it handles the unstructured meshes that the ASCI codes produce, and it can use its own multigroup opacity data. Because of the dump format, this code should be useable by any code that produces Cartesian, cylindrical, or spherical graphics formats. This report details the new commands; the required information to be placed on the dumps; some new commands and edits that are applicable to TDG as well, but have not been documented elsewhere; and general information about execution on the open and secure networks.

  8. The discrete-dipole-approximation code ADDA: capabilities and known limitations

    NARCIS (Netherlands)

    Yurkin, M.A.; Hoekstra, A.G.

    2011-01-01

    The open-source code ADDA is described, which implements the discrete dipole approximation (DDA), a method to simulate light scattering by finite 3D objects of arbitrary shape and composition. Besides standard sequential execution, ADDA can run on a multiprocessor distributed-memory system,

  9. CMCpy: Genetic Code-Message Coevolution Models in Python

    Science.gov (United States)

    Becich, Peter J.; Stark, Brian P.; Bhat, Harish S.; Ardell, David H.

    2013-01-01

    Code-message coevolution (CMC) models represent coevolution of a genetic code and a population of protein-coding genes (“messages”). Formally, CMC models are sets of quasispecies coupled together for fitness through a shared genetic code. Although CMC models display plausible explanations for the origin of multiple genetic code traits by natural selection, useful modern implementations of CMC models are not currently available. To meet this need we present CMCpy, an object-oriented Python API and command-line executable front-end that can reproduce all published results of CMC models. CMCpy implements multiple solvers for leading eigenpairs of quasispecies models. We also present novel analytical results that extend and generalize applications of perturbation theory to quasispecies models and pioneer the application of a homotopy method for quasispecies with non-unique maximally fit genotypes. Our results therefore facilitate the computational and analytical study of a variety of evolutionary systems. CMCpy is free open-source software available from http://pypi.python.org/pypi/CMCpy/. PMID:23532367

  10. MARS Code in Linux Environment

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2005-07-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  11. MARS Code in Linux Environment

    International Nuclear Information System (INIS)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong

    2005-01-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  12. Systems guide to MCNP (Monte Carlo Neutron and Photon Transport Code)

    International Nuclear Information System (INIS)

    Kirk, B.L.; West, J.T.

    1984-06-01

    The subject of this report is the implementation of the Los Alamos National Laboratory Monte Carlo Neutron and Photon Transport Code - Version 3 (MCNP) on the different types of computer systems, especially the IBM MVS system. The report supplements the documentation of the RSIC computer code package CCC-200/MCNP. Details of the procedure to follow in executing MCNP on the IBM computers, either in batch mode or interactive mode, are provided

  13. Micromagnetic Code Development of Advanced Magnetic Structures Final Report CRADA No. TC-1561-98

    Energy Technology Data Exchange (ETDEWEB)

    Cerjan, Charles J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Shi, Xizeng [Read-Rite Corporation, Fremont, CA (United States)

    2017-11-09

    The specific goals of this project were to: Further develop the previously written micromagnetic code DADIMAG (DOE code release number 980017); Validate the code. The resulting code was expected to be more realistic and useful for simulations of magnetic structures of specific interest to Read-Rite programs. We also planned to further the code for use in internal LLNL programs. This project complemented LLNL CRADA TC-840-94 between LLNL and Read-Rite, which allowed for simulations of the advanced magnetic head development completed under the CRADA. TC-1561-98 was effective concurrently with LLNL non-exclusive copyright license (TL-1552-98) to Read-Rite for DADIMAG Version 2 executable code.

  14. An approach for coupled-code multiphysics core simulations from a common input

    International Nuclear Information System (INIS)

    Schmidt, Rodney; Belcourt, Kenneth; Hooper, Russell; Pawlowski, Roger; Clarno, Kevin; Simunovic, Srdjan; Slattery, Stuart; Turner, John; Palmtag, Scott

    2015-01-01

    Highlights: • We describe an approach for coupled-code multiphysics reactor core simulations. • The approach can enable tight coupling of distinct physics codes with a common input. • Multi-code multiphysics coupling and parallel data transfer issues are explained. • The common input approach and how the information is processed is described. • Capabilities are demonstrated on an eigenvalue and power distribution calculation. - Abstract: This paper describes an approach for coupled-code multiphysics reactor core simulations that is being developed by the Virtual Environment for Reactor Applications (VERA) project in the Consortium for Advanced Simulation of Light-Water Reactors (CASL). In this approach a user creates a single problem description, called the “VERAIn” common input file, to define and setup the desired coupled-code reactor core simulation. A preprocessing step accepts the VERAIn file and generates a set of fully consistent input files for the different physics codes being coupled. The problem is then solved using a single-executable coupled-code simulation tool applicable to the problem, which is built using VERA infrastructure software tools and the set of physics codes required for the problem of interest. The approach is demonstrated by performing an eigenvalue and power distribution calculation of a typical three-dimensional 17 × 17 assembly with thermal–hydraulic and fuel temperature feedback. All neutronics aspects of the problem (cross-section calculation, neutron transport, power release) are solved using the Insilico code suite and are fully coupled to a thermal–hydraulic analysis calculated by the Cobra-TF (CTF) code. The single-executable coupled-code (Insilico-CTF) simulation tool is created using several VERA tools, including LIME (Lightweight Integrating Multiphysics Environment for coupling codes), DTK (Data Transfer Kit), Trilinos, and TriBITS. Parallel calculations are performed on the Titan supercomputer at Oak

  15. Overview of the ArbiTER edge plasma eigenvalue code

    Science.gov (United States)

    Baver, Derek; Myra, James; Umansky, Maxim

    2011-10-01

    The Arbitrary Topology Equation Reader, or ArbiTER, is a flexible eigenvalue solver that is currently under development for plasma physics applications. The ArbiTER code builds on the equation parser framework of the existing 2DX code, extending it to include a topology parser. This will give the code the capability to model problems with complicated geometries (such as multiple X-points and scrape-off layers) or model equations with arbitrary numbers of dimensions (e.g. for kinetic analysis). In the equation parser framework, model equations are not included in the program's source code. Instead, an input file contains instructions for building a matrix from profile functions and elementary differential operators. The program then executes these instructions in a sequential manner. These instructions may also be translated into analytic form, thus giving the code transparency as well as flexibility. We will present an overview of how the ArbiTER code is to work, as well as preliminary results from early versions of this code. Work supported by the U.S. DOE.

  16. Code REX to fit experimental data to exponential functions and graphics plotting

    International Nuclear Information System (INIS)

    Romero, L.; Travesi, A.

    1983-01-01

    The REX code, written in Fortran IV, performs the fitting a set of experimental data to different kind of functions as: straight-line (Y = A + BX) , and various exponential type (Y-A B x , Y=A X B ; Y=A exp(BX) ) , using the Least Squares criterion. Such fitting could be done directly for one selected function of for the our simultaneously and allows to chose the function that best fitting to the data, since presents the statistics data of all the fitting. Further, it presents the graphics plotting, of the fitted function, in the appropriate coordinate axes system. An additional option allows also the Graphic plotting of experimental data used for the fitting. All the data necessary to execute this code are asked to the operator in the terminal screen, in the iterative way by screen-operator dialogue, and the values are introduced through the keyboard. This code could be executed with any computer provided with graphic screen and keyboard terminal, with a X-Y plotter serial connected to the graphics terminal. (Author) 5 refs

  17. Development of an Auto-Validation Program for MARS Code Assessments

    International Nuclear Information System (INIS)

    Lee, Young Jin; Chung, Bub Dong

    2006-01-01

    MARS (Multi-dimensional Analysis of Reactor Safety) code is a best-estimate thermal hydraulic system analysis code developed at KAERI. It is important for a thermal hydraulic computer code to be assessed against theoretical and experimental data to verify and validate the performance and the integrity of the structure, models and correlations of the code. The code assessment efforts for complex thermal hydraulics code such as MARS code can be tedious, time-consuming and require large amount of human intervention in data transfer to see the results in graphic forms. Code developers produce many versions of a code during development and each version need to be verified for integrity. Thus, for MARS code developers, it is desirable to have an automatic way of carrying out the code assessment calculations. In the present work, an Auto-Validation program that carries out the code assessment efforts has been developed. The program uses the user supplied configuration file (with '.vv' extension) which contain commands to read input file, to execute the user selected MARS program, and to generate result graphs. The program can be useful if a same set of code assessments is repeated with different versions of the code. The program is written with the Delphi program language. The program runs under the Microsoft Windows environment

  18. Ensuring that User Defined Code does not See Uninitialized Fields

    DEFF Research Database (Denmark)

    Nielsen, Anders Bach

    2007-01-01

    Initialization of objects is commonly handled by user code, often in special routines known as constructors. This applies even in a virtual machine with multiple concurrent execution engines that all share the same heap. But for a language where run-time values play a role in the type system...

  19. Asynchronous Execution of the Fast Multipole Method Using Charm++

    OpenAIRE

    AbdulJabbar, Mustafa; Yokota, Rio; Keyes, David

    2014-01-01

    Fast multipole methods (FMM) on distributed mem- ory have traditionally used a bulk-synchronous model of com- municating the local essential tree (LET) and overlapping it with computation of the local data. This could be perceived as an extreme case of data aggregation, where the whole LET is communicated at once. Charm++ allows a much finer control over the granularity of communication, and has a asynchronous execution model that fits well with the structure of our FMM code. Unlike previous ...

  20. Executive Dysfunction

    Science.gov (United States)

    Rabinovici, Gil D.; Stephens, Melanie L.; Possin, Katherine L.

    2015-01-01

    Purpose of Review: Executive functions represent a constellation of cognitive abilities that drive goal-oriented behavior and are critical to the ability to adapt to an ever-changing world. This article provides a clinically oriented approach to classifying, localizing, diagnosing, and treating disorders of executive function, which are pervasive in clinical practice. Recent Findings: Executive functions can be split into four distinct components: working memory, inhibition, set shifting, and fluency. These components may be differentially affected in individual patients and act together to guide higher-order cognitive constructs such as planning and organization. Specific bedside and neuropsychological tests can be applied to evaluate components of executive function. While dysexecutive syndromes were first described in patients with frontal lesions, intact executive functioning relies on distributed neural networks that include not only the prefrontal cortex, but also the parietal cortex, basal ganglia, thalamus, and cerebellum. Executive dysfunction arises from injury to any of these regions, their white matter connections, or neurotransmitter systems. Dysexecutive symptoms therefore occur in most neurodegenerative diseases and in many other neurologic, psychiatric, and systemic illnesses. Management approaches are patient specific and should focus on treatment of the underlying cause in parallel with maximizing patient function and safety via occupational therapy and rehabilitation. Summary: Executive dysfunction is extremely common in patients with neurologic disorders. Diagnosis and treatment hinge on familiarity with the clinical components and neuroanatomic correlates of these complex, high-order cognitive processes. PMID:26039846

  1. Calculation of the absorbed dose for contamination in skin imparted by beta radiation through the Varskin code modified for 122 isotopes of interest for nuclear medicine, nuclear plants and research

    International Nuclear Information System (INIS)

    Alvarez R, J.T.

    1992-06-01

    In this work the implementation of a modification of the Varskin code for calculation of absorbed dose by contamination in skin imparted by external radiation fields generated by beta emitting is presented. The necessary data for the execution of the code are: isotope, dose depth, isotope activity, geometry type, source radio and time of integration of the isotope, being able to execute combinations of up to five radionuclides. This program it was implemented in Fortran 5 by means of the FFSKIN source program and the executable one in binary language BFFSKIN being the maximum execution time of 5 minutes. (Author)

  2. Evaluation of the efficiency and fault density of software generated by code generators

    Science.gov (United States)

    Schreur, Barbara

    1993-01-01

    Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.

  3. PCS a code system for generating production cross section libraries

    International Nuclear Information System (INIS)

    Cox, L.J.

    1997-01-01

    This document outlines the use of the PCS Code System. It summarizes the execution process for generating FORMAT2000 production cross section files from FORMAT2000 reaction cross section files. It also describes the process of assembling the ASCII versions of the high energy production files made from ENDL and Mark Chadwick's calculations. Descriptions of the function of each code along with its input and output and use are given. This document is under construction. Please submit entries, suggestions, questions, and corrections to (ljc at sign llnl.gov) 3 tabs

  4. Scaling Optimization of the SIESTA MHD Code

    Science.gov (United States)

    Seal, Sudip; Hirshman, Steven; Perumalla, Kalyan

    2013-10-01

    SIESTA is a parallel three-dimensional plasma equilibrium code capable of resolving magnetic islands at high spatial resolutions for toroidal plasmas. Originally designed to exploit small-scale parallelism, SIESTA has now been scaled to execute efficiently over several thousands of processors P. This scaling improvement was accomplished with minimal intrusion to the execution flow of the original version. First, the efficiency of the iterative solutions was improved by integrating the parallel tridiagonal block solver code BCYCLIC. Krylov-space generation in GMRES was then accelerated using a customized parallel matrix-vector multiplication algorithm. Novel parallel Hessian generation algorithms were integrated and memory access latencies were dramatically reduced through loop nest optimizations and data layout rearrangement. These optimizations sped up equilibria calculations by factors of 30-50. It is possible to compute solutions with granularity N/P near unity on extremely fine radial meshes (N > 1024 points). Grid separation in SIESTA, which manifests itself primarily in the resonant components of the pressure far from rational surfaces, is strongly suppressed by finer meshes. Large problem sizes of up to 300 K simultaneous non-linear coupled equations have been solved on the NERSC supercomputers. Work supported by U.S. DOE under Contract DE-AC05-00OR22725 with UT-Battelle, LLC.

  5. Tutoring executives online

    DEFF Research Database (Denmark)

    Bignoux, Stephane; Sund, Kristian J.

    2018-01-01

    Studies of learning and student satisfaction in the context of online university programmes have largely neglected programmes catering specifically to business executives. Such executives have typically been away from higher education for a number of years, and have collected substantial practical...... experience in the subject matters they are taught. Their expectations in terms of both content and delivery may therefore be different from non-executive students. We explore perceptions of the quality of tutoring in the context of an online executive MBA programme through participant interviews. We find...... that in addition to some of the tutor behaviours already discussed in the literature, executive students look specifically for practical industry knowledge and experience in tutors, when judging how effective a tutor is. This has implications for both the recruitment and training of online executive MBA tutors....

  6. Tutoring Executives Online

    DEFF Research Database (Denmark)

    Bignoux, Stephane; Sund, Kristian J.

    2016-01-01

    Studies of learning and student satisfaction in the context of online university programs have largely neglected programs catering specifically to business executives. Such executives have typically been away from higher education for a number of years, and have collected substantial practical...... experience in the subject matters they are taught. Their expectations in terms of both content and delivery may therefore be different from non-executive students. We explore perceptions of the quality of tutoring in the context of an online executive MBA program through participant interviews. We find...... that in addition to some of the tutor behaviors already discussed in the literature, executive students look specifically for practical industry knowledge and experience in tutors, when judging how effective a tutor is. This has implications for both the recruitment and training of online executive MBA tutors....

  7. MARS-KS code validation activity through the atlas domestic standard problem

    International Nuclear Information System (INIS)

    Choi, K. Y.; Kim, Y. S.; Kang, K. H.; Park, H. S.; Cho, S.

    2012-01-01

    The 2 nd Domestic Standard Problem (DSP-02) exercise using the ATLAS integral effect test data was executed to transfer the integral effect test data to domestic nuclear industries and to contribute to improving the safety analysis methodology for PWRs. A small break loss of coolant accident of a 6-inch break at the cold leg was determined as a target scenario by considering its technical importance and by incorporating interests from participants. Ten calculation results using MARS-KS code were collected, major prediction results were described qualitatively and code prediction accuracy was assessed quantitatively using the FFTBM. In addition, special code assessment activities were carried out to find out the area where the model improvement is required in the MARS-KS code. The lessons from this DSP-02 and recommendations to code developers are described in this paper. (authors)

  8. A fast reactor transient analysis methodology for PCs: Volume 3, LTC program manual of the QuickBASIC code

    International Nuclear Information System (INIS)

    Ott, K.O.; Chung, L.

    1992-06-01

    This manual augments the detailed manual of the GW-BASIC version of the LTC code for an application in QuickBASIC. As most of the GW-BASIC coding of this program for ''LMR Transient Calculations'' is compatible with QuickBASIC, this manual pertains primarily to the required changes, such as the handling of input and output. The considerable reduction in computation time achieved by this conversion is demonstrated for two sample problems, using a variety of hardware and execution options. The revised code is listed. Although the severe storage limitations of GW-BASIC no longer apply, the LOF transient path has not been completed in this QuickBASIC code. Its advantages are thus primarily in the much faster running time for TOP and LOHS transients. For the fastest PC hardware (486) and execution option the computation time is reduced by a factor of 124 compared to GW-BASIC on a 386/20

  9. Automatic generation of data merging program codes.

    OpenAIRE

    Hyensook, Kim; Oussena, Samia; Zhang, Ying; Clark, Tony

    2010-01-01

    Data merging is an essential part of ETL (Extract-Transform-Load) processes to build a data warehouse system. To avoid rewheeling merging techniques, we propose a Data Merging Meta-model (DMM) and its transformation into executable program codes in the manner of model driven engineering. DMM allows defining relationships of different model entities and their merging types in conceptual level. Our formalized transformation described using ATL (ATLAS Transformation Language) enables automatic g...

  10. SMARTS: Exploiting Temporal Locality and Parallelism through Vertical Execution

    International Nuclear Information System (INIS)

    Beckman, P.; Crotinger, J.; Karmesin, S.; Malony, A.; Oldehoeft, R.; Shende, S.; Smith, S.; Vajracharya, S.

    1999-01-01

    In the solution of large-scale numerical prob- lems, parallel computing is becoming simultaneously more important and more difficult. The complex organization of today's multiprocessors with several memory hierarchies has forced the scientific programmer to make a choice between simple but unscalable code and scalable but extremely com- plex code that does not port to other architectures. This paper describes how the SMARTS runtime system and the POOMA C++ class library for high-performance scientific computing work together to exploit data parallelism in scientific applications while hiding the details of manag- ing parallelism and data locality from the user. We present innovative algorithms, based on the macro -dataflow model, for detecting data parallelism and efficiently executing data- parallel statements on shared-memory multiprocessors. We also desclibe how these algorithms can be implemented on clusters of SMPS

  11. SMARTS: Exploiting Temporal Locality and Parallelism through Vertical Execution

    Energy Technology Data Exchange (ETDEWEB)

    Beckman, P.; Crotinger, J.; Karmesin, S.; Malony, A.; Oldehoeft, R.; Shende, S.; Smith, S.; Vajracharya, S.

    1999-01-04

    In the solution of large-scale numerical prob- lems, parallel computing is becoming simultaneously more important and more difficult. The complex organization of today's multiprocessors with several memory hierarchies has forced the scientific programmer to make a choice between simple but unscalable code and scalable but extremely com- plex code that does not port to other architectures. This paper describes how the SMARTS runtime system and the POOMA C++ class library for high-performance scientific computing work together to exploit data parallelism in scientific applications while hiding the details of manag- ing parallelism and data locality from the user. We present innovative algorithms, based on the macro -dataflow model, for detecting data parallelism and efficiently executing data- parallel statements on shared-memory multiprocessors. We also desclibe how these algorithms can be implemented on clusters of SMPS.

  12. Code Betal to calculation Alpha/Beta activities in environmental samples

    International Nuclear Information System (INIS)

    Romero, L.; Travesi, A.

    1983-01-01

    A codes, BETAL, was developed, written in FORTRAN IV, to automatize calculations and presentations of the result of the total alpha-beta activities measurements in environmental samples. This code performs the necessary calculations for transformation the activities measured in total counts, to pCi/1., bearing in mind the efficiency of the detector used and the other necessary parameters. Further more, it appraise the standard deviation of the result, and calculus the Lower limit of detection for each measurement. This code is written in iterative way by screen-operator dialogue, and asking the necessary data to perform the calculation of the activity in each case by a screen label. The code could be executed through any screen and keyboard terminal, (whose computer accepts Fortran IV) with a printer connected to the said computer. (Author) 5 refs

  13. THYDE-P2 code: RCS (reactor-coolant system) analysis code

    International Nuclear Information System (INIS)

    Asahi, Yoshiro; Hirano, Masashi; Sato, Kazuo

    1986-12-01

    THYDE-P2, being characterized by the new thermal-hydraulic network model, is applicable to analysis of RCS behaviors in response to various disturbances including LB (large break)-LOCA(loss-of-coolant accident). In LB-LOCA analysis, THYDE-P2 is capable of through calculation from its initiation to complete reflooding of the core without an artificial change in the methods and models. The first half of the report is the description of the methods and models for use in the THYDE-P2 code, i.e., (1) the thermal-hydraulic network model, (2) the various RCS components models, (3) the heat sources in fuel, (4) the heat transfer correlations, (5) the mechanical behavior of clad and fuel, and (6) the steady state adjustment. The second half of the report is the user's mannual for the THYDE-P2 code (version SV04L08A) containing items; (1) the program control (2) the input requirements, (3) the execution of THYDE-P2 job, (4) the output specifications and (5) the sample problem to demonstrate capability of the thermal-hydraulic network model, among other things. (author)

  14. Maybe it's not Python that sucks, maybe it's my code

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Did you know that in Python integers from -5 to 257 are preallocated? Reusing them 1000 times, instead of allocating memory for a bigger integer, saves a whopping 1 millisecond of code's execution time! Isn't that thrilling? Well, before you get that crazy, learn some basic performance tricks that you can start using today.

  15. Additional extensions to the NASCAP computer code, volume 3

    Science.gov (United States)

    Mandell, M. J.; Cooke, D. L.

    1981-01-01

    The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

  16. Litigation to execution in legal labour relationships. Study case

    Directory of Open Access Journals (Sweden)

    Dragos Lucian Radulescu

    2016-06-01

    Full Text Available Enforced execution is the legal way by which the Creditor under an enforceable order protects his rights by resorting to coercive force of the state. When the Debtor does not comply voluntarily, the Creditor may appeal to the Bailiff to commence the enforced execution in all manner prescribed by law. Of course, the start of compulsory execution is limited by the conditions of admissibility imperatively specified in the law, principally the condition to exist an enforceable order owned by the Creditor. Regarding the order to be enforced, it can be represented either by an enforceable or final judgement, with provisional enforcement or any other document that can be enforced. Procedurally, the provisions of Art.712 of the Civil Procedure Code allow the introduction by a Creditor who has a litigation to execution against the execution itself, against the Closures issued by the Bailiff, and against any other act of enforcement. Jurisdiction of the Court in this matter will be of the Executor Court or the Court in whose district the Debtor is situated, on the date of the appeal. The appeal of the Debtor questions the Parties not only over the acts of execution because the appeal is also allowed over the explanations relative to the meaning, scope or application of the enforceable order, but in the conditions limited by the legal nature of this order. Thus, according to the law when enforceable order is not issued by a Court or Arbitration may be invoked before an Executor Court including reasons of fact or law which could not be discussed during an earlier trial, in the first instance or in an appeal. Basically, if enforced execution is under an enforceable order that is not from a Court, these reasons can be invoked when there is no other processual mean for its abolition. There also can be submitted a complaint against the Closure by which was upheld the appeal for an enforced execution, and the act of execution concerning the division of the

  17. General purpose code for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Wilcke, W.W.

    1983-01-01

    A general-purpose computer called MONTHY has been written to perform Monte Carlo simulations of physical systems. To achieve a high degree of flexibility the code is organized like a general purpose computer, operating on a vector describing the time dependent state of the system under simulation. The instruction set of the computer is defined by the user and is therefore adaptable to the particular problem studied. The organization of MONTHY allows iterative and conditional execution of operations

  18. Use of source term code package in the ELEBRA MX-850 system

    International Nuclear Information System (INIS)

    Guimaraes, A.C.F.; Goes, A.G.A.

    1988-12-01

    The implantation of source term code package in the ELEBRA-MX850 system is presented. The source term is formed when radioactive materials generated in nuclear fuel leakage toward containment and the external environment to reactor containment. The implantated version in the ELEBRA system are composed of five codes: MARCH 3, TRAPMELT 3, THCCA, VANESA and NAVA. The original example case was used. The example consists of a small loca accident in a PWR type reactor. A sensitivity study for the TRAPMELT 3 code was carried out, modifying the 'TIME STEP' to estimate the processing time of CPU for executing the original example case. (M.C.K.) [pt

  19. The computer code SEURBNUK/EURDYN (Release 1). Input and output specification

    International Nuclear Information System (INIS)

    Broadhouse, B.J.; Yerkess, A.

    1986-05-01

    SEURBNUK/EURODYN is an extension of SEURBNUK-2, a two dimensional, axisymmetric, Eulerian, finite element containment code in which the finite difference thin shell treatment is replaced by a finite element calculation for both thin and thick structures. These codes are designed to model the hydrodynamic development in time of a hypothetical core disruptive accident (HCDA) in a fast breeder reactor. This manual describes the input data specifications needed for the execution of SEURBNUK/EURDYN calculations, with information on output facilities, and aid to users to avoid some common difficulties. (UK)

  20. Carmen system: a code block for neutronic PWR calculation by diffusion theory with spacedependent feedback effects

    International Nuclear Information System (INIS)

    Ahnert, C.; Aragones, J.M.

    1982-01-01

    The Carmen code (theory and user's manual) is described. This code for assembly and core calculations uses diffusion theory (Citation), with feedback in the cross sections by zone due to the effects of burnup, water density, fuel temperature, Xenon and Samarium. The burnup calculation of a full cycle is solved in only an execution of Carmen, and in a reduced computer time. (auth.)

  1. A Mechanism to Avoid Collusion Attacks Based on Code Passing in Mobile Agent Systems

    Science.gov (United States)

    Jaimez, Marc; Esparza, Oscar; Muñoz, Jose L.; Alins-Delgado, Juan J.; Mata-Díaz, Jorge

    Mobile agents are software entities consisting of code, data, state and itinerary that can migrate autonomously from host to host executing their code. Despite its benefits, security issues strongly restrict the use of code mobility. The protection of mobile agents against the attacks of malicious hosts is considered the most difficult security problem to solve in mobile agent systems. In particular, collusion attacks have been barely studied in the literature. This paper presents a mechanism that avoids collusion attacks based on code passing. Our proposal is based on a Multi-Code agent, which contains a different variant of the code for each host. A Trusted Third Party is responsible for providing the information to extract its own variant to the hosts, and for taking trusted timestamps that will be used to verify time coherence.

  2. Autonomous execution of the Precision Immobilization Technique

    Science.gov (United States)

    Mascareñas, David D. L.; Stull, Christopher J.; Farrar, Charles R.

    2017-03-01

    Over the course of the last decade great advances have been made in autonomously driving cars. The technology has advanced to the point that driverless car technology is currently being tested on publicly accessed roadways. The introduction of these technologies onto publicly accessed roadways not only raises questions of safety, but also security. Autonomously driving cars are inherently cyber-physical systems and as such will have novel security vulnerabilities that couple both the cyber aspects of the vehicle including the on-board computing and any network data it makes use of, with the physical nature of the vehicle including its sensors, actuators, and the vehicle chassis. Widespread implementation of driverless car technology will require that both the cyber, as well as physical security concerns surrounding these vehicles are addressed. In this work, we specifically developed a control policy to autonomously execute the Precision Immobilization Technique, a.k.a. the PIT maneuver. The PIT maneuver was originally developed by law enforcement to end high-speed vehicular pursuits in a quasi-safe manner. However, there is still a risk of damage/roll-over to both the vehicle executing the PIT maneuver as well as to the vehicle subject to the PIT maneuver. In law enforcement applications, it would be preferable to execute the PIT maneuver using an autonomous vehicle, thus removing the danger to law-enforcement officers. Furthermore, it is entirely possible that unscrupulous individuals could inject code into an autonomously-driving car to use the PIT maneuver to immobilize other vehicles while maintaining anonymity. For these reasons it is useful to know how the PIT maneuver can be implemented on an autonomous car. In this work a simple control policy based on velocity pursuit was developed to autonomously execute the PIT maneuver using only a vision and range measurements that are both commonly collected by contemporary driverless cars. The ability of this

  3. Guidelines for selecting codes for ground-water transport modeling of low-level waste burial sites. Executive summary

    International Nuclear Information System (INIS)

    Simmons, C.S.; Cole, C.R.

    1985-05-01

    This document was written to provide guidance to managers and site operators on how ground-water transport codes should be selected for assessing burial site performance. There is a need for a formal approach to selecting appropriate codes from the multitude of potentially useful ground-water transport codes that are currently available. Code selection is a problem that requires more than merely considering mathematical equation-solving methods. These guidelines are very general and flexible and are also meant for developing systems simulation models to be used to assess the environmental safety of low-level waste burial facilities. Code selection is only a single aspect of the overall objective of developing a systems simulation model for a burial site. The guidance given here is mainly directed toward applications-oriented users, but managers and site operators need to be familiar with this information to direct the development of scientifically credible and defensible transport assessment models. Some specific advice for managers and site operators on how to direct a modeling exercise is based on the following five steps: identify specific questions and study objectives; establish costs and schedules for achieving answers; enlist the aid of professional model applications group; decide on approach with applications group and guide code selection; and facilitate the availability of site-specific data. These five steps for managers/site operators are discussed in detail following an explanation of the nine systems model development steps, which are presented first to clarify what code selection entails

  4. Correlated sampling added to the specific purpose Monte Carlo code McPNL for neutron lifetime log responses

    International Nuclear Information System (INIS)

    Mickael, M.; Verghese, K.; Gardner, R.P.

    1989-01-01

    The specific purpose neutron lifetime oil well logging simulation code, McPNL, has been rewritten for greater user-friendliness and faster execution. Correlated sampling has been added to the code to enable studies of relative changes in the tool response caused by environmental changes. The absolute responses calculated by the code have been benchmarked against laboratory test pit data. The relative responses from correlated sampling are not directly benchmarked, but they are validated using experimental and theoretical results

  5. Tabled Execution in Scheme

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, J J; Lumsdaine, A; Quinlan, D J

    2008-08-19

    Tabled execution is a generalization of memorization developed by the logic programming community. It not only saves results from tabled predicates, but also stores the set of currently active calls to them; tabled execution can thus provide meaningful semantics for programs that seemingly contain infinite recursions with the same arguments. In logic programming, tabled execution is used for many purposes, both for improving the efficiency of programs, and making tasks simpler and more direct to express than with normal logic programs. However, tabled execution is only infrequently applied in mainstream functional languages such as Scheme. We demonstrate an elegant implementation of tabled execution in Scheme, using a mix of continuation-passing style and mutable data. We also show the use of tabled execution in Scheme for a problem in formal language and automata theory, demonstrating that tabled execution can be a valuable tool for Scheme users.

  6. Protection of Mobile Agents Execution Using a Modified Self-Validating Branch-Based Software Watermarking with External Sentinel

    Science.gov (United States)

    Tomàs-Buliart, Joan; Fernández, Marcel; Soriano, Miguel

    Critical infrastructures are usually controlled by software entities. To monitor the well-function of these entities, a solution based in the use of mobile agents is proposed. Some proposals to detect modifications of mobile agents, as digital signature of code, exist but they are oriented to protect software against modification or to verify that an agent have been executed correctly. The aim of our proposal is to guarantee that the software is being executed correctly by a non trusted host. The way proposed to achieve this objective is by the improvement of the Self-Validating Branch-Based Software Watermarking by Myles et al.. The proposed modification is the incorporation of an external element called sentinel which controls branch targets. This technique applied in mobile agents can guarantee the correct operation of an agent or, at least, can detect suspicious behaviours of a malicious host during the execution of the agent instead of detecting when the execution of the agent have finished.

  7. About the Code of Practice of the European Mathematical Society

    DEFF Research Database (Denmark)

    Jensen, Arne

    2013-01-01

    The Executive Committee of the European Mathematical Society created an Ethics Committee in the Spring of 2010. The first task of the Committee was to prepare a Code of Practice. This task was completed in the Spring of 2012 and went into effect on 1 November 2012. Arne Jensen, author...... of this article, is Chair of the EMS Ethics Committee...

  8. User's guide for SLWDN9, a code for calculating flux-surfaced-averaging of alpha densities, currents, and heating in non-circular tokamaks

    International Nuclear Information System (INIS)

    Hively, L.M.; Miley, G.M.

    1980-03-01

    The code calculates flux-surfaced-averaged values of alpha density, current, and electron/ion heating profiles in realistic, non-circular tokamak plasmas. The code is written in FORTRAN and execute on the CRAY-1 machine at the Magnetic Fusion Energy Computer Center

  9. Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †.

    Science.gov (United States)

    Murdani, Muhammad Harist; Kwon, Joonho; Choi, Yoon-Ho; Hong, Bonghee

    2018-03-24

    In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes ( Ad-Hoc ) and neighborhood proximity ( Top-K ). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space.

  10. Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †

    Directory of Open Access Journals (Sweden)

    Muhammad Harist Murdani

    2018-03-01

    Full Text Available In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes (Ad-Hoc and neighborhood proximity (Top-K. Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space.

  11. A study on the prediction capability of GOTHIC and HYCA3D code for local hydrogen concentrations

    International Nuclear Information System (INIS)

    Choi, Y. S.; Lee, W. J.; Lee, J. J.; Park, K. C.

    2002-01-01

    In this study the prediction capability of GOTHIC and HYCA3D code for local hydrogen concentrations was verified with experimental results. Among the experiments, executed by SNU and other organization inside and outside of the country, the fast transient and the obstacle cases are selected. In case of large subcompartment both the code show good agreement with the experimental data. But in case of small and complex geometry or fast transient the results of GOTHIC code have the large difference from experimental ones. This represents that GOTHIC code is unsuitable for these cases. On the contrary HTCA3D code agrees well with all the experimental data

  12. Installation of Monte Carlo neutron and photon transport code system MCNP4

    International Nuclear Information System (INIS)

    Takano, Makoto; Sasaki, Mikio; Kaneko, Toshiyuki; Yamazaki, Takao.

    1993-03-01

    The continuous energy Monte Carlo code MCNP-4 including its graphic functions has been installed on the Sun-4 sparc-2 work station with minor corrections. In order to validate the installed MCNP-4 code, 25 sample problems have been executed on the work station and these results have been compared with the original ones. And, the most of the graphic functions have been demonstrated by using 3 sample problems. Further, additional 14 nuclides have been included to the continuous cross section library edited from JENDL-3. (author)

  13. Development of the versatile reactor analysis code system, MARBLE2

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Jin, Tomoyuki; Hazama, Taira; Hirai, Yasushi

    2015-07-01

    The second version of the versatile reactor analysis code system, MARBLE2, has been developed. A lot of new functions have been added in MARBLE2 by using the base technology developed in the first version (MARBLE1). Introducing the remaining functions of the conventional code system (JOINT-FR and SAGEP-FR), MARBLE2 enables one to execute almost all analysis functions of the conventional code system with the unified user interfaces of its subsystem, SCHEME. In particular, the sensitivity analysis functionality is available in MARBLE2. On the other hand, new built-in solvers have been developed, and existing ones have been upgraded. Furthermore, some other analysis codes and libraries developed in JAEA have been consolidated and prepared in SCHEME. In addition, several analysis codes developed in the other institutes have been additionally introduced as plug-in solvers. Consequently, gamma-ray transport calculation and heating evaluation become available. As for another subsystem, ORPHEUS, various functionality updates and speed-up techniques have been applied based on user experience of MARBLE1 to enhance its usability. (author)

  14. Investigating executive functions in children with severe speech and movement disorders using structured tasks

    Directory of Open Access Journals (Sweden)

    Kristine eStadskleiv

    2014-09-01

    Full Text Available Executive functions are the basis for goal-directed activity and include planning, monitoring, and inhibition, and language seems to play a role in the development of these functions. There is a tradition of studying executive function in both typical and atypical populations, and the present study investigates executive functions in children with severe speech and motor impairments who are communicating using communication aids with graphic symbols, letters and/or words. There are few neuropsychological studies of children in this group and little is known about their cognitive functioning, including executive functions. It was hypothesized that aided communication would tax executive functions more than speech. 29 children using communication aids and 27 naturally speaking children participated. Structured tasks resembling everyday activities, where the action goals had to be reached through communication with a partner, were used to get information about executive functions. The children a directed the partner to perform actions like building a Lego tower from a model the partner could not see and b gave information about an object without naming it to a person who had to guess what object it was. The executive functions of planning, monitoring and impulse control were coded from the children’s on-task behavior. Both groups solved most of the tasks correctly, indicating that aided communicators are able to use language to direct another person to do a complex set of actions. Planning and lack of impulsivity was positively related to task success in both groups. The aided group completed significantly fewer tasks, spent longer time and showed more variation in performance than the comparison group. The aided communicators scored lower on planning and showed more impulsivity than the comparison group, while both groups showed an equal degree of monitoring of the work progress. The results are consistent with the hypothesis that aided language

  15. Build and Execute Environment

    Energy Technology Data Exchange (ETDEWEB)

    2017-04-21

    workflows for repeatable and partial re-execution. We will coordinate the physical snapshots of virtual machines with parallel programming constructs, such as barriers, to automate checkpoint and restart. We will also integrate with HPC-specific container runtimes to gain access to accelerators and other specialized hardware to preserve native performance. Containers will link development to continuous integration. When application developers check code in, it will automatically be tested on a suite of different software and hardware architectures.

  16. Development of the point-depletion code DEPTH

    International Nuclear Information System (INIS)

    She, Ding; Wang, Kan; Yu, Ganglin

    2013-01-01

    Highlights: ► The DEPTH code has been developed for the large-scale depletion system. ► DEPTH uses the data library which is convenient to couple with MC codes. ► TTA and matrix exponential methods are implemented and compared. ► DEPTH is able to calculate integral quantities based on the matrix inverse. ► Code-to-code comparisons prove the accuracy and efficiency of DEPTH. -- Abstract: The burnup analysis is an important aspect in reactor physics, which is generally done by coupling of transport calculations and point-depletion calculations. DEPTH is a newly-developed point-depletion code of handling large burnup depletion systems and detailed depletion chains. For better coupling with Monte Carlo transport codes, DEPTH uses data libraries based on the combination of ORIGEN-2 and ORIGEN-S and allows users to assign problem-dependent libraries for each depletion step. DEPTH implements various algorithms of treating the stiff depletion systems, including the Transmutation trajectory analysis (TTA), the Chebyshev Rational Approximation Method (CRAM), the Quadrature-based Rational Approximation Method (QRAM) and the Laguerre Polynomial Approximation Method (LPAM). Three different modes are supported by DEPTH to execute the decay, constant flux and constant power calculations. In addition to obtaining the instantaneous quantities of the radioactivity, decay heats and reaction rates, DEPTH is able to calculate the integral quantities by a time-integrated solver. Through calculations compared with ORIGEN-2, the validity of DEPTH in point-depletion calculations is proved. The accuracy and efficiency of depletion algorithms are also discussed. In addition, an actual pin-cell burnup case is calculated to illustrate the DEPTH code performance in coupling with the RMC Monte Carlo code

  17. SIERRA Code Coupling Module: Arpeggio User Manual Version 4.44

    Energy Technology Data Exchange (ETDEWEB)

    Sierra Thermal/Fluid Team

    2017-04-01

    The SNL Sierra Mechanics code suite is designed to enable simulation of complex multiphysics scenarios. The code suite is composed of several specialized applications which can operate either in standalone mode or coupled with each other. Arpeggio is a supported utility that enables loose coupling of the various Sierra Mechanics applications by providing access to Framework services that facilitate the coupling. More importantly Arpeggio orchestrates the execution of applications that participate in the coupling. This document describes the various components of Arpeggio and their operability. The intent of the document is to provide a fast path for analysts interested in coupled applications via simple examples of its usage.

  18. THYDE-NEU: Nuclear reactor system analysis code

    International Nuclear Information System (INIS)

    Asahi, Yoshiro

    2002-03-01

    THYDE-NEU is applicable not only to transient analyses, but also to steady state analyses of nuclear reactor systems (NRSs). In a steady state analysis, the code generates a solution satisfying the transient equations without external disturbances. In a transient analysis, the code calculates temporal NRS behaviors in response to various external disturbances in such a way that mass and energy of the coolant as well as the number of neutrons conserve. The first half of the report is the description of the methods and models for use in the THYDE-NEU code, i.e., (1) the thermal-hydraulic network model, (2) the spatial kinetics model, (3) the heat sources in fuel, (4) the heat transfer correlations, (5) the mechanical behavior of clad and fuel, and (6) the steady state adjustment. The second half of the report is the users' mannual containing the items; (1) the program control, (2) the input requirements, (3) the execution of THYDE-NEU jobs, (4) the output specifications and (5) the sample calculation. (author)

  19. The Event Coordination Notation: Execution Engine and Programming Framework

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2012-01-01

    that was written manually. In this paper, we rephrase the main concepts of ECNO. The focus of this paper, however, is on the architecture of the ECNO execution engine and its programming framework. We will show how this framework allows us to integrate ECNO with object-oriented models, how it works without any......ECNO (Event Coordination Notation) is a notation for modelling the behaviour of a software system on top of some object-oriented data model. ECNO has two main objectives: On the one hand, ECNO should allow modelling the behaviour of a system on the domain level; on the other hand, it should...... be possible to completely generate code from ECNO and the underlying object-oriented domain models. Today, there are several approaches that would allow to do this. But, most of them would require that the data models and the behaviour models are using the same technology and the code is generated together...

  20. A flexible coupling scheme for Monte Carlo and thermal-hydraulics codes

    Energy Technology Data Exchange (ETDEWEB)

    Hoogenboom, J. Eduard, E-mail: J.E.Hoogenboom@tudelft.nl [Delft University of Technology (Netherlands); Ivanov, Aleksandar; Sanchez, Victor, E-mail: Aleksandar.Ivanov@kit.edu, E-mail: Victor.Sanchez@kit.edu [Karlsruhe Institute of Technology, Institute of Neutron Physics and Reactor Technology, Eggenstein-Leopoldshafen (Germany); Diop, Cheikh, E-mail: Cheikh.Diop@cea.fr [CEA/DEN/DANS/DM2S/SERMA, Commissariat a l' Energie Atomique, Gif-sur-Yvette (France)

    2011-07-01

    A coupling scheme between a Monte Carlo code and a thermal-hydraulics code is being developed within the European NURISP project for comprehensive and validated reactor analysis. The scheme is flexible as it allows different Monte Carlo codes and different thermal-hydraulics codes to be used. At present the MCNP and TRIPOLI4 Monte Carlo codes can be used and the FLICA4 and SubChanFlow thermal-hydraulics codes. For all these codes only an original executable is necessary. A Python script drives the iterations between Monte Carlo and thermal-hydraulics calculations. It also calls a conversion program to merge a master input file for the Monte Carlo code with the appropriate temperature and coolant density data from the thermal-hydraulics calculation. Likewise it calls another conversion program to merge a master input file for the thermal-hydraulics code with the power distribution data from the Monte Carlo calculation. Special attention is given to the neutron cross section data for the various required temperatures in the Monte Carlo calculation. Results are shown for an infinite lattice of PWR fuel pin cells and a 3 x 3 fuel BWR pin cell cluster. Various possibilities for further improvement and optimization of the coupling system are discussed. (author)

  1. A flexible coupling scheme for Monte Carlo and thermal-hydraulics codes

    International Nuclear Information System (INIS)

    Hoogenboom, J. Eduard; Ivanov, Aleksandar; Sanchez, Victor; Diop, Cheikh

    2011-01-01

    A coupling scheme between a Monte Carlo code and a thermal-hydraulics code is being developed within the European NURISP project for comprehensive and validated reactor analysis. The scheme is flexible as it allows different Monte Carlo codes and different thermal-hydraulics codes to be used. At present the MCNP and TRIPOLI4 Monte Carlo codes can be used and the FLICA4 and SubChanFlow thermal-hydraulics codes. For all these codes only an original executable is necessary. A Python script drives the iterations between Monte Carlo and thermal-hydraulics calculations. It also calls a conversion program to merge a master input file for the Monte Carlo code with the appropriate temperature and coolant density data from the thermal-hydraulics calculation. Likewise it calls another conversion program to merge a master input file for the thermal-hydraulics code with the power distribution data from the Monte Carlo calculation. Special attention is given to the neutron cross section data for the various required temperatures in the Monte Carlo calculation. Results are shown for an infinite lattice of PWR fuel pin cells and a 3 x 3 fuel BWR pin cell cluster. Various possibilities for further improvement and optimization of the coupling system are discussed. (author)

  2. Runtime Detection of C-Style Errors in UPC Code

    Energy Technology Data Exchange (ETDEWEB)

    Pirkelbauer, P; Liao, C; Panas, T; Quinlan, D

    2011-09-29

    Unified Parallel C (UPC) extends the C programming language (ISO C 99) with explicit parallel programming support for the partitioned global address space (PGAS), which provides a global memory space with localized partitions to each thread. Like its ancestor C, UPC is a low-level language that emphasizes code efficiency over safety. The absence of dynamic (and static) safety checks allows programmer oversights and software flaws that can be hard to spot. In this paper, we present an extension of a dynamic analysis tool, ROSE-Code Instrumentation and Runtime Monitor (ROSECIRM), for UPC to help programmers find C-style errors involving the global address space. Built on top of the ROSE source-to-source compiler infrastructure, the tool instruments source files with code that monitors operations and keeps track of changes to the system state. The resulting code is linked to a runtime monitor that observes the program execution and finds software defects. We describe the extensions to ROSE-CIRM that were necessary to support UPC. We discuss complications that arise from parallel code and our solutions. We test ROSE-CIRM against a runtime error detection test suite, and present performance results obtained from running error-free codes. ROSE-CIRM is released as part of the ROSE compiler under a BSD-style open source license.

  3. V.S.O.P. (99/05) computer code system

    International Nuclear Information System (INIS)

    Ruetten, H.J.; Haas, K.A.; Brockmann, H.; Scherer, W.

    2005-11-01

    V.S.O.P. is a computer code system for the comprehensive numerical simulation of the physics of thermal reactors. It implies the setup of the reactor and of the fuel element, processing of cross sections, neutron spectrum evaluation, neutron diffusion calculation in two or three dimensions, fuel burnup, fuel shuffling, reactor control, thermal hydraulics and fuel cycle costs. The thermal hydraulics part (steady state and time-dependent) is restricted to HTRs and to two spatial dimensions. The code can simulate the reactor operation from the initial core towards the equilibrium core. V.S.O.P.(99 / 05) represents the further development of V.S.O.P. (99). Compared to its precursor, the code system has been improved in many details. Major improvements and extensions have been included concerning the neutron spectrum calculation, the 3-d neutron diffusion options, and the thermal hydraulic section with respect to 'multi-pass'-fuelled pebblebed cores. This latest code version was developed and tested under the WINDOWS-XP - operating system. The storage requirement for the executables and the basic libraries associated with the code amounts to about 15 MB. Another 5 MB are required - if desired - for storage of the source code (∼65000 Fortran statements). (orig.)

  4. V.S.O.P. (99/05) computer code system

    Energy Technology Data Exchange (ETDEWEB)

    Ruetten, H.J.; Haas, K.A.; Brockmann, H.; Scherer, W.

    2005-11-01

    V.S.O.P. is a computer code system for the comprehensive numerical simulation of the physics of thermal reactors. It implies the setup of the reactor and of the fuel element, processing of cross sections, neutron spectrum evaluation, neutron diffusion calculation in two or three dimensions, fuel burnup, fuel shuffling, reactor control, thermal hydraulics and fuel cycle costs. The thermal hydraulics part (steady state and time-dependent) is restricted to HTRs and to two spatial dimensions. The code can simulate the reactor operation from the initial core towards the equilibrium core. V.S.O.P.(99 / 05) represents the further development of V.S.O.P. (99). Compared to its precursor, the code system has been improved in many details. Major improvements and extensions have been included concerning the neutron spectrum calculation, the 3-d neutron diffusion options, and the thermal hydraulic section with respect to 'multi-pass'-fuelled pebblebed cores. This latest code version was developed and tested under the WINDOWS-XP - operating system. The storage requirement for the executables and the basic libraries associated with the code amounts to about 15 MB. Another 5 MB are required - if desired - for storage of the source code ({approx}65000 Fortran statements). (orig.)

  5. The SWAN coupling code: user's guide

    International Nuclear Information System (INIS)

    Litaudon, X.; Moreau, D.

    1988-11-01

    Coupling of slow waves in a plasma near the lower hybrid frequency is well known and linear theory with density step followed by a constant gradient can be used with some confidence. With the aid of the computer code SWAN, which stands for 'Slow Wave Antenna', the following parameters can be numerically calculated: n parallel power spectrum, directivity (weighted by the current drive efficiency), reflection coefficients (amplitude and phase) both before and after the E-plane junctions, scattering matrix at the plasma interface, scattering matrix at the E-plane junctions, maximum electric fields in secondary waveguides and location where it occurs, effect of passive waveguides on each side of the antenna, and the effect of a finite magnetic field in front of the antenna (for homogeneous plasma). This manual gives the basic information on the main assumptions of the coupling theory and on the use and general structure of the code itself. It answers the questions what are the main assumptions of the physical model? how to execute a job? what are the input parameters of the code? and what are the output results and where are they written? (author)

  6. Standardization of the time for the execution of HANARO start-up and shutdown procedures

    International Nuclear Information System (INIS)

    Choi, H. Y.; Lim, I. C.; Hwang, S. R.; Kang, T. J.; Youn, D. B.

    2003-01-01

    For the standardization of the time to execute HANARO start-up and shutdown procedures, code names were assigned to the individual procedures and the work time were investigated. The data recorded by the operators during start-up and shutdown were statistically analyzed. The analysis results will be used for the standardization of start-up and shutdown procedures and it will be reflected in the procedure document

  7. Development and Execution of an Impact Cratering Application on a Computational Grid

    Directory of Open Access Journals (Sweden)

    E. Huedo

    2005-01-01

    Full Text Available Impact cratering is an important geological process of special interest in Astrobiology. Its numerical simulation comprises the execution of a high number of tasks, since the search space of input parameter values includes the projectile diameter, the water depth and the impactor velocity. Furthermore, the execution time of each task is not uniform because of the different numerical properties of each experimental configuration. Grid technology is a promising platform to execute this kind of applications, since it provides the end user with a performance much higher than that achievable on any single organization. However, the scheduling of each task on a Grid involves challenging issues due to the unpredictable and heterogeneous behavior of both the Grid and the numerical code. This paper evaluates the performance of a Grid infrastructure based on the Globus toolkit and the GridWay framework, which provides the adaptive and fault tolerance functionality required to harness Grid resources, in the simulation of the impact cratering process. The experiments have been performed on a testbed composed of resources shared by five sites interconnected by RedIRIS, the Spanish Research and Education Network.

  8. The structure of affective action representations: temporal binding of affective response codes.

    Science.gov (United States)

    Eder, Andreas B; Müsseler, Jochen; Hommel, Bernhard

    2012-01-01

    Two experiments examined the hypothesis that preparing an action with a specific affective connotation involves the binding of this action to an affective code reflecting this connotation. This integration into an action plan should lead to a temporary occupation of the affective code, which should impair the concurrent representation of affectively congruent events, such as the planning of another action with the same valence. This hypothesis was tested with a dual-task setup that required a speeded choice between approach- and avoidance-type lever movements after having planned and before having executed an evaluative button press. In line with the code-occupation hypothesis, slower lever movements were observed when the lever movement was affectively compatible with the prepared evaluative button press than when the two actions were affectively incompatible. Lever movements related to approach and avoidance and evaluative button presses thus seem to share a code that represents affective meaning. A model of affective action control that is based on the theory of event coding is discussed.

  9. Evaluating Open-Source Full-Text Search Engines for Matching ICD-10 Codes.

    Science.gov (United States)

    Jurcău, Daniel-Alexandru; Stoicu-Tivadar, Vasile

    2016-01-01

    This research presents the results of evaluating multiple free, open-source engines on matching ICD-10 diagnostic codes via full-text searches. The study investigates what it takes to get an accurate match when searching for a specific diagnostic code. For each code the evaluation starts by extracting the words that make up its text and continues with building full-text search queries from the combinations of these words. The queries are then run against all the ICD-10 codes until a match indicates the code in question as a match with the highest relative score. This method identifies the minimum number of words that must be provided in order for the search engines choose the desired entry. The engines analyzed include a popular Java-based full-text search engine, a lightweight engine written in JavaScript which can even execute on the user's browser, and two popular open-source relational database management systems.

  10. A ''SuperCode'' for performing systems analysis of tokamak experiments and reactors

    International Nuclear Information System (INIS)

    Haney, S.W.; Barr, W.L.; Crotinger, J.A.; Perkins, L.J.; Solomon, C.J.; Chaniotakis, E.A.; Freidberg, J.P.; Wei, J.; Galambos, J.D.; Mandrekas, J.

    1992-01-01

    A new code, named the ''SUPERCODE,'' has been developed to fill the gap between currently available zero dimensional systems codes and highly sophisticated, multidimensional plasma performance codes. The former are comprehensive in content, fast to execute, but rather simple in terms of the accuracy of the physics and engineering models. The latter contain state-of-the-art plasma physics modelling but are limited in engineering content and time consuming to run. The SUPERCODE upgrades the reliability and accuracy of systems codes by calculating the self consistent 1 1/2 dimensional MHD-transport plasma evolution in a realistic engineering environment. By a combination of variational techniques and careful formation, there is only a modest increase in CPU time over O-D runs, thereby making the SUPERCODE suitable for use as a systems studies tool. In addition, considerable effort has been expended to make the code user- and programming-friendly, as well as operationally flexible, with the hope of encouraging wide usage throughout the fusion community

  11. 5 CFR 842.211 - Senior Executive Service, Defense Intelligence Senior Executive Service, and Senior Cryptologic...

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Senior Executive Service, Defense Intelligence Senior Executive Service, and Senior Cryptologic Executive Service. 842.211 Section 842.211... EMPLOYEES RETIREMENT SYSTEM-BASIC ANNUITY Eligibility § 842.211 Senior Executive Service, Defense...

  12. Remodularizing Java Programs for Improved Locality of Feature Implementations in Source Code

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    Explicit traceability between features and source code is known to help programmers to understand and modify programs during maintenance tasks. However, the complex relations between features and their implementations are not evident from the source code of object-oriented Java programs....... Consequently, the implementations of individual features are difficult to locate, comprehend, and modify in isolation. In this paper, we present a novel remodularization approach that improves the representation of features in the source code of Java programs. Both forward- and reverse restructurings...... are supported through on-demand bidirectional restructuring between feature-oriented and object-oriented decompositions. The approach includes a feature location phase based of tracing program execution, a feature representation phase that reallocates classes into a new package structure based on single...

  13. OFFSCALE: A PC input processor for the SCALE code system. The ORIGNATE processor for ORIGEN-S

    International Nuclear Information System (INIS)

    Bowman, S.M.

    1994-11-01

    OFFSCALE is a suite of personal computer input processor programs developed at Oak Ridge National Laboratory to provide an easy-to-use interface for modules in the SCALE-4 code system. ORIGNATE is a program in the OFFSCALE suite that serves as a user-friendly interface for the ORIGEN-S isotopic generation and depletion code. It is designed to assist an ORIGEN-S user in preparing an input file for execution of light-water-reactor (LWR) fuel depletion and decay cases. ORIGNATE generates an input file that may be used to execute ORIGEN-S in SCALE-4. ORIGNATE features a pulldown menu system that accesses sophisticated data entry screens. The program allows the user to quickly set up an ORIGEN-S input file and perform error checking. This capability increases productivity and decreases the chance of user error

  14. Regulation of Say on Pay: Engineering Incentives for Executives and Directors – Experiences from the United States and Implications for Regulation in Switzerland

    OpenAIRE

    Müller, Lukas

    2011-01-01

    The debate about the compensation of executives and directors is a discussion about incentives and agency costs. This article analyzes basic tools to reduce agency costs and also assesses the ongoing debate about the future regulation of the compensation of executives and directors. It draws upon legislative experience from the United States. Recently proposed legislation in Switzerland attempts to empower shareholders with the draft of the Swiss Code of Obligations (CO). The main motivation ...

  15. Memory bottlenecks and memory contention in multi-core Monte Carlo transport codes

    International Nuclear Information System (INIS)

    Tramm, J.R.; Siegel, A.R.

    2013-01-01

    The simulation of whole nuclear cores through the use of Monte Carlo codes requires an impracticably long time-to-solution. We have extracted a kernel that executes only the most computationally expensive steps of the Monte Carlo particle transport algorithm - the calculation of macroscopic cross sections - in an effort to expose bottlenecks within multi-core, shared memory architectures. (authors)

  16. Validity of the Italian Code of Ethics for everyday nursing practice.

    Science.gov (United States)

    Gobbi, Paola; Castoldi, Maria Grazia; Alagna, Rosa Anna; Brunoldi, Anna; Pari, Chiara; Gallo, Annamaria; Magri, Miriam; Marioni, Lorena; Muttillo, Giovanni; Passoni, Claudia; Torre, Anna La; Rosa, Debora; Carnevale, Franco A

    2016-12-07

    The research question for this study was as follows: Is the Code of Ethics for Nurses in Italy (Code) a valid or useful decision-making instrument for nurses faced with ethical problems in their daily clinical practice? Focus groups were conducted to analyze specific ethical problems through 11 case studies. The analysis was conducted using sections of the Code as well as other relevant documents. Each focus group had a specific theme and nurses participated freely in the discussions according to their respective clinical competencies. The executive administrative committee of the local nursing licensing council provided approval for conducting this project. Measures were taken to protect the confidentiality of consenting participants. The answer to the research question posed for this investigation was predominantly positive. Many sections of the Code were useful for discussion and identifying possible solutions for the ethical problems presented in the 11 cases. We concluded that the Code of Ethics for Nurses in Italy can be a valuable aid in daily practice in most clinical situations that can give rise to ethical problems. © The Author(s) 2016.

  17. A comparison of two three-dimensional shell-element transient electromagnetics codes

    International Nuclear Information System (INIS)

    Yugo, J.J.; Williamson, D.E.

    1992-01-01

    Electromagnetic forces due to eddy currents strongly influence the design of components for the next generation of fusion devices. An effort has been made to benchmark two computer programs used to generate transient electromagnetic loads: SPARK and EddyCuFF. Two simple transient field problems were analyzed, both of which had been previously analyzed by the SPARK code with results recorded in the literature. A third problem that uses an ITER inboard blanket benchmark model was analyzed as well. This problem was driven with a self-consistent, distributed multifilament plasma model generated by an axisymmetric physics code. The benchmark problems showed good agreement between the two shell-element codes. Variations in calculated eddy currents of 1--3% have been found for similar, finely meshed models. A difference of 8% was found in induced current and 20% in force for a coarse mesh and complex, multifilament field driver. Because comparisons were made to results obtained from literature, model preparation and code execution times were not evaluated

  18. Development of a parallelization strategy for the VARIANT code

    International Nuclear Information System (INIS)

    Hanebutte, U.R.; Khalil, H.S.; Palmiotti, G.; Tatsumi, M.

    1996-01-01

    The VARIANT code solves the multigroup steady-state neutron diffusion and transport equation in three-dimensional Cartesian and hexagonal geometries using the variational nodal method. VARIANT consists of four major parts that must be executed sequentially: input handling, calculation of response matrices, solution algorithm (i.e. inner-outer iteration), and output of results. The objective of the parallelization effort was to reduce the overall computing time by distributing the work of the two computationally intensive (sequential) tasks, the coupling coefficient calculation and the iterative solver, equally among a group of processors. This report describes the code's calculations and gives performance results on one of the benchmark problems used to test the code. The performance analysis in the IBM SPx system shows good efficiency for well-load-balanced programs. Even for relatively small problem sizes, respectable efficiencies are seen for the SPx. An extension to achieve a higher degree of parallelism will be addressed in future work. 7 refs., 1 tab

  19. WinBUGSio: A SAS Macro for the Remote Execution of WinBUGS

    Directory of Open Access Journals (Sweden)

    Michael K. Smith

    2007-09-01

    Full Text Available This is a macro which facilitates remote execution of WinBUGS from within SAS. The macro pre-processes data for WinBUGS, writes the WinBUGS batch-script, executes this script and reads in output statistics from the WinBUGS log-file back into SAS native format. The user specifies the input and output file names and directory path as well as the statistics to be monitored in WinBUGS. The code works best for a model that has already been set up and checked for convergence diagnostics within WinBUGS. An obvious extension of the use of this macro is for running simulations where the input and output files all have the same name but all that differs between simulation iterations is the input dataset. The functionality and syntax of the macro call are described in this paper and illustrated using a simple linear regression model.

  20. Applying Hamming Code to Memory System of Safety Grade PLC (POSAFE-Q) Processor Module

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Taehee; Hwang, Sungjae; Park, Gangmin [POSCO Nuclear Technology, Seoul (Korea, Republic of)

    2013-05-15

    If some errors such as inverted bits occur in the memory, instructions and data will be corrupted. As a result, the PLC may execute the wrong instructions or refer to the wrong data. Hamming Code can be considered as the solution for mitigating this mis operation. In this paper, we apply hamming Code, then, we inspect whether hamming code is suitable for to the memory system of the processor module. In this paper, we applied hamming code to existing safety grade PLC (POSAFE-Q). Inspection data are collected and they will be referred for improving the PLC in terms of the soundness. In our future work, we will try to improve time delay caused by hamming calculation. It will include CPLD optimization and memory architecture or parts alteration. In addition to these hamming code-based works, we will explore any methodologies such as mirroring for the soundness of safety grade PLC. Hamming code-based works can correct bit errors, but they have limitation in multi bits errors.

  1. Applying Hamming Code to Memory System of Safety Grade PLC (POSAFE-Q) Processor Module

    International Nuclear Information System (INIS)

    Kim, Taehee; Hwang, Sungjae; Park, Gangmin

    2013-01-01

    If some errors such as inverted bits occur in the memory, instructions and data will be corrupted. As a result, the PLC may execute the wrong instructions or refer to the wrong data. Hamming Code can be considered as the solution for mitigating this mis operation. In this paper, we apply hamming Code, then, we inspect whether hamming code is suitable for to the memory system of the processor module. In this paper, we applied hamming code to existing safety grade PLC (POSAFE-Q). Inspection data are collected and they will be referred for improving the PLC in terms of the soundness. In our future work, we will try to improve time delay caused by hamming calculation. It will include CPLD optimization and memory architecture or parts alteration. In addition to these hamming code-based works, we will explore any methodologies such as mirroring for the soundness of safety grade PLC. Hamming code-based works can correct bit errors, but they have limitation in multi bits errors

  2. Code for plant identification (KKS) key in PC version

    International Nuclear Information System (INIS)

    Pannenbaecker, K.

    1991-01-01

    The plant identification system (KKS) as a common development of german plant operators, erection firms and also power plant oriented organisations have decisively influenced the technical-organizing activities of planning and erections as operations and maintenance of all kind of power plants. Fundamentals are three key parts, operation, armatures and function keys. Their management and application is executed by a plantidentification-key code in a PC version, which is briefly described in this report. (orig.) [de

  3. Commentary: Mentoring the mentor: executive coaching for clinical departmental executive officers.

    Science.gov (United States)

    Geist, Lois J; Cohen, Michael B

    2010-01-01

    Departmental executive officers (DEOs), department chairs, and department heads in medical schools are often hired on the basis of their accomplishments in research as well as their skills in administration, management, and leadership. These individuals are also expected to be expert in multiple areas, including negotiation, finance and budgeting, mentoring, and personnel management. At the same time, they are expected to maintain and perhaps even enhance their personal academic standing for the purposes of raising the level of departmental and institutional prestige and for recruiting the next generation of physicians and scientists. In the corporate world, employers understand the importance of training new leaders in requisite skill enhancement that will lead to success in their new positions. These individuals are often provided with extensive executive training to develop the necessary competencies to make them successful leaders. Among the tools employed for this purpose are the use of personal coaches or executive training courses. The authors propose that the use of executive coaching in academic medicine may be of benefit for new DEOs. Experience using an executive coach suggests that this was a valuable growth experience for new leaders in the institution.

  4. Single-instruction multiple-data execution

    CERN Document Server

    Hughes, Christopher J

    2015-01-01

    Having hit power limitations to even more aggressive out-of-order execution in processor cores, many architects in the past decade have turned to single-instruction-multiple-data (SIMD) execution to increase single-threaded performance. SIMD execution, or having a single instruction drive execution of an identical operation on multiple data items, was already well established as a technique to efficiently exploit data parallelism. Furthermore, support for it was already included in many commodity processors. However, in the past decade, SIMD execution has seen a dramatic increase in the set of

  5. Executive Orders from 1994-2013

    Data.gov (United States)

    National Archives and Records Administration — The President of the United States manages the operations of the Executive branch of Government through Executive orders. After the President signs an Executive...

  6. 75 FR 55574 - Joint Public Roundtable on Swap Execution Facilities and Security-Based Swap Execution Facilities

    Science.gov (United States)

    2010-09-13

    ...; File No. 4-612] Joint Public Roundtable on Swap Execution Facilities and Security-Based Swap Execution Facilities AGENCY: Commodity Futures Trading Commission (``CFTC'') and Securities and Exchange Commission... discuss swap execution facilities and security-based swap execution facilities in the context of certain...

  7. Generating performance portable geoscientific simulation code with Firedrake (Invited)

    Science.gov (United States)

    Ham, D. A.; Bercea, G.; Cotter, C. J.; Kelly, P. H.; Loriant, N.; Luporini, F.; McRae, A. T.; Mitchell, L.; Rathgeber, F.

    2013-12-01

    , can be written as short C kernels operating locally on the underlying mesh, with no explicit parallelism. The executable code is then generated in C, CUDA or OpenCL and executed in parallel on the target architecture. The system also offers features of special relevance to the geosciences. In particular, the large scale separation between the vertical and horizontal directions in many geoscientific processes can be exploited to offer the flexibility of unstructured meshes in the horizontal direction, without the performance penalty usually associated with those methods.

  8. Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder

    Science.gov (United States)

    Staats, Matt

    2009-01-01

    We present work on a prototype tool based on the JavaPathfinder (JPF) model checker for automatically generating tests satisfying the MC/DC code coverage criterion. Using the Eclipse IDE, developers and testers can quickly instrument Java source code with JPF annotations covering all MC/DC coverage obligations, and JPF can then be used to automatically generate tests that satisfy these obligations. The prototype extension to JPF enables various tasks useful in automatic test generation to be performed, such as test suite reduction and execution of generated tests.

  9. The Significance of the 2014 Corporate Governance Code of the Bank of Russia

    Directory of Open Access Journals (Sweden)

    Anna Vladislavovna Shashkova

    2014-01-01

    Full Text Available The present article focuses on corporate governance in Russia, as well as on the approval in 2014 of the Code of Corporate Governance by the Bank of Russia and by the Russian Government. The article also provides the concept of the famous foreign term Compliance. Compliance is a system based on binding rules of conduct contained in the regulations which are mandatory for the company. In order to fulfill best practices and implement local acts on the most important issues for the company, many foreign companies as well as large Russian companies have formed special Compliance departments. Taking into account such international experience and international corporate governance principles the Bank of Russia has elaborated the Corporate Governance Code, approved by the Russian Government in February 2014. Corporate Governance Code regulates a number of the most important issues of corporate governance such as shareholders'rights and fair treatment of shareholders; Board of Directors; Corporate Secretary of the Company; system of remuneration of members of the Board of Directors, executive bodies and other key executives of the company; system of risk management and internal control; disclosure of information about the company, the information policy of the company; major corporate actions. The most important issue which is analyzed by the author is the problem of the composition of the Board of Directors: the presence of independent directors in the company. According to the author the new Corporate Governance Code reflects the latest trends as well as the current situation with corporate governance in Russia today.

  10. A vectorized Monte Carlo code for modeling photon transport in SPECT

    International Nuclear Information System (INIS)

    Smith, M.F.; Floyd, C.E. Jr.; Jaszczak, R.J.

    1993-01-01

    A vectorized Monte Carlo computer code has been developed for modeling photon transport in single photon emission computed tomography (SPECT). The code models photon transport in a uniform attenuating region and photon detection by a gamma camera. It is adapted from a history-based Monte Carlo code in which photon history data are stored in scalar variables and photon histories are computed sequentially. The vectorized code is written in FORTRAN77 and uses an event-based algorithm in which photon history data are stored in arrays and photon history computations are performed within DO loops. The indices of the DO loops range over the number of photon histories, and these loops may take advantage of the vector processing unit of our Stellar GS1000 computer for pipelined computations. Without the use of the vector processor the event-based code is faster than the history-based code because of numerical optimization performed during conversion to the event-based algorithm. When only the detection of unscattered photons is modeled, the event-based code executes 5.1 times faster with the use of the vector processor than without; when the detection of scattered and unscattered photons is modeled the speed increase is a factor of 2.9. Vectorization is a valuable way to increase the performance of Monte Carlo code for modeling photon transport in SPECT

  11. An Automatic Instruction-Level Parallelization of Machine Code

    Directory of Open Access Journals (Sweden)

    MARINKOVIC, V.

    2018-02-01

    Full Text Available Prevailing multicores and novel manycores have made a great challenge of modern day - parallelization of embedded software that is still written as sequential. In this paper, automatic code parallelization is considered, focusing on developing a parallelization tool at the binary level as well as on the validation of this approach. The novel instruction-level parallelization algorithm for assembly code which uses the register names after SSA to find independent blocks of code and then to schedule independent blocks using METIS to achieve good load balance is developed. The sequential consistency is verified and the validation is done by measuring the program execution time on the target architecture. Great speedup, taken as the performance measure in the validation process, and optimal load balancing are achieved for multicore RISC processors with 2 to 16 cores (e.g. MIPS, MicroBlaze, etc.. In particular, for 16 cores, the average speedup is 7.92x, while in some cases it reaches 14x. An approach to automatic parallelization provided by this paper is useful to researchers and developers in the area of parallelization as the basis for further optimizations, as the back-end of a compiler, or as the code parallelization tool for an embedded system.

  12. Concurrent Models for Object Execution

    OpenAIRE

    Diertens, Bob

    2012-01-01

    In previous work we developed a framework of computational models for the concurrent execution of functions on different levels of abstraction. It shows that the traditional sequential execution of function is just a possible implementation of an abstract computational model that allows for the concurrent execution of functions. We use this framework as base for the development of abstract computational models that allow for the concurrent execution of objects.

  13. Investigating executive functions in children with severe speech and movement disorders using structured tasks.

    Science.gov (United States)

    Stadskleiv, Kristine; von Tetzchner, Stephen; Batorowicz, Beata; van Balkom, Hans; Dahlgren-Sandberg, Annika; Renner, Gregor

    2014-01-01

    Executive functions are the basis for goal-directed activity and include planning, monitoring, and inhibition, and language seems to play a role in the development of these functions. There is a tradition of studying executive function in both typical and atypical populations, and the present study investigates executive functions in children with severe speech and motor impairments who are communicating using communication aids with graphic symbols, letters, and/or words. There are few neuropsychological studies of children in this group and little is known about their cognitive functioning, including executive functions. It was hypothesized that aided communication would tax executive functions more than speech. Twenty-nine children using communication aids and 27 naturally speaking children participated. Structured tasks resembling everyday activities, where the action goals had to be reached through communication with a partner, were used to get information about executive functions. The children (a) directed the partner to perform actions like building a Lego tower from a model the partner could not see and (b) gave information about an object without naming it to a person who had to guess what object it was. The executive functions of planning, monitoring, and impulse control were coded from the children's on-task behavior. Both groups solved most of the tasks correctly, indicating that aided communicators are able to use language to direct another person to do a complex set of actions. Planning and lack of impulsivity was positively related to task success in both groups. The aided group completed significantly fewer tasks, spent longer time and showed more variation in performance than the comparison group. The aided communicators scored lower on planning and showed more impulsivity than the comparison group, while both groups showed an equal degree of monitoring of the work progress. The results are consistent with the hypothesis that aided language tax

  14. Executive Energy Leadership Academy | NREL

    Science.gov (United States)

    Executive Energy Leadership Academy Executive Energy Leadership Academy NREL's Executive Energy Leadership Academy is a nationally renowned program that provides non-technical business, governmental, and foreground. Leadership Program The Leadership Program is designed for community and industry leaders with an

  15. Efficient data management techniques implemented in the Karlsruhe Monte Carlo code KAMCCO

    International Nuclear Information System (INIS)

    Arnecke, G.; Borgwaldt, H.; Brandl, V.; Lalovic, M.

    1974-01-01

    The Karlsruhe Monte Carlo Code KAMCCO is a forward neutron transport code with an eigenfunction and a fixed source option, including time-dependence. A continuous energy model is combined with a detailed representation of neutron cross sections, based on linear interpolation, Breit-Wigner resonances and probability tables. All input is processed into densely packed, dynamically addressed parameter fields and networks of pointers (addresses). Estimation routines are decoupled from random walk and analyze a storage region with sample records. This technique leads to fast execution with moderate storage requirements and without any I/O-operations except in the input and output stages. 7 references. (U.S.)

  16. FASTDART - A fast, accurate and friendly version of DART code

    International Nuclear Information System (INIS)

    Rest, Jeffrey; Taboada, Horacio

    2000-01-01

    A new enhanced, visual version of DART code is presented. DART is a mechanistic model based code, developed for the performance calculation and assessment of aluminum dispersion fuel. Major issues of this new version are the development of a new, time saving calculation routine, able to be run on PC, a friendly visual input interface and a plotting facility. This version, available for silicide and U-Mo fuels, adds to the classical accuracy of DART models for fuel performance prediction, a faster execution and visual interfaces. It is part of a collaboration agreement between ANL and CNEA in the area of Low Enriched Uranium Advanced Fuels, held by the Implementation Arrangement for Technical Exchange and Cooperation in the Area of Peaceful Uses of Nuclear Energy. (author)

  17. Mongolia; Report on the Observance of Standards and Codes-Fiscal Transparency

    OpenAIRE

    International Monetary Fund

    2001-01-01

    This report provides an assessment of fiscal transparency practices in Mongolia against the requirements of the IMF Code of Good Practices on Fiscal Transparency. This paper analyzes the government's participation in the financial and nonfinancial sectors of the economy. Executive Directors appreciated the achievements, and stressed the need for improvements in the areas of fiscal transparency. They emphasized the need for addressing weaknesses of fiscal data, maintaining a legal framework fo...

  18. Educating Executive Function

    Science.gov (United States)

    Blair, Clancy

    2016-01-01

    Executive functions are thinking skills that assist with reasoning, planning, problem solving, and managing one’s life. The brain areas that underlie these skills are interconnected with and influenced by activity in many different brain areas, some of which are associated with emotion and stress. One consequence of the stress-specific connections is that executive functions, which help us to organize our thinking, tend to be disrupted when stimulation is too high and we are stressed out, or too low when we are bored and lethargic. Given their central role in reasoning and also in managing stress and emotion, scientists have conducted studies, primarily with adults, to determine whether executive functions can be improved by training. By and large, results have shown that they can be, in part through computer-based videogame-like activities. Evidence of wider, more general benefits from such computer-based training, however, is mixed. Accordingly, scientists have reasoned that training will have wider benefits if it is implemented early, with very young children as the neural circuitry of executive functions is developing, and that it will be most effective if embedded in children’s everyday activities. Evidence produced by this research, however, is also mixed. In sum, much remains to be learned about executive function training. Without question, however, continued research on this important topic will yield valuable information about cognitive development. PMID:27906522

  19. Grid-based Parallel Data Streaming Implemented for the Gyrokinetic Toroidal Code

    International Nuclear Information System (INIS)

    Klasky, S.; Ethier, S.; Lin, Z.; Martins, K.; McCune, D.; Samtaney, R.

    2003-01-01

    We have developed a threaded parallel data streaming approach using Globus to transfer multi-terabyte simulation data from a remote supercomputer to the scientist's home analysis/visualization cluster, as the simulation executes, with negligible overhead. Data transfer experiments show that this concurrent data transfer approach is more favorable compared with writing to local disk and then transferring this data to be post-processed. The present approach is conducive to using the grid to pipeline the simulation with post-processing and visualization. We have applied this method to the Gyrokinetic Toroidal Code (GTC), a 3-dimensional particle-in-cell code used to study microturbulence in magnetic confinement fusion from first principles plasma theory

  20. A model-guided symbolic execution approach for network protocol implementations and vulnerability detection.

    Science.gov (United States)

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Formal techniques have been devoted to analyzing whether network protocol specifications violate security policies; however, these methods cannot detect vulnerabilities in the implementations of the network protocols themselves. Symbolic execution can be used to analyze the paths of the network protocol implementations, but for stateful network protocols, it is difficult to reach the deep states of the protocol. This paper proposes a novel model-guided approach to detect vulnerabilities in network protocol implementations. Our method first abstracts a finite state machine (FSM) model, then utilizes the model to guide the symbolic execution. This approach achieves high coverage of both the code and the protocol states. The proposed method is implemented and applied to test numerous real-world network protocol implementations. The experimental results indicate that the proposed method is more effective than traditional fuzzing methods such as SPIKE at detecting vulnerabilities in the deep states of network protocol implementations.

  1. MCOR - Monte Carlo depletion code for reference LWR calculations

    Energy Technology Data Exchange (ETDEWEB)

    Puente Espel, Federico, E-mail: fup104@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Tippayakul, Chanatip, E-mail: cut110@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Ivanov, Kostadin, E-mail: kni1@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Misu, Stefan, E-mail: Stefan.Misu@areva.com [AREVA, AREVA NP GmbH, Erlangen (Germany)

    2011-04-15

    Research highlights: > Introduction of a reference Monte Carlo based depletion code with extended capabilities. > Verification and validation results for MCOR. > Utilization of MCOR for benchmarking deterministic lattice physics (spectral) codes. - Abstract: The MCOR (MCnp-kORigen) code system is a Monte Carlo based depletion system for reference fuel assembly and core calculations. The MCOR code is designed as an interfacing code that provides depletion capability to the LANL Monte Carlo code by coupling two codes: MCNP5 with the AREVA NP depletion code, KORIGEN. The physical quality of both codes is unchanged. The MCOR code system has been maintained and continuously enhanced since it was initially developed and validated. The verification of the coupling was made by evaluating the MCOR code against similar sophisticated code systems like MONTEBURNS, OCTOPUS and TRIPOLI-PEPIN. After its validation, the MCOR code has been further improved with important features. The MCOR code presents several valuable capabilities such as: (a) a predictor-corrector depletion algorithm, (b) utilization of KORIGEN as the depletion module, (c) individual depletion calculation of each burnup zone (no burnup zone grouping is required, which is particularly important for the modeling of gadolinium rings), and (d) on-line burnup cross-section generation by the Monte Carlo calculation for 88 isotopes and usage of the KORIGEN libraries for PWR and BWR typical spectra for the remaining isotopes. Besides the just mentioned capabilities, the MCOR code newest enhancements focus on the possibility of executing the MCNP5 calculation in sequential or parallel mode, a user-friendly automatic re-start capability, a modification of the burnup step size evaluation, and a post-processor and test-matrix, just to name the most important. The article describes the capabilities of the MCOR code system; from its design and development to its latest improvements and further ameliorations. Additionally

  2. MCOR - Monte Carlo depletion code for reference LWR calculations

    International Nuclear Information System (INIS)

    Puente Espel, Federico; Tippayakul, Chanatip; Ivanov, Kostadin; Misu, Stefan

    2011-01-01

    Research highlights: → Introduction of a reference Monte Carlo based depletion code with extended capabilities. → Verification and validation results for MCOR. → Utilization of MCOR for benchmarking deterministic lattice physics (spectral) codes. - Abstract: The MCOR (MCnp-kORigen) code system is a Monte Carlo based depletion system for reference fuel assembly and core calculations. The MCOR code is designed as an interfacing code that provides depletion capability to the LANL Monte Carlo code by coupling two codes: MCNP5 with the AREVA NP depletion code, KORIGEN. The physical quality of both codes is unchanged. The MCOR code system has been maintained and continuously enhanced since it was initially developed and validated. The verification of the coupling was made by evaluating the MCOR code against similar sophisticated code systems like MONTEBURNS, OCTOPUS and TRIPOLI-PEPIN. After its validation, the MCOR code has been further improved with important features. The MCOR code presents several valuable capabilities such as: (a) a predictor-corrector depletion algorithm, (b) utilization of KORIGEN as the depletion module, (c) individual depletion calculation of each burnup zone (no burnup zone grouping is required, which is particularly important for the modeling of gadolinium rings), and (d) on-line burnup cross-section generation by the Monte Carlo calculation for 88 isotopes and usage of the KORIGEN libraries for PWR and BWR typical spectra for the remaining isotopes. Besides the just mentioned capabilities, the MCOR code newest enhancements focus on the possibility of executing the MCNP5 calculation in sequential or parallel mode, a user-friendly automatic re-start capability, a modification of the burnup step size evaluation, and a post-processor and test-matrix, just to name the most important. The article describes the capabilities of the MCOR code system; from its design and development to its latest improvements and further ameliorations

  3. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    Science.gov (United States)

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  4. Executive control systems in the engineering design environment. M.S. Thesis

    Science.gov (United States)

    Hurst, P. W.

    1985-01-01

    An executive control system (ECS) is a software structure for unifying various applications codes into a comprehensive system. It provides a library of applications, a uniform access method through a cental user interface, and a data management facility. A survey of twenty-four executive control systems designed to unify various CAD/CAE applications for use in diverse engineering design environments within government and industry was conducted. The goals of this research were to establish system requirements to survey state-of-the-art architectural design approaches, and to provide an overview of the historical evolution of these systems. Foundations for design are presented and include environmental settings, system requirements, major architectural components, and a system classification scheme based on knowledge of the supported engineering domain(s). An overview of the design approaches used in developing the major architectural components of an ECS is presented with examples taken from the surveyed systems. Attention is drawn to four major areas of ECS development: interdisciplinary usage; standardization; knowledge utilization; and computer science technology transfer.

  5. Code ''Repol'' to fit experimental data with a polynomial and its graphics plotting

    International Nuclear Information System (INIS)

    Travesi, A.; Romero, L.

    1983-01-01

    The ''Repol'' code performs the fitting of a set of experimental data, with a polynomial of mth. degree (max. 10), using the Least Squares Criterion. Further, it presents the graphic plotting of the fitted polynomial, in the appropriate coordinates axes system, by a plotter. An additional option allows also the graphic plotting of the experimental data, used for the fit. The necessary data to execute this code, are asked to the operator in the screen, in a iterative way, by screen-operator dialogue, and the values are introduced through the keyboard. This code is written in Fortran IV, and because of its structure programming in subroutine blocks, can be adapted to any computer with graphic screen and keyboard terminal, with a plotter serial connected to it, whose software has the Hewlett Packard ''Graphics 1000''. (author)

  6. Development of a PC code package for the analysis of research and power reactors

    International Nuclear Information System (INIS)

    Urli, N.

    1992-06-01

    Computer codes available for performing reactor physics calculations for nuclear research reactors and power reactors are normally suited for running on mainframe computers. With the fast development in speed and memory of the PCs and affordable prices it became feasible to develop PC versions of commonly used codes. The present work performed under an IAEA sponsored research contract has successfully developed a code package for running on a PC. This package includes a cross-section generating code PSU-LEOPARD and 2D and 1D spatial diffusion codes, MCRAC and MCYC 1D. For adapting PSU-LEOPARD for a PC, the binary library has been reorganized to decimal form, upgraded to FORTRAN-77 standard and arrays and subroutines reorganized to conform to PC compiler. Similarly PC version of MCRAC for FORTRAN-77 and 1D code MCYC 1D have been developed. Tests, verification and bench mark results show excellent agreement with the results obtained from mainframe calculations. The execution speeds are also very satisfactory. 12 refs, 4 figs, 3 tabs

  7. FISP5 - an extended and improved version of the fission product inventory code FISP

    International Nuclear Information System (INIS)

    Tobias, A.

    1978-05-01

    In order to accommodate the UKFPDD-1 fission product data library the CEGB fission product inventory code FISP4 has been modified and extended. The opportunity was taken to revise the algorithm used for calculating the nuclide concentrations during irradiation in order to reduce the problem of rounding errors which arise as a result of the computer limitation to a finite word length. The resulting code FISP5 is shown in addition to offer considerable improvement in execution time in comparison with FISP4. Details of the revised algorithm are given together with a brief users' guide to FISP5. (author)

  8. Caregiver person-centeredness and behavioral symptoms during mealtime interactions: development and feasibility of a coding scheme.

    Science.gov (United States)

    Gilmore-Bykovskyi, Andrea L

    2015-01-01

    Mealtime behavioral symptoms are distressing and frequently interrupt eating for the individual experiencing them and others in the environment. A computer-assisted coding scheme was developed to measure caregiver person-centeredness and behavioral symptoms for nursing home residents with dementia during mealtime interactions. The purpose of this pilot study was to determine the feasibility, ease of use, and inter-observer reliability of the coding scheme, and to explore the clinical utility of the coding scheme. Trained observers coded 22 observations. Data collection procedures were acceptable to participants. Overall, the coding scheme proved to be feasible, easy to execute and yielded good to very good inter-observer agreement following observer re-training. The coding scheme captured clinically relevant, modifiable antecedents to mealtime behavioral symptoms, but would be enhanced by the inclusion of measures for resident engagement and consolidation of items for measuring caregiver person-centeredness that co-occurred and were difficult for observers to distinguish. Published by Elsevier Inc.

  9. Development of an integrated thermal-hydraulics capability incorporating RELAP5 and PANTHER neutronics code

    Energy Technology Data Exchange (ETDEWEB)

    Page, R.; Jones, J.R.

    1997-07-01

    Ensuring that safety analysis needs are met in the future is likely to lead to the development of new codes and the further development of existing codes. It is therefore advantageous to define standards for data interfaces and to develop software interfacing techniques which can readily accommodate changes when they are made. Defining interface standards is beneficial but is necessarily restricted in application if future requirements are not known in detail. Code interfacing methods are of particular relevance with the move towards automatic grid frequency response operation where the integration of plant dynamic, core follow and fault study calculation tools is considered advantageous. This paper describes the background and features of a new code TALINK (Transient Analysis code LINKage program) used to provide a flexible interface to link the RELAP5 thermal hydraulics code with the PANTHER neutron kinetics and the SIBDYM whole plant dynamic modelling codes used by Nuclear Electric. The complete package enables the codes to be executed in parallel and provides an integrated whole plant thermal-hydraulics and neutron kinetics model. In addition the paper discusses the capabilities and pedigree of the component codes used to form the integrated transient analysis package and the details of the calculation of a postulated Sizewell `B` Loss of offsite power fault transient.

  10. Development of an integrated thermal-hydraulics capability incorporating RELAP5 and PANTHER neutronics code

    International Nuclear Information System (INIS)

    Page, R.; Jones, J.R.

    1997-01-01

    Ensuring that safety analysis needs are met in the future is likely to lead to the development of new codes and the further development of existing codes. It is therefore advantageous to define standards for data interfaces and to develop software interfacing techniques which can readily accommodate changes when they are made. Defining interface standards is beneficial but is necessarily restricted in application if future requirements are not known in detail. Code interfacing methods are of particular relevance with the move towards automatic grid frequency response operation where the integration of plant dynamic, core follow and fault study calculation tools is considered advantageous. This paper describes the background and features of a new code TALINK (Transient Analysis code LINKage program) used to provide a flexible interface to link the RELAP5 thermal hydraulics code with the PANTHER neutron kinetics and the SIBDYM whole plant dynamic modelling codes used by Nuclear Electric. The complete package enables the codes to be executed in parallel and provides an integrated whole plant thermal-hydraulics and neutron kinetics model. In addition the paper discusses the capabilities and pedigree of the component codes used to form the integrated transient analysis package and the details of the calculation of a postulated Sizewell 'B' Loss of offsite power fault transient

  11. Associations among dispositional mindfulness, self-compassion, and executive function proficiency in early adolescents.

    Science.gov (United States)

    Shin, Hee-Sung; Black, David S; Shonkoff, Eleanor Tate; Riggs, Nathaniel R; Pentz, Mary Ann

    2016-12-01

    The study objective was to examine the effects of two conceptually related constructs, self-compassion and dispositional mindfulness, on executive function (EF) proficiency among early adolescents. Executive function refers to a set of psychological processes governing emotional regulation, organization, and planning. While the benefits of positive psychology appear evident for mental health and wellness, little is known about the etiological relationship between dispositional mindfulness and self-compassion in their associations with EF. Two hundred and ten early adolescents attending middle school (age M=12.5 years; SD=0.5; 21% Hispanic, 18% Mixed/bi-racial, 47% White, and 9% Other/Missing; 37.1% on free lunch program) self-reported levels of dispositional mindfulness (Mindful Attention Awareness Scale; MAAS), self-compassion (Self-Compassion Scale; SCS; self-judgment and self-kindness domains), and EF proficiency (Behavior Rating Inventory of Executive Function; BRIEF-SR). A sequential linear regression stepwise approach was taken entering the independent variables as separate models in the following order: self-kindness, self-judgement, and dispositional mindfulness. All models controlled for participant age and sex. SCS self-kindness was not associated with EF proficiency, but SCS self-judgment (reverse-coded) contributed to the variance in EF (β=0.40, p mindfulness appears to outweigh that of specific self-compassion domains, when independent of contemplative training.

  12. Load balancing in highly parallel processing of Monte Carlo code for particle transport

    International Nuclear Information System (INIS)

    Higuchi, Kenji; Takemiya, Hiroshi; Kawasaki, Takuji

    1998-01-01

    In parallel processing of Monte Carlo (MC) codes for neutron, photon and electron transport problems, particle histories are assigned to processors making use of independency of the calculation for each particle. Although we can easily parallelize main part of a MC code by this method, it is necessary and practically difficult to optimize the code concerning load balancing in order to attain high speedup ratio in highly parallel processing. In fact, the speedup ratio in the case of 128 processors remains in nearly one hundred times when using the test bed for the performance evaluation. Through the parallel processing of the MCNP code, which is widely used in the nuclear field, it is shown that it is difficult to attain high performance by static load balancing in especially neutron transport problems, and a load balancing method, which dynamically changes the number of assigned particles minimizing the sum of the computational and communication costs, overcomes the difficulty, resulting in nearly fifteen percentage of reduction for execution time. (author)

  13. A competence executive coaching model

    Directory of Open Access Journals (Sweden)

    Pieter Koortzen

    2010-07-01

    Research purpose: The purpose of this article is to address the training and development needs of these consulting psychologists by presenting a competence executive coaching model for the planning, implementation and evaluation of executive coaching interventions. Research design, approach and method: The study was conducted while one of the authors was involved in teaching doctoral students in consulting psychology and executive coaching, specifically in the USA. The approach involved a literature review of executive coaching models and a qualitative study using focus groups to develop and evaluate the competence executive coaching model. Main findings: The literature review provided scant evidence of competence executive coaching models and there seems to be a specific need for this in the training of coaches in South Africa. Hence the model that was developed is an attempt to provide trainers with a structured model for the training of coaches. Contribution/value-add: The uniqueness of this competence model is not only described in terms of the six distinct coaching intervention phases, but also the competencies required in each.

  14. A model-guided symbolic execution approach for network protocol implementations and vulnerability detection.

    Directory of Open Access Journals (Sweden)

    Shameng Wen

    Full Text Available Formal techniques have been devoted to analyzing whether network protocol specifications violate security policies; however, these methods cannot detect vulnerabilities in the implementations of the network protocols themselves. Symbolic execution can be used to analyze the paths of the network protocol implementations, but for stateful network protocols, it is difficult to reach the deep states of the protocol. This paper proposes a novel model-guided approach to detect vulnerabilities in network protocol implementations. Our method first abstracts a finite state machine (FSM model, then utilizes the model to guide the symbolic execution. This approach achieves high coverage of both the code and the protocol states. The proposed method is implemented and applied to test numerous real-world network protocol implementations. The experimental results indicate that the proposed method is more effective than traditional fuzzing methods such as SPIKE at detecting vulnerabilities in the deep states of network protocol implementations.

  15. Hardware-Assisted System for Program Execution Security of SOC

    Directory of Open Access Journals (Sweden)

    Wang Xiang

    2016-01-01

    Full Text Available With the rapid development of embedded systems, the systems’ security has become more and more important. Most embedded systems are at the risk of series of software attacks, such as buffer overflow attack, Trojan virus. In addition, with the rapid growth in the number of embedded systems and wide application, followed embedded hardware attacks are also increasing. This paper presents a new hardware assisted security mechanism to protect the program’s code and data, monitoring its normal execution. The mechanism mainly monitors three types of information: the start/end address of the program of basic blocks; the lightweight hash value in basic blocks and address of the next basic block. These parameters are extracted through additional tools running on PC. The information will be stored in the security module. During normal program execution, the security module is designed to compare the real-time state of program with the information in the security module. If abnormal, it will trigger the appropriate security response, suspend the program and jump to the specified location. The module has been tested and validated on the SOPC with OR1200 processor. The experimental analysis shows that the proposed mechanism can defence a wide range of common software and physical attacks with low performance penalties and minimal overheads.

  16. Source Code Verification for Embedded Systems using Prolog

    Directory of Open Access Journals (Sweden)

    Frank Flederer

    2017-01-01

    Full Text Available System relevant embedded software needs to be reliable and, therefore, well tested, especially for aerospace systems. A common technique to verify programs is the analysis of their abstract syntax tree (AST. Tree structures can be elegantly analyzed with the logic programming language Prolog. Moreover, Prolog offers further advantages for a thorough analysis: On the one hand, it natively provides versatile options to efficiently process tree or graph data structures. On the other hand, Prolog's non-determinism and backtracking eases tests of different variations of the program flow without big effort. A rule-based approach with Prolog allows to characterize the verification goals in a concise and declarative way. In this paper, we describe our approach to verify the source code of a flash file system with the help of Prolog. The flash file system is written in C++ and has been developed particularly for the use in satellites. We transform a given abstract syntax tree of C++ source code into Prolog facts and derive the call graph and the execution sequence (tree, which then are further tested against verification goals. The different program flow branching due to control structures is derived by backtracking as subtrees of the full execution sequence. Finally, these subtrees are verified in Prolog. We illustrate our approach with a case study, where we search for incorrect applications of semaphores in embedded software using the real-time operating system RODOS. We rely on computation tree logic (CTL and have designed an embedded domain specific language (DSL in Prolog to express the verification goals.

  17. Tripoli-4, a three-dimensional poly-kinetic particle transport Monte-Carlo code

    International Nuclear Information System (INIS)

    Both, J.P.; Lee, Y.K.; Mazzolo, A.; Peneliau, Y.; Petit, O.; Roesslinger, B.; Soldevila, M.

    2003-01-01

    In this updated of the Monte-Carlo transport code Tripoli-4, we list and describe its current main features. The code computes coupled neutron-photon propagation as well as the electron-photon cascade shower. While providing the user with common biasing techniques, it also implements an automatic weighting scheme. Tripoli-4 enables the user to compute the following physical quantities: a flux, a multiplication factor, a current, a reaction rate, a dose equivalent rate as well as deposit of energy and recoil energies. For each interesting physical quantity, a Monte-Carlo simulation offers different types of estimators. Tripoli-4 has support for execution in parallel mode. Special features and applications are also presented

  18. Tripoli-4, a three-dimensional poly-kinetic particle transport Monte-Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Both, J P; Lee, Y K; Mazzolo, A; Peneliau, Y; Petit, O; Roesslinger, B; Soldevila, M [CEA Saclay, Dir. de l' Energie Nucleaire (DEN/DM2S/SERMA/LEPP), 91 - Gif sur Yvette (France)

    2003-07-01

    In this updated of the Monte-Carlo transport code Tripoli-4, we list and describe its current main features. The code computes coupled neutron-photon propagation as well as the electron-photon cascade shower. While providing the user with common biasing techniques, it also implements an automatic weighting scheme. Tripoli-4 enables the user to compute the following physical quantities: a flux, a multiplication factor, a current, a reaction rate, a dose equivalent rate as well as deposit of energy and recoil energies. For each interesting physical quantity, a Monte-Carlo simulation offers different types of estimators. Tripoli-4 has support for execution in parallel mode. Special features and applications are also presented.

  19. Educating executive function.

    Science.gov (United States)

    Blair, Clancy

    2017-01-01

    Executive functions are thinking skills that assist with reasoning, planning, problem solving, and managing one's life. The brain areas that underlie these skills are interconnected with and influenced by activity in many different brain areas, some of which are associated with emotion and stress. One consequence of the stress-specific connections is that executive functions, which help us to organize our thinking, tend to be disrupted when stimulation is too high and we are stressed out, or too low when we are bored and lethargic. Given their central role in reasoning and also in managing stress and emotion, scientists have conducted studies, primarily with adults, to determine whether executive functions can be improved by training. By and large, results have shown that they can be, in part through computer-based videogame-like activities. Evidence of wider, more general benefits from such computer-based training, however, is mixed. Accordingly, scientists have reasoned that training will have wider benefits if it is implemented early, with very young children as the neural circuitry of executive functions is developing, and that it will be most effective if embedded in children's everyday activities. Evidence produced by this research, however, is also mixed. In sum, much remains to be learned about executive function training. Without question, however, continued research on this important topic will yield valuable information about cognitive development. WIREs Cogn Sci 2017, 8:e1403. doi: 10.1002/wcs.1403 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  20. Essays in Executive Compensation

    NARCIS (Netherlands)

    D. Zhang (Dan)

    2012-01-01

    textabstractThis dissertation focuses on how executive compensation is designed and its implications for corporate finance and government regulations. Chapter 2 analyzes several proposals to restrict CEO compensation and calibrates two models of executive compensation that describe how firms would

  1. Auto Code Generation for Simulink-Based Attitude Determination Control System

    Science.gov (United States)

    MolinaFraticelli, Jose Carlos

    2012-01-01

    This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.

  2. Contribution to the building of an execution engine for UML models for the simulation of competitor and timed applications

    International Nuclear Information System (INIS)

    Benyahia, A.

    2012-01-01

    Model Driven Engineering (MDE) places models at the heart of the software engineering process. MDE helps managing the complexity of software systems and improving the quality of the development process. The Model Driven Architecture (MDA) initiative from the Object Management Group (OMG) defines a framework for building design flows in the context of MDE. MDA relies heavily on formalisms which are normalized by the OMG, such as UML for modeling, QVT for model transformations and so on. This work deals with the execution semantics of the UML language applied to embedded real-time applications. In this context, the OMG has a norm which defines an execution model for a subset of UML called fUML (foundational UML subset). This execution model gives a precise semantics to UML models, which can be used for analyzing models, generating code, or verifying transformations. The goal of this PhD thesis is to define and build an execution engine for UML models of embedded real-time systems, which takes into account the explicit hypothesis made by the designer about the execution semantics at a high level of abstraction, in order to be able to execute models as early as possible in the design flow of a system. To achieve this goal, we have extended the fUML execution model along three important axes with regard to embedded real-time systems: - Concurrence: fUML does not provide any mechanism for handling concurrent activities in its execution engine. We address this issue by introducing an explicit scheduler which allows us to control the execution of concurrent tasks. - Time: fUML does not provide any mean to handle time. By adding a clock to the model of execution, we can take into account the elapsed time as well as temporal constraints on the execution of activities. - Profiles: fUML does not take profiles into account, which makes it difficult to personalize the execution engine with new semantic variants. The execution engine we propose allows the use of UML models with

  3. Uncertainty and sensitivity analysis in the scenario simulation with RELAP/SCDAP and MELCOR codes

    International Nuclear Information System (INIS)

    Garcia J, T.; Cardenas V, J.

    2015-09-01

    A methodology was implemented for analysis of uncertainty in simulations of scenarios with RELAP/SCDAP V- 3.4 bi-7 and MELCOR V-2.1 codes, same that are used to perform safety analysis in the Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). The uncertainty analysis methodology chosen is a probabilistic method of type Propagation of uncertainty of the input parameters to the departure parameters. Therefore, it began with the selection of the input parameters considered uncertain and are considered of high importance in the scenario for its direct effect on the output interest variable. These parameters were randomly sampled according to intervals of variation or probability distribution functions assigned by expert judgment to generate a set of input files that were run through the simulation code to propagate the uncertainty to the output parameters. Then, through the use or ordered statistical and formula Wilks, was determined that the minimum number of executions required to obtain the uncertainty bands that include a population of 95% at a confidence level of 95% in the results is 93, is important to mention that in this method that number of executions does not depend on the number of selected input parameters. In the implementation routines in Fortran 90 that allowed automate the process to make the uncertainty analysis in transients for RELAP/SCDAP code were generated. In the case of MELCOR code for severe accident analysis, automation was carried out through complement Dakota Uncertainty incorporated into the Snap platform. To test the practical application of this methodology, two analyzes were performed: the first with the simulation of closing transient of the main steam isolation valves using the RELAP/SCDAP code obtaining the uncertainty band of the dome pressure of the vessel; while in the second analysis, the accident simulation of the power total loss (Sbo) was carried out with the Macarol code obtaining the uncertainty band for the

  4. Non-execution of laws or decisions of a taxation body

    Directory of Open Access Journals (Sweden)

    Ekaterina Vladimirovna Shestakova

    2015-09-01

    Full Text Available Objective to consider the protection of both the state and the taxpayer in order to improve the implementation of laws and decisions of tax bodies basing on the fact that the central problem of tax legislation is the problem of nonexecution of the Taxation Code and judicial acts. Methods general and specific scientific methods of research were used including systematic and structured problemtheoretical formallegal logical methods etc. Results the reasons are analyzed of nonexecution or improper execution of legislative acts and court decisions by taxpayers. It is concluded that the taxpayer has fewer possibilities to force the tax body to execute the court decision while these possibilities of the taxpayer are hidden in the Taxation Code. Recommendations are given to increase the opportunities for taxpayers to protect their rights from the arbitrariness of bureaucratic bodies means of taxpayersrsquo rights protection are systematized. Scientific novelty the main problem of nonexecution of the laws and decisions of tax bodies is the presence of persons who do not fulfill their responsibilities as taxpayers. In addition the reasons of nonexecution of laws and decisions of tax authorities are not systematized as well as the issues of combating these trends and the statutory possibilities of the taxpayerrsquos nonexecution of the laws and decisions of tax bodies as a means of protecting their interests. However the development of the law enforcement system can be achieved by improving the taxpayersrsquo rights protection. The author proposes specific measures to improve the taxpayer protection which will increase the citizens awareness and the budget replenishment. Practical significance of the study is to protect the state by improving the protection of every citizen and legal entity. The taxpayer should know that the state is not only a punitive mechanism but a mechanism that can protect a particular taxpayer. nbsp

  5. A meta-model for computer executable dynamic clinical safety checklists.

    Science.gov (United States)

    Nan, Shan; Van Gorp, Pieter; Lu, Xudong; Kaymak, Uzay; Korsten, Hendrikus; Vdovjak, Richard; Duan, Huilong

    2017-12-12

    Safety checklist is a type of cognitive tool enforcing short term memory of medical workers with the purpose of reducing medical errors caused by overlook and ignorance. To facilitate the daily use of safety checklists, computerized systems embedded in the clinical workflow and adapted to patient-context are increasingly developed. However, the current hard-coded approach of implementing checklists in these systems increase the cognitive efforts of clinical experts and coding efforts for informaticists. This is due to the lack of a formal representation format that is both understandable by clinical experts and executable by computer programs. We developed a dynamic checklist meta-model with a three-step approach. Dynamic checklist modeling requirements were extracted by performing a domain analysis. Then, existing modeling approaches and tools were investigated with the purpose of reusing these languages. Finally, the meta-model was developed by eliciting domain concepts and their hierarchies. The feasibility of using the meta-model was validated by two case studies. The meta-model was mapped to specific modeling languages according to the requirements of hospitals. Using the proposed meta-model, a comprehensive coronary artery bypass graft peri-operative checklist set and a percutaneous coronary intervention peri-operative checklist set have been developed in a Dutch hospital and a Chinese hospital, respectively. The result shows that it is feasible to use the meta-model to facilitate the modeling and execution of dynamic checklists. We proposed a novel meta-model for the dynamic checklist with the purpose of facilitating creating dynamic checklists. The meta-model is a framework of reusing existing modeling languages and tools to model dynamic checklists. The feasibility of using the meta-model is validated by implementing a use case in the system.

  6. Comparative Evaluation and Case Studies of Shared-Memory and Data-Parallel Execution Patterns

    Directory of Open Access Journals (Sweden)

    Xiaodong Zhang

    1999-01-01

    Full Text Available Shared‐memory and data‐parallel programming models are two important paradigms for scientific applications. Both models provide high‐level program abstractions, and simple and uniform views of network structures. The common features of the two models significantly simplify program coding and debugging for scientific applications. However, the underlining execution and overhead patterns are significantly different between the two models due to their programming constraints, and due to different and complex structures of interconnection networks and systems which support the two models. We performed this experimental study to present implications and comparisons of execution patterns on two commercial architectures. We implemented a standard electromagnetic simulation program (EM and a linear system solver using the shared‐memory model on the KSR‐1 and the data‐parallel model on the CM‐5. Our objectives are to examine the execution pattern changes required for an implementation transformation between the two models; to study memory access patterns; to address scalability issues; and to investigate relative costs and advantages/disadvantages of using the two models for scientific computations. Our results indicate that the EM program tends to become computation‐intensive in the KSR‐1 shared‐memory system, and memory‐demanding in the CM‐5 data‐parallel system when the systems and the problems are scaled. The EM program, a highly data‐parallel program performed extremely well, and the linear system solver, a highly control‐structured program suffered significantly in the data‐parallel model on the CM‐5. Our study provides further evidence that matching execution patterns of algorithms to parallel architectures would achieve better performance.

  7. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  8. 76 FR 66235 - Bar Code Technologies for Drugs and Biological Products; Retrospective Review Under Executive...

    Science.gov (United States)

    2011-10-26

    ... interactions, overdoses, and patient allergies) and retail pharmacy-based computer systems that use a bar-coded... drugs. The goal of this initiative is to implement a system to further ensure patient safety and to..., and ideas on the need, maturity, and acceptability of alternative identification technologies for the...

  9. Code REPOL to fit experimental data with a polynomial, and its graphics plotting

    International Nuclear Information System (INIS)

    Romero, L.; Travesi, A.

    1983-01-01

    The REPOL code, performs the fitting a set of experimental data, with a polynomial of mth. degree (max. 10), using the Least Squares Criterion. further, it presents the graphic plotting of the fitted polynomial, in the appropriate coordinates axes system, by a plotter. An additional option allows also the graphic plotting of the experimental data, used for the fit. The necessary data to execute this code, are asked to the operator in the screen, in a iterative way, by screen-operator dialogue, and the values are introduced through the keyboard. This code is written in Fortran IV, and because of its structure programming in subroutine blocks, can be adapted to any computer with graphic screen and keyboard terminal, with a plotter serial connected to it, whose Software has the Hewlett Packard Graphics 1000. (Author) 5 refs

  10. Optical CDMA with Embedded Spectral-Polarization Coding over Double Balanced Differential-Detector

    Science.gov (United States)

    Huang, Jen-Fa; Yen, Chih-Ta; Chen, Bo-Hau

    A spectral-polarization coding (SPC) optical code-division multiple-access (OCDMA) configuration structured over arrayed-waveguide grating (AWG) router is proposed. The polarization-division double balanced detector is adopted to execute difference detection and enhances system performance. The signal-to-noise ratio (SNR) is derived by taking the effect of PIIN into account. The result indicates that there would be up to 9-dB SNR improvement than the conventional spectral-amplitude coding (SAC) structures with Walsh-Hadamard codes. Mathematical deriving results of the SNR demonstrate the system embedded with the orthogonal state of polarization (SOP) will suppress effectively phase-induced intensity noise (PIIN). In addition, we will analyze the relations about bit error rate (BER) vs. the number of active users under the different encoding schemes and compare them with our proposed scheme. The BER vs. the effective power under the different encoding scheme with the same number of simultaneous active user conditions are also revealed. Finally, the polarization-matched factor and the difference between simulated and experimental values are discussed.

  11. Acceptance and validation test report for HANSF code version 1.3.2

    International Nuclear Information System (INIS)

    PIEPHO, M.G.

    2001-01-01

    The HANSF code, Version 1.3.2, is a stand-along code that runs only in DOS. As a result, it runs on any Windows' platform, since each Windows(trademark) platform can create a DOS-prompt window and execute HANSF in the DOS window. The HANSF code is proprietary to Fauske and Associates, Inc., (FAI) of Burr Ridge, IL, the developers of the code. The SNF Project has a license from FAI to run the HANSF code on any computer for only work related to SNF Project. The SNF Project owns the MCO.FOR routine, which is the main routine in HANSF for CVDF applications. The HANSF code calculates physical variables such as temperature, pressure, oxidation rates due to chemical reactions of uranium metal/fuel with water or oxygen. The code is used by the Spent Nuclear Fuel (SNF) Project at Hanford; for example, the report Thermal Analysis of Cold Vacuum Drying of Spent Nuclear Fuel (HNF-SD-SNF-CN-023). The primary facilities of interest are the K-Basins, Cold Vacuum Drying Facility (CVDF), Canister Storage Building (CSB) and T Plant. The overall Summary is presented in Section 2.0, Variances in Section 3.0, Comprehensive Assessment in Section 4.0, Results in Section 5.0, Evaluation in Section 6.0, and Summary of Activities in Section 7.0

  12. A hybrid gyrokinetic ion and isothermal electron fluid code for astrophysical plasma

    Science.gov (United States)

    Kawazura, Y.; Barnes, M.

    2018-05-01

    This paper describes a new code for simulating astrophysical plasmas that solves a hybrid model composed of gyrokinetic ions (GKI) and an isothermal electron fluid (ITEF) Schekochihin et al. (2009) [9]. This model captures ion kinetic effects that are important near the ion gyro-radius scale while electron kinetic effects are ordered out by an electron-ion mass ratio expansion. The code is developed by incorporating the ITEF approximation into AstroGK, an Eulerian δf gyrokinetics code specialized to a slab geometry Numata et al. (2010) [41]. The new code treats the linear terms in the ITEF equations implicitly while the nonlinear terms are treated explicitly. We show linear and nonlinear benchmark tests to prove the validity and applicability of the simulation code. Since the fast electron timescale is eliminated by the mass ratio expansion, the Courant-Friedrichs-Lewy condition is much less restrictive than in full gyrokinetic codes; the present hybrid code runs ∼ 2√{mi /me } ∼ 100 times faster than AstroGK with a single ion species and kinetic electrons where mi /me is the ion-electron mass ratio. The improvement of the computational time makes it feasible to execute ion scale gyrokinetic simulations with a high velocity space resolution and to run multiple simulations to determine the dependence of turbulent dynamics on parameters such as electron-ion temperature ratio and plasma beta.

  13. Automatic ID heat load generation in ANSYS code

    International Nuclear Information System (INIS)

    Wang, Zhibi.

    1992-01-01

    Detailed power density profiles are critical in the execution of a thermal analysis using a finite element (FE) code such as ANSYS. Unfortunately, as yet there is no easy way to directly input the precise power profiles into ANSYS. A straight-forward way to do this is to hand-calculate the power of each node or element and then type the data into the code. Every time a change is made to the FE model, the data must be recalculated and reentered. One way to solve this problem is to generate a set of discrete data, using another code such as PHOTON2, and curve-fit the data. Using curve-fitted formulae has several disadvantages. It is time consuming because of the need to run a second code for generation of the data, curve-fitting, and doing the data check, etc. Additionally, because there is no generality for different beamlines or different parameters, the above work must be repeated for each case. And, errors in the power profiles due to curve-fitting result in errors in the analysis. To solve the problem once and for all and with the capability to apply to any insertion device (ID), a program for ED power profile was written in ANSYS Parametric Design Language (APDL). This program is implemented as an ANSYS command with input parameters of peak magnetic field, deflection parameter, length of ID, and distance from the source. Once the command is issued, all the heat load will be automatically generated by the code

  14. Executive functioning in pre-school children with autism spectrum disorders: The relationship between executive functioning and language

    OpenAIRE

    Linnerud, Ida Cathrine Wang

    2014-01-01

    Background: Executive function difficulties are prevalent in children with autism spectrum disorders (ASD) and there are several indications of a modifying relationship between executive functions and language in children. However, there is limited research on the relationship between executive functioning and language in young children with ASD. The current study compared real-world executive functioning between groups of children with ASD, language disorders (LD), and typical development (T...

  15. Sensitivity Analysis of Uncertainty Parameter based on MARS-LMR Code on SHRT-45R of EBR II

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seok-Ju; Kang, Doo-Hyuk; Seo, Jae-Seung [System Engineering and Technology Co., Daejeon (Korea, Republic of); Bae, Sung-Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jeong, Hae-Yong [Sejong University, Seoul (Korea, Republic of)

    2016-10-15

    In order to assess the uncertainty quantification of the MARS-LMR code, the code has been improved by modifying the source code to accommodate calculation process required for uncertainty quantification. In the present study, a transient of Unprotected Loss of Flow(ULOF) is selected as typical cases of as Anticipated Transient without Scram(ATWS) which belongs to DEC category. The MARS-LMR input generation for EBR II SHRT-45R and execution works are performed by using the PAPIRUS program. The sensitivity analysis is carried out with Uncertainty Parameter of the MARS-LMR code for EBR-II SHRT-45R. Based on the results of sensitivity analysis, dominant parameters with large sensitivity to FoM are picked out. Dominant parameters selected are closely related to the development process of ULOF event.

  16. The computer code SEURBNUK-2

    International Nuclear Information System (INIS)

    Yerkess, A.

    1984-01-01

    SEURBNUK-2 has been designed to model the hydrodynamic development in time of a hypothetical core disrupture accident in a fast breeder reactor. SEURBNUK-2 is a two-dimensional, axisymmetric, eulerian, finite difference containment code. The numerical procedure adopted in SEURBNUK to solve the hydrodynamic equations is based on the semi-implicit ICE method. SEURBNUK has a full thin shell treatment for tanks of arbitrary shape and includes the effects of the compressibility of the fluid. Fluid flow through porous media and porous structures can also be accommodated. An important feature of SEURBNUK is that the thin shell equations are solved quite separately from those of the fluid, and the time step for the fluid flow calculation can be an integer multiple of that for calculating the shell motion. The interaction of the shell with the fluid is then considered as a modification to the coefficients in the implicit pressure equations, the modifications naturally depending on the behaviour of the thin shell section within the fluid cell. The code is limited to dealing with a single fluid, the coolant, whereas the bubble and the cover gas are treated as cavities of uniform pressure calculated via appropriate pressure-volume-energy relationships. This manual describes the input data specifications needed for the execution of SEURBNUK-2 calculations and nine sample problems of varying degrees of complexity highlight the code capabilities. After explaining the output facilities information is included to aid those unfamiliar with SEURBNUK-2 to avoid the common pit-falls experienced by novices

  17. Television and children's executive function.

    Science.gov (United States)

    Lillard, Angeline S; Li, Hui; Boguszewski, Katie

    2015-01-01

    Children spend a lot of time watching television on its many platforms: directly, online, and via videos and DVDs. Many researchers are concerned that some types of television content appear to negatively influence children's executive function. Because (1) executive function predicts key developmental outcomes, (2) executive function appears to be influenced by some television content, and (3) American children watch large quantities of television (including the content of concern), the issues discussed here comprise a crucial public health issue. Further research is needed to reveal exactly what television content is implicated, what underlies television's effect on executive function, how long the effect lasts, and who is affected. © 2015 Elsevier Inc. All rights reserved.

  18. SHEAN (Simplified Human Error Analysis code) and automated THERP

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1993-01-01

    One of the most widely used human error analysis tools is THERP (Technique for Human Error Rate Prediction). Unfortunately, this tool has disadvantages. The Nuclear Regulatory Commission, realizing these drawbacks, commissioned Dr. Swain, the author of THERP, to create a simpler, more consistent tool for deriving human error rates. That effort produced the Accident Sequence Evaluation Program Human Reliability Analysis Procedure (ASEP), which is more conservative than THERP, but a valuable screening tool. ASEP involves answering simple questions about the scenario in question, and then looking up the appropriate human error rate in the indicated table (THERP also uses look-up tables, but four times as many). The advantages of ASEP are that human factors expertise is not required, and the training to use the method is minimal. Although not originally envisioned by Dr. Swain, the ASEP approach actually begs to be computerized. That WINCO did, calling the code SHEAN, for Simplified Human Error ANalysis. The code was done in TURBO Basic for IBM or IBM-compatible MS-DOS, for fast execution. WINCO is now in the process of comparing this code against THERP for various scenarios. This report provides a discussion of SHEAN

  19. Assessing Executive Functioning: A Pragmatic Review

    Science.gov (United States)

    Hass, Michael R.; Patterson, Ashlea; Sukraw, Jocelyn; Sullivan, Brianna M.

    2014-01-01

    Despite the common usage of the term "executive functioning" in neuropsychology, several aspects of this concept remain unsettled. In this paper, we will address some of the issues surrounding the notion of executive functioning and how an understanding of executive functioning and its components might assist school-based practitioners…

  20. Guidelines for Automation Project Execution

    OpenAIRE

    Takkinen, Heidi

    2011-01-01

    The purpose of this Master’s thesis was to create instructions for executing an automation project. Sarlin Oy Ab needed directions on how to execute an automation project. Sarlin is starting up a new business area offering total project solutions for customers. Sarlin focuses on small and minor automation projects on domestic markets. The thesis represents issues related to project execution starting from the theory of the project to its kick-off and termination. Site work is one importan...

  1. The CAIN computer code for the generation of MABEL input data sets: a user's manual

    International Nuclear Information System (INIS)

    Tilley, D.R.

    1983-03-01

    CAIN is an interactive FORTRAN computer code designed to overcome the substantial effort involved in manually creating the thermal-hydraulics input data required by MABEL-2. CAIN achieves this by processing output from either of the whole-core codes, RELAP or TRAC, interpolating where necessary, and by scanning RELAP/TRAC output in order to generate additional information. This user's manual describes the actions required in order to create RELAP/TRAC data sets from magnetic tape, to create the other input data sets required by CAIN, and to operate the interactive command procedure for the execution of CAIN. In addition, the CAIN code is described in detail. This programme of work is part of the Nuclear Installations Inspectorate (NII)'s contribution to the United Kingdom Atomic Energy Authority's independent safety assessment of pressurized water reactors. (author)

  2. Design and implementation of a software tool intended for simulation and test of real time codes

    International Nuclear Information System (INIS)

    Le Louarn, C.

    1986-09-01

    The objective of real time software testing is to show off processing errors and unobserved functional requirements or timing constraints in a code. In the perspective of safety analysis of nuclear equipments of power plants testing should be carried independently from the physical process (which is not generally available), and because casual hardware failures must be considered. We propose here a simulation and test tool, integrally software, with large interactive possibilities for testing assembly code running on microprocessor. The OST (outil d'aide a la simulation et au Test de logiciels temps reel) simulates code execution and hardware or software environment behaviour. Test execution is closely monitored and many useful informations are automatically saved. The present thesis work details, after exposing methods and tools dedicated to real time software, the OST system. We show the internal mechanisms and objects of the system: particularly ''events'' (which describe evolutions of the system under test) and mnemonics (which describe the variables). Then, we detail the interactive means available to the user for constructing the test data and the environment of the tested software. Finally, a prototype implementation is presented along with the results of the tests carried out. This demonstrates the many advantages of the use of an automatic tool over a manual investigation. As a conclusion, further developments, nececessary to complete the final tool are rewieved [fr

  3. 40 CFR 68.155 - Executive summary.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Executive summary. 68.155 Section 68...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Risk Management Plan § 68.155 Executive summary. The owner or operator shall provide in the RMP an executive summary that includes a brief description of the following...

  4. Executions in The Bahamas

    Directory of Open Access Journals (Sweden)

    Lofquist, William Steele

    2010-01-01

    Full Text Available The stories of those who have been executed in the Bahamas are heretofore untold. In telling these stories and in linking them to the changing course of Bahamian history, the present research adds an important dimension to our understanding of Bahamian history and politics. The major theme of this effort is that the changing practice of the death penalty is much more than a consequence of changes in crime. The use of the death penalty parallels the changing interests of colonial rulers, the changing practice of slavery, and the changing role of the Bahamas in colonial and regional affairs. Four distinctive eras of death penalty practice can be identified: (1 the slave era, where executions and commutations were used liberally and with a clear racial patterning; (2 a long era of stable colonialism, a period of marginalization and few executions; (3 an era of unstable colonialism characterized by intensive and efficient use of the death penalty; and (4 the current independence era of high murder rates and equally high impediments to the use of executions.

  5. How Do South Korean Female Executives' Definitions of Career Success Differ from Those of Male Executives?

    Science.gov (United States)

    Cho, Yonjoo; Park, Jiwon; Han, Soo Jeoung; Ju, Boreum; You, Jieun; Ju, Ahreum; Park, Chan Kyun; Park, Hye Young

    2017-01-01

    Purpose: The purpose of this study was to compare South Korean female executives' definitions of career success with those of male executives, identify their career development strategies for success and provide implications for research and practice. Two research questions guiding our inquiry included: How do female executives' definitions of…

  6. Calculation of the absorbed dose for contamination in skin imparted by beta radiation through the Varskin code modified for 122 isotopes of interest for nuclear medicine, nuclear plants and research; Calculo de dosis absorbida para contaminacion en piel impartida por radiacion beta mediante el codigo Varskin modificado para 122 isotopos de interes para medicina nuclear, plantas nucleares e investigacion

    Energy Technology Data Exchange (ETDEWEB)

    Alvarez R, J T

    1992-06-15

    In this work the implementation of a modification of the Varskin code for calculation of absorbed dose by contamination in skin imparted by external radiation fields generated by beta emitting is presented. The necessary data for the execution of the code are: isotope, dose depth, isotope activity, geometry type, source radio and time of integration of the isotope, being able to execute combinations of up to five radionuclides. This program it was implemented in Fortran 5 by means of the FFSKIN source program and the executable one in binary language BFFSKIN being the maximum execution time of 5 minutes. (Author)

  7. Code implementation of partial-range angular scattering cross sections: GAMMER and MORSE

    International Nuclear Information System (INIS)

    Ward, J.T. Jr.

    1978-01-01

    A partial-range (finite-element) method has been previously developed for representing multigroup angular scattering in Monte Carlo photon transport. Computer application of the method, with preliminary quantitative results is discussed here. A multigroup photon cross section processing code, GAMMER, was written which utilized ENDF File 23 point data and the Klein--Nishina formula for Compton scattering. The cross section module of MORSE, along with several execution routines, were rewritten to permit use of the method with photon transport. Both conventional and partial-range techniques were applied for comparison to calculating angular and spectral penetration of 6-MeV photons through a six-inch iron slab. GAMMER was found to run 90% faster than SMUG, with further improvement evident for multiple-media situations; MORSE cross section storage was reduced by one-third; cross section processing, greatly simplified; and execution time, reduced by 15%. Particle penetration was clearly more forward peaked, as moment accuracy is retained to extremly high order. This method of cross section treatment offers potential savings in both storage and handling, as well as improved accuracy and running time in the actual execution phase. 3 figures, 4 tables

  8. TVF-NMCRC-A powerful program for writing and executing simulation inputs for the FLUKA Monte Carlo Code system

    International Nuclear Information System (INIS)

    Mark, S.; Khomchenko, S.; Shifrin, M.; Haviv, Y.; Schwartz, J.R.; Orion, I.

    2007-01-01

    We at the Negev Monte Carlo Research Center (NMCRC) have developed a powerful new interface for writing and executing FLUKA input files-TVF-NMCRC. With the TVF tool a FLUKA user has the ability to easily write an input file without requiring any previous experience. The TVF-NMCRC tool is a LINUX program that has been verified for the most common LINUX-based operating systems, and is suitable for the latest version of FLUKA (FLUKA 2006.3)

  9. Order information coding in working memory: Review of behavioural studies and cognitive mechanisms

    Directory of Open Access Journals (Sweden)

    Barbara Dolenc

    2014-06-01

    Full Text Available Executive processes, such as coding for sequential order, are of extreme importance for higher-order cognitive tasks. One of the significant questions is, how order information is coded in working memory and what cognitive mechanisms and processes mediate it. The aim of this review paper is to summarize results of studies that explore whether order and item memory are two separable processes. Furthermore, we reviewed evidence for each of the proposed cognitive mechanism that might mediate order processing. Previous behavioural and neuroimaging data suggest different representation and processing of item and order information in working memory. Both information are maintained and recalled separately and this separation seems to hold for recognition as well as for recall. To explain the result of studies of order coding, numerous cognitive mechanisms were proposed. We focused on four different mechanisms by which order information might be coded and retrieved, namely inter-item associations, direct coding, hierarchical coding and magnitude coding. Each of the mechanisms can explain some of the aspect of order information coding, however none of them is able to explain all of the empirical findings. Due to its complex nature it is not surprising that a single mechanism has difficulties accounting for all the behavioral data and order memory may be more accurately characterized as the result of a set of mechanisms rather than a single one. Moreover, the findings beget a question of whether different types of memory for order information might exist.

  10. Autism Spectrum Disorder and intact executive functioning.

    Science.gov (United States)

    Ferrara, R; Ansermet, F; Massoni, F; Petrone, L; Onofri, E; Ricci, P; Archer, T; Ricci, S

    2016-01-01

    Earliest notions concerning autism (Autism Spectrum Disorders, ASD) describe the disturbance in executive functioning. Despite altered definition, executive functioning, expressed as higher cognitive skills required complex behaviors linked to the prefrontal cortex, are defective in autism. Specific difficulties in children presenting autism or verbal disabilities at executive functioning levels have been identified. Nevertheless, the developmental deficit of executive functioning in autism is highly diversified with huge individual variation and may even be absent. The aim of the present study to examine the current standing of intact executive functioning intact in ASD. Analysis of ASD populations, whether high-functioning, Asperger's or autism Broad Phenotype, studied over a range of executive functions including response inhibition, planning, cognitive flexibility, cognitive inhibition, and alerting networks indicates an absence of damage/impairment compared to the typically-developed normal control subjects. These findings of intact executive functioning in ASD subjects provide a strong foundation on which to construct applications for growth environments and the rehabilitation of autistic subjects.

  11. EMI Execution Service (EMI-ES) Specification

    CERN Document Server

    Schuller, B

    2010-01-01

    This document provides the interface specification, including related data models such as state model, activity description, resource and activity information, of an execution service, matching the needs of the EMI production middleware stack composed of ARC, gLite and UNICORE components. This service therefore is referred to as the EMI Execution Service (or “ES” for short). This document is a continuation of the work previously know as the GENEVA, then AGU (“ARC, gLite UNICORE”), then PGI execution service. As a starting point, the v0.42 of the “PGI Execution Service Specification” (doc15839) was used.

  12. Disciplining and Screening Top Executives

    NARCIS (Netherlands)

    S. Dominguez Martinez (Silvia); O.H. Swank (Otto); B. Visser (Bauke)

    2006-01-01

    textabstractBoards of directors face the twin task of disciplining and screening executives. To perform these tasks directors do not have detailed information about executives' behaviour, and only infrequently have information about the success or failure of initiated strategies, reorganizations,

  13. Nurse executive transformational leadership found in participative organizations.

    Science.gov (United States)

    Dunham-Taylor, J

    2000-05-01

    The study examined a national sample of 396 randomly selected hospital nurse executives to explore transformational leadership, stage of power, and organizational climate. Results from a few nurse executive studies have found nurse executives were transformational leaders. As executives were more transformational, they achieved better staff satisfaction and higher work group effectiveness. This study integrates Bass' transformational leadership model with Hagberg's power stage theory and Likert's organizational climate theory. Nurse executives (396) and staff reporting to them (1,115) rated the nurse executives' leadership style, staff extra effort, staff satisfaction, and work group effectiveness using Bass and Avolio's Multifactor Leadership Questionnaire. Executives' bosses (360) rated executive work group effectiveness. Executives completed Hagberg's Personal Power Profile and ranked their organizational climate using Likert's Profile of Organizational Characteristics. Nurse executives used transformational leadership fairly often; achieved fairly satisfied staff levels; were very effective according to bosses; were most likely at stage 3 (power by achievement) or stage 4 (power by reflection); and rated their hospital as a Likert System 3 Consultative Organization. Staff satisfaction and work group effectiveness decreased as nurse executives were more transactional. Higher transformational scores tended to occur with higher educational degrees and within more participative organizations. Transformational qualities can be enhanced by further education, by achieving higher power stages, and by being within more participative organizations.

  14. A simple hypothesis of executive function

    Directory of Open Access Journals (Sweden)

    Bruno eKopp

    2012-06-01

    Full Text Available Executive function is traditionally conceptualized as a set of abilities required to guide behavior toward goals. Here, an integrated theoretical framework for executive function is developed which has its roots in the notion of hierarchical mental models. Further following Duncan (2010a,b, executive function is construed as a hierarchical recursive system of test-operation-test-exit units (Miller, Galanter, and Pribram, 1960. Importantly, it is shown that this framework can be used to model the main regional prefrontal syndromes, which are characterized by apathetic, disinhibited and dysexecutive cognition and behavior, respectively. Implications of these considerations for the neuropsychological assessment of executive function are discussed.

  15. Further assessment of the chemical modelling of iodine in IMPAIR 3 code using ACE/RTF data

    International Nuclear Information System (INIS)

    Cripps, R.C.; Guentay, S.

    1996-01-01

    This paper introduces the assessment of the computer code IMPAIR 3 (Iodine Matter Partitioning And Iodine Release) which simulates physical and chemical iodine processes in a LWR containment with one or more compartments under conditions relevant to a severe accident in a nuclear reactor. The first version was published in 1992 to replace both the multi-compartment code IMPAIR 2/M and the single-compartment code IMPAIR 2.2. IMPAIR 2.2 was restricted to a single pH value specified before programme execution and precluded any variation of pH or calculation of H + changes during program execution. This restriction is removed in IMPAIR 3. Results of the IMPAIR 2.2 assessment using ACE/RTF Test 2 and the acidic phase of Test 3 B data were presented at the 3rd CSNI Workshop. The purpose of the current assessment is to verify the IMPAIR 3 capability to follow the whole test duration with changing boundary conditions. Besides revisiting ACE/RTF Test 3B, Test 4 data were also used for the current assessment. A limited data analysis was conducted using the outcome of the current ACEX iodine work to understand the iodine behaviour observed during these tests. This paper presents comparisons of the predicted results with the test data. The code capabilities are demonstrated to focus on still unresolved modelling problems. The unclear behaviour observed in the gaseous molecular iodine behaviour and its inconclusive effect on the calculated behaviour in the acidic phase of the Test 4 and importance of the catalytic effect of stainless steel are also indicated. (author) 18 figs., 1 tab., 11 refs

  16. Further assessment of the chemical modelling of iodine in IMPAIR 3 code using ACE/RTF data

    Energy Technology Data Exchange (ETDEWEB)

    Cripps, R C; Guentay, S [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1996-12-01

    This paper introduces the assessment of the computer code IMPAIR 3 (Iodine Matter Partitioning And Iodine Release) which simulates physical and chemical iodine processes in a LWR containment with one or more compartments under conditions relevant to a severe accident in a nuclear reactor. The first version was published in 1992 to replace both the multi-compartment code IMPAIR 2/M and the single-compartment code IMPAIR 2.2. IMPAIR 2.2 was restricted to a single pH value specified before programme execution and precluded any variation of pH or calculation of H{sup +} changes during program execution. This restriction is removed in IMPAIR 3. Results of the IMPAIR 2.2 assessment using ACE/RTF Test 2 and the acidic phase of Test 3 B data were presented at the 3rd CSNI Workshop. The purpose of the current assessment is to verify the IMPAIR 3 capability to follow the whole test duration with changing boundary conditions. Besides revisiting ACE/RTF Test 3B, Test 4 data were also used for the current assessment. A limited data analysis was conducted using the outcome of the current ACEX iodine work to understand the iodine behaviour observed during these tests. This paper presents comparisons of the predicted results with the test data. The code capabilities are demonstrated to focus on still unresolved modelling problems. The unclear behaviour observed in the gaseous molecular iodine behaviour and its inconclusive effect on the calculated behaviour in the acidic phase of the Test 4 and importance of the catalytic effect of stainless steel are also indicated. (author) 18 figs., 1 tab., 11 refs.

  17. Executive Functioning Heterogeneity in Pediatric ADHD.

    Science.gov (United States)

    Kofler, Michael J; Irwin, Lauren N; Soto, Elia F; Groves, Nicole B; Harmon, Sherelle L; Sarver, Dustin E

    2018-04-28

    Neurocognitive heterogeneity is increasingly recognized as a valid phenomenon in ADHD, with most estimates suggesting that executive dysfunction is present in only about 33%-50% of these children. However, recent critiques question the veracity of these estimates because our understanding of executive functioning in ADHD is based, in large part, on data from single tasks developed to detect gross neurological impairment rather than the specific executive processes hypothesized to underlie the ADHD phenotype. The current study is the first to comprehensively assess heterogeneity in all three primary executive functions in ADHD using a criterion battery that includes multiple tests per construct (working memory, inhibitory control, set shifting). Children ages 8-13 (M = 10.37, SD = 1.39) with and without ADHD (N = 136; 64 girls; 62% Caucasian/Non-Hispanic) completed a counterbalanced series of executive function tests. Accounting for task unreliability, results indicated significantly improved sensitivity and specificity relative to prior estimates, with 89% of children with ADHD demonstrating objectively-defined impairment on at least one executive function (62% impaired working memory, 27% impaired inhibitory control, 38% impaired set shifting; 54% impaired on one executive function, 35% impaired on two or all three executive functions). Children with working memory deficits showed higher parent- and teacher-reported ADHD inattentive and hyperactive/impulsive symptoms (BF 10  = 5.23 × 10 4 ), and were slightly younger (BF 10  = 11.35) than children without working memory deficits. Children with vs. without set shifting or inhibitory control deficits did not differ on ADHD symptoms, age, gender, IQ, SES, or medication status. Taken together, these findings confirm that ADHD is characterized by neurocognitive heterogeneity, while suggesting that contemporary, cognitively-informed criteria may provide improved precision for identifying a

  18. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  19. Building the blocks of executive functioning: differentiating early developing processes contributing to executive functioning skills

    NARCIS (Netherlands)

    Mandell, D.J.; Ward, S.E.

    2011-01-01

    The neural processes that underlie executive function begin to develop in infancy. However, it is unclear how the behavior manifested by these processes are related or if they can be differentiated early in development. This study seeks to examine early emerging executive functioning skills in

  20. The Impact of Hedge Fund Activism on Target Firm Performance, Executive Compensation and Executive Wealth

    Directory of Open Access Journals (Sweden)

    Andrew Carrothers

    2017-10-01

    Full Text Available This paper examines the relationship between hedge fund activism and target firm performance, executive compensation, and executive wealth. It introduces a theoretical framework that describes the activism process as a sequence of discrete decisions. The methodology uses regression analysis on a matched sample based on firm size, industry, and market-to-book ratio. All regressions control for industry and year fixed effects. Schedule 13D Securities and Exchange Commission (SEC filings are the source for the statistical sample of hedge fund target firms. I supplement that data with target firm financial, operating, and share price information from the CRSP-COMPUSTAT merged database. Activist hedge funds target undervalued or underperforming firms with high profitability and cash flows. They do not avoid firms with powerful CEOs. Leverage, executive compensation, pay for performance and CEO turnover increase at target firms after the arrival of the activist hedge fund. Target firm executives’ wealth is more sensitive to changes in share price after hedge fund activism events suggesting that the executive team experiences changes to their compensation structure that provides incentive to take action to improve returns to shareholders. The top executives reap rewards for increasing firm value but not for increased risk taking.

  1. A fortran code CVTRAN to provide cross-section file for TWODANT by using macroscopic file written by SRAC

    International Nuclear Information System (INIS)

    Yamane, Tsuyoshi; Tsuchihashi, Keichiro

    1999-03-01

    A code CVTRAN provides the macroscopic cross-sections in the format of XSLIB file which is one of Standard interface files for a two-dimensional Sn transport code TWODANT by reading a macroscopic cross section file in the PDS format which is prepared by SRAC execution. While a two-dimensional Sn transport code TWOTRAN published by LANL is installed as a module in the SRAC code system, several functions such as alpha search, concentration search, zone thickness search and various edits are suppressed. Since the TWODANT code was released from LANL, its short running time, stable convergence and plenty of edits have attracted many users. The code CVTRAN makes the TWODANT available to the SRAC user by providing the macroscopic cross-sections on a card-image file XSLIB. The CVTRAN also provides material dependent fission spectra into a card-image format file CVLIB, together with group velocities, group boundary energies and material names. The user can feed them into the TWODANT input, if necessary, by cut-and-paste command. (author)

  2. On the automated assessment of nuclear reactor systems code accuracy

    International Nuclear Information System (INIS)

    Kunz, Robert F.; Kasmala, Gerald F.; Mahaffy, John H.; Murray, Christopher J.

    2002-01-01

    An automated code assessment program (ACAP) has been developed to provide quantitative comparisons between nuclear reactor systems (NRS) code results and experimental measurements. The tool provides a suite of metrics for quality of fit to specific data sets, and the means to produce one or more figures of merit (FOM) for a code, based on weighted averages of results from the batch execution of a large number of code-experiment and code-code data comparisons. Accordingly, this tool has the potential to significantly streamline the verification and validation (V and V) processes in NRS code development environments which are characterized by rapidly evolving software, many contributing developers and a large and growing body of validation data. In this paper, a survey of data conditioning and analysis techniques is summarized which focuses on their relevance to NRS code accuracy assessment. A number of methods are considered for their applicability to the automated assessment of the accuracy of NRS code simulations. A variety of data types and computational modeling methods are considered from a spectrum of mathematical and engineering disciplines. The goal of the survey was to identify needs, issues and techniques to be considered in the development of an automated code assessment procedure, to be used in United States Nuclear Regulatory Commission (NRC) advanced thermal-hydraulic T/H code consolidation efforts. The ACAP software was designed based in large measure on the findings of this survey. An overview of this tool is summarized and several NRS data applications are provided. The paper is organized as follows: The motivation for this work is first provided by background discussion that summarizes the relevance of this subject matter to the nuclear reactor industry. Next, the spectrum of NRS data types are classified into categories, in order to provide a basis for assessing individual comparison methods. Then, a summary of the survey is provided, where each

  3. Sexual Harassment and Organizational Outcomes Executive Summary

    Science.gov (United States)

    2011-10-01

    quid pro quo type of Sexual harassment and Organizational, 4 sexual harassment (e.g., sexual coercion). This should drive organizational efforts to... Sexual Harassment and Organizational Outcomes Executive Summary Charlie L. Law DEFENSE EQUAL...Executive Summary] No. 99-11 Sexual harassment and Organizational, 2 Executive Summary Issue

  4. Executable Architecture Research at Old Dominion University

    Science.gov (United States)

    Tolk, Andreas; Shuman, Edwin A.; Garcia, Johnny J.

    2011-01-01

    Executable Architectures allow the evaluation of system architectures not only regarding their static, but also their dynamic behavior. However, the systems engineering community do not agree on a common formal specification of executable architectures. To close this gap and identify necessary elements of an executable architecture, a modeling language, and a modeling formalism is topic of ongoing PhD research. In addition, systems are generally defined and applied in an operational context to provide capabilities and enable missions. To maximize the benefits of executable architectures, a second PhD effort introduces the idea of creating an executable context in addition to the executable architecture. The results move the validation of architectures from the current information domain into the knowledge domain and improve the reliability of such validation efforts. The paper presents research and results of both doctoral research efforts and puts them into a common context of state-of-the-art of systems engineering methods supporting more agility.

  5. Executive function, episodic memory, and Medicare expenditures.

    Science.gov (United States)

    Bender, Alex C; Austin, Andrea M; Grodstein, Francine; Bynum, Julie P W

    2017-07-01

    We examined the relationship between health care expenditures and cognition, focusing on differences across cognitive systems defined by global cognition, executive function, or episodic memory. We used linear regression models to compare annual health expenditures by cognitive status in 8125 Nurses' Health Study participants who completed a cognitive battery and were enrolled in Medicare parts A and B. Adjusting for demographics and comorbidity, executive impairment was associated with higher total annual expenditures of $1488 per person (P episodic memory impairment was found. Expenditures exhibited a linear relationship with executive function, but not episodic memory ($584 higher for every 1 standard deviation decrement in executive function; P < .01). Impairment in executive function is specifically and linearly associated with higher health care expenditures. Focusing on management strategies that address early losses in executive function may be effective in reducing costly services. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  6. Independence and executive remuneration for supervisory board members and non-executive directors

    Directory of Open Access Journals (Sweden)

    Hana Horak

    2011-01-01

    Full Text Available In this paper, the author analyses the issues of independence and rewarding members of supervisory boards and non-executive directors in administrative boards. The question of independence was developed as one of the fundamental issues of corporative administration. Members of these organs of society should have the appropriate qualifications, specific knowledge and skills in order to reasonably and impartially weigh up social business and reach decisions in the best interests of society, its members and other interest holders. So that they can act in accordance with this, the presumption is that they are independent. Recently, after financial crises, it is precisely the independence of supervisory members, that is, non-executive directors, which is considered to be the foundation of the fight against the opportunism of administration and main shareholders. The authors analyse the Recommendations on Independence and Executive Remuneration for members of supervisory and administrative boards of the European Union, together with their implementation in Croatian law and practice.

  7. Ground Operations Aerospace Language (GOAL). Volume 4: Interpretive code translator

    Science.gov (United States)

    1973-01-01

    This specification identifies and describes the principal functions and elements of the Interpretive Code Translator which has been developed for use with the GOAL Compiler. This translator enables the user to convert a compliled GOAL program to a highly general binary format which is designed to enable interpretive execution. The translator program provides user controls which are designed to enable the selection of various output types and formats. These controls provide a means for accommodating many of the implementation options which are discussed in the Interpretive Code Guideline document. The technical design approach is given. The relationship between the translator and the GOAL compiler is explained and the principal functions performed by the Translator are described. Specific constraints regarding the use of the Translator are discussed. The control options are described. These options enable the user to select outputs to be generated by the translator and to control vrious aspects of the translation processing.

  8. Programming a real code in a functional language (part 1)

    Energy Technology Data Exchange (ETDEWEB)

    Hendrickson, C.P.

    1991-09-10

    For some, functional languages hold the promise of allowing ease of programming massively parallel computers that imperative languages such as Fortran and C do not offer. At LLNL, we have initiated a project to write the physics of a major production code in Sisal, a functional language developed at LLNL in collaboration with researchers throughout the world. We are investigating the expressibility of Sisal, as well as its performance on a shared-memory multiprocessor, the Y-MP. An interesting aspect of the project is that Sisal modules can call Fortran modules, and are callable by them. This eliminates the rewriting of 80% of the production code that would not benefit from parallel execution. Preliminary results indicate that the restrictive nature of the language does not cause problems in expressing the algorithms we have chosen. Some interesting aspects of programming in a mixed functional-imperative environment have surfaced, but can be managed. 8 refs.

  9. Evolvix BEST Names for semantic reproducibility across code2brain interfaces.

    Science.gov (United States)

    Loewe, Laurence; Scheuer, Katherine S; Keel, Seth A; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G; Moog, Cecilia L; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist-Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda-Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L; Freiberg, Erika; Waters, Noah P; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2017-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general-purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long-term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder-brains to reader-brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. © 2016 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  10. The MIMIC Code Repository: enabling reproducibility in critical care research.

    Science.gov (United States)

    Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J

    2018-01-01

    Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  11. Comparative analyses of the internal radiation exposures due to food chain pathway using food III code

    International Nuclear Information System (INIS)

    Choi, Yong Ho; Chung, Kyu Hoi; Kim, Jin Kyu; Lee, Jeong Ho

    1988-01-01

    In order to develop a food-chain computer code suitable to the environmental conditions of Korea, the FOOD III code was partially modified. The execution results for Korean male-adult were compared to those from the Canadian version of FOOD III to deduce a more realistic approach in dose assessment. The amounts of Mn-54, Co-50, Co-60, I-131 and I-132 released from Kori unit 1 in 1984 were used as the source terms for the sample calculation. The maximum atmospheric dispersion factor(X/Q) value on the site boundary was applied. Through the code modification, organ doses decreased by about 20∼70% and the effective committed dose equivalent by about 40% to be 7.935x10 -6 Sv/y which is 0.16% of the ICRP limit, 5x10 -3 Sv/y. (Author)

  12. The contribution of executive control to semantic cognition: Convergent evidence from semantic aphasia and executive dysfunction.

    Science.gov (United States)

    Thompson, Hannah E; Almaghyuli, Azizah; Noonan, Krist A; Barak, Ohr; Lambon Ralph, Matthew A; Jefferies, Elizabeth

    2018-01-03

    Semantic cognition, as described by the controlled semantic cognition (CSC) framework (Rogers et al., , Neuropsychologia, 76, 220), involves two key components: activation of coherent, generalizable concepts within a heteromodal 'hub' in combination with modality-specific features (spokes), and a constraining mechanism that manipulates and gates this knowledge to generate time- and task-appropriate behaviour. Executive-semantic goal representations, largely supported by executive regions such as frontal and parietal cortex, are thought to allow the generation of non-dominant aspects of knowledge when these are appropriate for the task or context. Semantic aphasia (SA) patients have executive-semantic deficits, and these are correlated with general executive impairment. If the CSC proposal is correct, patients with executive impairment should not only exhibit impaired semantic cognition, but should also show characteristics that align with those observed in SA. This possibility remains largely untested, as patients selected on the basis that they show executive impairment (i.e., with 'dysexecutive syndrome') have not been extensively tested on tasks tapping semantic control and have not been previously compared with SA cases. We explored conceptual processing in 12 patients showing symptoms consistent with dysexecutive syndrome (DYS) and 24 SA patients, using a range of multimodal semantic assessments which manipulated control demands. Patients with executive impairments, despite not being selected to show semantic impairments, nevertheless showed parallel patterns to SA cases. They showed strong effects of distractor strength, cues and miscues, and probe-target distance, plus minimal effects of word frequency on comprehension (unlike semantic dementia patients with degradation of conceptual knowledge). This supports a component process account of semantic cognition in which retrieval is shaped by control processes, and confirms that deficits in SA patients reflect

  13. Culture, executive function, and social understanding.

    Science.gov (United States)

    Lewis, Charlie; Koyasu, Masuo; Oh, Seungmi; Ogawa, Ayako; Short, Benjamin; Huang, Zhao

    2009-01-01

    Much of the evidence from the West has shown links between children's developing self-control (executive function), their social experiences, and their social understanding (Carpendale & Lewis, 2006, chapters 5 and 6), across a range of cultures including China. This chapter describes four studies conducted in three Oriental cultures, suggesting that the relationships among social interaction, executive function, and social understanding are different in these cultures, implying that social and executive skills are underpinned by key cultural processes.

  14. Validation of Advanced Computer Codes for VVER Technology: LB-LOCA Transient in PSB-VVER Facility

    Directory of Open Access Journals (Sweden)

    A. Del Nevo

    2012-01-01

    Full Text Available The OECD/NEA PSB-VVER project provided unique and useful experimental data for code validation from PSB-VVER test facility. This facility represents the scaled-down layout of the Russian-designed pressurized water reactor, namely, VVER-1000. Five experiments were executed, dealing with loss of coolant scenarios (small, intermediate, and large break loss of coolant accidents, a primary-to-secondary leak, and a parametric study (natural circulation test aimed at characterizing the VVER system at reduced mass inventory conditions. The comparative analysis, presented in the paper, regards the large break loss of coolant accident experiment. Four participants from three different institutions were involved in the benchmark and applied their own models and set up for four different thermal-hydraulic system codes. The benchmark demonstrated the performances of such codes in predicting phenomena relevant for safety on the basis of fixed criteria.

  15. Theoretical background and user's manual for the computer code on groundwater flow and radionuclide transport calculation in porous rock

    International Nuclear Information System (INIS)

    Shirakawa, Toshihiko; Hatanaka, Koichiro

    2001-11-01

    In order to document a basic manual about input data, output data, execution of computer code on groundwater flow and radionuclide transport calculation in heterogeneous porous rock, we investigated the theoretical background about geostatistical computer codes and the user's manual for the computer code on groundwater flow and radionuclide transport which calculates water flow in three dimension, the path of moving radionuclide, and one dimensional radionuclide migration. In this report, based on above investigation we describe the geostatistical background about simulating heterogeneous permeability field. And we describe construction of files, input and output data, a example of calculating of the programs which simulates heterogeneous permeability field, and calculates groundwater flow and radionuclide transport. Therefore, we can document a manual by investigating the theoretical background about geostatistical computer codes and the user's manual for the computer code on groundwater flow and radionuclide transport calculation. And we can model heterogeneous porous rock and analyze groundwater flow and radionuclide transport by utilizing the information from this report. (author)

  16. Developmental Changes in Executive Functioning

    Science.gov (United States)

    Lee, Kerry; Bull, Rebecca; Ho, Ringo M. H.

    2013-01-01

    Although early studies of executive functioning in children supported Miyake et al.'s (2000) three-factor model, more recent findings supported a variety of undifferentiated or two-factor structures. Using a cohort-sequential design, this study examined whether there were age-related differences in the structure of executive functioning among…

  17. SSC-K code users manual (rev.1)

    International Nuclear Information System (INIS)

    Kwon, Y. M.; Lee, Y. B.; Chang, W. P.; Hahn, D.

    2002-01-01

    The Supper System Code of KAERI (SSC-K) is a best-estimate system code for analyzing a variety of off-normal or accidents in the heat transport system of a pool type LMR design. It is being developed at Korea Atomic Energy Research Institution (KAERI) on the basis of SSC-L, originally developed at BNL to analyze loop-type LMR transients. SSC-K can handle both designs of loop and pool type LMRs. SSC-K contains detailed mechanistic models of transient thermal, hydraulic, neutronic, and mechanical phenomena to describe the response of the reactor core, coolant, fuel elements, and structures to accident conditions. This report provides a revised User's Manual (rev.1) of the SSC-K computer code, focusing on phenomenological model descriptions for new thermal, hydraulic, neutronic, and mechanical modules. A comprehensive description of the models for pool-type reactor is given in Chapters 2 and 3; the steady-state plant characterization, prior to the initiation of transient is described in Chapter 2 and their transient counterparts are discussed in Chapter 3. Discussions on the intermediate heat exchanger (IHX) and the electromagnetic (EM) pump are described in Chapter 4 and 5, respectively. A model of passive safety decay heat removal system (PSDRS) is discussed in Chapter 6, and models for various reactivity feedback effects are discussed in Chapter 7. In Chapter 8, constitutive laws and correlations required to execute the SSC-K are described. New models developed for SSC-K rev.1 are two dimensional hot pool model in Chapter 9, and long term cooling model in Chapter 10. Finally, a brief description of MINET code adopted to simulate BOP is presented in Chapter 11. Based on test runs for typical LMFBR accident analyses, it was found that the present version of SSC-K would be used for the safety analysis of KALIMER. However, the further validation of SSC-K is required for real applications. It is noted that the user's manual of SSC-K will be revised later with the

  18. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  19. Questionnaire-based assessment of executive functioning: Psychometrics.

    Science.gov (United States)

    Castellanos, Irina; Kronenberger, William G; Pisoni, David B

    2018-01-01

    The psychometric properties of the Learning, Executive, and Attention Functioning (LEAF) scale were investigated in an outpatient clinical pediatric sample. As a part of clinical testing, the LEAF scale, which broadly measures neuropsychological abilities related to executive functioning and learning, was administered to parents of 118 children and adolescents referred for psychological testing at a pediatric psychology clinic; 85 teachers also completed LEAF scales to assess reliability across different raters and settings. Scores on neuropsychological tests of executive functioning and academic achievement were abstracted from charts. Psychometric analyses of the LEAF scale demonstrated satisfactory internal consistency, parent-teacher inter-rater reliability in the small to large effect size range, and test-retest reliability in the large effect size range, similar to values for other executive functioning checklists. Correlations between corresponding subscales on the LEAF and other behavior checklists were large, while most correlations with neuropsychological tests of executive functioning and achievement were significant but in the small to medium range. Results support the utility of the LEAF as a reliable and valid questionnaire-based assessment of delays and disturbances in executive functioning and learning. Applications and advantages of the LEAF and other questionnaire measures of executive functioning in clinical neuropsychology settings are discussed.

  20. EXECUTIVE FUNCTIONING IN SCHIZOPHRENIA

    Directory of Open Access Journals (Sweden)

    Gricel eOrellana

    2013-06-01

    Full Text Available The executive function (EF is a set of abilities, which allows us to invoke voluntary control of our behavioral responses. These functions enable human beings to develop and carry out plans, make up analogies, obey social rules, solve problems, adapt to unexpected circumstances, do many tasks simultaneously and locate episodes in time and place. EF includes divided attention and sustained attention, working memory, set-shifting, flexibility, planning and the regulation of goal directed behavior and can be defined as a brain function underlying the human faculty to act or think not only in reaction to external events but also in relation with internal goals and states. EF is mostly associated with dorsolateral prefrontal cortex (PFC. Besides EF, PFC is involved in self-regulation of behavior, i.e. the ability to regulate behavior according to internal goals and constraints, particularly in less structured situations. Self-regulation of behavior is subtended by ventral medial /orbital PFC. Impairment of EF is one of the most commonly observed deficits in schizophrenia through the various disease stages. Impairment in tasks measuring conceptualization, planning, cognitive flexibility, verbal fluency, ability to solve complex problems and working memory occur in schizophrenia. Disorders detected by executive tests are consistent with evidence from functional neuroimaging, which have shown PFC dysfunction in patients while performing these kinds of tasks. Schizophrenics also exhibit deficit in odor identifying, decision-making and self-regulation of behavior suggesting dysfunction of the orbital PFC. However, impairment in executive tests is explained by dysfunction of prefronto-striato-thalamic, prefronto-parietal and prefronto-temporal neural networks mainly. Disorders in executive functions may be considered central facts with respect to schizophrenia and it has been suggested that negative symptoms may be explained by that executive dysfunction.

  1. Adaptation of OCA-P, a probabilistic fracture-mechanics code, to a personal computer

    International Nuclear Information System (INIS)

    Ball, D.G.; Cheverton, R.D.

    1985-01-01

    The OCA-P probabilistic fracture-mechanics code can now be executed on a personal computer with 512 kilobytes of memory, a math coprocessor, and a hard disk. A user's guide for the particular adaptation has been prepared, and additional importance sampling techniques for OCA-P have been developed that allow the sampling of only the tails of selected distributions. Features have also been added to OCA-P that permit RTNDT to be used as an ''independent'' variable in the calculation of P

  2. SSYST, a code-system for analysing transient LWR fuel rod behaviour under off-normal conditions

    International Nuclear Information System (INIS)

    Borgwaldt, H.; Gulden, W.

    1983-01-01

    SSYST is a code-system for analysing transient fuel rod behaviour under off-normal conditions, developed conjointly by the Institut fuer Kernenergetik und Energiesysteme (IKE), Stuttgart, and Kernforschungszentrum Karlsruhe (KfK) under contract of Projek Nukleare Sicherheit (PNS) at KfK. The main differences between SSYST and similar codes are (1) an open-ended modular code organisation, and (2) a preference for simple models, wherever possible. While the first feature makes SSYST a very flexible tool, easily adapted to changing requirements, the second feature leads to short execution times. The analysis of transient rod behaviour under LOCA boundary conditions takes 2 min cpu-time (IBM-3033), so that extensive parametric studies become possible. This paper gives an outline of the overall code organisation and a general overview of the physical models implemented. Besides explaining the routine application of SSYST in the analysis of loss-of-coolant accidents, examples are given of special applications which have led to a satisfactory understanding of the decisive influence of deviations from rotational symmetry on the fuel rod perimeter. (author)

  3. SSYST: A code-system for analyzing transient LWR fuel rod behaviour under off-normal conditions

    International Nuclear Information System (INIS)

    Borgwaldt, H.; Gulden, W.

    1983-01-01

    SSYST is a code-system for analyzing transient fuel rod behaviour under off-normal conditions, developed conjointly by the Institut fur Kernenergetik und Energiesysteme (IKE), Stuttgart, and Kernforschungszentrum Karlsruhe (KfK) under contract of Projekt Nukleare Sicherheit (PNS) at KfK. The main differences between SSYST and similar codes are an open-ended modular code organization, and a preference for simple models, wherever possible. While the first feature makes SSYST a very flexible tool, easily adapted to changing requirements, the second feature leads to short execution times. The analysis of transient rod behaviour under LOCA boundary conditions takes 2 min cpu-time (IBM-3033), so that extensive parametric studies become possible. This paper gives an outline of the overall code organisation and a general overview of the physical models implemented. Besides explaining the routine application of SSYST in the analysis of loss-of-coolant accidents, examples are given of special applications which have led to a satisfactory understanding of the decisive influence of deviations from rotational symmetry on the fuel rod perimeter

  4. Executive Competencies of Nurses within the Veterans Health Administration: Comparison of Current and Future Nurse Executive Views

    National Research Council Canada - National Science Library

    Sutto, Natalie B

    2006-01-01

    ...). Using the Delphi method for executive decision-making, 144 current nurse executives, as well as 168 nurses identified for potential selection to this position, judged the relative importance of SKAs...

  5. Statistical Symbolic Execution with Informed Sampling

    Science.gov (United States)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  6. On the numerical verification of industrial codes

    International Nuclear Information System (INIS)

    Montan, Sethy Akpemado

    2013-01-01

    Numerical verification of industrial codes, such as those developed at EDF R and D, is required to estimate the precision and the quality of computed results, even more for code running in HPC environments where millions of instructions are performed each second. These programs usually use external libraries (MPI, BLACS, BLAS, LAPACK). In this context, it is required to have a tool as non intrusive as possible to avoid rewriting the original code. In this regard, the CADNA library, which implements the Discrete Stochastic Arithmetic, appears to be one of a promising approach for industrial applications. In the first part of this work, we are interested in an efficient implementation of the BLAS routine DGEMM (General Matrix Multiply) implementing Discrete Stochastic Arithmetic. The implementation of a basic algorithm for matrix product using stochastic types leads to an overhead greater than 1000 for a matrix of 1024 * 1024 compared to the standard version and commercial versions of xGEMM. Here, we detail different solutions to reduce this overhead and the results we have obtained. A new routine Dgemm- CADNA have been designed. This routine has allowed to reduce the overhead from 1100 to 35 compare to optimized BLAS implementations (GotoBLAS). Then, we focus on the numerical verification of Telemac-2D computed results. Performing a numerical validation with the CADNA library shows that more than 30% of the numerical instabilities occurring during an execution come from the dot product function. A more accurate implementation of the dot product with compensated algorithms is presented in this work. We show that implementing these kinds of algorithms, in order to improve the accuracy of computed results does not alter the code performance. (author)

  7. Toward synthesizing executable models in biology.

    Science.gov (United States)

    Fisher, Jasmin; Piterman, Nir; Bodik, Rastislav

    2014-01-01

    Over the last decade, executable models of biological behaviors have repeatedly provided new scientific discoveries, uncovered novel insights, and directed new experimental avenues. These models are computer programs whose execution mechanistically simulates aspects of the cell's behaviors. If the observed behavior of the program agrees with the observed biological behavior, then the program explains the phenomena. This approach has proven beneficial for gaining new biological insights and directing new experimental avenues. One advantage of this approach is that techniques for analysis of computer programs can be applied to the analysis of executable models. For example, one can confirm that a model agrees with experiments for all possible executions of the model (corresponding to all environmental conditions), even if there are a huge number of executions. Various formal methods have been adapted for this context, for example, model checking or symbolic analysis of state spaces. To avoid manual construction of executable models, one can apply synthesis, a method to produce programs automatically from high-level specifications. In the context of biological modeling, synthesis would correspond to extracting executable models from experimental data. We survey recent results about the usage of the techniques underlying synthesis of computer programs for the inference of biological models from experimental data. We describe synthesis of biological models from curated mutation experiment data, inferring network connectivity models from phosphoproteomic data, and synthesis of Boolean networks from gene expression data. While much work has been done on automated analysis of similar datasets using machine learning and artificial intelligence, using synthesis techniques provides new opportunities such as efficient computation of disambiguating experiments, as well as the ability to produce different kinds of models automatically from biological data.

  8. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  9. EMI Execution Service Specification 1.0

    CERN Document Server

    Schuller, B. (JUELICH); Smirnova, O (Lund University); Konstantinov, A. (Oslo University); Skou Andersen, M. (University of Copenhagen); Riedel, M. (JUELICH); Memon, A.S. (JUELICH); Memon, M.S. (JUELICH); Zangrando, L. (INFN); Sgaravatto, M. (INFN); Frizziero, E. (INFN)

    2010-01-01

    This document provides the interface specification, including related data models such as state model, activity description, resource and activity information, of an execution service, matching the needs of the EMI production middleware stack composed of ARC, gLite and UNICORE components. This service therefore is referred to as the EMI Execution Service (or “ES” for short). This document is a continuation of the work previously known as the GENEVA, then AGU (“ARC, gLite UNICORE”), then PGI execution service.

  10. PERCEPTIONS OF EXECUTIVE PAYMENT ABUSE IN PUBLIC INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Tudor Pendiuc

    2013-07-01

    Full Text Available In the aftermath of the financial crisis, executive compensation abuse has been deeply criticized. Thus, the actuality of research is undeniable. The article highlights the importance of learning from other institutions’ past and present executive compensation abuse experiences by presenting the participants’ shared experiences (having to do with executive compensation abuse and by studying how participants perceive executive compensation abuse. The main objective of this research lies in exploring participants’ shared experiences concerning executive compensation abuse, as well as their perceptions, discrepancies and unsolved questions – presented within an ample, interconnected qualitative and quantitative methodological approach. A sample of 20 individuals was chosen for the triangulation method. From the resultant triangulation six new themes derive from the interview/questionnaire questions specifically referring to executive payment abuse namely: (a ethics means distinguishing between right and wrong, (b perspectives of ethical behaviour, (c types of executive payment abuse, (d the participant’s perceptions of institution and colleagues.

  11. An Embodied Account of Early Executive-Function Development

    Science.gov (United States)

    Gottwald, Janna M.; Achermann, Sheila; Marciszko, Carin; Lindskog, Marcus; Gredebäck, Gustaf

    2016-01-01

    The importance of executive functioning for later life outcomes, along with its potential to be positively affected by intervention programs, motivates the need to find early markers of executive functioning. In this study, 18-month-olds performed three executive-function tasks—involving simple inhibition, working memory, and more complex inhibition—and a motion-capture task assessing prospective motor control during reaching. We demonstrated that prospective motor control, as measured by the peak velocity of the first movement unit, is related to infants’ performance on simple-inhibition and working memory tasks. The current study provides evidence that motor control and executive functioning are intertwined early in life, which suggests an embodied perspective on executive-functioning development. We argue that executive functions and prospective motor control develop from a common source and a single motive: to control action. This is the first demonstration that low-level movement planning is related to higher-order executive control early in life. PMID:27765900

  12. Code REX to fit experimental data to exponential functions and graphics plotting; Codigo REX para ajuste de datos experimentales a funciones exponenciales y su representacion grafica

    Energy Technology Data Exchange (ETDEWEB)

    Romero, L.; Travesi, A.

    1983-07-01

    The REX code, written in Fortran IV, performs the fitting a set of experimental data to different kind of functions as: straight-line (Y = A + BX) , and various exponential type (Y-A B{sup x}, Y=A X{sup B}; Y=A exp(BX) ) , using the Least Squares criterion. Such fitting could be done directly for one selected function of for the our simultaneously and allows to chose the function that best fitting to the data, since presents the statistics data of all the fitting. Further, it presents the graphics plotting, of the fitted function, in the appropriate coordinate axes system. An additional option allows also the Graphic plotting of experimental data used for the fitting. All the data necessary to execute this code are asked to the operator in the terminal screen, in the iterative way by screen-operator dialogue, and the values are introduced through the keyboard. This code could be executed with any computer provided with graphic screen and keyboard terminal, with a X-Y plotter serial connected to the graphics terminal. (Author) 5 refs.

  13. Questionnaire-based assessment of executive functioning: Case studies.

    Science.gov (United States)

    Kronenberger, William G; Castellanos, Irina; Pisoni, David B

    2018-01-01

    Delays in the development of executive functioning skills are frequently observed in pediatric neuropsychology populations and can have a broad and significant impact on quality of life. As a result, assessment of executive functioning is often relevant for the development of formulations and recommendations in pediatric neuropsychology clinical work. Questionnaire-based measures of executive functioning behaviors in everyday life have unique advantages and complement traditional neuropsychological measures of executive functioning. Two case studies of children with spina bifida are presented to illustrate the clinical use of a new questionnaire measure of executive and learning-related functioning, the Learning, Executive, and Attention Functioning Scale (LEAF). The LEAF emphasizes clinical utility in assessment by incorporating four characteristics: brevity in administration, breadth of additional relevant content, efficiency of scoring and interpretation, and ease of availability for use. LEAF results were consistent with another executive functioning checklist in documenting everyday behavior problems related to working memory, planning, and organization while offering additional breadth of assessment of domains such as attention, processing speed, and novel problem-solving. These case study results demonstrate the clinical utility of questionnaire-based measurement of executive functioning in pediatric neuropsychology and provide a new measure for accomplishing this goal.

  14. Preliminary study for unified management of CANDU safety codes and construction of database system

    International Nuclear Information System (INIS)

    Min, Byung Joo; Kim, Hyoung Tae

    2003-03-01

    It is needed to develop the Graphical User Interface(GUI) for the unified management of CANDU safety codes and to construct database system for the validation of safety codes, for which the preliminary study is done in the first stage of the present work. The input and output structures and data flow of CATHENA and PRESCON2 are investigated and the interaction of the variables between CATHENA and PRESCON2 are identified. Furthermore, PC versions of CATHENA and PRESCON2 codes are developed for the interaction of these codes and GUI(Graphic User Interface). The PC versions are assessed by comparing the calculation results with those by HP workstation or from FSAR(Final Safety Analysis Report). Preliminary study on the GUI for the safety codes in the unified management system are done. The sample of GUI programming is demonstrated preliminarily. Visual C++ is selected as the programming language for the development of GUI system. The data for Wolsong plants, reactor core, and thermal-hydraulic experiments executed in the inside and outside of the country, are collected and classified following the structure of the database system, of which two types are considered for the final web-based database system. The preliminary GUI programming for database system is demonstrated, which is updated in the future work

  15. An empirical analysis of executive behaviour with hospital executive information systems in Taiwan.

    Science.gov (United States)

    Huang, Wei-Min

    2013-01-01

    Existing health information systems largely only support the daily operations of a medical centre, and are unable to generate the information required by executives for decision-making. Building on past research concerning information retrieval behaviour and learning through mental models, this study examines the use of information systems by hospital executives in medical centres. It uses a structural equation model to help find ways hospital executives might use information systems more effectively. The results show that computer self-efficacy directly affects the maintenance of mental models, and that system characteristics directly impact learning styles and information retrieval behaviour. Other results include the significant impact of perceived environmental uncertainty on scan searches; information retrieval behaviour and focused searches on mental models and perceived efficiency; scan searches on mental model building; learning styles and model building on perceived efficiency; and finally the impact of mental model maintenance on perceived efficiency and effectiveness.

  16. Executive dysfunction, brain aging, and political leadership.

    Science.gov (United States)

    Fisher, Mark; Franklin, David L; Post, Jerrold M

    2014-01-01

    Decision-making is an essential component of executive function, and a critical skill of political leadership. Neuroanatomic localization studies have established the prefrontal cortex as the critical brain site for executive function. In addition to the prefrontal cortex, white matter tracts as well as subcortical brain structures are crucial for optimal executive function. Executive function shows a significant decline beginning at age 60, and this is associated with age-related atrophy of prefrontal cortex, cerebral white matter disease, and cerebral microbleeds. Notably, age-related decline in executive function appears to be a relatively selective cognitive deterioration, generally sparing language and memory function. While an individual may appear to be functioning normally with regard to relatively obvious cognitive functions such as language and memory, that same individual may lack the capacity to integrate these cognitive functions to achieve normal decision-making. From a historical perspective, global decline in cognitive function of political leaders has been alternatively described as a catastrophic event, a slowly progressive deterioration, or a relatively episodic phenomenon. Selective loss of executive function in political leaders is less appreciated, but increased utilization of highly sensitive brain imaging techniques will likely bring greater appreciation to this phenomenon. Former Israeli Prime Minister Ariel Sharon was an example of a political leader with a well-described neurodegenerative condition (cerebral amyloid angiopathy) that creates a neuropathological substrate for executive dysfunction. Based on the known neuroanatomical and neuropathological changes that occur with aging, we should probably assume that a significant proportion of political leaders over the age of 65 have impairment of executive function.

  17. Accelerating phylogenetics computing on the desktop: experiments with executing UPGMA in programmable logic.

    Science.gov (United States)

    Davis, J P; Akella, S; Waddell, P H

    2004-01-01

    Having greater computational power on the desktop for processing taxa data sets has been a dream of biologists/statisticians involved in phylogenetics data analysis. Many existing algorithms have been highly optimized-one example being Felsenstein's PHYLIP code, written in C, for UPGMA and neighbor joining algorithms. However, the ability to process more than a few tens of taxa in a reasonable amount of time using conventional computers has not yielded a satisfactory speedup in data processing, making it difficult for phylogenetics practitioners to quickly explore data sets-such as might be done from a laptop computer. We discuss the application of custom computing techniques to phylogenetics. In particular, we apply this technology to speed up UPGMA algorithm execution by a factor of a hundred, against that of PHYLIP code running on the same PC. We report on these experiments and discuss how custom computing techniques can be used to not only accelerate phylogenetics algorithm performance on the desktop, but also on larger, high-performance computing engines, thus enabling the high-speed processing of data sets involving thousands of taxa.

  18. Executable UML Modeling For Automotive Embedded Systems

    International Nuclear Information System (INIS)

    Gerard, Sebastien

    2000-01-01

    Engineers are more and more faced to the hard problem of sophisticated real-time System whereas time to market becomes always smaller. Object oriented modeling supported by UML standard brings effective solutions to such problems. However the possibility to specify real-time aspects of an application are not yet fully satisfactory Indeed, existing industrial proposals supply good answers to concurrency specification problem but they are yet limited regarding to real-time quantitative properties specification of an application. This work aims to construct a complete and consistent UML methodology based on a profile dedicated to automotive embedded Systems modeling and prototyping. This profile contains ail needed extensions to express easily the real-time quantitative properties of an application. Moreover, thanks to the formalization of UML protocol state machines, real-time concepts have been well-integrated in the object oriented paradigm. The main result of this deep integration is that a user is now able to model real-time Systems through the classical object oriented view i.e. without needing any specific knowing in real-time area. In order to answer to an industrial requirement, Systems prototyping (key point for car industry) the ACCORD/UML approach allows also to build executable models of an application. For that purpose, the method supplies a set of rules allow.ng to remove UML ambiguous semantics points, to complete semantics variation points and then to obtain a complete and coherent global model of an application being executable. The work of UML extension and its using formalization realized all along this thesis supplied also a complete and non-ambiguous modeling framework for automotive electronics Systems development. This is also a base particularly well-suited to tackle other facets of the Systems development as automatic and optimized code generation, validation, simulation or tests. (author) [fr

  19. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.

  20. Code for plant identification (KKS) key in PC version. KKS-Schluessel-Programm in PC-Version

    Energy Technology Data Exchange (ETDEWEB)

    Pannenbaecker, K. (GABO Gesellschaft fuer Ablauforganisation und Informationsverarbeitung mbH, Erlangen (Germany) GABO Gesellschaft fuer Ablauforganisation und Informationsverarbeitung mbH, Muenchen (Germany))

    1991-11-01

    The plant identification system (KKS) as a common development of german plant operators, erection firms and also power plant oriented organisations have decisively influenced the technical-organizing activities of planning and erections as operations and maintenance of all kind of power plants. Fundamentals are three key parts, operation, armatures and function keys. Their management and application is executed by a plantidentification-key code in a PC version, which is briefly described in this report. (orig.).

  1. 39 CFR 211.3 - Executive orders and other executive pronouncements; circulars, bulletins, and other issuances of...

    Science.gov (United States)

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Executive orders and other executive pronouncements; circulars, bulletins, and other issuances of the Office of Management and Budget. 211.3 Section 211.3 Postal Service UNITED STATES POSTAL SERVICE ORGANIZATION AND ADMINISTRATION APPLICATION OF...

  2. Installation of aerosol behavior model into multi-dimensional thermal hydraulic analysis code AQUA

    International Nuclear Information System (INIS)

    Kisohara, Naoyuki; Yamaguchi, Akira

    1997-12-01

    The safety analysis of FBR plant system for sodium leak phenomena needs to evaluate the deposition of the aerosol particle to the components in the plant, the chemical reaction of aerosol to humidity in the air and the effect of the combustion heat through aerosol to the structural component. For this purpose, ABC-INTG (Aerosol Behavior in Containment-INTeGrated Version) code has been developed and used until now. This code calculates aerosol behavior in the gas area of uniform temperature and pressure by 1 cell-model. Later, however, more detailed calculation of aerosol behavior requires the installation of aerosol model into multi-cell thermal hydraulic analysis code AQUA. AQUA can calculate the carrier gas flow, temperature and the distribution of the aerosol spatial concentration. On the other hand, ABC-INTG can calculate the generation, deposition to the wall and flower, agglomeration of aerosol particle and figure out the distribution of the aerosol particle size. Thus, the combination of these two codes enables to deal with aerosol model coupling the distribution of the aerosol spatial concentration and that of the aerosol particle size. This report describes aerosol behavior model, how to install the aerosol model to AQUA and new subroutine equipped to the code. Furthermore, the test calculations of the simple structural model were executed by this code, appropriate results were obtained. Thus, this code has prospect to predict aerosol behavior by the introduction of coupling analysis with multi-dimensional gas thermo-dynamics for sodium combustion evaluation. (J.P.N.)

  3. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  4. The development, qualification and availability of AECL analytical, scientific and design codes

    International Nuclear Information System (INIS)

    Kupferschmidt, W.C.H.; Fehrenbach, P.J.; Wolgemuth, G.A.; McDonald, B.H.; Snell, V.G.

    2001-01-01

    Over the past several years, AECL has embarked on a comprehensive program to develop, qualify and support its key safety and licensing codes, and to make executable versions of these codes available to the international nuclear community. To this end, we have instituted a company-wide Software Quality Assurance (SQA) Program for Analytical, Scientific and Design Computer Programs to ensure that the design, development, maintenance, modification, procurement and use of computer codes within AECL is consistent with today's quality assurance standards. In addition, we have established a comprehensive Code Validation Project (CVP) with the goal of qualifying AECL's 'front-line' safety and licensing codes by 2001 December. The outcome of this initiative will be qualified codes, which are properly verified and validated for the expected range of applications, with associated statements of accuracy and uncertainty for each application. The code qualification program, based on the CSA N286.7 standard, is intended to ensure (1) that errors are not introduced into safety analyses because of deficiencies in the software, (2) that an auditable documentation base is assembled that demonstrates to the regulator that the codes are of acceptable quality, and (3) that these codes are formally qualified for their intended applications. Because AECL and the Canadian nuclear utilities (i.e., Ontario Power Generation, Bruce Power, Hydro Quebec and New Brunswick Power) generally use the same safety and licensing codes, the nuclear industry in Canada has agreed to work cooperatively together towards the development, qualification and maintenance of a common set of analysis tools, referred to as the Industry Standard Toolset (IST). This paper provides an overview of the AECL Software Quality Assurance Program and the Code Validation Project, and their associated linkages to the Canadian nuclear community's Industry Standard Toolset initiative to cooperatively qualify and support commonly

  5. Executive summary

    NARCIS (Netherlands)

    van Nimwegen, N.; van Nimwegen, N.; van der Erf, R.

    2009-01-01

    The Demography Monitor 2008 gives a concise overview of current demographic trends and related developments in education, the labour market and retirement for the European Union and some other countries. This executive summary highlights the major findings of the Demography Monitor 2008 and further

  6. Validation of the ATHLET-code 2.1A by calculation of the ECTHOR experiment; Validierung des ATHLET-Codes 2.1A anhand des Einzeleffekt-Tests ECTHOR

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Andreas; Sarkadi, Peter; Schaffrath, Andreas [TUEV NORD SysTec GmbH und Co. KG, Hamburg (Germany)

    2010-05-15

    Before a numerical code (e.g. ATHLET) is used for simulation of physical phenomena being new or unknown for the code and/or the user, the user ensures the applicability of the code and his own experience of handling with it by means of a so-called validation. Parametric studies with the code are executed for that matter and the results have to be compared with verified experimental data. Corresponding reference values are available in terms of so-called single-effect-tests (e.g. ECTHOR). In this work the system-code ATHLET Mod. 2.1 Cycle A is validated by post test calculation of the ECTHOR experiment due to the above named aspects. With the ECTHOR-tests the clearing of a water-filled model of a loop seal by means of an air-stream was investigated including momentum exchange at the phase interface under adiabatic and atmospheric conditions. The post test calculations show that the analytical results meet the experimental data within the reproducibility of the experiments. Further findings of the parametric studies are: - The experimental results obtained with the system water-air (ECTHOR) can be assigned to a water-steam-system, if the densities of the phases are equal in both cases. - The initial water level in the loop seal has no influence on the results as long as the gas mass flow is increased moderately. - The loop seal is appropriately nodalized if the mean length of the control volumes accords approx. 1.5 tim es the hydraulic pipe diameter. (orig.)

  7. Validation of the ATHLET-code 2.1A by calculation of the ECTHOR experiment; Validierung des ATHLET-Codes 2.1A anhand des Einzeleffekt-Tests ECTHOR

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Andreas; Sarkadi, Peter; Schaffrath, Andreas [TUEV NORD SysTec GmbH und Co. KG, Hamburg (Germany)

    2010-06-15

    Before a numerical code (e.g. ATHLET) is used for simulation of physical phenomena being new or unknown for the code and/or the user, the user ensures the applicability of the code and his own experience of handling with it by means of a so-called validation. Parametric studies with the code are executed for that matter und the results have to be compared with verified experimental data. Corresponding reference values are available in terms of so-called single-effect-tests (e.g. ECTHOR). In this work the system-code ATHLET Mod. 2.1 Cycle A is validated by post test calculation of the ECTHOR experiment due to the above named aspects. With the ECTHOR-tests the clearing of a water-filled model of a loop seal by means of an air-stream was investigated including momentum exchange at the phase interface under adiabatic and atmospheric conditions. The post test calculations show that the analytical results meet the experimental data within the reproducibility of the experiments. Further findings of the parametric studies are: - The experimental results obtained with the system water-air (ECTHOR) can be assigned to a water-steam-system, if the densities of the phases are equal in both cases. - The initial water level in the loop seal has no influence on the results as long as the gas mass flow is increased moderately. - The loop seal is appropriately nodalized if the mean length of the control volumes accords approx. 1.5 times the hydraulic pipe diameter. (orig.)

  8. Fatal injection: a survey of modern code injection attack countermeasures

    Directory of Open Access Journals (Sweden)

    Dimitris Mitropoulos

    2017-11-01

    Full Text Available With a code injection attack (CIA an attacker can introduce malicious code into a computer program or system that fails to properly encode data that comes from an untrusted source. A CIA can have different forms depending on the execution context of the application and the location of the programming flaw that leads to the attack. Currently, CIAs are considered one of the most damaging classes of application attacks since they can severely affect an organisation’s infrastructure and cause financial and reputational damage to it. In this paper we examine and categorize the countermeasures developed to detect the various attack forms. In particular, we identify two distinct categories. The first incorporates static program analysis tools used to eliminate flaws that can lead to such attacks during the development of the system. The second involves the use of dynamic detection safeguards that prevent code injection attacks while the system is in production mode. Our analysis is based on nonfunctional characteristics that are considered critical when creating security mechanisms. Such characteristics involve usability, overhead, implementation dependencies, false positives and false negatives. Our categorization and analysis can help both researchers and practitioners either to develop novel approaches, or use the appropriate mechanisms according to their needs.

  9. Towards Synthesizing Executable Models in Biology

    Directory of Open Access Journals (Sweden)

    Jasmin eFisher

    2014-12-01

    Full Text Available Over the last decade, executable models of biological behaviors have repeatedly provided new scientific discoveries, uncovered novel insights, and directed new experimental avenues. These models are computer programs whose execution mechanistically simulates aspects of the cell’s behaviors. If the observed behavior of the program agrees with the observed biological behavior, then the program explains the phenomena. This approach has proven beneficial for gaining new biological insights and directing new experimental avenues. One advantage of this approach is that techniques for analysis of computer programs can be applied to the analysis of executable models. For example, one can confirm that a model agrees with experiments for all possible executions of the model (corresponding to all environmental conditions, even if there are a huge number of executions. Various formal methods have been adapted for this context, for example, model checking or symbolic analysis of state spaces. To avoid manual construction of executable models, one can apply synthesis, a method to produce programs automatically from high-level specifications. In the context of biological modelling, synthesis would correspond to extracting executable models from experimental data. We survey recent results about the usage of the techniques underlying synthesis of computer programs for the inference of biological models from experimental data. We describe synthesis of biological models from curated mutation experiment data, inferring network connectivity models from phosphoproteomic data, and synthesis of Boolean networks from gene expression data. While much work has been done on automated analysis of similar datasets using machine learning and artificial intelligence, using synthesis techniques provides new opportunities such as efficient computation of disambiguating experiments, as well as the ability to produce different kinds of models automatically from biological data.

  10. Grid workflow job execution service 'Pilot'

    Science.gov (United States)

    Shamardin, Lev; Kryukov, Alexander; Demichev, Andrey; Ilyin, Vyacheslav

    2011-12-01

    'Pilot' is a grid job execution service for workflow jobs. The main goal for the service is to automate computations with multiple stages since they can be expressed as simple workflows. Each job is a directed acyclic graph of tasks and each task is an execution of something on a grid resource (or 'computing element'). Tasks may be submitted to any WS-GRAM (Globus Toolkit 4) service. The target resources for the tasks execution are selected by the Pilot service from the set of available resources which match the specific requirements from the task and/or job definition. Some simple conditional execution logic is also provided. The 'Pilot' service is built on the REST concepts and provides a simple API through authenticated HTTPS. This service is deployed and used in production in a Russian national grid project GridNNN.

  11. Grid workflow job execution service 'Pilot'

    International Nuclear Information System (INIS)

    Shamardin, Lev; Kryukov, Alexander; Demichev, Andrey; Ilyin, Vyacheslav

    2011-01-01

    'Pilot' is a grid job execution service for workflow jobs. The main goal for the service is to automate computations with multiple stages since they can be expressed as simple workflows. Each job is a directed acyclic graph of tasks and each task is an execution of something on a grid resource (or 'computing element'). Tasks may be submitted to any WS-GRAM (Globus Toolkit 4) service. The target resources for the tasks execution are selected by the Pilot service from the set of available resources which match the specific requirements from the task and/or job definition. Some simple conditional execution logic is also provided. The 'Pilot' service is built on the REST concepts and provides a simple API through authenticated HTTPS. This service is deployed and used in production in a Russian national grid project GridNNN.

  12. Helping Behavior in Executives' Global Networks

    DEFF Research Database (Denmark)

    Miller, Stewart; Mors, Marie Louise; McDonald, Michael

    2014-01-01

    Drawing on research on helping behavior in networks at the upper echelons, we develop and test theory about helping behavior in senior executive networks. We examine the location and relational dependence of the network contact. Our results reveal that executives are more likely to perceive...... insiders in their network to be helpful, but geographic location has no effect on expectations of receiving help. With regards to relational dependence: executives who are more dependent on their contacts are more likely to perceive them to be helpful. We also look at whether perceived helpfulness affects...... an executive’s willingness to engage in risky new business development -- an important performance indicator - and indeed find that those executives that perceive their networks to be helpful are more likely to be willing to take risky decisions. We test these arguments using primary data on 1845 relationships...

  13. MIV Project: Executive Summary

    DEFF Research Database (Denmark)

    Ravazzotti, Mariolina T.; Jørgensen, John Leif; Neefs, Marc

    1997-01-01

    Under the ESA contract #11453/95/NL/JG(SC), aiming at assessing the feasibility of Rendez-vous and docking of unmanned spacecrafts, a reference mission scenario was defined. This report gives an executive summary of the achievements and results from the project.......Under the ESA contract #11453/95/NL/JG(SC), aiming at assessing the feasibility of Rendez-vous and docking of unmanned spacecrafts, a reference mission scenario was defined. This report gives an executive summary of the achievements and results from the project....

  14. The computer code SEURBNUK/EURDYN. Pt. 2

    International Nuclear Information System (INIS)

    Yerkess, A.; Broadhouse, B.J.; Smith, B.L.

    1987-01-01

    SEURBNUK-2 is a two-dimensional, axisymmetric, Eulerian, finite difference containment code. The numerical procedure adopted in SEURBNUK to solve the hydrodynamic equations is based on the semi-implicit ICE method which itself is an extension of the MAC algorithm. SEURBNUK has a finite difference thin shell treatment for vessels and internal structures of arbitrary shape and includes the effects of the compressibility of the fluid. Fluid flow through porous media and porous structures can also be accommodated. SEURBNUK/EURDYN is an extension of SEURBNUK-2 in which the finite difference thin shell treatment is replaced by a finite element calculation for both thin or thick structures. This has been achieved by coupling the shell elements and axisymmetric triangular elements. Within the code, the equations of motion for the structures are solved quite separately from those for the fluid, and the timestep for the fluid can be an integer multiple of that for the structures. The interaction of the structures with the fluid is then considered as a modification to the coefficients in the pressure equations, the modifications naturally depending on the behaviour of the structures within the fluid cell. The code is limited to dealing with a single fluid, the coolant, and the bubble and the cover gas are treated as cavities of uniform pressure calculated via appropriate pressure-volume-energy relationships. This manual describes the input data specifications needed for the execution of SEURBNUK/EURDYN calculations. After explaining the output facilities information is included to aid users to avoid some common pit-falls

  15. Supramodal Executive Control of Attention

    Directory of Open Access Journals (Sweden)

    ALFREDO eSPAGNA

    2015-02-01

    Full Text Available The human attentional system can be subdivided into three functional networks of alerting, orienting, and executive control. Although these networks have been extensively studied in the visuospatial modality, whether the same mechanisms are deployed across different sensory modalities remains unclear. In this study we used the attention network test for visuospatial modality, in addition to two auditory variants with spatial and frequency manipulations to examine cross-modal correlations between network functions. Results showed that among the visual and auditory tasks the effects of executive control, but not effects of alerting and orienting were significantly correlated. These findings suggest that while alerting and orienting functions rely more upon modality specific processes, the executive control of attention coordinates complex behavior via supramodal mechanisms.

  16. Executive functions and self-regulation.

    Science.gov (United States)

    Hofmann, Wilhelm; Schmeichel, Brandon J; Baddeley, Alan D

    2012-03-01

    Self-regulation is a core aspect of adaptive human behavior that has been studied, largely in parallel, through the lenses of social and personality psychology as well as cognitive psychology. Here, we argue for more communication between these disciplines and highlight recent research that speaks to their connection. We outline how basic facets of executive functioning (working memory operations, behavioral inhibition, and task-switching) may subserve successful self-regulation. We also argue that temporary reductions in executive functions underlie many of the situational risk factors identified in the social psychological research on self-regulation and review recent evidence that the training of executive functions holds significant potential for improving poor self-regulation in problem populations. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. ATDM LANL FleCSI: Topology and Execution Framework

    Energy Technology Data Exchange (ETDEWEB)

    Bergen, Benjamin Karl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-06

    FleCSI is a compile-time configurable C++ framework designed to support multi-physics application development. As such, FleCSI attempts to provide a very general set of infrastructure design patterns that can be specialized and extended to suit the needs of a broad variety of solver and data requirements. This means that FleCSI is potentially useful to many different ECP projects. Current support includes multidimensional mesh topology, mesh geometry, and mesh adjacency information, n-dimensional hashed-tree data structures, graph partitioning interfaces, and dependency closures (to identify data dependencies between distributed-memory address spaces). FleCSI introduces a functional programming model with control, execution, and data abstractions that are consistent with state-of-the-art task-based runtimes such as Legion and Charm++. The model also provides support for fine-grained, data-parallel execution with backend support for runtimes such as OpenMP and C++17. The FleCSI abstraction layer provides the developer with insulation from the underlying runtimes, while allowing support for multiple runtime systems, including conventional models like asynchronous MPI. The intent is to give developers a concrete set of user-friendly programming tools that can be used now, while allowing flexibility in choosing runtime implementations and optimizations that can be applied to architectures and runtimes that arise in the future. This project is essential to the ECP Ristra Next-Generation Code project, part of ASC ATDM, because it provides a hierarchically parallel programming model that is consistent with the design of modern system architectures, but which allows for the straightforward expression of algorithmic parallelism in a portably performant manner.

  18. 8 CFR 1003.0 - Executive Office for Immigration Review.

    Science.gov (United States)

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Executive Office for Immigration Review. 1003.0 Section 1003.0 Aliens and Nationality EXECUTIVE OFFICE FOR IMMIGRATION REVIEW, DEPARTMENT OF JUSTICE GENERAL PROVISIONS EXECUTIVE OFFICE FOR IMMIGRATION REVIEW § 1003.0 Executive Office for...

  19. Parallelization of a numerical simulation code for isotropic turbulence

    International Nuclear Information System (INIS)

    Sato, Shigeru; Yokokawa, Mitsuo; Watanabe, Tadashi; Kaburaki, Hideo.

    1996-03-01

    A parallel pseudospectral code which solves the three-dimensional Navier-Stokes equation by direct numerical simulation is developed and execution time, parallelization efficiency, load balance and scalability are evaluated. A vector parallel supercomputer, Fujitsu VPP500 with up to 16 processors is used for this calculation for Fourier modes up to 256x256x256 using 16 processors. Good scalability for number of processors is achieved when number of Fourier mode is fixed. For small Fourier modes, calculation time of the program is proportional to NlogN which is ideal complexity of calculation for 3D-FFT on vector parallel processors. It is found that the calculation performance decreases as the increase of the Fourier modes. (author)

  20. Development of an object-oriented simulation code for repository performance assessment

    International Nuclear Information System (INIS)

    Tsujimoto, Keiichi; Ahn, J.

    1999-01-01

    As understanding for mechanisms of radioactivity confinement by a deep geologic repository improves at the individual process level, it has become imperative to evaluate consequences of individual processes to the performance of the whole repository system. For this goal, the authors have developed a model for radionuclide transport in, and release from, the repository region by incorporating multiple-member decay chains and multiple waste canisters. A computer code has been developed with C++, an object-oriented language. By utilizing the feature that a geologic repository consists of thousands of objects of the same kind, such as the waste canister, the repository region is divided into multiple compartments and objects for simulation of radionuclide transport. Massive computational tasks are distributed over, and executed by, multiple networked workstations, with the help of parallel virtual machine (PVM) technology. Temporal change of the mass distribution of 28 radionuclides in the repository region for the time period of 100 million yr has been successfully obtained by the code

  1. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... includes a reformulation of the usual methods to estimate the minimum distances of evaluation codes into the setting of affine variety codes. Finally we describe the connection to the theory of one-pointgeometric Goppa codes. Contents 4.1 Introduction...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  2. [Ecological executive function characteristics and effects of executive function on social adaptive function in school-aged children with epilepsy].

    Science.gov (United States)

    Xu, X J; Wang, L L; Zhou, N

    2016-02-23

    To explore the characteristics of ecological executive function in school-aged children with idiopathic or probably symptomatic epilepsy and examine the effects of executive function on social adaptive function. A total of 51 school-aged children with idiopathic or probably symptomatic epilepsy aged 5-12 years at our hospital and 37 normal ones of the same gender, age and educational level were included. The differences in ecological executive function and social adaptive function were compared between the two groups with the Behavior Rating Inventory of Executive Function (BRIEF) and Child Adaptive Behavior Scale, the Pearson's correlation test and multiple stepwise linear regression were used to explore the impact of executive function on social adaptive function. The scores of school-aged children with idiopathic or probably symptomatic epilepsy in global executive composite (GEC), behavioral regulation index (BRI) and metacognition index (MI) of BRIEF ((62±12), (58±13) and (63±12), respectively) were significantly higher than those of the control group ((47±7), (44±6) and (48±8), respectively))(Pchildren with idiopathic or probably symptomatic epilepsy in adaptive behavior quotient (ADQ), independence, cognition, self-control ((86±22), (32±17), (49±14), (41±16), respectively) were significantly lower than those of the control group ((120±12), (59±14), (59±7) and (68±10), respectively))(Pchildren with idiopathic or probably symptomatic epilepsy. School-aged children with idiopathic or probably symptomatic epilepsy may have significantly ecological executive function impairment and social adaptive function reduction. The aspects of BRI, inhibition and working memory in ecological executive function are significantly related with social adaptive function in school-aged children with epilepsy.

  3. Executive Headteachers: What's in a Name? Executive Summary

    Science.gov (United States)

    Theobald, Katy; Lord, Pippa

    2016-01-01

    Executive headteachers (EHTs) are becoming increasingly prevalent as the self-improving school system matures; there are over 620 EHTs in the school workforce today; and the number recorded in the School Workforce Census (SWC) has increased by 240 per cent between 2010 and 2014. The role is still evolving locally and nationally and, as EHTs take…

  4. Executive functions in anorexia nervosa

    OpenAIRE

    Jauregui-Lobera, Ignacio

    2014-01-01

    Introduction: The pathophysiologic mechanisms that account for the development and persistence of anorexia nervosa (AN) remain unclear. With respect to the neuropsychological functioning, the executive functions have been reported to be altered, especially cognitive flexibility and decision-making processes. Objectives: The aim of this study was to review the current state of the neuropsychological studies focused on anorexia nervosa, especially those highlighting the executive functions. Met...

  5. 8 CFR 3.0 - Executive Office for Immigration Review

    Science.gov (United States)

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Executive Office for Immigration Review 3.0 Section 3.0 Aliens and Nationality DEPARTMENT OF HOMELAND SECURITY GENERAL PROVISIONS EXECUTIVE OFFICE FOR IMMIGRATION REVIEW § 3.0 Executive Office for Immigration Review Regulations of the Executive Office for...

  6. Streamlining of the RELAP5-3D Code

    International Nuclear Information System (INIS)

    Mesina, George L; Hykes, Joshua; Guillen, Donna Post

    2007-01-01

    methodology employed follows Dijkstra's structured programming paradigm, which is based on splitting programs into sub-sections, each with single points of entry and exit and in which control is passed downward through the structure with no unconditional branches to higher levels. GO TO commands are typically avoided, since they alter the flow and control of a program's execution by allowing a jump from one place in the routine to another. The restructuring of RELAP5-3D subroutines is complicated by several issues. The first is use of code other than standard FORTRAN77. The second is restructuring limitations of FOR( ) STRUCT. The third is existence of pre-compiler directives and the complication of nested directives. Techniques were developed to overcome all these difficulties and more and these are reported. By implementing these developments, all subroutines of RELAP were restructured. Measures of code improvement relative to maintenance and development are presented

  7. Executive Functions in Premanifest Huntington’s Disease

    Science.gov (United States)

    You, S. Christine; Geschwind, Michael D.; Sha, Sharon J.; Apple, Alexandra; Satris, Gabriella; Wood, Kristie A.; Johnson, Erica T.; Gooblar, Jonathan; Feuerstein, Jeanne S.; Finkbeiner, Steven; Kang, Gail A.; Miller, Bruce L.; Hess, Christopher P.; Kramer, Joel H.; Possin, Katherine L.

    2014-01-01

    Objective We investigated the viability of psychometrically robust executive function measures as markers for premanifest Huntington’s disease (HD). Methods Fifteen premanifest HD subjects and 42 controls were compared on the NIH EXAMINER executive function battery. This battery yields an overall Executive Composite score, plus Working Memory, Cognitive Control, and Fluency Scores that are measured on psychometrically matched scales. The scores were correlated with two disease markers, disease burden and striatal volumes, in the premanifest HD subjects. Results The premanifest HD subjects scored significantly lower on the Working Memory Score. The Executive Composite positively correlated with striatal volumes, and Working Memory Score negatively correlated with disease burden. The Cognitive Control and Fluency Scores did not differ between the groups or correlate significantly with the disease markers. Conclusions The NIH EXAMINER Executive Composite and Working Memory Score are sensitive markers of cognitive dysfunction, striatal volume, and disease burden in premanifest HD. PMID:24375511

  8. Challenging executive dominance in European democracy

    NARCIS (Netherlands)

    Curtin, D.

    2014-01-01

    Executive dominance in the contemporary EU is part of a wider migration of executive power towards types of decision making that eschew electoral accountability and popular democratic control. This democratic gap is fed by far-going secrecy arrangements and practices exercised in a concerted fashion

  9. Challenging Executive Dominance in European Democracy

    NARCIS (Netherlands)

    Curtin, D.

    2013-01-01

    Executive dominance in the contemporary EU is part of a wider migration of executive power towards types of decision making that eschew electoral accountability and popular democratic control. This democratic gap is fed by far‐going secrecy arrangements and practices exercised in a concerted fashion

  10. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  11. File Managing and Program Execution in Web Operating Systems

    OpenAIRE

    Bravetti, Mario

    2010-01-01

    Web Operating Systems can be seen as an extension of traditional Operating Systems where the addresses used to manage files and execute programs (via the basic load/execution mechanism) are extended from local filesystem path-names to URLs. A first consequence is that, similarly as in traditional web technologies, executing a program at a given URL, can be done in two modalities: either the execution is performed client-side at the invoking machine (and relative URL addressing in the executed...

  12. Modern multicore and manycore architectures: Modelling, optimisation and benchmarking a multiblock CFD code

    Science.gov (United States)

    Hadade, Ioan; di Mare, Luca

    2016-08-01

    Modern multicore and manycore processors exhibit multiple levels of parallelism through a wide range of architectural features such as SIMD for data parallel execution or threads for core parallelism. The exploitation of multi-level parallelism is therefore crucial for achieving superior performance on current and future processors. This paper presents the performance tuning of a multiblock CFD solver on Intel SandyBridge and Haswell multicore CPUs and the Intel Xeon Phi Knights Corner coprocessor. Code optimisations have been applied on two computational kernels exhibiting different computational patterns: the update of flow variables and the evaluation of the Roe numerical fluxes. We discuss at great length the code transformations required for achieving efficient SIMD computations for both kernels across the selected devices including SIMD shuffles and transpositions for flux stencil computations and global memory transformations. Core parallelism is expressed through threading based on a number of domain decomposition techniques together with optimisations pertaining to alleviating NUMA effects found in multi-socket compute nodes. Results are correlated with the Roofline performance model in order to assert their efficiency for each distinct architecture. We report significant speedups for single thread execution across both kernels: 2-5X on the multicore CPUs and 14-23X on the Xeon Phi coprocessor. Computations at full node and chip concurrency deliver a factor of three speedup on the multicore processors and up to 24X on the Xeon Phi manycore coprocessor.

  13. User's manual of a computer code for seismic hazard evaluation for assessing the threat to a facility by fault model. SHEAT-FM

    International Nuclear Information System (INIS)

    Sugino, Hideharu; Onizawa, Kunio; Suzuki, Masahide

    2005-09-01

    To establish the reliability evaluation method for aged structural component, we developed a probabilistic seismic hazard evaluation code SHEAT-FM (Seismic Hazard Evaluation for Assessing the Threat to a facility site - Fault Model) using a seismic motion prediction method based on fault model. In order to improve the seismic hazard evaluation, this code takes the latest knowledge in the field of earthquake engineering into account. For example, the code involves a group delay time of observed records and an update process model of active fault. This report describes the user's guide of SHEAT-FM, including the outline of the seismic hazard evaluation, specification of input data, sample problem for a model site, system information and execution method. (author)

  14. Executable Behaviour and the π-Calculus (extended abstract

    Directory of Open Access Journals (Sweden)

    Bas Luttik

    2015-08-01

    Full Text Available Reactive Turing machines extend classical Turing machines with a facility to model observable interactive behaviour. We call a behaviour executable if, and only if, it is behaviourally equivalent to the behaviour of a reactive Turing machine. In this paper, we study the relationship between executable behaviour and behaviour that can be specified in the pi-calculus. We establish that all executable behaviour can be specified in the pi-calculus up to divergence-preserving branching bisimilarity. The converse, however, is not true due to (intended limitations of the model of reactive Turing machines. That is, the pi-calculus allows the specification of behaviour that is not executable up to divergence-preserving branching bisimilarity. Motivated by an intuitive understanding of executability, we then consider a restriction on the operational semantics of the pi-calculus that does associate with every pi-term executable behaviour, at least up to the version of branching bisimilarity that does not require the preservation of divergence.

  15. An upgraded version of the nucleon meson transport code: NMTC/JAERI97

    Energy Technology Data Exchange (ETDEWEB)

    Takada, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Yoshizawa, Nobuaki; Kosako, Kazuaki; Ishibashi, Kenji

    1998-02-01

    The nucleon-meson transport code NMTC/JAERI is upgraded to NMTC/JAERI97 which has new features not only in physics model and nuclear data but also in computational procedure. NMTC/JAERI97 implements the following two new physics models: an intranuclear cascade model taking account of the in-medium nuclear effects and the preequilibrium calculation model based on the exciton one. For treating the nucleon transport process more accurately, the nucleon-nucleus cross sections are revised to those derived by the systematics of Pearlstein. Moreover, the level density parameter derived by Ignatyuk is included as a new option for particle evaporation calculation. Other than those physical aspects, a new geometry package based on the Combinatorial Geometry with multi-array system and the importance sampling technique are implemented in the code. Tally function is also employed for obtaining such physical quantities as neutron energy spectra, heat deposition and nuclide yield without editing a history file. The resultant NMTC/JAERI97 is tuned to be executed on the UNIX system. This paper explains about the function, physics models and geometry model adopted in NMTC/JAERI97 and guides how to use the code. (author)

  16. Mother-Child Communication: The Influence of ADHD Symptomatology and Executive Functioning on Paralinguistic Style

    Directory of Open Access Journals (Sweden)

    Elizabeth Nilsen

    2016-08-01

    Full Text Available Paralinguistic style, involving features of speech such as pitch and volume, is an important aspect of one’s communicative competence. However, little is known about the behavioral traits and cognitive skills that relate to these aspects of speech. This study examined the extent to which ADHD traits and executive functioning related to the paralinguistic styles of 8- to 12-year-old children and their mothers. Data was collected via parent report (ADHD traits, independent laboratory tasks of executive functioning (working memory, inhibitory control, cognitive flexibility, and an interactive problem-solving task (completed by mothers and children jointly which was coded for paralinguistic speech elements (i.e., pitch level/variability; volume level/variability. Dyadic data analyses revealed that elevated ADHD traits in children were associated with a more exaggerated paralinguistic style (i.e., elevated and more variable pitch/volume for both mothers and children. Mothers’ paralinguistic style was additionally predicted by an interaction of mothers’ and children’s ADHD traits, such that mothers with elevated ADHD traits showed exaggerated paralinguistic styles particularly when their children also had elevated ADHD traits. Highlighting a cognitive mechanism, children with weaker inhibitory control showed more exaggerated paralinguistic styles.

  17. Conceptualization and Operationalization of Executive Function

    Science.gov (United States)

    Baggetta, Peter; Alexander, Patricia A.

    2016-01-01

    Executive function is comprised of different behavioral and cognitive elements and is considered to play a significant role in learning and academic achievement. Educational researchers frequently study the construct. However, because of its complexity functionally, the research on executive function can at times be both confusing and…

  18. 32 CFR 724.702 - Executive management.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Executive management. 724.702 Section 724.702 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY PERSONNEL NAVAL DISCHARGE REVIEW BOARD Organization of the Naval Discharge Review Board § 724.702 Executive management. The...

  19. A science-based executive for autonomous planetary vehicles

    Science.gov (United States)

    Peters, S.

    2001-01-01

    If requests for scientific observations, rather than specific plans, are uplinked to an autonomous execution system on the vehicle, it would be able to adjust its execution based upon actual performance. Such a science-based executive control system had been developed and demonstrated for the Rocky7 research rover.

  20. Executive control of attention in narcolepsy.

    Directory of Open Access Journals (Sweden)

    Sophie Bayard

    Full Text Available BACKGROUND: Narcolepsy with cataplexy (NC is a disabling sleep disorder characterized by early loss of hypocretin neurons that project to areas involved in the attention network. We characterized the executive control of attention in drug-free patients with NC to determine whether the executive deficits observed in patients with NC are specific to the disease itself or whether they reflect performance changes due to the severity of excessive daytime sleepiness. METHODOLOGY: Twenty-two patients with NC compared to 22 patients with narcolepsy without cataplexy (NwC matched for age, gender, intellectual level, objective daytime sleepiness and number of sleep onset REM periods (SOREMPs were studied. Thirty-two matched healthy controls were included. All participants underwent a standardized interview, completed questionnaires, and neuropsychological tests. All patients underwent a polysomnography followed by multiple sleep latency tests (MSLT, with neuropsychological evaluation performed the same day between MSLT sessions. PRINCIPAL FINDINGS: Irrespective of diagnosis, patients reported higher self-reported attentional complaints associated with the intensity of depressive symptoms. Patients with NC performed slower and more variably on simple reaction time tasks than patients with NwC, who did not differ from controls. Patients with NC and NwC generally performed slower, reacted more variably, and made more errors than controls on executive functioning tests. Individual profile analyses showed a clear heterogeneity of the severity of executive deficit. This severity was related to objective sleepiness, higher number of SOREMPs on the MSLT, and lower intelligence quotient. The nature and severity of the executive deficits were unrelated to NC and NwC diagnosis. CONCLUSIONS: We demonstrated that drug-free patients with NC and NwC complained of attention deficit, with altered executive control of attention being explained by the severity of objective

  1. 29 CFR 541.100 - General rule for executive employees.

    Science.gov (United States)

    2010-07-01

    ... REGULATIONS DEFINING AND DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE SALES EMPLOYEES Executive Employees § 541.100 General rule for executive employees. (a) The term...

  2. Validation of the DRAGON/DONJON code package for MNR using the IAEA 10 MW benchmark problem

    International Nuclear Information System (INIS)

    Day, S.E.; Garland, W.J.

    2000-01-01

    The first step in developing a framework for reactor physics analysis is to establish the appropriate and proven reactor physics codes. The chosen code package is tested, by executing a benchmark problem and comparing the results to the accepted standards. The IAEA 10 MW Benchmark problem is suitable for static reactor physics calculations on plate-fueled research reactor systems and has been used previously to validate codes for the McMaster Nuclear (MNR). The flexible and advanced geometry capabilities of the DRAGON transport theory code make it a desirable tool, and the accompanying DONJON diffusion theory code also has useful features applicable to safety analysis work at MNR. This paper describes the methodology used to benchmark the DRAGON/DONJON code package against this problem and the results herein extend the domain of validation of this code package. The results are directly applicable to MNR and are relevant to a reduced-enrichment fuel program. The DRAGON transport code models, used in this study, are based on the 1-D infinite slab approximation whereas the DONJON diffusion code models are defined in 3-D Cartesian geometry. The cores under consideration are composed of HEU (93% enrichment), MEU (45% enrichment) and LEU (20% enrichment) fuel and are examined in a fresh state, as well as at beginning-of-life (BOL) and end-of-life (EOL) exposures. The required flux plots and flux-ratio plots are included, as are transport theory code k∞and diffusion theory code k eff results. In addition to this, selected isotope atom densities are charted as a function of fuel burnup. Results from this analysis are compared to and are in good agreement with previously published results. (author)

  3. DCHAIN-SP 2001: High energy particle induced radioactivity calculation code

    Energy Technology Data Exchange (ETDEWEB)

    Kai, Tetsuya; Maekawa, Fujio; Kasugai, Yoshimi; Takada, Hiroshi; Ikeda, Yujiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Kosako, Kazuaki [Sumitomo Atomic Energy Industries, Ltd., Tokyo (Japan)

    2001-03-01

    For the purpose of contribution to safety design calculations for induced radioactivities in the JAERI/KEK high-intensity proton accelerator project facilities, the DCHAIN-SP which calculates the high energy particle induced radioactivity has been updated to DCHAIN-SP 2001. The following three items were improved: (1) Fission yield data are included to apply the code to experimental facility design for nuclear transmutation of long-lived radioactive waste where fissionable materials are treated. (2) Activation cross section data below 20 MeV are revised. In particular, attentions are paid to cross section data of materials which have close relation to the facilities, i.e., mercury, lead and bismuth, and to tritium production cross sections which are important in terms of safety of the facilities. (3) User-interface for input/output data is sophisticated to perform calculations more efficiently than that in the previous version. Information needed for use of the code is attached in Appendices; the DCHAIN-SP 2001 manual, the procedures of installation and execution of DCHAIN-SP, and sample problems. (author)

  4. The discrete-dipole-approximation code ADDA: Capabilities and known limitations

    International Nuclear Information System (INIS)

    Yurkin, Maxim A.; Hoekstra, Alfons G.

    2011-01-01

    The open-source code ADDA is described, which implements the discrete dipole approximation (DDA), a method to simulate light scattering by finite 3D objects of arbitrary shape and composition. Besides standard sequential execution, ADDA can run on a multiprocessor distributed-memory system, parallelizing a single DDA calculation. Hence the size parameter of the scatterer is in principle limited only by total available memory and computational speed. ADDA is written in C99 and is highly portable. It provides full control over the scattering geometry (particle morphology and orientation, and incident beam) and allows one to calculate a wide variety of integral and angle-resolved scattering quantities (cross sections, the Mueller matrix, etc.). Moreover, ADDA incorporates a range of state-of-the-art DDA improvements, aimed at increasing the accuracy and computational speed of the method. We discuss both physical and computational aspects of the DDA simulations and provide a practical introduction into performing such simulations with the ADDA code. We also present several simulation results, in particular, for a sphere with size parameter 320 (100-wavelength diameter) and refractive index 1.05.

  5. 32 CFR 700.320 - The Civilian Executive Assistants.

    Science.gov (United States)

    2010-07-01

    ... the Navy. (b) Each Civilian Executive Assistants, within his or her assigned area of responsibility... 32 National Defense 5 2010-07-01 2010-07-01 false The Civilian Executive Assistants. 700.320... of the Navy The Office of the Secretary of the Navy/the Civilian Executive Assistants § 700.320 The...

  6. Rumination prospectively predicts executive functioning impairments in adolescents.

    Science.gov (United States)

    Connolly, Samantha L; Wagner, Clara A; Shapero, Benjamin G; Pendergast, Laura L; Abramson, Lyn Y; Alloy, Lauren B

    2014-03-01

    The current study tested the resource allocation hypothesis, examining whether baseline rumination or depressive symptom levels prospectively predicted deficits in executive functioning in an adolescent sample. The alternative to this hypothesis was also evaluated by testing whether lower initial levels of executive functioning predicted increases in rumination or depressive symptoms at follow-up. A community sample of 200 adolescents (ages 12-13) completed measures of depressive symptoms, rumination, and executive functioning at baseline and at a follow-up session approximately 15 months later. Adolescents with higher levels of baseline rumination displayed decreases in selective attention and attentional switching at follow-up. Rumination did not predict changes in working memory or sustained and divided attention. Depressive symptoms were not found to predict significant changes in executive functioning scores at follow-up. Baseline executive functioning was not associated with change in rumination or depression over time. Findings partially support the resource allocation hypothesis that engaging in ruminative thoughts consumes cognitive resources that would otherwise be allocated towards difficult tests of executive functioning. Support was not found for the alternative hypothesis that lower levels of initial executive functioning would predict increased rumination or depressive symptoms at follow-up. Our study is the first to find support for the resource allocation hypothesis using a longitudinal design and an adolescent sample. Findings highlight the potentially detrimental effects of rumination on executive functioning during early adolescence. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Modeling of FREYA fast critical experiments with the Serpent Monte Carlo code

    International Nuclear Information System (INIS)

    Fridman, E.; Kochetkov, A.; Krása, A.

    2017-01-01

    Highlights: • FREYA – the EURATOM project executed to support fast lead-based reactor systems. • Critical experiments in the VENUS-F facility during the FREYA project. • Characterization of the critical VENUS-F cores with Serpent. • Comparison of the numerical Serpent results to the experimental data. - Abstract: The FP7 EURATOM project FREYA has been executed between 2011 and 2016 with the aim of supporting the design of fast lead-cooled reactor systems such as MYRRHA and ALFRED. During the project, a number of critical experiments were conducted in the VENUS-F facility located at SCK·CEN, Mol, Belgium. The Monte Carlo code Serpent was one of the codes applied for the characterization of the critical VENUS-F cores. Four critical configurations were modeled with Serpent, namely the reference critical core, the clean MYRRHA mock-up, the full MYRRHA mock-up, and the critical core with the ALFRED island. This paper briefly presents the VENUS-F facility, provides a detailed description of the aforementioned critical VENUS-F cores, and compares the numerical results calculated by Serpent to the available experimental data. The compared parameters include keff, point kinetics parameters, fission rate ratios of important actinides to that of U235 (spectral indices), axial and radial distribution of fission rates, and lead void reactivity effect. The reported results show generally good agreement between the calculated and experimental values. Nevertheless, the paper also reveals some noteworthy issues requiring further attention. This includes the systematic overprediction of reactivity and systematic underestimation of the U238 to U235 fission rate ratio.

  8. Coupling of RELAP5-3D and GAMMA codes for Nuclear Hydrogen System Analysis

    International Nuclear Information System (INIS)

    Jin, Hyung Gon

    2007-02-01

    RELAP5-3D is one of the most important system analysis codes in nuclear field, which has been developed for best-estimate transient simulation of light water reactor coolant systems during postulated accidents. The GAMMA code is a multi-dimensional multi-component mixture analysis code with the complete set of chemical reaction models which is developed for safety analysis of HTGR (High Temperature Gas Cooled Reactor) air-ingress. The two codes, RELAP5-3D and GAMMA, are coupled to be used for nuclear-hydrogen system analysis, which requires the capability of the analysis of multi-component gas mixture and two-phase flow. In order to couple the two codes, 4 steps are needed. Before coupling, the GAMMA code was transformed into DLL (dynamic link liberally) from executive type and RELAP5-3D was recompiled into Compaq Visual Fortran environments for our debugging purpose. As the second step, two programs - RELAP5-3D and GAMMA codes - must be synchronized in terms of time and time step. Based on that time coupling, the coupled code can calculate simultaneously. Time-step coupling had been accomplished successfully and it is tested by using a simple test input. As a next step, source-term coupling was done and it was also tested in two different test inputs. The fist case is a simple test condition, which has no chemical reaction. And the other test set is the chemical reaction model, including four non-condensable gas species, which are He, O2, CO, CO2. Finally, in order to analyze combined cycle system, heat-flux coupling has been made and a simple heat exchanger model was demonstrated

  9. Executive functioning: a scoping review of the occupational therapy literature.

    Science.gov (United States)

    Cramm, Heidi A; Krupa, Terry M; Missiuna, Cheryl A; Lysaght, Rosemary M; Parker, Kevin H

    2013-06-01

    Increasingly recognized as an important factor in the performance of complex, goal-directed tasks, executive functioning is understood in different ways across disciplines. The aim was to explore the ways in which executive functioning is conceptualized, discussed, described, and implied in the occupational therapy literature. A scoping review of the occupational therapy literature was conducted following Levac, Colquhoun, and O'Brien's (2010) recommended methodology. Executive functioning is described both as a set of performance component skills or processes and as the executive occupational performance inherent in complex occupations. Executive functioning is implicit in occupational performance and engagement, and some health conditions seem to be commonly associated with impaired executive functioning. Assessing executive functioning requires dynamic occupation- and performance-based assessment. Interventions targeting executive functioning are grounded in metacognitive approaches. Executive functioning is a complex construct that is conceptualized with considerable variance within the occupational therapy literature, creating barriers to effective service delivery.

  10. Hyperactivity in boys with attention-deficit/hyperactivity disorder (ADHD): The role of executive and non-executive functions.

    Science.gov (United States)

    Hudec, Kristen L; Alderson, R Matt; Patros, Connor H G; Lea, Sarah E; Tarle, Stephanie J; Kasper, Lisa J

    2015-01-01

    Motor activity of boys (age 8-12 years) with (n=19) and without (n=18) ADHD was objectively measured with actigraphy across experimental conditions that varied with regard to demands on executive functions. Activity exhibited during two n-back (1-back, 2-back) working memory tasks was compared to activity during a choice-reaction time (CRT) task that placed relatively fewer demands on executive processes and during a simple reaction time (SRT) task that required mostly automatic processing with minimal executive demands. Results indicated that children in the ADHD group exhibited greater activity compared to children in the non-ADHD group. Further, both groups exhibited the greatest activity during conditions with high working memory demands, followed by the reaction time and control task conditions, respectively. The findings indicate that large-magnitude increases in motor activity are predominantly associated with increased demands on working memory, though demands on non-executive processes are sufficient to elicit small to moderate increases in motor activity as well. Published by Elsevier Ltd.

  11. Financial Management for Childcare Executive Officers.

    Science.gov (United States)

    Foster-Jorgensen, Karen; Harrington, Angela

    This handbook is designed to assist childcare executive officers (CEOs) in managing the finances of their programs. The guide is divided into five sections. Section 1, "Financial Entrepreneurship," advocates the adoption of an entrepreneurial spirit in directors and recommends: (1) becoming the Chief Executive Officer of the program; (2) actively…

  12. Executive functioning in low birth weight children entering kindergarten.

    Science.gov (United States)

    Miller, S E; DeBoer, M D; Scharf, R J

    2018-01-01

    Poor executive functioning is associated with life-long difficulty. Identification of children at risk for executive dysfunction is important for early intervention to improve neurodevelopmental outcomes. This study is designed to examine relationships between birthweight and executive functioning in US children during kindergarten. Our hypothesis was that children with higher birthweights would have better executive function scores. We evaluated data from 17506 US children from the Early Childhood Longitudinal Study-Kindergarten 2011 cohort. Birthweight and gestational age were obtained by parental survey. Executive functions were directly assessed using the number reverse test and card sort test to measure working memory and cognitive flexibility, respectively. Teacher evaluations were used for additional executive functions. Data were analyzed using SAS to run all linear and logistical regressions. For every kilogram of birthweight, scores of working memory increased by 1.47 (Pexecutive functioning. As birthweight increases executive function scores improve, even among infants born normal weight. Further evaluation of this population including interventions and progression through school is needed.

  13. Innovative behavior in nurse executives.

    Science.gov (United States)

    Adams, C E

    1994-05-01

    This study addresses the problem-solving styles of hospital nurse executives and explores the relationship between problem-solving style and leader effectiveness. The Kirton Adaption-Innovation Inventory (KAI) and the Leader Effectiveness and Adaptability Description-Self (LEAD-S) were the instruments used to survey nurse executives from 66 medium-sized urban California hospitals. The majority of respondents used innovative approaches, but no correlation was found between problem-solving style and leader effectiveness.

  14. Supervision of execution of dismantling

    International Nuclear Information System (INIS)

    Canizares, J.

    2015-01-01

    Enresa create and organizational structure that covers various areas involved in effective control of Decommissioning Project. One area is the Technical Supervision of Works Decommissioning Project, as Execution Department dependent Technical Management. In the structure, Execution Department acts as liaison between the project, disciplines involved in developing and specialized companies contracted work to achieve your intended target. Equally important is to ensure that such activities are carried out correctly, according to the project documentation. (Author)

  15. Execution and executability

    Science.gov (United States)

    Bradford, Robert W.; Harrison, Denise

    2015-09-01

    "We have a new strategy to grow our organization." Developing the plan is just the start. Implementing it in the organization is the real challenge. Many organizations don't fail due to lack of strategy; they struggle because it isn't effectively implemented. After working with hundreds of companies on strategy development, Denise and Robert have distilled the critical areas where organizations need to focus in order to enhance profitability through superior execution. If these questions are important to your organization, you'll find useful answers in the following articles: Do you find yourself overwhelmed by too many competing priorities? How do you limit how many strategic initiatives/projects your organization is working on at one time? How do you balance your resource requirements (time and money) with the availability of these resources? How do you balance your strategic initiative requirements with the day-to-day requirements of your organization?

  16. A Sales Execution Strategy Guide for Technology Startups

    Directory of Open Access Journals (Sweden)

    Ian Gilbert

    2011-10-01

    Full Text Available The majority of startups fail to consider sales execution as part of their overall strategy. This article demonstrates how a sales execution strategy can help a company take a product or service to market more efficiently and effectively by focusing on the customers that are key to generating revenue. Combined with techniques for recruiting effectively and measuring sales outcomes, a sales execution strategy helps technology startups exceed growth aspirations and potentially reduce or even eliminate the requirement for external investment. In this article, we first describe the focus of assistance currently given to startups and the reasons why sales execution strategies are often overlooked. Next, we outline recommendations for developing, implementing, and supporting a sales execution strategy. Finally, we summarize the key points presented in the article.

  17. Metacognition and executive functioning in Elementary School

    Directory of Open Access Journals (Sweden)

    Trinidad García

    Full Text Available This study analyzes differences in metacognitive skills and executive functioning between two groups of students (10-12 years with different levels of metacognitive knowledge (high n = 50, low n = 64. Groups were established based on students' score on a test of knowledge of strategy use. Metacognitive skills were assessed by means of self-report. Students reported the frequency with which they applied these strategies during the phases of planning, execution, and evaluation of learning. Information about student executive functioning was provided by families and teachers, who completed two parallel forms of a behavior rating scale. The results indicated that: a the group with high levels of metacognitive knowledge reported using their metacognitive skills more frequently than their peers in the other group. These differences were statistically significant in the phases of planning and execution; b both family and teachers informed of better levels of executive functioning in the students with high metacognitive knowledge. Statistically significant differences were found in planning, functional memory, focus, and sustained attention. These results show the existence of an association between different levels of metacognitive knowledge, and differences in metacognitive skills and executive functions, and suggest the need to emphasize this set of variables in order to encourage students to acquire increasing levels of control over their learning process.

  18. Quality circles: the nurse executive as mentor.

    Science.gov (United States)

    Flarey, D L

    1991-12-01

    Changes within and around the health care environment are forcing health care executives to reexamine their managerial and leadership styles to confront the resulting turbulence. The nurse executive is charged with the profound responsibility of directing the delivery of nursing care throughout the organization. Care delivered today must be of high quality. Declining financial resources as well as personnel shortages cause the executive to be an effective innovator in meeting the increasing demands. Quality circles offer the nurse executive an avenue of recourse. Circles have been effectively implemented in the health care setting, as has been consistently documented over time. By way of a participative management approach, quality circles may lead to increased employee morale and productivity, cost savings, and decreased employee turnover rates, as well as realization of socialization and self-actualization needs. A most effective approach to their introduction would be implementation at the first-line manager level. This promotes an acceptance of the concept at the management level as well as a training course for managers to implement the process at the unit level. The nurse executive facilitates the process at the first-line manager level. This facilitation will cause a positive outcome to diffuse throughout the entire organization. Quality circles offer the nurse executive the opportunity to challenge the existing environmental turmoil and effect a positive and lasting change.

  19. A User's Manual for MASH V1.5 - A Monte Carlo Adjoint Shielding Code System

    Energy Technology Data Exchange (ETDEWEB)

    C. O. Slater; J. M. Barnes; J. O. Johnson; J.D. Drischler

    1998-10-01

    The Monte Carlo ~djoint ~ielding Code System, MASH, calculates neutron and gamma- ray environments and radiation protection factors for armored military vehicles, structures, trenches, and other shielding configurations by coupling a forward discrete ordinates air- over-ground transport calculation with an adjoint Monte Carlo treatment of the shielding geometry. Efficiency and optimum use of computer time are emphasized. The code system includes the GRTUNCL and DORT codes for air-over-ground transport calculations, the MORSE code with the GIFT5 combinatorial geometry package for adjoint shielding calculations, and several peripheral codes that perform the required data preparations, transformations, and coupling functions. The current version, MASH v 1.5, is the successor to the original MASH v 1.0 code system initially developed at Oak Ridge National Laboratory (ORNL). The discrete ordinates calculation determines the fluence on a coupling surface surrounding the shielding geometry due to an external neutron/gamma-ray source. The Monte Carlo calculation determines the effectiveness of the fluence at that surface in causing a response in a detector within the shielding geometry, i.e., the "dose importance" of the coupling surface fluence. A coupling code folds the fluence together with the dose importance, giving the desired dose response. The coupling code can determine the dose response as a function of the shielding geometry orientation relative to the source, distance from the source, and energy response of the detector. This user's manual includes a short description of each code, the input required to execute the code along with some helpful input data notes, and a representative sample problem.

  20. Naps Enhance Executive Attention in Preschool-Aged Children.

    Science.gov (United States)

    Cremone, Amanda; McDermott, Jennifer M; Spencer, Rebecca M C

    2017-09-01

    Executive attention is impaired following sleep loss in school-aged children, adolescents, and adults. Whether naps improve attention relative to nap deprivation in preschool-aged children is unknown. The aim of this study was to compare executive attention in preschool children following a nap and an interval of wake. Sixty-nine children, 35-70 months of age, completed a Flanker task to assess executive attention following a nap and an equivalent interval of wake. Overall, accuracy was greater after the nap compared with the wake interval. Reaction time(s) did not differ between the nap and wake intervals. Results did not differ between children who napped consistently and those who napped inconsistently, suggesting that naps benefit executive attention of preschoolers regardless of nap habituality. These results indicate that naps enhance attention in preschool children. As executive attention supports executive functioning and learning, nap promotion may improve early education outcomes. © The Author 2017. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  1. 78 FR 55244 - Senior Executive Service Performance Review Board; Membership

    Science.gov (United States)

    2013-09-10

    ... DEFENSE NUCLEAR FACILITIES SAFETY BOARD Senior Executive Service Performance Review Board... the membership of the Defense Nuclear Facilities Safety Board (DNFSB) Senior Executive Service (SES... rating of a senior executive's performance, the executive's response, and the higher level official's...

  2. The contribution of executive control to semantic cognition: Convergent evidence from semantic aphasia and executive dysfunction

    OpenAIRE

    Thompson, Hannah E; Almaghyuli, Azizah; Noonan, Krist A.; Barak, Ohr; Lambon Ralph, Matthew; Jefferies, Elizabeth

    2018-01-01

    Semantic cognition, as described by the Controlled Semantic Cognition (CSC) framework (Rogers, Patterson, Jefferies, & Lambon Ralph, 2015), involves two key components: activation of coherent, generalizable concepts within a heteromodal ‘hub’ in combination with modality-specific features (spokes), and a constraining mechanism that manipulates and gates this knowledge to generate time- and task- appropriate behaviour. Executive-semantic goal representations, largely supported by executive...

  3. Parallelization characteristics of the DeCART code

    International Nuclear Information System (INIS)

    Cho, J. Y.; Joo, H. G.; Kim, H. Y.; Lee, C. C.; Chang, M. H.; Zee, S. Q.

    2003-12-01

    domain using MPI. In memory distribution capability, the memory requirement of about 11 GBytes for a simplified SMART core problem is reduced by the factor of about 11 when using 12 processors. Therefore it is concluded that the parallel capability accompanying memory distribution of the DeCART code enables not only to solve a problem efficiently via parallel computing but also to solve huge problems via memory distribution on affordable LINUX clusters, and this parallel execution feature is an important element of DeCART since it increases significantly the practical application of the DeCART code

  4. PL-MOD: a computer code for modular fault tree analysis and evaluation

    International Nuclear Information System (INIS)

    Olmos, J.; Wolf, L.

    1978-01-01

    The computer code PL-MOD has been developed to implement the modular methodology to fault tree analysis. In the modular approach, fault tree structures are characterized by recursively relating the top tree event to all basic event inputs through a set of equations, each defining an independent modular event for the tree. The advantages of tree modularization lie in that it is a more compact representation than the minimal cut-set description and in that it is well suited for fault tree quantification because of its recursive form. In its present version, PL-MOD modularizes fault trees and evaluates top and intermediate event failure probabilities, as well as basic component and modular event importance measures, in a very efficient way. Thus, its execution time for the modularization and quantification of a PWR High Pressure Injection System reduced fault tree was 25 times faster than that necessary to generate its equivalent minimal cut-set description using the computer code MOCUS

  5. The Risk Preferences of U.S. Executives

    DEFF Research Database (Denmark)

    Brenner, Steffen

    2015-01-01

    In this paper, I elicit risk attitudes of U.S. executives by calibrating a subjective option valuation model for option exercising data (1996 to 2008), yielding approximately 65,000 values of relative risk aversion (RRA) for almost 7,000 executives. The observed behavior is generally consistent...... with moderate risk aversion and a median (mean) RRA close to one (three). Values are validated for chief executive officers (CEOs) by testing theory-based predictions on the influence of individual characteristics on risk preferences such as gender, marital status, religiosity, and intelligence. Senior managers...... such as CEOs, presidents, and chairpersons of the boards of directors are significantly less risk averse than non-senior executives. RRA heterogeneity is strongly correlated with sector membership and firm-level variables such as size, performance, and capital structure. Alternative factors influencing option...

  6. Stateless and stateful implementations of faithful execution

    Science.gov (United States)

    Pierson, Lyndon G; Witzke, Edward L; Tarman, Thomas D; Robertson, Perry J; Eldridge, John M; Campbell, Philip L

    2014-12-16

    A faithful execution system includes system memory, a target processor, and protection engine. The system memory stores a ciphertext including value fields and integrity fields. The value fields each include an encrypted executable instruction and the integrity fields each include an encrypted integrity value for determining whether a corresponding one of the value fields has been modified. The target processor executes plaintext instructions decoded from the ciphertext while the protection engine is coupled between the system memory and the target processor. The protection engine includes logic to retrieve the ciphertext from the system memory, decrypt the value fields into the plaintext instructions, perform an integrity check based on the integrity fields to determine whether any of the corresponding value fields have been modified, and provide the plaintext instructions to the target processor for execution.

  7. Renewal and change for health care executives.

    Science.gov (United States)

    Burke, G C; Bice, M O

    1991-01-01

    Health care executives must consider renewal and change within their own lives if they are to breathe life into their own institutions. Yet numerous barriers to executive renewal exist, including time pressures, fatigue, cultural factors, and trustee attitudes. This essay discusses such barriers and suggests approaches that health care executives may consider for programming renewal into their careers. These include self-assessment for professional and personal goals, career or job change, process vs. outcome considerations, solitude, networking, lifelong education, surrounding oneself with change agents, business travel and sabbaticals, reading outside the field, physical exercise, mentoring, learning from failures, a sense of humor, spiritual reflection, and family and friends. Renewal is a continuous, lifelong process requiring constant learning. Individual executives would do well to develop a framework for renewal in their careers and organizations.

  8. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  9. Software enhancements and modifications to Program FDTD executable on the Cray X-MP computer

    Energy Technology Data Exchange (ETDEWEB)

    Stringer, J.C.

    1987-09-04

    This report summarizes enhancements and modifications to PROGRAM FDTD executable on the Cray X-MP computer system. Specifically, the tasks defined and performed under this effort are revision of the material encoding/decoding scheme to allow material type specification on an individual cell basis; modification of the I/O buffering scheme to maximize the use of available central memory and minimize the number of physical I/O accesses; user interface enhancements. Provide enhanced input/output features for greater flexibility; increased modularity. Divide the code into additional modules for ease of maintenance and future enhancements; and assist in the conversion and testing of FDTD to Floating Point Systems scientific computers and associated peripheral devices.

  10. What executives should remember.

    Science.gov (United States)

    Drucker, Peter F

    2006-02-01

    In more than 30 essays for Harvard Business Review, Peter Drucker (1909-2005) urged readers to take on the hard work of thinking--always combined, he insisted, with decisive action. He closely analyzed the phenomenon of knowledge work--the growing call for employees who use their minds rather than their hands--and explained how it challenged the conventional wisdom about the way organizations should be run. He was intrigued by employees who knew more about certain subjects than their bosses or colleagues but who still had to cooperate with others in a large organization. As the business world matured in the second half of the twentieth century, executives came to think that they knew how to run companies--and Drucker took it upon himself to poke holes in their assumptions, lest organizations become stale. But he did so sympathetically, operating from the premise that his readers were intelligent, hardworking people of goodwill. Well suited to HBR's format of practical, idea-based essays for executives, his clear-eyed, humanistic writing enriched the magazine time and again. This article is a compilation of the savviest management advice Drucker offered HBR readers over the years--in short, his greatest hits. It revisits the following insightful, influential contributions: "The Theory of the Business" (September-October 1994), "Managing for Business Effectiveness" (May-June 1963), "What Business Can Learn from Nonprofits" (July-August 1989), "The New Society of Organizations" (September-October 1992), "The Information Executives Truly Need" (January-February 1995), "Managing Oneself" (March-April 1999 republished January 2005), "They're Not Employees, They're People" (February 2002), "What Makes an Effective Executive" (June 2004).

  11. Self-complementary circular codes in coding theory.

    Science.gov (United States)

    Fimmel, Elena; Michel, Christian J; Starman, Martin; Strüngmann, Lutz

    2018-04-01

    Self-complementary circular codes are involved in pairing genetic processes. A maximal [Formula: see text] self-complementary circular code X of trinucleotides was identified in genes of bacteria, archaea, eukaryotes, plasmids and viruses (Michel in Life 7(20):1-16 2017, J Theor Biol 380:156-177, 2015; Arquès and Michel in J Theor Biol 182:45-58 1996). In this paper, self-complementary circular codes are investigated using the graph theory approach recently formulated in Fimmel et al. (Philos Trans R Soc A 374:20150058, 2016). A directed graph [Formula: see text] associated with any code X mirrors the properties of the code. In the present paper, we demonstrate a necessary condition for the self-complementarity of an arbitrary code X in terms of the graph theory. The same condition has been proven to be sufficient for codes which are circular and of large size [Formula: see text] trinucleotides, in particular for maximal circular codes ([Formula: see text] trinucleotides). For codes of small-size [Formula: see text] trinucleotides, some very rare counterexamples have been constructed. Furthermore, the length and the structure of the longest paths in the graphs associated with the self-complementary circular codes are investigated. It has been proven that the longest paths in such graphs determine the reading frame for the self-complementary circular codes. By applying this result, the reading frame in any arbitrary sequence of trinucleotides is retrieved after at most 15 nucleotides, i.e., 5 consecutive trinucleotides, from the circular code X identified in genes. Thus, an X motif of a length of at least 15 nucleotides in an arbitrary sequence of trinucleotides (not necessarily all of them belonging to X) uniquely defines the reading (correct) frame, an important criterion for analyzing the X motifs in genes in the future.

  12. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    Science.gov (United States)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  13. List Decoding of Matrix-Product Codes from nested codes: an application to Quasi-Cyclic codes

    DEFF Research Database (Denmark)

    Hernando, Fernando; Høholdt, Tom; Ruano, Diego

    2012-01-01

    A list decoding algorithm for matrix-product codes is provided when $C_1,..., C_s$ are nested linear codes and $A$ is a non-singular by columns matrix. We estimate the probability of getting more than one codeword as output when the constituent codes are Reed-Solomon codes. We extend this list...... decoding algorithm for matrix-product codes with polynomial units, which are quasi-cyclic codes. Furthermore, it allows us to consider unique decoding for matrix-product codes with polynomial units....

  14. Design review of the SYVAC Executive

    International Nuclear Information System (INIS)

    Lane, G.D.

    1985-01-01

    The report documents the current SYVAC Executive design. Eight criteria for evaluating aspects of the design as objectively as possible are described as well as the results of applying them. Principal benefits arising from implementation of design recommendations are perceived to be improved maintainability and better delineation of the interface between the Executive and SYVAC submodels. (author)

  15. Executive compensation and firm performance: Evidence from Indian firms

    Directory of Open Access Journals (Sweden)

    Mehul Raithatha

    2016-09-01

    Full Text Available The study examines the relationship between executive compensation and firm performance among Indian firms. The evidence suggests that firm performance measured by accounting, as well as market-based measures, significantly affects executive compensation. We also test for the presence of persistence in executive compensation by employing the system-generalised methods of moments (GMM estimator. We find significant persistence in executive compensation among the sample firms. Further, we report the absence of pay–performance relationship among the smaller sample firms and business group affiliated firms. Thus, our findings cast doubts over the performance-based executive compensation practices of Indian business group affiliated firms.

  16. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  17. 78 FR 41191 - Senior Executive Service Performance Review Board

    Science.gov (United States)

    2013-07-09

    ... DEPARTMENT OF TRANSPORTATION Surface Transportation Board Senior Executive Service Performance... Transportation Board (STB) publishes the names of the Persons selected to serve on its Senior Executive Service... performance appraisal system making senior executives accountable for organizational and individual goal...

  18. Review of SKB's Code Documentation and Testing

    International Nuclear Information System (INIS)

    Hicks, T.W.

    2005-01-01

    SKB is in the process of developing the SR-Can safety assessment for a KBS 3 repository. The assessment will be based on quantitative analyses using a range of computational codes aimed at developing an understanding of how the repository system will evolve. Clear and comprehensive code documentation and testing will engender confidence in the results of the safety assessment calculations. This report presents the results of a review undertaken on behalf of SKI aimed at providing an understanding of how codes used in the SR 97 safety assessment and those planned for use in the SR-Can safety assessment have been documented and tested. Having identified the codes us ed by SKB, several codes were selected for review. Consideration was given to codes used directly in SKB's safety assessment calculations as well as to some of the less visible codes that are important in quantifying the different repository barrier safety functions. SKB's documentation and testing of the following codes were reviewed: COMP23 - a near-field radionuclide transport model developed by SKB for use in safety assessment calculations. FARF31 - a far-field radionuclide transport model developed by SKB for use in safety assessment calculations. PROPER - SKB's harness for executing probabilistic radionuclide transport calculations using COMP23 and FARF31. The integrated analytical radionuclide transport model that SKB has developed to run in parallel with COMP23 and FARF31. CONNECTFLOW - a discrete fracture network model/continuum model developed by Serco Assurance (based on the coupling of NAMMU and NAPSAC), which SKB is using to combine hydrogeological modelling on the site and regional scales in place of the HYDRASTAR code. DarcyTools - a discrete fracture network model coupled to a continuum model, recently developed by SKB for hydrogeological modelling, also in place of HYDRASTAR. ABAQUS - a finite element material model developed by ABAQUS, Inc, which is used by SKB to model repository buffer

  19. An Efficient Code-Based Threshold Ring Signature Scheme with a Leader-Participant Model

    Directory of Open Access Journals (Sweden)

    Guomin Zhou

    2017-01-01

    Full Text Available Digital signature schemes with additional properties have broad applications, such as in protecting the identity of signers allowing a signer to anonymously sign a message in a group of signers (also known as a ring. While these number-theoretic problems are still secure at the time of this research, the situation could change with advances in quantum computing. There is a pressing need to design PKC schemes that are secure against quantum attacks. In this paper, we propose a novel code-based threshold ring signature scheme with a leader-participant model. A leader is appointed, who chooses some shared parameters for other signers to participate in the signing process. This leader-participant model enhances the performance because every participant including the leader could execute the decoding algorithm (as a part of signing process upon receiving the shared parameters from the leader. The time complexity of our scheme is close to Courtois et al.’s (2001 scheme. The latter is often used as a basis to construct other types of code-based signature schemes. Moreover, as a threshold ring signature scheme, our scheme is as efficient as the normal code-based ring signature.

  20. Combinatorial neural codes from a mathematical coding theory perspective.

    Science.gov (United States)

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  1. 76 FR 69770 - Senior Executive Service Performance Review Board

    Science.gov (United States)

    2011-11-09

    ... OFFICE OF PERSONNEL MANAGEMENT Senior Executive Service Performance Review Board AGENCY: Office of... of a senior executive's performance by the supervisor, and considers recommendations to the appointing authority regarding the performance of the senior executive. Office of Personnel Management. John...

  2. 76 FR 78257 - Senior Executive Service Performance Review Board

    Science.gov (United States)

    2011-12-16

    ... FEDERAL RETIREMENT THRIFT INVESTMENT BOARD Senior Executive Service Performance Review Board... appointment of the members of the Senior Executive Service Performance Review Boards for the Federal... appropriate personnel actions for members of the Senior Executive Service. DATES: This notice is effective...

  3. 78 FR 44577 - Senior Executive Service Performance Review Board

    Science.gov (United States)

    2013-07-24

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Senior Executive Service Performance... notice announces the appointment of the members of the Senior Executive Service Performance Review Boards... other appropriate personnel actions for incumbents of Senior Executive Service, Senior Level and Senior...

  4. 77 FR 54570 - Senior Executive Service Performance Review Board

    Science.gov (United States)

    2012-09-05

    ... DEFENSE NUCLEAR FACILITIES SAFETY BOARD Senior Executive Service Performance Review Board AGENCY... the Defense Nuclear Facilities Safety Board (DNFSB) Senior Executive Service (SES) Performance Review.... The PRB shall review and evaluate the initial summary rating of the senior executive's performance...

  5. Executive functioning in older adults with hoarding disorder.

    Science.gov (United States)

    Ayers, Catherine R; Wetherell, Julie Loebach; Schiehser, Dawn; Almklov, Erin; Golshan, Shahrokh; Saxena, Sanjaya

    2013-11-01

    Hoarding disorder (HD) is a chronic and debilitating psychiatric condition. Midlife HD patients have been found to have neurocognitive impairment, particularly in areas of executive functioning, but the extent to which this is due to comorbid psychiatric disorders has not been clear. The purpose of the present investigation was to examine executive functioning in geriatric HD patients without any comorbid Axis I disorders (n = 42) compared with a healthy older adult comparison group (n = 25). We hypothesized that older adults with HD would perform significantly worse on measures of executive functioning (Wisconsin Card Sort Task [Psychological Assessment Resources, Lutz, Florida, USA] ( Psychological Assessment Resources, 2003) and the Wechsler Adult Intelligence Scale-IV digit span and letter-number sequencing tests [Pearson, San Antonio, TX, USA]). Older adults with HD showed significant differences from healthy older controls in multiple aspects of executive functioning. Compared with healthy controls, older adults with HD committed significantly more total, non-perseverative errors and conceptual level responses on the Wisconsin Card Sort Task and had significantly worse performance on the Wechsler Adult Intelligence Scale-IV digit span and letter-number sequencing tests. Hoarding symptom severity was strongly correlated with executive dysfunction in the HD group. Compared with demographically-matched controls, older adults with HD have dysfunction in several domains of executive functioning including mental control, working memory, inhibition, and set shifting. Executive dysfunction is strongly correlated with hoarding severity and is not because of comorbid psychiatric disorders in HD patients. These results have broad clinical implications suggesting that executive functioning should be assessed and taken into consideration when developing intervention strategies for older adults with HD. Copyright © 2013 John Wiley & Sons, Ltd.

  6. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  7. Pegasys: software for executing and integrating analyses of biological sequences

    Directory of Open Access Journals (Sweden)

    Lett Drew

    2004-04-01

    Full Text Available Abstract Background We present Pegasys – a flexible, modular and customizable software system that facilitates the execution and data integration from heterogeneous biological sequence analysis tools. Results The Pegasys system includes numerous tools for pair-wise and multiple sequence alignment, ab initio gene prediction, RNA gene detection, masking repetitive sequences in genomic DNA as well as filters for database formatting and processing raw output from various analysis tools. We introduce a novel data structure for creating workflows of sequence analyses and a unified data model to store its results. The software allows users to dynamically create analysis workflows at run-time by manipulating a graphical user interface. All non-serial dependent analyses are executed in parallel on a compute cluster for efficiency of data generation. The uniform data model and backend relational database management system of Pegasys allow for results of heterogeneous programs included in the workflow to be integrated and exported into General Feature Format for further analyses in GFF-dependent tools, or GAME XML for import into the Apollo genome editor. The modularity of the design allows for new tools to be added to the system with little programmer overhead. The database application programming interface allows programmatic access to the data stored in the backend through SQL queries. Conclusions The Pegasys system enables biologists and bioinformaticians to create and manage sequence analysis workflows. The software is released under the Open Source GNU General Public License. All source code and documentation is available for download at http://bioinformatics.ubc.ca/pegasys/.

  8. Strategic management: a new dimension of the nurse executive's role.

    Science.gov (United States)

    Johnson, L J

    1990-09-01

    The growth of corporate orientation for health care structures, with a focus on bottom-line management, has radically altered the role of nurse executives. With the organization's emphasis on performance, productivity, and results, successful nurse executives are now integrating the management of the delivery of nursing care with the management of complex corporate structures and relationships. The editor of Executive Development discusses the rapidly changing expectations and demands of the contemporary nurse executive's work. The nurse executive's role can be viewed from many perspectives: its scope, its value, its structure, its content. Content--"What does the nurse executive do that makes a real difference?"--is the focus here.

  9. 76 FR 76122 - Senior Executive Service Performance Review Board

    Science.gov (United States)

    2011-12-06

    ... CHEMICAL SAFETY AND HAZARD INVESTIGATION BOARD Senior Executive Service Performance Review Board... change in the membership of the Senior Executive Service Performance Review Board for the Chemical Safety... Senior Executive Service (SES) and makes recommendations as to final annual performance ratings for...

  10. Corporate governance going astray: executive remuneration built to fail

    NARCIS (Netherlands)

    Winter, J.; Grundmann, S.; Haar, B.; Merkt, H.; Mülbert, P.O.; Wellenhofer, M.; Baum, H.; von Hein, J.; von Hippel, T.; Pistor, K.; Roth, M.; Schweitzer, H.

    2010-01-01

    Modern remuneration systems for executive directors include substantial elements of performance based pay. The idea behind this is that by rewarding executives for performance their interests become aligned with those of the company’s shareholders, thus bridging the principal-agent gap. Executive

  11. 78 FR 67147 - Senior Executive Service Performance Review Board

    Science.gov (United States)

    2013-11-08

    ... FEDERAL RETIREMENT THRIFT INVESTMENT BOARD Senior Executive Service Performance Review Board... appointment of the members of the Senior Executive Service Performance Review Boards for the Federal... actions for members of the Senior Executive Service. DATES: This notice is effective November 5, 2013. FOR...

  12. 78 FR 57837 - Senior Executive Service Performance Review Board

    Science.gov (United States)

    2013-09-20

    ... CHEMICAL SAFETY AND HAZARD INVESTIGATION BOARD Senior Executive Service Performance Review Board... change in the membership of the Senior Executive Service Performance Review Board for the Chemical Safety... Senior Executive Service (SES) and makes recommendations as to final annual performance ratings for...

  13. 77 FR 60450 - Senior Executive Service Performance Review Board

    Science.gov (United States)

    2012-10-03

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Senior Executive Service Performance... announces the appointment of the members of the Senior Executive Service Performance Review Boards for the... appropriate personnel actions for incumbents of Senior Executive Service, Senior Level and Senior Professional...

  14. 77 FR 70779 - Senior Executive Service Performance Review Board

    Science.gov (United States)

    2012-11-27

    ... FEDERAL RETIREMENT THRIFT INVESTMENT BOARD Senior Executive Service Performance Review Board... appointment of the members of the Senior Executive Service Performance Review Boards for the Federal... actions for members of the Senior Executive Service. DATES: This notice is effective November 27, 2012...

  15. 77 FR 66191 - Senior Executive Service-Performance Review Board

    Science.gov (United States)

    2012-11-02

    ... OFFICE OF PERSONNEL MANAGEMENT Senior Executive Service--Performance Review Board AGENCY: Office... performance review boards. The board reviews and evaluates the initial appraisal of a senior executive's... performance of the senior executive. U.S. Office of Personnel Management. John Berry, Director. The following...

  16. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  17. Repetitive thinking, executive functioning, and depressive mood in the elderly.

    Science.gov (United States)

    Philippot, Pierre; Agrigoroaei, Stefan

    2017-11-01

    Previous findings and the depressive-executive dysfunction hypothesis suggest that the established association between executive functioning and depression is accounted for by repetitive thinking. Investigating the association between executive functioning, repetitive thinking, and depressive mood, the present study empirically tested this mediational model in a sample of older adults, while focusing on both concrete and abstract repetitive thinking. This latter distinction is important given the potential protective role of concrete repetitive thinking, in contrast to the depletive effect of abstract repetitive thinking. A sample of 43 elderly volunteers, between 75 and 95 years of age, completed tests of executive functioning (the Stroop test, the Trail Making test, and the Fluency test), and questionnaires of repetitive thinking and depression. Positive correlations were observed between abstract repetitive thinking and depressive mood, and between concrete repetitive thinking and executive functioning; a negative correlation was observed between depressive mood and executive functioning. Further, mediational analysis evidenced that the relation between executive functioning and depressive mood was mediated by abstract repetitive thinking. The present data provide, for the first time, empirical support to the depressive-executive dysfunction hypothesis: the lack of executive resources would favor a mode of abstract repetitive thinking, which in turn would deplete mood. It suggests that clinical intervention targeting depression in the elderly should take into consideration repetitive thinking modes and the executive resources needed to disengage from rumination.

  18. Experimental validation for combustion analysis of GOTHIC code in 2-dimensional combustion chamber

    International Nuclear Information System (INIS)

    Lee, J. W.; Yang, S. Y.; Park, K. C.; Jung, S. H.

    2002-01-01

    In this study, the prediction capability of GOTHIC code for hydrogen combustion phenomena was validated with the results of two-dimensional premixed hydrogen combustion experiment executed by Seoul National University. The experimental chamber has about 24 liter free volume (1x0.024x1 m 3 ) and 2-dimensional rectangular shape. The test were preformed with 10% hydrogen/air gas mixture and conducted with combination of two igniter positions (top center, top corner) and two boundary conditions (bottom full open, bottom right half open). Using the lumped parameter and mechanistic combustion model in GOTHIC code, the SNU experiments were simulated under the same conditions. The GOTHIC code prediction of the hydrogen combustion phenomena did not compare well with the experimental results. In case of lumped parameter simulation, the combustion time was predicted appropriately. But any other local information related combustion phenomena could not be obtained. In case of mechanistic combustion analysis, the physical combustion phenomena of gas mixture were not matched experimental ones. In boundary open cases, the GOTHIC predicted very long combustion time and the flame front propagation could not simulate appropriately. Though GOTHIC showed flame propagation phenomenon in adiabatic calculation, the induction time of combustion was still very long compare with experimental results. Also, it was found that the combustion model of GOTHIC code had some weak points in low concentration of hydrogen combustion simulation

  19. User Instructions for the Systems Assessment Capability, Rev. 0, Computer Codes Volume 2: Impact Modules

    International Nuclear Information System (INIS)

    Eslinger, Paul W.; Arimescu, Carmen; Kanyid, Beverly A.; Miley, Terri B.

    2001-01-01

    One activity of the Department of Energy?s Groundwater/Vadose Zone Integration Project is an assessment of cumulative impacts from Hanford Site wastes on the subsurface environment and the Columbia River. Through the application of a system assessment capability (SAC), decisions for each cleanup and disposal action will be able to take into account the composite effect of other cleanup and disposal actions. The SAC has developed a suite of computer programs to simulate the migration of contaminants (analytes) present on the Hanford Site and to assess the potential impacts of the analytes, including dose to humans, socio-cultural impacts, economic impacts, and ecological impacts. The general approach to handling uncertainty in the SAC computer codes is a Monte Carlo approach. Conceptually, one generates a value for every stochastic parameter in the code (the entire sequence of modules from inventory through transport and impacts) and then executes the simulation, obtaining an output value, or result. This document provides user instructions for the SAC codes that generate human, ecological, economic, and cultural impacts

  20. Neural modeling of prefrontal executive function

    Energy Technology Data Exchange (ETDEWEB)

    Levine, D.S. [Univ. of Texas, Arlington, TX (United States)

    1996-12-31

    Brain executive function is based in a distributed system whereby prefrontal cortex is interconnected with other cortical. and subcortical loci. Executive function is divided roughly into three interacting parts: affective guidance of responses; linkage among working memory representations; and forming complex behavioral schemata. Neural network models of each of these parts are reviewed and fit into a preliminary theoretical framework.

  1. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  2. 75 FR 1028 - Senior Executive Service Performance Review Board

    Science.gov (United States)

    2010-01-08

    ... CHEMICAL SAFETY AND HAZARD INVESTIGATION BOARD Senior Executive Service Performance Review Board... change in the membership of the Senior Executive Service Performance Review Board for the Chemical Safety... performance ratings of members of the Senior Executive Service (SES) and makes recommendations as to final...

  3. Structures and Relationships between the Business Executive and Information Technology Executive at the University: A Mixed Methods Study

    Science.gov (United States)

    Hollman, Angela K.

    2014-01-01

    This study uses an explanatory mixed methods methodology to attempt to determine the reporting relationships between business and IT executives within the university. The study also explores IT and business executives thoughts on these relationships. Supporting research from organizational studies and business-IT alignment is combined in order to…

  4. Inter-Organisational Coordination in Ramp-Up Execution

    DEFF Research Database (Denmark)

    Christensen, Irene; Karlsson, Christer

    the degree of fragmentation in the process planning and execution. Resource dependence theory (RDT) is used as central explanatory framework for inter-organisational interdependencies formation throughout the planning and execution of the ramp-up activities and milestones. This study aims at exploring inter...

  5. OFFSCALE: A PC input processor for the SCALE code system. The CSASIN processor for the criticality sequences

    International Nuclear Information System (INIS)

    Bowman, S.M.

    1994-11-01

    OFFSCALE is a suite of personal computer input processor programs developed at Oak Ridge National Laboratory to provide an easy-to-use interface for modules in the SCALE-4 code system. CSASIN (formerly known as OFFSCALE) is a program in the OFFSCALE suite that serves as a user-friendly interface for the Criticality Safety Analysis Sequences (CSAS) available in SCALE-4. It is designed to assist a SCALE-4 user in preparing an input file for execution of criticality safety problems. Output from CSASIN generates an input file that may be used to execute the CSAS control module in SCALE-4. CSASIN features a pulldown menu system that accesses sophisticated data entry screens. The program allows the user to quickly set up a CSAS input file and perform data checking. This capability increases productivity and decreases the chance of user error

  6. Executive Function and Reading Comprehension: A Meta-Analytic Review

    Science.gov (United States)

    Follmer, D. Jake

    2018-01-01

    This article presents a meta-analytic review of the relation between executive function and reading comprehension. Results (N = 6,673) supported a moderate positive association between executive function and reading comprehension (r = 0.36). Moderator analyses suggested that correlations between executive function and reading comprehension did not…

  7. ORIGEN-2.2, Isotope Generation and Depletion Code Matrix Exponential Method

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of problem or function: ORIGEN is a computer code system for calculating the buildup, decay, and processing of radioactive materials. ORIGEN2 is a revised version of ORIGEN and incorporates updates of the reactor models, cross sections, fission product yields, decay data, and decay photon data, as well as the source code. ORIGEN-2.1 replaces ORIGEN and includes additional libraries for standard and extended-burnup PWR and BWR calculations, which are documented in ORNL/TM-11018. ORIGEN2.1 was first released in August 1991 and was replaced with ORIGEN2 Version 2.2 in June 2002. Version 2.2 was the first update to ORIGEN2 in over 10 years and was stimulated by a user discovering a discrepancy in the mass of fission products calculated using ORIGEN2 V2.1. Code modifications, as well as reducing the irradiation time step to no more than 100 days/step reduced the discrepancy from ∼10% to 0.16%. The bug does not noticeably affect the fission product mass in typical ORIGEN2 calculations involving reactor fuels because essentially all of the fissions come from actinides that have explicit fission product yield libraries. Thus, most previous ORIGEN2 calculations that were otherwise set up properly should not be affected. 2 - Method of solution: ORIGEN uses a matrix exponential method to solve a large system of coupled, linear, first-order ordinary differential equations with constant coefficients. ORIGEN2 has been variably dimensioned to allow the user to tailor the size of the executable module to the problem size and/or the available computer space. Dimensioned arrays have been set large enough to handle almost any size problem, using virtual memory capabilities available on most mainframe and 386/486 based PCS. The user is provided with much of the framework necessary to put some of the arrays to several different uses, call for the subroutines that perform the desired operations, and provide a mechanism to execute multiple ORIGEN2 problems with a single

  8. Dissociation in undergraduate students: disruptions in executive functioning.

    Science.gov (United States)

    Giesbrecht, Timo; Merckelbach, Harald; Geraerts, Elke; Smeets, Ellen

    2004-08-01

    The concept of dissociation refers to disruptions in attentional control. Attentional control is an executive function. Few studies have addressed the link between dissociation and executive functioning. Our study investigated this relationship in a sample of undergraduate students (N = 185) who completed the Dissociative Experiences Scale and the Random Number Generation Task. We found that minor disruptions in executive functioning were related to a subclass of dissociative experiences, notably dissociative amnesia and the Dissociative Experiences Scale Taxon. However, the two other subscales of the Dissociative Experiences Scale, measuring depersonalization and absorption, were unrelated to executive functioning. Our findings suggest that a failure to inhibit previous responses might contribute to the pathological memory manifestations of dissociation.

  9. ANCON: A code for the evaluation of complex fault trees in personal computers

    International Nuclear Information System (INIS)

    Napoles, J.G.; Salomon, J.; Rivero, J.

    1990-01-01

    Performing probabilistic safety analysis has been recognized worldwide as one of the more effective ways for further enhancing safety of Nuclear Power Plants. The evaluation of fault trees plays a fundamental role in these analysis. Some existing limitations in RAM and execution speed of personal computers (PC) has restricted so far their use in the analysis of complex fault trees. Starting from new approaches in the data structure and other possibilities the ANCON code can evaluate complex fault trees in a PC, allowing the user to do a more comprehensive analysis of the considered system in reduced computing time

  10. ROUTING FINANCIAL FLOWS AT THE EXECUTION OF THE MUNICIPALITY BUDGET

    Directory of Open Access Journals (Sweden)

    Mariya V. Fedotova

    2018-05-01

    Full Text Available Economic development of each administrative-territorial unit is an important factor in the economic development of the Russian Federation. In its turn, the effective execution of the budget at each level of the budget system is invariably topical, influencing the development of a system of both centralized and decentralized finance. The paramount characteristic of the organization of the budget execution of the municipality is an ability to ensure the timely and full execution of their commitments. There is a direct correlation of the budget process on both the revenue and expenditure sides. Cash gap in the local budget execution is actually quite common and, usually, inevitable, because revenues are coming unevenly during the financial year, budgetary expenditures are seasonal, and this threatens the execution of planned budgets; there is a need to raise funds. The article deals with the execution of the cash balance plan, cash flow at the single account of budget, debt and source of financing deficits. The analysis of financial flows in the execution of municipalities’ budget shows problems of execution of budget revenues and allows assessing the scale of emerging cash gaps. An outcome of the review consists in the recognition of an objective need to develop approaches to assessing the effectiveness of budget execution at all its stages and to implement the monitoring of the effectiveness of budget execution and timely effective measures for ensuring its high-quality.

  11. Ten Years of Change in Executive Education.

    Science.gov (United States)

    Bolt, James F.

    1993-01-01

    As recently as the 1980s, most companies did not pay much attention to executive education. In the 1990s, many see executive education as a must for revamping competitive strategies, increasing productivity, improving quality, reducing cycle time, and revitalizing corporate culture. (Author/JOW)

  12. Understanding the Executive Functioning Heterogeneity in Schizophrenia

    Science.gov (United States)

    Raffard, Stephane; Bayard, Sophie

    2012-01-01

    Schizophrenia is characterized by heterogeneous brain abnormalities involving cerebral regions implied in the executive functioning. The dysexecutive syndrome is one of the most prominent and functionally cognitive features of schizophrenia. Nevertheless, it is not clear to what extend executive deficits are heterogeneous in schizophrenia…

  13. Unraveling Executive Functioning in Dual Diagnosis.

    Science.gov (United States)

    Duijkers, Judith C L M; Vissers, Constance Th W M; Egger, Jos I M

    2016-01-01

    In mental health, the term dual-diagnosis is used for the co-occurrence of Substance Use Disorder (SUD) with another mental disorder. These co-occurring disorders can have a shared cause, and can cause/intensify each other's expression. Forming a threat to health and society, dual-diagnosis is associated with relapses in addiction-related behavior and a destructive lifestyle. This is due to a persistent failure to control impulses and the maintaining of inadequate self-regulatory behavior in daily life. Thus, several aspects of executive functioning like inhibitory, shifting and updating processes seem impaired in dual-diagnosis. Executive (dys-)function is currently even seen as a shared underlying key component of most mental disorders. However, the number of studies on diverse aspects of executive functioning in dual-diagnosis is limited. In the present review, a systematic overview of various aspects of executive functioning in dual-diagnosis is presented, striving for a prototypical profile of patients with dual-diagnosis. Looking at empirical results, inhibitory and shifting processes appear to be impaired for SUD combined with schizophrenia, bipolar disorder or cluster B personality disorders. Studies involving updating process tasks for dual-diagnosis were limited. More research that zooms in to the full diversity of these executive functions is needed in order to strengthen these findings. Detailed insight in the profile of strengths and weaknesses that underlies one's behavior and is related to diagnostic classifications, can lead to tailor-made assessment and indications for treatment, pointing out which aspects need attention and/or training in one's self-regulative abilities.

  14. Questionnaire of Executive Function for Dancers: An Ecological Approach

    Science.gov (United States)

    Wong, Alina; Rodriguez, Mabel; Quevedo, Liliana; de Cossio, Lourdes Fernandez; Borges, Ariel; Reyes, Alicia; Corral, Roberto; Blanco, Florentino; Alvarez, Miguel

    2012-01-01

    There is a current debate about the ecological validity of executive function (EF) tests. Consistent with the verisimilitude approach, this research proposes the Ballet Executive Scale (BES), a self-rating questionnaire that assimilates idiosyncratic executive behaviors of classical dance community. The BES was administrated to 149 adolescents,…

  15. Executive Function and Adaptive Behavior in Muenke Syndrome.

    Science.gov (United States)

    Yarnell, Colin M P; Addissie, Yonit A; Hadley, Donald W; Guillen Sacoto, Maria J; Agochukwu, Nneamaka B; Hart, Rachel A; Wiggs, Edythe A; Platte, Petra; Paelecke, Yvonne; Collmann, Hartmut; Schweitzer, Tilmann; Kruszka, Paul; Muenke, Maximilian

    2015-08-01

    To investigate executive function and adaptive behavior in individuals with Muenke syndrome using validated instruments with a normative population and unaffected siblings as controls. Participants in this cross-sectional study included individuals with Muenke syndrome (P250R mutation in FGFR3) and their mutation-negative siblings. Participants completed validated assessments of executive functioning (Behavior Rating Inventory of Executive Function [BRIEF]) and adaptive behavior skills (Adaptive Behavior Assessment System, Second Edition [ABAS-II]). Forty-four with a positive FGFR3 mutation, median age 9 years, range 7 months to 52 years were enrolled. In addition, 10 unaffected siblings served as controls (5 males, 5 females; median age, 13 years; range, 3-18 years). For the General Executive Composite scale of the BRIEF, 32.1% of the cohort had scores greater than +1.5 SD, signifying potential clinical significance. For the General Adaptive Composite of the ABAS-II, 28.2% of affected individuals scored in the 3rd-8th percentile of the normative population, and 56.4% were below the average category (General Executive Composite and the ABAS-II General Adaptive Composite. Individuals with Muenke syndrome are at an increased risk for developing adaptive and executive function behavioral changes compared with a normative population and unaffected siblings. Published by Elsevier Inc.

  16. Mismanagement Reasons of the Projects Execution Phase

    Directory of Open Access Journals (Sweden)

    Hatem Khaleefah Al-Agele

    2017-10-01

    Full Text Available The execution phase of the project is most dangerous and the most drain on the resources during project life cycle, therefore, its need to monitor and control by specialists to exceeded obstructions and achieve the project goals. The study aims to detect the actual reasons behind mismanagement of the execution phase. The study begins with theoretical part, where it deals with the concepts of project, project selection, project management, and project processes. Field part consists of three techniques: 1- brainstorming, 2- open interviews with experts and 3- designed questionnaire (with 49 reason. These reasons result from brainstorming and interviewing with experts., in order to find the real reasons behind mismanagement of the execution phase. The most important reasons which are negatively impact on management of the execution phase that proven by the study were (Inability of company to meet project requirements because it's specialized and / or large project, Multiple sources of decision and overlap in powers, Inadequate planning, Inaccurate estimation of cost, Delayed cash flows by owners, Poor performance of project manager, inefficient decision making process, and the Negative impact of people in the project area. Finally, submitting a set of recommendations which will contribute to overcome the obstructions of successful management of the execution phase.

  17. New quantum codes constructed from quaternary BCH codes

    Science.gov (United States)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  18. Executive Orders-Barack Obama

    Data.gov (United States)

    National Archives and Records Administration — Executive orders are official documents, numbered consecutively, through which the President of the United States manages the operations of the Federal Government....

  19. Development of the next generation code system as an engineering modeling language (1)

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Uto, Nariaki; Kasahara, Naoto; Nagura, Fuminori; Ishikawa, Makoto; Ohira, Masanori; Kato, Masayuki

    2002-11-01

    In the fast reactor development, numerical simulation using analytical codes plays an important role for complementing theory and experiment. It is necessary that the engineering models and analysis methods can be flexibly changed, because the phenamine to be investigated become more complicated due to the diversity of the needs for research. And, there are large problems in combining physical properties and engineering models in many different fields. In this study, the goal is to develop a flexible and general-purposive analysis system, in which the physical properties and engineering models are represented as a programming language or a diagrams that are easily understandable for humans and executable for computers. The authors named this concept the Engineering Modeling Language (EML). This report describes the result of the investigation for latest computer technologies and software development techniques which seem to be usable for a realization of the analysis code system for nuclear engineering as an EML. (author)

  20. Development of a severe accident training simulator using a MELCOR code

    International Nuclear Information System (INIS)

    Kim, Ko Ryu; Jeong, Kwang Sub; Ha, Jae Joo; Jung, Won Dae

    2002-03-01

    Nuclear power plants' severe accidents are, despite of their rareness, very important in safety aspects, because of their huge damages when occurred. For the appropriate execution of severe accident strategy, more information for decision-making are required because of the uncertainties included in severe accidents. Earlier NRC raised concerns over severe accident training in the report NUREC/CR-477, and accordingly, developing effective training tools for severe accident were emphasized. In fact the training tools were requested from industrial area, nevertheless, few training tools were developed due to the uncertainties in severe accidents, lacks of analysis computer codes and technical limitations. SATS, the severe accident training simulator, is developed as a multi-purpose tools for severe accident training. SATS uses the calculation results of MELCOR, an integral severe accident analysis code, and with the help of SL-GMS graphic tools, provides dynamic displays of severe accident phenomena on the terminal of IBM PC. It aimed to have two main features: one is to provide graphic displays to represent severe accident phenomena and the other is to process and simulate severe accident strategy given by plant operators and TSC staffs. Severe accident strategies are basically composed of series of operations of available pumps, valves and other equipments. Whenever executing strategies with SATS, the trainee should follow the HyperKAMG, the on line version of the recently developed severe accident guidance (KAMG). Severe accident strategies are closely related to accidents scenarios. TLOFW and LOCA , two representative severe accident scenarios of Uljin 3,4, are developed as a built-in scenarios of SATS. Although SATS has some minor problems at this time, we expect SATS will be a good severe accident training tool after the appropriate addition of accident scenarios. Moreover, we also expect SATS will be a good advisory tool for the severe accident research

  1. 47 CFR 54.704 - The Administrator's Chief Executive Officer.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false The Administrator's Chief Executive Officer. 54... Administrator shall nominate by consensus a Chief Executive Officer. The Board of Directors shall submit the... Administrator's Chief Executive Officer. (3) If the Board of Directors does not reach consensus on a nominee or...

  2. Entanglement-assisted quantum MDS codes from negacyclic codes

    Science.gov (United States)

    Lu, Liangdong; Li, Ruihu; Guo, Luobin; Ma, Yuena; Liu, Yang

    2018-03-01

    The entanglement-assisted formalism generalizes the standard stabilizer formalism, which can transform arbitrary classical linear codes into entanglement-assisted quantum error-correcting codes (EAQECCs) by using pre-shared entanglement between the sender and the receiver. In this work, we construct six classes of q-ary entanglement-assisted quantum MDS (EAQMDS) codes based on classical negacyclic MDS codes by exploiting two or more pre-shared maximally entangled states. We show that two of these six classes q-ary EAQMDS have minimum distance more larger than q+1. Most of these q-ary EAQMDS codes are new in the sense that their parameters are not covered by the codes available in the literature.

  3. MANAGERIAL PROBLEMS CONFRONTED BY EXECUTIVE CHEFS IN HOTELS

    Directory of Open Access Journals (Sweden)

    Kemal BIRDIR

    2014-07-01

    Full Text Available The study was conducted to determine the managerial problems confronted by executive chefs working at 4 and 5-star hotels in Turkey. A survey developed by the researchers was employed as a data collection tool. Answers given by participants were analyzed using “T-test” and “ANOVA” analyses in order to determine whether there are significant differences of opinion on the subject (collated in answers to the survey questionnaire amongst executive chefs, based on answers given by them (expressed as average figures dependent upon such variables as their “Age”, “Gender”, “Educational Status” and “Star status of the hotel within which they worked.” The study results showed that the most important problem confronting executive chefs was “finding educated/trained kitchen personnel.” On the specific problem, “responsibility and authority is not clear within the kitchen,” there was a significant difference of opinion by the gender of the executive chefs. Moreover, there was a significant difference of opinion dependent upon the star status of the hotels within which the chefs worked on the problem of whether or not “the working hours of kitchen personnel were too long.” The findings suggest that there are important problems confronted by executive chefs. Moreover, male and female executive chefs have different opinions on the magnitudes of some specific problems. Whereas there are various reports and similar publications discussing problems faced by executive chefs, the present study is the first one in the literature that solely explore the managerial problems experienced at a kitchen context.

  4. TOWARD THE DOMINANCE OF THE EXECUTIVE

    Directory of Open Access Journals (Sweden)

    Danica Fink-Hafner

    2013-01-01

    Full Text Available This article examines the recent processes of globalization and (within this framework Europeanization, with a focus on the changes in national political Systems (particularly in post-communist EU member states due to the pressures of these processes. The main thesis is that national executives have been gaining power in relation to the legislative due to international pressures. The international financial and economic crisis has added to this trend as countries become more financially dependent on International centers of power which demand efficient economic liberalization from national executives as a precondition for the required international loans. The case study of Slovenia is presented from a comparative perspective (in some aspects being a deviant case so as to offer new theoretical insights into the mechanisms of strengthening the national executives.

  5. Visualizing code and coverage changes for code review

    NARCIS (Netherlands)

    Oosterwaal, Sebastiaan; van Deursen, A.; De Souza Coelho, R.; Sawant, A.A.; Bacchelli, A.

    2016-01-01

    One of the tasks of reviewers is to verify that code modifications are well tested. However, current tools offer little support in understanding precisely how changes to the code relate to changes to the tests. In particular, it is hard to see whether (modified) test code covers the changed code.

  6. Homological stabilizer codes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Jonas T., E-mail: jonastyleranderson@gmail.com

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  7. Central executive involvement in children's spatial memory.

    Science.gov (United States)

    Ang, Su Yin; Lee, Kerry

    2008-11-01

    Previous research with adults found that spatial short-term and working memory tasks impose similar demands on executive resources. We administered spatial short-term and working memory tasks to 8- and 11-year-olds in three separate experiments. In Experiments 1 and 2 an executive suppression task (random number generation) was found to impair performances on a short-term memory task (Corsi blocks), a working memory task (letter rotation), and a spatial visualisation task (paper folding). In Experiment 3 an articulatory suppression task only impaired performance on the working memory task. These results suggest that short-term and working memory performances are dependent on executive resources. The degree to which the short-term memory task was dependent on executive resources was expected to be related to the amount of experience children have had with such tasks. Yet we found no significant age-related suppression effects. This was attributed to differences in employment of cognitive strategies by the older children.

  8. Executive functioning complaints and escitalopram treatment response in late-life depression.

    Science.gov (United States)

    Manning, Kevin J; Alexopoulos, George S; Banerjee, Samprit; Morimoto, Sarah Shizuko; Seirup, Joanna K; Klimstra, Sibel A; Yuen, Genevieve; Kanellopoulos, Theodora; Gunning-Dixon, Faith

    2015-05-01

    Executive dysfunction may play a key role in the pathophysiology of late-life depression. Executive dysfunction can be assessed with cognitive tests and subjective report of difficulties with executive skills. The present study investigated the association between subjective report of executive functioning complaints and time to escitalopram treatment response in older adults with major depressive disorder (MDD). 100 older adults with MDD (58 with executive functioning complaints and 42 without executive functioning complaints) completed a 12-week trial of escitalopram. Treatment response over 12 weeks, as measured by repeated Hamilton Depression Rating Scale scores, was compared for adults with and without executive complaints using mixed-effects modeling. Mixed effects analysis revealed a significant group × time interaction, F(1, 523.34) = 6.00, p = 0.01. Depressed older adults who reported executive functioning complaints at baseline demonstrated a slower response to escitalopram treatment than those without executive functioning complaints. Self-report of executive functioning difficulties may be a useful prognostic indicator for subsequent speed of response to antidepressant medication. Copyright © 2015 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  9. WCET Analysis of Java Bytecode Featuring Common Execution Environments

    DEFF Research Database (Denmark)

    Luckow, Kasper Søe; Thomsen, Bent; Frost, Christian

    2011-01-01

    We present a novel tool for statically determining the Worst Case Execution Time (WCET) of Java Bytecode-based programs called Tool for Execution Time Analysis of Java bytecode (TetaJ). This tool differentiates itself from existing tools by separating the individual constituents of the execution...... environment into independent components. The prime benefit is that it can be used for execution environments featuring common embedded processors and software implementations of the JVM. TetaJ employs a model checking approach for statically determining WCET where the Java program, the JVM, and the hardware...

  10. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  11. GOTHIC: Gravitational oct-tree code accelerated by hierarchical time step controlling

    Science.gov (United States)

    Miki, Yohei; Umemura, Masayuki

    2017-04-01

    The tree method is a widely implemented algorithm for collisionless N-body simulations in astrophysics well suited for GPU(s). Adopting hierarchical time stepping can accelerate N-body simulations; however, it is infrequently implemented and its potential remains untested in GPU implementations. We have developed a Gravitational Oct-Tree code accelerated by HIerarchical time step Controlling named GOTHIC, which adopts both the tree method and the hierarchical time step. The code adopts some adaptive optimizations by monitoring the execution time of each function on-the-fly and minimizes the time-to-solution by balancing the measured time of multiple functions. Results of performance measurements with realistic particle distribution performed on NVIDIA Tesla M2090, K20X, and GeForce GTX TITAN X, which are representative GPUs of the Fermi, Kepler, and Maxwell generation of GPUs, show that the hierarchical time step achieves a speedup by a factor of around 3-5 times compared to the shared time step. The measured elapsed time per step of GOTHIC is 0.30 s or 0.44 s on GTX TITAN X when the particle distribution represents the Andromeda galaxy or the NFW sphere, respectively, with 224 = 16,777,216 particles. The averaged performance of the code corresponds to 10-30% of the theoretical single precision peak performance of the GPU.

  12. BPELPower—A BPEL execution engine for geospatial web services

    Science.gov (United States)

    Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi

    2012-10-01

    The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.

  13. Executive and Perceptual Distraction in Visual Working Memory

    Science.gov (United States)

    2017-01-01

    The contents of visual working memory are likely to reflect the influence of both executive control resources and information present in the environment. We investigated whether executive attention is critical in the ability to exclude unwanted stimuli by introducing concurrent potentially distracting irrelevant items to a visual working memory paradigm, and manipulating executive load using simple or more demanding secondary verbal tasks. Across 7 experiments varying in presentation format, timing, stimulus set, and distractor number, we observed clear disruptive effects of executive load and visual distraction, but relatively minimal evidence supporting an interactive relationship between these factors. These findings are in line with recent evidence using delay-based interference, and suggest that different forms of attentional selection operate relatively independently in visual working memory. PMID:28414499

  14. Neuroanatomical Substrates of Executive Functions: Beyond Prefrontal Structures

    Science.gov (United States)

    Bettcher, Brianne M.; Mungas, Dan; Patel, Nihar; Elofson, Jonathan; Dutt, Shubir; Wynn, Matthew; Watson, Christa L.; Stephens, Melanie; Walsh, Christine M.; Kramer, Joel H.

    2016-01-01

    Executive functions are often considered lynchpin “frontal lobe tasks”, despite accumulating evidence that a broad network of anterior and posterior brain structures supports them. Using a latent variable modeling approach, we assessed whether prefrontal grey matter volumes independently predict executive function performance when statistically differentiated from global atrophy and individual non-frontal lobar volume contributions. We further examined whether fronto-parietal white matter microstructure underlies and independently contributes to executive functions. We developed a latent variable model to decompose lobar grey matter volumes into a global grey matter factor and specific lobar volumes (i.e. prefrontal, parietal, temporal, occipital) that were independent of global grey matter. We then added mean fractional anisotropy (FA) for the superior longitudinal fasciculus (dorsal portion), corpus callosum, and cingulum bundle (dorsal portion) to models that included grey matter volumes related to cognitive variables in previous analyses. Results suggested that the 2-factor model (shifting/inhibition, updating/working memory) plus an information processing speed factor best explained our executive function data in a sample of 202 community dwelling older adults, and was selected as the base measurement model for further analyses. Global grey matter was related to the executive function and speed variables in all four lobar models, but independent contributions of the frontal lobes were not significant. In contrast, when assessing the effect of white matter microstructure, cingulum FA made significant independent contributions to all three executive function and speed variables and corpus callosum FA was independently related to shifting/inhibition and speed. Findings from the current study indicate that while prefrontal grey matter volumes are significantly associated with cognitive neuroscience measures of shifting/inhibition and working memory in healthy

  15. GeNN: a code generation framework for accelerated brain simulations

    Science.gov (United States)

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-01

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/.

  16. Executive function in different groups of university students

    OpenAIRE

    Prosen, Simona; Smrtnik Vitulić, Helena

    2015-01-01

    The present study analyses the executive function (EF) skills of 369 students of primary education (n = 116), preschool education (n = 72), social pedagogy (n = 54), and biology (n = 128). It explores how the different groups of students use selected executive skills and whether there are any differences between the groups in this respect. Eleven EF skills were self-assessed using the Executive Skills Questionnaire for Students (Dawson & Guare, 2010). All of the groups of students experien...

  17. Multitasking the three-dimensional transport code TORT on CRAY platforms

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1996-01-01

    The multitasking options in the three-dimensional neutral particle transport code TORT originally implemented for Cray's CTSS operating system are revived and extended to run on Cray Y/MP and C90 computers using the UNICOS operating system. These include two coarse-grained domain decompositions; across octants, and across directions within an octant, termed Octant Parallel (OP), and Direction Parallel (DP), respectively. Parallel performance of the DP is significantly enhanced by increasing the task grain size and reducing load imbalance via dynamic scheduling of the discrete angles among the participating tasks. Substantial Wall Clock speedup factors, approaching 4.5 using 8 tasks, have been measured in a time-sharing environment, and generally depend on the test problem specifications, number of tasks, and machine loading during execution

  18. The coupled code system TORT-TD/ATTICA3D for 3-D transient analysis of pebble-bed HTGR

    International Nuclear Information System (INIS)

    Seubert, A.; Sureda, A.; Lapins, J.; Buck, M.; Laurien, E.; Bader, J.; EnBW Kernkraft GmbH, Philippsburg

    2012-01-01

    This paper describes the time-dependent 3-D discrete-ordinates based coupled code system TORT-TD/ATTICA3D and its application to HTGR of pebble bed type. TORT-TD/ATTICA3D is represented by a single executable and adapts the so-called internal coupling approach. Three-dimensional distributions of temperatures from ATTICA3D and power density from TORT-TD are efficiently exchanged by direct memory access of array elements via interface routines. Applications of TORT-TD/ATTICA3D to three transients based on the PBMR-400 benchmark (total and partial control rod withdrawal and cold helium ingress) and the full power steady state of the HTR-10 are presented. For the partial control rod withdrawal, 3-D effects of local neutron flux redistributions are clearly identified. The results are very promising and demonstrate that the coupled code system TORT-TD/ATTICA3D may represent a key component in a future comprehensive 3-D code system for HTGR of pebble bed type. (orig.)

  19. 24 CFR 983.204 - When HAP contract is executed.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false When HAP contract is executed. 983... When HAP contract is executed. (a) PHA inspection of housing. (1) Before execution of the HAP contract... into a HAP contract for any contract unit until the PHA has determined that the unit complies with the...

  20. A probabilistic assessment code system for derivation of clearance levels of radioactive materials. PASCLR user's manual

    International Nuclear Information System (INIS)

    Takahashi, Tomoyuki; Takeda, Seiji; Kimura, Hideo

    2001-01-01

    It is indicated that some types of radioactive material generating from the development and utilization of nuclear energy do not need to be subject regulatory control because they can only give rise to trivial radiation hazards. The process to remove such materials from regulatory control is called as 'clearance'. The corresponding levels of the concentration of radionuclides are called as 'clearance levels'. In the Nuclear Safety Commission's discussion, the deterministic approach was applied to derive the clearance levels, which are the concentrations of radionuclides in a cleared material equivalent to an individual dose criterion. Basically, realistic parameter values were selected for it. If the realistic values could not be defined, reasonably conservative values were selected. Additionally, the stochastic approaches were performed to validate the results which were obtained by the deterministic calculations. We have developed a computer code system PASCLR (Probabilistic Assessment code System for derivation of Clearance Levels of Radioactive materials) by using the Monte Carlo technique for carrying out the stochastic calculations. This report describes the structure and user information for execution of PASCLR code. (author)

  1. LQCD workflow execution framework: Models, provenance and fault-tolerance

    International Nuclear Information System (INIS)

    Piccoli, Luciano; Simone, James N; Kowalkowlski, James B; Dubey, Abhishek

    2010-01-01

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  2. Executive and Memory Function in Adolescents Born Very Preterm

    Science.gov (United States)

    Ment, Laura; Allan, Walter; Schneider, Karen; Vohr, Betty R.

    2011-01-01

    BACKGROUND: Many preterm children display school difficulties, which may be mediated by impairment in executive function and memory. OBJECTIVE: To evaluate executive and memory function among adolescents born preterm compared with term controls at 16 years. METHODS: A total of 337 of 437 (77%) adolescents born in 1989 to 1992 with a birth weight executive function and memory tasks. Multiple regression analyses were used to compare groups and to identify associations between selected factors and outcomes among preterm subjects. RESULTS: Adolescents born preterm, compared with term controls, showed deficits in executive function in the order of 0.4 to 0.6 SD on tasks of verbal fluency, inhibition, cognitive flexibility, planning/organization, and working memory as well as verbal and visuospatial memory. After exclusion of adolescents with neurosensory disabilities and full-scale IQ executive dysfunction, as measured with the Behavior Rating Inventory of Executive Function, on the Metacognition Index (odds ratio [OR]: 2.5 [95% confidence interval (CI): 1.2–5.1]) and the Global Executive Composite (OR: 4.2 [95% CI: 1.6–10.9]), but not on the Behavioral Regulation index (OR: 1.5 [95% CI: 0.7–3.5]). Among adolescents born preterm, severe brain injury on neonatal ultrasound and lower maternal education were the most consistent factors associated with poor outcomes. CONCLUSIONS: Even after exclusion of preterm subjects with significant disabilities, adolescents born preterm in the early 1990s were at increased risk of deficits in executive function and memory. PMID:21300680

  3. Program Execution on Reconfigurable Multicore Architectures

    Directory of Open Access Journals (Sweden)

    Sanjiva Prasad

    2016-06-01

    Full Text Available Based on the two observations that diverse applications perform better on different multicore architectures, and that different phases of an application may have vastly different resource requirements, Pal et al. proposed a novel reconfigurable hardware approach for executing multithreaded programs. Instead of mapping a concurrent program to a fixed architecture, the architecture adaptively reconfigures itself to meet the application's concurrency and communication requirements, yielding significant improvements in performance. Based on our earlier abstract operational framework for multicore execution with hierarchical memory structures, we describe execution of multithreaded programs on reconfigurable architectures that support a variety of clustered configurations. Such reconfiguration may not preserve the semantics of programs due to the possible introduction of race conditions arising from concurrent accesses to shared memory by threads running on the different cores. We present an intuitive partial ordering notion on the cluster configurations, and show that the semantics of multithreaded programs is always preserved for reconfigurations "upward" in that ordering, whereas semantics preservation for arbitrary reconfigurations can be guaranteed for well-synchronised programs. We further show that a simple approximate notion of efficiency of execution on the different configurations can be obtained using the notion of amortised bisimulations, and extend it to dynamic reconfiguration.

  4. Career Patterns of Supply Chain Executives

    DEFF Research Database (Denmark)

    Flöthmann, Christoph; Hoberg, Kai

    2017-01-01

    career patterns for SCEs. They differ in terms of the individuals' previous professional experience, educational background, and the time they needed to arrive in an executive position. By characterizing the backgrounds and career paths of SCEs, we show that supply chain management (SCM) is truly a cross......This exploratory study analyzes the careers of 307 supply chain executives (SCEs). Motivated by career theory, our findings create new knowledge about the educational backgrounds and career paths that lead to SCE positions. Based on an optimal matching analysis, we are able to distinguish among six......-functional profession. Our findings suggest that previous staff responsibility appears to be a more important hiring criterion than extensive SCM experience. While 56% of the executives had prior staff responsibility, only 12% of the cumulated careers were actually spent inside the SCM function....

  5. 75 FR 62501 - Senior Executive Service Performance Review Board: Update

    Science.gov (United States)

    2010-10-12

    ... AGENCY FOR INTERNATIONAL DEVELOPMENT Senior Executive Service Performance Review Board: Update... Development, Office of Inspector General's Senior Executive Service Performance Review Board. DATES: September... reference-- USAID OIG Senior Executive Service (SES) Performance Review Board). SUPPLEMENTARY INFORMATION: 5...

  6. Embedding Temporal Constraints For Coordinated Execution in Habitat Automation

    Science.gov (United States)

    Morris, Paul; Schwabacher, Mark; Dalal, Michael; Fry, Charles

    2013-01-01

    Future NASA plans call for long-duration deep space missions with human crews. Because of light-time delay and other considerations, increased autonomy will be needed. This will necessitate integration of tools in such areas as anomaly detection, diagnosis, planning, and execution. In this paper we investigate an approach that integrates planning and execution by embedding planner-derived temporal constraints in an execution procedure. To avoid the need for propagation, we convert the temporal constraints to dispatchable form. We handle some uncertainty in the durations without it affecting the execution; larger variations may cause activities to be skipped.

  7. Remembering the Future of Centralized Control-Decentralized Execution

    National Research Council Canada - National Science Library

    Sheets, Patrick

    2003-01-01

    ... concepts which should drive system development. To realize the significance of the USAF C2 tenet of "centralized control-decentralized execution," one must understand how C2 is executed, in contingency theaters of operation...

  8. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  9. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  10. Executive Compensation and Principal-Agent Theory.

    OpenAIRE

    Garen, John E

    1994-01-01

    The empirical literature on executive compensation generally fails to specify a model of executive pay on which to base hypotheses regarding its determinants. In contrast, this paper analyzes a simple principal-agent model to determine how well it explains variations in CEO incentive pay and salaries. Many findings are consistent with the basic intuition of principle-agent models that compensation is structured to trade off incentives with insurance. However, statistical significance for some...

  11. 78 FR 28243 - Senior Executive Service; Performance Review Board; Members

    Science.gov (United States)

    2013-05-14

    ... NATIONAL CAPITAL PLANNING COMMISSION Senior Executive Service; Performance Review Board; Members AGENCY: National Capital Planning Commission. ACTION: Notice of Members of Senior Executive Service... Senior Executive Service. The PRB established for the National Capital Planning Commission also makes...

  12. 76 FR 29013 - Senior Executive Service; Performance Review Board; Members

    Science.gov (United States)

    2011-05-19

    ... NATIONAL CAPITAL PLANNING COMMISSION Senior Executive Service; Performance Review Board; Members AGENCY: National Capital Planning Commission. ACTION: Notice of Members of Senior Executive Service... Senior Executive Service. The PRB established for the National Capital Planning Commission also makes...

  13. Executive and perceptual distraction in visual working memory.

    Science.gov (United States)

    Allen, Richard J; Baddeley, Alan D; Hitch, Graham J

    2017-09-01

    The contents of visual working memory are likely to reflect the influence of both executive control resources and information present in the environment. We investigated whether executive attention is critical in the ability to exclude unwanted stimuli by introducing concurrent potentially distracting irrelevant items to a visual working memory paradigm, and manipulating executive load using simple or more demanding secondary verbal tasks. Across 7 experiments varying in presentation format, timing, stimulus set, and distractor number, we observed clear disruptive effects of executive load and visual distraction, but relatively minimal evidence supporting an interactive relationship between these factors. These findings are in line with recent evidence using delay-based interference, and suggest that different forms of attentional selection operate relatively independently in visual working memory. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Code Betal to calculation Alpha/Beta activities in environmental samples; Programa de ordenador Betal para el calculo de la actividad Beta/Alfa de muestras ambientales

    Energy Technology Data Exchange (ETDEWEB)

    Romero, L.; Travesi, A.

    1983-07-01

    A codes, BETAL, was developed, written in FORTRAN IV, to automatize calculations and presentations of the result of the total alpha-beta activities measurements in environmental samples. This code performs the necessary calculations for transformation the activities measured in total counts, to pCi/1., bearing in mind the efficiency of the detector used and the other necessary parameters. Further more, it appraise the standard deviation of the result, and calculus the Lower limit of detection for each measurement. This code is written in iterative way by screen-operator dialogue, and asking the necessary data to perform the calculation of the activity in each case by a screen label. The code could be executed through any screen and keyboard terminal, (whose computer accepts Fortran IV) with a printer connected to the said computer. (Author) 5 refs.

  15. Converter of a continuous code into the Grey code

    International Nuclear Information System (INIS)

    Gonchar, A.I.; TrUbnikov, V.R.

    1979-01-01

    Described is a converter of a continuous code into the Grey code used in a 12-charged precision amplitude-to-digital converter to decrease the digital component of spectrometer differential nonlinearity to +0.7% in the 98% range of the measured band. To construct the converter of a continuous code corresponding to the input signal amplitude into the Grey code used is the regularity in recycling of units and zeroes in each discharge of the Grey code in the case of a continuous change of the number of pulses of a continuous code. The converter is constructed on the elements of 155 series, the frequency of continuous code pulse passing at the converter input is 25 MHz

  16. Evaluating the Effects of Executive Learning and Development on Organisational Performance: Implications for Developing Senior Manager and Executive Capabilities

    Science.gov (United States)

    Akrofi, Solomon

    2016-01-01

    In spite of decades of research into high-performance work systems, very few studies have examined the relationship between executive learning and development and organisational performance. In an attempt to close this gap, this study explores the effects of a validated four-dimensional executive learning and development measure on a composite…

  17. Genetic analyses of the stability of executive functioning during childhood.

    NARCIS (Netherlands)

    Polderman, T.J.C.; Posthuma, D.; de Sonneville, L.M.J.; Stins, J.F.; Verhulst, F.C.; Boomsma, D.I.

    2007-01-01

    Executive functioning is an umbrella term for several related cognitive functions like selective- and sustained attention, working memory, and inhibition. Little is known about the stability of executive functioning during childhood. In this study the longitudinal stability of executive functioning

  18. News from the Staff Association Executive Committee

    CERN Multimedia

    Staff Association

    2018-01-01

    On 17 April, the Staff Council proceeded to the election of the Executive Committee of the Staff Association and the members of the Bureau. First of all, why a new election of the Executive Committee elected in April 2018 after that of December 2017 (Echo No. 281)? Quite simply because a Crisis Executive Committee with a provisional Bureau had been elected for a period from 1st January to 16 April 2018 with defined and restricted objectives (Echo No. 283). Therefore, on 17 April, G. Roy presented for election a list of 12 persons, including five members for the Bureau, who agreed to continue their work within the Executive Committee, based on an intensive programme with the following main axes: Crèche and School and in particular the establishment of a foundation; Concertation: review and relaunch of the concertation process; Finalisation of the 2015 five-yearly review; Preparation and start of the 2020 five-yearly review; Actuarial reviews of the Pension Fund and the CHIS; Internal enquiries and...

  19. Guidelines for selecting codes for ground-water transport modeling of low-level waste burial sites. Volume 2. Special test cases

    International Nuclear Information System (INIS)

    Simmons, C.S.; Cole, C.R.

    1985-08-01

    This document was written for the National Low-Level Waste Management Program to provide guidance for managers and site operators who need to select ground-water transport codes for assessing shallow-land burial site performance. The guidance given in this report also serves the needs of applications-oriented users who work under the direction of a manager or site operator. The guidelines are published in two volumes designed to support the needs of users having different technical backgrounds. An executive summary, published separately, gives managers and site operators an overview of the main guideline report. Volume 1, titled ''Guideline Approach,'' consists of Chapters 1 through 5 and a glossary. Chapters 2 through 5 provide the more detailed discussions about the code selection approach. This volume, Volume 2, consists of four appendices reporting on the technical evaluation test cases designed to help verify the accuracy of ground-water transport codes. 20 refs

  20. [Memory processes and executive functioning: novel trends for research].

    Science.gov (United States)

    Collette, Fabienne; Angel, Lucie

    2015-01-01

    The existence of processes common to memory systems and executive functioning was evidenced by studies in the domain of cerebral neuroimaging, individual differences (mainly in normal aging) and, to a lesser extent, neuropsychology. Executive functioning depends on a large antero-posterior brain network, some regions of which (the middle dorsolateral and ventrolateral cortex, the dorsal anterior cingulate cortex) are involved in a series of executive processes, but also in encoding and retrieval of information in episodic memory and short-term memory. A consequence of lesions in frontal areas is to impair strategical organization of the information to-be-processed (an executive process) and thus leads to a lower memory capacity in frontal patients. Moreover, executive abilities will influence both memory efficiency and the associated brain networks even in people without brain pathology. These data attest to the importance of the relationships between executive and memory processes for an optimal cognitive functioning. Recent advances in neuroimaging and electrophysiology data acquisition and analysis techniques should allow us to better determine and understand the fashion in which these relationships work. © Société de Biologie, 2016.

  1. Executive Functioning: Relationship with High School Student Role Performance

    Directory of Open Access Journals (Sweden)

    Donna P. Mann

    2015-10-01

    Full Text Available BACKGROUND. Student role performance for academic success in secondary education is under represented in the occupational therapy literature, despite the persistently high dropout rate in the United States (Stillwell & Sable, 2013. Executive dysfunction is one of many possible contributors to difficulties in the classroom (Dirette & Kolak, 2004 and is a better indicator of school performance than IQ (Diamond, 2012. This research examined executive functioning of both alternative and traditional high school students to determine if there is a relationship between executive function and academic success as measured by cumulative grade point average. METHOD. 132 high school students from three different school settings were given the Behavioral Rating Inventory of Executive Function-Self Report (BRIEF-SR. The Global Executive Composite (GEC and individual subscale scores were compared to GPA. RESULTS. No significant difference in GEC scores was found among settings. Subscale scores for “inhibition” and “task completion” were significantly different in the alternative school setting. A weak negative correlation was seen between the GEC and GPA. However, academically unsuccessful students scored statistically lower on the GEC. CONCLUSION. Global executive dysfunction was not predicted by setting but was seen in academically unsuccessful students.

  2. 76 FR 57947 - Senior Executive Service Performance Review Board Membership

    Science.gov (United States)

    2011-09-19

    ... AND EFFICIENCY Senior Executive Service Performance Review Board Membership AGENCY: Council of the... of Personnel Management, each agency is required to establish one or more Senior Executive Service... appraisal of a senior executive's performance by the supervisor, along with any recommendations to the...

  3. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  4. Entanglement-assisted quantum MDS codes constructed from negacyclic codes

    Science.gov (United States)

    Chen, Jianzhang; Huang, Yuanyuan; Feng, Chunhui; Chen, Riqing

    2017-12-01

    Recently, entanglement-assisted quantum codes have been constructed from cyclic codes by some scholars. However, how to determine the number of shared pairs required to construct entanglement-assisted quantum codes is not an easy work. In this paper, we propose a decomposition of the defining set of negacyclic codes. Based on this method, four families of entanglement-assisted quantum codes constructed in this paper satisfy the entanglement-assisted quantum Singleton bound, where the minimum distance satisfies q+1 ≤ d≤ n+2/2. Furthermore, we construct two families of entanglement-assisted quantum codes with maximal entanglement.

  5. Composable Framework Support for Software-FMEA Through Model Execution

    Science.gov (United States)

    Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco

    2016-08-01

    Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.

  6. Managing locality in grand challenge applications: a case study of the gyrokinetic toroidal code

    Energy Technology Data Exchange (ETDEWEB)

    Marin, G; Jin, G; Mellor-Crummey, J [Department of Computer Science, Rice University, Houston, TX 77005 (United States)

    2008-07-15

    Achieving high performance with grand challenge applications on today's large-scale parallel systems requires tailoring applications for the characteristics of the modern microprocessor architectures. As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, we studied and tuned the Gyrokinetic Toroidal Code (GTC), a particle-in-cell code for simulating turbulent transport of particles and energy in burning plasma, developed at Princeton Plasma Physics Laboratory. In this paper, we present a performance study of the application that revealed several opportunities for improving performance by enhancing its data locality. We tuned GTC by performing three kinds of transformations: static data structure reorganization to improve spatial locality, loop nest restructuring for better temporal locality, and dynamic data reordering at run-time to enhance both spatial and temporal reuse. Experimental results show that these changes improve execution time by more than 20% on large parallel systems, including a Cray XT4.

  7. Managing locality in grand challenge applications: a case study of the gyrokinetic toroidal code

    International Nuclear Information System (INIS)

    Marin, G; Jin, G; Mellor-Crummey, J

    2008-01-01

    Achieving high performance with grand challenge applications on today's large-scale parallel systems requires tailoring applications for the characteristics of the modern microprocessor architectures. As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, we studied and tuned the Gyrokinetic Toroidal Code (GTC), a particle-in-cell code for simulating turbulent transport of particles and energy in burning plasma, developed at Princeton Plasma Physics Laboratory. In this paper, we present a performance study of the application that revealed several opportunities for improving performance by enhancing its data locality. We tuned GTC by performing three kinds of transformations: static data structure reorganization to improve spatial locality, loop nest restructuring for better temporal locality, and dynamic data reordering at run-time to enhance both spatial and temporal reuse. Experimental results show that these changes improve execution time by more than 20% on large parallel systems, including a Cray XT4

  8. Executive function influences sedentary behavior: A longitudinal study

    Directory of Open Access Journals (Sweden)

    Paul D. Loprinzi

    2016-10-01

    Full Text Available Background: No study has evaluated the effects of executive function on follow-up sedentary behavior, which was this study’s purpose. Methods: A longitudinal design was employed among 18 young adult college students (Mage = 23.7 years; 88.9% female. Accelerometer-determined sedentary behavior and physical activity, along with executive function, were assessed at baseline. Approximately 8 weeks later, re-assessment of accelerometer-determined sedentary behavior and physical activity occurred. Executive function was assessed using the Parametric Go/No-Go (PGNG computer task. From this, 2 primary executive function outcome parameters were evaluated, including the Simple Rule and Repeating Rule. Results: After adjusting for baseline sedentary behavior, age, gender, body mass index and baseline moderate-to-vigorous physical activity (MVPA, for every 25% increase in the number of correctly identified targets for the Repeating rule at the baseline assessment, participants engaged in 91.8 fewer minutes of sedentary behavior at the follow-up assessment (β = -91.8; 95% CI: -173.5, -10.0; P = 0.03. Results were unchanged when also adjusting for total baseline or follow-up physical activity. Conclusion: Greater executive function is associated with less follow-up sedentary behavior.

  9. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  10. 77 FR 65685 - Senior Executive Service Performance Review Board; Membership

    Science.gov (United States)

    2012-10-30

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL-9747-4] Senior Executive Service Performance Review Board... performance review boards. This board shall review and evaluate the initial appraisal of a senior executive's... performance of the senior executive. Members of the 2012 EPA Performance Review Board are: Benita Best-Wong...

  11. 78 FR 77125 - Senior Executive Service Performance Review Board; Membership

    Science.gov (United States)

    2013-12-20

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL-9904-20-OARM] Senior Executive Service Performance Review... review boards. This board shall review and evaluate the initial appraisal of a senior executive's... performance of the senior executive. Members of the 2013 EPA Performance Review Board are: Benita Best-Wong...

  12. Neurobehavioral Abnormalities Associated with Executive Dysfunction after Traumatic Brain Injury

    Directory of Open Access Journals (Sweden)

    Rodger Ll. Wood

    2017-10-01

    Full Text Available Objective: This article will address how anomalies of executive function after traumatic brain injury (TBI can translate into altered social behavior that has an impact on a person’s capacity to live safely and independently in the community.Method: Review of literature on executive and neurobehavioral function linked to cognitive ageing in neurologically healthy populations and late neurocognitive effects of serious TBI. Information was collated from internet searches involving MEDLINE, PubMed, PyscINFO and Google Scholar as well as the authors’ own catalogs.Conclusions: The conventional distinction between cognitive and emotional-behavioral sequelae of TBI is shown to be superficial in the light of increasing evidence that executive skills are critical for integrating and appraising environmental events in terms of cognitive, emotional and social significance. This is undertaken through multiple fronto-subcortical pathways within which it is possible to identify a predominantly dorsolateral network that subserves executive control of attention and cognition (so-called cold executive processes and orbito-frontal/ventro-medial pathways that underpin the hot executive skills that drive much of behavior in daily life. TBI frequently involves disruption to both sets of executive functions but research is increasingly demonstrating the role of hot executive deficits underpinning a wide range of neurobehavioral disorders that compromise relationships, functional independence and mental capacity in daily life.

  13. Real-time Executive for a basic principle simulator

    International Nuclear Information System (INIS)

    Buerger, L.; Szegi, Zs.; Vegh, E.

    1987-09-01

    A basic principle simulator for WWER-440 type nuclear power plants is under development in the Central Research Institute for Physics, Budapest. So far the technological models of both to primary and secondary circuits are ready and this paper presents the Real-time Executive and the on-line operating environment which controls the simulator. This executive system contains eight programs and the detailed structure of the data base is presented. The control of the execution of the model programs, their timing and the error recoveries are also discussed. (author) 5 refs

  14. Programming real-time executives in higher order language

    Science.gov (United States)

    Foudriat, E. C.

    1982-01-01

    Methods by which real-time executive programs can be implemented in a higher order language are discussed, using HAL/S and Path Pascal languages as program examples. Techniques are presented by which noncyclic tasks can readily be incorporated into the executive system. Situations are shown where the executive system can fail to meet its task scheduling and yet be able to recover either by rephasing the clock or stacking the information for later processing. The concept of deadline processing is shown to enable more effective mixing of time and information synchronized systems.

  15. Executive functions as predictors of math learning disabilities

    NARCIS (Netherlands)

    Toll, S.W.M.; van der Ven, S.H.G.; Kroesbergen, E.H.; van Luit, J.E.H.

    2011-01-01

    In the past years, an increasing number of studies have investigated executive functions as predictors of individual differences in mathematical abilities. The present longitudinal study was designed to investigate whether the executive functions shifting, inhibition, and working memory differ

  16. SSC-K code user's manual

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Y M; Lee, Y B; Chang, W P; Hahn, D

    2000-07-01

    The Supper System Code of KAERI (SSC-K) is a best-estimate system code for analyzing a variety of off-normal or accidents in the heat transport system of a pool type LMR design. It is being developed at Korea Atomic Energy Research Inititution (KAERI) on the basis of SSC-L, originally developed at BNL to analyze loop-type LMR transients. SSC-K can handle both designs of loop and pool type LMRs. SSC-K contains detailed mechanistic models of transient thermal, hydraulic, neutronic, and mechanical phenomena to describe the response of the reactor core, coolant, fuel elements, and structures to accident conditions. This report provides an overview of recent model developmentsvfor the SSC-K computer code, focusing on phenomenological model descriptions for new thermal, hydraulic, neutronic, and mechnaical modules. A comprehensive description of the models for pool-type reactor is given in Chapters 2 and 3; the steady-state plant characterization, prior to the initiation of transient is described in Chapter 2 and their transient counterparts are discussed in Chapter 3. In Chapter 4, a discussion on the intermediate heat exchanger (IHX) is presented. The IHX model of SSC-K is similar to that used in the SSC-L, except for some changes required for the pool-type configuration of reactor vessel. In Chapter 5, an electromagnetic (EM) pump is modeled as a component. There are two pump choices available in SSC-K; a centrifugal pump which was originally imbedded into the SSC-L, and an EM pump which was introduced for the KALIMER design. In Chapter 6, a model of passive safety decay heat removal system(PSDRS) is discussed, which removes decay heat through the reactor and containment vessel walls to the ambient air heat sink. In Chapter 7, models for various reactivity feedback effects are discussed. Reactivity effects of importance in fast reactor include the Doppler effect, effects of sodium density changes, effects of dimensional changes in core geometry. Finally in Chapter 8

  17. Executable research compendia in geoscience research infrastructures

    Science.gov (United States)

    Nüst, Daniel

    2017-04-01

    From generation through analysis and collaboration to communication, scientific research requires the right tools. Scientists create their own software using third party libraries and platforms. Cloud computing, Open Science, public data infrastructures, and Open Source enable scientists with unprecedented opportunites, nowadays often in a field "Computational X" (e.g. computational seismology) or X-informatics (e.g. geoinformatics) [0]. This increases complexity and generates more innovation, e.g. Environmental Research Infrastructures (environmental RIs [1]). Researchers in Computational X write their software relying on both source code (e.g. from https://github.com) and binary libraries (e.g. from package managers such as APT, https://wiki.debian.org/Apt, or CRAN, https://cran.r-project.org/). They download data from domain specific (cf. https://re3data.org) or generic (e.g. https://zenodo.org) data repositories, and deploy computations remotely (e.g. European Open Science Cloud). The results themselves are archived, given persistent identifiers, connected to other works (e.g. using https://orcid.org/), and listed in metadata catalogues. A single researcher, intentionally or not, interacts with all sub-systems of RIs: data acquisition, data access, data processing, data curation, and community support [3]. To preserve computational research [3] proposes the Executable Research Compendium (ERC), a container format closing the gap of dependency preservation by encapsulating the runtime environment. ERCs and RIs can be integrated for different uses: (i) Coherence: ERC services validate completeness, integrity and results (ii) Metadata: ERCs connect the different parts of a piece of research and faciliate discovery (iii) Exchange and Preservation: ERC as usable building blocks are the shared and archived entity (iv) Self-consistency: ERCs remove dependence on ephemeral sources (v) Execution: ERC services create and execute a packaged analysis but integrate with

  18. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    Today, both turbo codes and low-density parity-check codes are largely superior to other code families and are being used in an increasing number of modern communication systems including 3G standards, satellite and deep space communications. However, the two codes have certain distinctive characteristics that ...

  19. TASS code topical report. V.1 TASS code technical manual

    International Nuclear Information System (INIS)

    Sim, Suk K.; Chang, W. P.; Kim, K. D.; Kim, H. C.; Yoon, H. Y.

    1997-02-01

    TASS 1.0 code has been developed at KAERI for the initial and reload non-LOCA safety analysis for the operating PWRs as well as the PWRs under construction in Korea. TASS code will replace various vendor's non-LOCA safety analysis codes currently used for the Westinghouse and ABB-CE type PWRs in Korea. This can be achieved through TASS code input modifications specific to each reactor type. The TASS code can be run interactively through the keyboard operation. A simimodular configuration used in developing the TASS code enables the user easily implement new models. TASS code has been programmed using FORTRAN77 which makes it easy to install and port for different computer environments. The TASS code can be utilized for the steady state simulation as well as the non-LOCA transient simulations such as power excursions, reactor coolant pump trips, load rejections, loss of feedwater, steam line breaks, steam generator tube ruptures, rod withdrawal and drop, and anticipated transients without scram (ATWS). The malfunctions of the control systems, components, operator actions and the transients caused by the malfunctions can be easily simulated using the TASS code. This technical report describes the TASS 1.0 code models including reactor thermal hydraulic, reactor core and control models. This TASS code models including reactor thermal hydraulic, reactor core and control models. This TASS code technical manual has been prepared as a part of the TASS code manual which includes TASS code user's manual and TASS code validation report, and will be submitted to the regulatory body as a TASS code topical report for a licensing non-LOCA safety analysis for the Westinghouse and ABB-CE type PWRs operating and under construction in Korea. (author). 42 refs., 29 tabs., 32 figs

  20. Assessing executive functions in preschoolers using Shape School Task

    Directory of Open Access Journals (Sweden)

    Marta Nieto

    2016-09-01

    Full Text Available Over the last two decades, there has been a growing interest in the study of the development of executive functions in preschool children due to their relationship with different cognitive, psychological, social and academic domains. Early detection of individual differences in executive functioning can have major implications for basic and applied research. Consequently, there is a key need for assessment tools adapted to preschool skills: Shape School has been shown to be a suitable task for this purpose. Our study uses Shape School as the main task to analyze development of inhibition, task-switching and working memory in a sample of 304 preschoolers (age range 3.25-6.50 years. Additionally, we include cognitive tasks for the evaluation of verbal variables (vocabulary, word reasoning and short-term memory and performance variables (picture completion and symbol search, so as to analyze their relationship with executive functions. Our results show age-associated improvements in executive functions and the cognitive variables assessed. Furthermore, correlation analyses reveal positive relationships between executive functions and the other cognitive variables. More specifically, using structural equation modeling and including age direct and indirect effects, our results suggest that executive functions explain to a greater extent performance on verbal and performance tasks. These findings provide further information to support research that considers preschool age to be a crucial period for the development of executive functions and their relationship with other cognitive processes

  1. Financial performance and remuneration of executive directors of brazilian

    Directory of Open Access Journals (Sweden)

    Larissa Degenhart

    2017-09-01

    Full Text Available This study aimed to examine whether there is a relationship between financial performance and the remuneration of executive directors of Brazilian companies. Thus, there was a descriptive, documentary and quantitative research. The review period was the years 2011 to 2015. The study population consisted of Brazilian companies listed on the BM&FBovespa and the sample consisted of companies that presented all the variables used in each year surveyed, totaling 219 companies. For the data analysis was conducted to Spearman correlation analysis and linear regression, and was performed using the SPSS statistical software. From the study results it was found that the variables: Total Asset Profitability (ROA and company size had a significant and positive relationship with the fixed remuneration, variable and total executive directors. These results showed for the analyzed scenario, the compensation of executive officers is higher when the ROA is high and also in relation to the company size, large companies pay their executives more than smaller companies. Finally, it can be concluded that there is a relationship between financial performance and Fixed Compensation, and Variable Total executive directors of Brazilian companies listed on the BM&FBovespa. In addition, this research contributes to the understanding of the amounts paid to executive officers, demonstrating that the performance of companies reflected in the remuneration of the executive directors, so that they act in the company in order to raise the economic and financial results.

  2. Executive function and bilingualism in young and older adults

    Directory of Open Access Journals (Sweden)

    Shanna eKousaie

    2014-07-01

    Full Text Available Research suggests that being bilingual results in advantages on executive control processes and disadvantages on language tasks relative to monolinguals. Furthermore, the executive function advantage is thought to be larger in older than younger adults, suggesting that bilingualism may buffer against age-related changes in executive function. However, there are potential confounds in some of the previous research, as well as inconsistencies in the literature. The goal of the current investigation was to examine the presence of a bilingual advantage in executive control and a bilingual disadvantage on language tasks in the same sample of young and older monolingual anglophones, monolingual francophones, and French/English bilinguals. Participants completed a series of executive function tasks, including a Stroop task, a Simon task, a sustained attention to response task (SART, the Wisconsin Card Sort Test (WCST, and the digit span subtest of the Wechsler Adult Intelligence Scale, and language tasks, including the Boston Naming Test (BNT, and category and letter fluency. The results do not demonstrate an unequivocal advantage for bilinguals on executive function tasks and raise questions about the reliability, robustness and/or specificity of previous findings. The results also did not demonstrate a disadvantage for bilinguals on language tasks. Rather, they suggest that there may be an influence of the language environment. It is concluded that additional research is required to fully characterize any language group differences in both executive function and language tasks.

  3. Cognitive predictors and age-based adverse impact among business executives.

    Science.gov (United States)

    Klein, Rachael M; Dilchert, Stephan; Ones, Deniz S; Dages, Kelly D

    2015-09-01

    Age differences on measures of general mental ability and specific cognitive abilities were examined in 2 samples of job applicants to executive positions as well as a mix of executive/nonexecutive positions to determine which predictors might lead to age-based adverse impact in making selection and advancement decisions. Generalizability of the pattern of findings was also investigated in 2 samples from the general adult population. Age was negatively related to general mental ability, with older executives scoring lower than younger executives. For specific ability components, the direction and magnitude of age differences depended on the specific ability in question. Older executives scored higher on verbal ability, a measure most often associated with crystallized intelligence. This finding generalized across samples examined in this study. Also, consistent with findings that fluid abilities decline with age, older executives scored somewhat lower on figural reasoning than younger executives, and much lower on a letter series test of inductive reasoning. Other measures of inductive reasoning, such as Raven's Advanced Progressive Matrices, also showed similar age group mean differences across settings. Implications for employee selection and adverse impact on older job candidates are discussed. (c) 2015 APA, all rights reserved).

  4. Motor resonance facilitates movement execution: an ERP and kinematic study

    Directory of Open Access Journals (Sweden)

    Mathilde eMénoret

    2013-10-01

    Full Text Available Action observation, simulation and execution share neural mechanisms that allow for a common motor representation. It is known that when these overlapping mechanisms are simultaneously activated by action observation and execution, motor performance is influenced by observation and vice versa. To understand the neural dynamics underlying this influence and to measure how variations in brain activity impact the precise kinematics of motor behaviour, we coupled kinematics and electrophysiological recordings of participants while they performed and observed congruent or non-congruent actions or during action execution alone. We found that movement velocities and the trajectory deviations of the executed actions increased during the observation of congruent actions compared to the observation of non-congruent actions or action execution alone. This facilitation was also discernible in the motor-related potentials of the participants; the motor-related potentials were transiently more negative in the congruent condition around the onset of the executed movement, which occurred 300 ms after the onset of the observed movement. This facilitation seemed to depend not only on spatial congruency but also on the optimal temporal relationship of the observation and execution events.

  5. [Memory and the executive functions].

    Science.gov (United States)

    Tirapu-Ustárroz, J; Muñoz-Céspedes, J M

    The terms 'executive functioning' or 'executive control' refer to a set of mechanisms involved in the improvement of cognitive processes to guide them towards the resolution of complex problems. Both the frontal lobes, acting as structure, and the executive processes, acting as function, work with memory contents, operating with information placed in the diencephalic structures and in the medial temporal lobe. Generally, we can state that many works find an association between frontal damage and specific memory shortages like working memory deficit, metamemory problems, source amnesia, or difficulties in the prospective memory. This paper is a critical review of the working memory concept and proposes a new term: the attentional operative system that works with memory contents. Concerning the metamemory, the frontal lobes are essential for monitoring processes in general and for 'the feeling of knowing' kind of judgements in particular. Patients suffering prefrontal damage show serious problems to remember the information source. Thus, the information is rightly remembered but the spatiotemporal context where that information was learned has been forgotten. Finally, the prospective memory deals with remembering to make something in a particular moment in the future and performing the plan previously drawn up.

  6. Executive Functions in Youth With Spastic Cerebral Palsy

    NARCIS (Netherlands)

    Pirila, Silja; van der Meere, Jaap J.; Rantanen, Kati; Jokiluoma, Maria; Eriksson, Kai

    Dependent on criteria used, between 35% and 53% of the participants with cerebral palsy fulfilled the criteria of clinically relevant executive function problems as defined by Conners' (1994) Continuous Performance Test. Executive function problems were noticed mainly in participants with bilateral

  7. 78 FR 44563 - Senior Executive Service (SES) Performance Review Board

    Science.gov (United States)

    2013-07-24

    ... FEDERAL LABOR RELATIONS AUTHORITY Senior Executive Service (SES) Performance Review Board AGENCY... Management, one or more PRBs. The PRB shall review and evaluate the initial appraisal of a senior executive's performance by the supervisor, along with any response by the senior executive, and make recommendations to...

  8. 77 FR 51523 - Senior Executive Service Performance Review Board Membership

    Science.gov (United States)

    2012-08-24

    ... COUNCIL OF THE INSPECTORS GENERAL ON INTEGRITY AND EFFICIENCY Senior Executive Service Performance... required to establish one or more Senior Executive Service (SES) performance review boards. The purpose of these boards is to review and evaluate the initial appraisal of a senior executive's performance by the...

  9. ADAMS executive and operating system

    Science.gov (United States)

    Pittman, W. D.

    1981-01-01

    The ADAMS Executive and Operating System, a multitasking environment under which a variety of data reduction, display and utility programs are executed, a system which provides a high level of isolation between programs allowing them to be developed and modified independently, is described. The Airborne Data Analysis/Monitor System (ADAMS) was developed to provide a real time data monitoring and analysis capability onboard Boeing commercial airplanes during flight testing. It inputs sensor data from an airplane performance data by applying transforms to the collected sensor data, and presents this data to test personnel via various display media. Current utilization and future development are addressed.

  10. The Executive as Integrator.

    Science.gov (United States)

    Cohn, Hans M.

    1983-01-01

    Argues that although the executive has many tasks, he or she must view internal organizational integration as a primary task, making use of organizational charts, job descriptions, statements of goals and objectives, evaluations, and feedback devices. (RH)

  11. 29 CFR 541.402 - Executive and administrative computer employees.

    Science.gov (United States)

    2010-07-01

    ... LABOR REGULATIONS DEFINING AND DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE SALES EMPLOYEES Computer Employees § 541.402 Executive and administrative computer...

  12. Decoding of concatenated codes with interleaved outer codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom; Thommesen, Christian

    2004-01-01

    Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes.......Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes....

  13. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  14. Optimization and parallelization of the thermal–hydraulic subchannel code CTF for high-fidelity multi-physics applications

    International Nuclear Information System (INIS)

    Salko, Robert K.; Schmidt, Rodney C.; Avramova, Maria N.

    2015-01-01

    Highlights: • COBRA-TF was adopted by the Consortium for Advanced Simulation of LWRs. • We have improved code performance to support running large-scale LWR simulations. • Code optimization has led to reductions in execution time and memory usage. • An MPI parallelization has reduced full-core simulation time from days to minutes. - Abstract: This paper describes major improvements to the computational infrastructure of the CTF subchannel code so that full-core, pincell-resolved (i.e., one computational subchannel per real bundle flow channel) simulations can now be performed in much shorter run-times, either in stand-alone mode or as part of coupled-code multi-physics calculations. These improvements support the goals of the Department Of Energy Consortium for Advanced Simulation of Light Water Reactors (CASL) Energy Innovation Hub to develop high fidelity multi-physics simulation tools for nuclear energy design and analysis. A set of serial code optimizations—including fixing computational inefficiencies, optimizing the numerical approach, and making smarter data storage choices—are first described and shown to reduce both execution time and memory usage by about a factor of ten. Next, a “single program multiple data” parallelization strategy targeting distributed memory “multiple instruction multiple data” platforms utilizing domain decomposition is presented. In this approach, data communication between processors is accomplished by inserting standard Message-Passing Interface (MPI) calls at strategic points in the code. The domain decomposition approach implemented assigns one MPI process to each fuel assembly, with each domain being represented by its own CTF input file. The creation of CTF input files, both for serial and parallel runs, is also fully automated through use of a pressurized water reactor (PWR) pre-processor utility that uses a greatly simplified set of user input compared with the traditional CTF input. To run CTF in

  15. Experimental validation for combustion analysis of GOTHIC 6.1b code in 2-dimensional premixed combustion experiments

    International Nuclear Information System (INIS)

    Lee, J. Y.; Lee, J. J.; Park, K. C.

    2003-01-01

    In this study, the prediction capability of GOTHIC code for hydrogen combustion phenomena was validated with the results of two-dimensional premixed hydrogen combustion experiment executed by Seoul National University. In the experimental results, we could confirm the propagation characteristics of hydrogen flame such as buoyancy effect, flame front shape etc.. The combustion time of the tests was about 0.1 sec.. In the GOTHIC analyses results, the GOTHIC code could predict the overall hydrogen flame propagation characteristics but the buoyancy effect and flame shape did not compare well with the experimental results. Especially, in case of the flame propagate to the dead-end, GOTHIC predicted the flame did not affected by the flow and this cause quite different results in flame propagation from experimental results. Moreover the combustion time of the analyses was about 1 sec. which is ten times longer than the experimental result. To obtain more reasonable analysis results, it is necessary that combustion model parameters in GOTHIC code apply appropriately and hydrogen flame characteristics be reflected in solving governing equations

  16. A hardware-oriented concurrent TZ search algorithm for High-Efficiency Video Coding

    Science.gov (United States)

    Doan, Nghia; Kim, Tae Sung; Rhee, Chae Eun; Lee, Hyuk-Jae

    2017-12-01

    High-Efficiency Video Coding (HEVC) is the latest video coding standard, in which the compression performance is double that of its predecessor, the H.264/AVC standard, while the video quality remains unchanged. In HEVC, the test zone (TZ) search algorithm is widely used for integer motion estimation because it effectively searches the good-quality motion vector with a relatively small amount of computation. However, the complex computation structure of the TZ search algorithm makes it difficult to implement it in the hardware. This paper proposes a new integer motion estimation algorithm which is designed for hardware execution by modifying the conventional TZ search to allow parallel motion estimations of all prediction unit (PU) partitions. The algorithm consists of the three phases of zonal, raster, and refinement searches. At the beginning of each phase, the algorithm obtains the search points required by the original TZ search for all PU partitions in a coding unit (CU). Then, all redundant search points are removed prior to the estimation of the motion costs, and the best search points are then selected for all PUs. Compared to the conventional TZ search algorithm, experimental results show that the proposed algorithm significantly decreases the Bjøntegaard Delta bitrate (BD-BR) by 0.84%, and it also reduces the computational complexity by 54.54%.

  17. Codes Over Hyperfields

    Directory of Open Access Journals (Sweden)

    Atamewoue Surdive

    2017-12-01

    Full Text Available In this paper, we define linear codes and cyclic codes over a finite Krasner hyperfield and we characterize these codes by their generator matrices and parity check matrices. We also demonstrate that codes over finite Krasner hyperfields are more interesting for code theory than codes over classical finite fields.

  18. Implementing an Executive-Function Syllabus: Operational Issues

    Directory of Open Access Journals (Sweden)

    Russell Jay Hendel

    2016-08-01

    Full Text Available A recent approach to pedagogic challenge, contrastive to the hierarchy approach of Bloom, Anderson, Gagne, Van Hiele, Marzano, Webb and many others, identifies pedagogic challenge with executive function: Pedagogy is defined as challenging if it addresses executive function. Executive function, in turn, is defined by the presence of multiple modalities of topic approach and a multi-parameter development of the topic. This paper discusses operational issues in implementing a teaching methodology based on multi-parameter problems. This paper advocates teaching a multi-parameter topic using a step-by-step incremental approach, introducing one parameter at a time. Examples are presented from trigonometry, actuarial mathematics, statistics and (biblical literary analysis. The paper also discusses the use of the incremental approach for problem creation and remediation.

  19. DECOVALEX - Mathematical models of coupled T-H-M processes for nuclear waste repositories. Executive summary for Phases I,II and III

    International Nuclear Information System (INIS)

    Jing, L.; Stephansson, O.; Tsang, C.F.; Kautsky, F.

    1996-06-01

    This executive summary presents the motivation, structure, objectives, methodologies and results of the first stage of the international DECOVALEX project - DECOVALEX I (1992-1995). The acronym stands for Development of Coupled Models and their Validation against Experiment in Nuclear Waste Isolation, and the project is an international effort to develop mathematical models, numerical methods and computer codes for coupled thermo-hydro-mechanical processes in fractured rocks and buffer materials for geological isolation of spent nuclear fuel and other radioactive wastes, and validate them against laboratory and field experiments. 24 refs

  20. Amino acid codes in mitochondria as possible clues to primitive codes

    Science.gov (United States)

    Jukes, T. H.

    1981-01-01

    Differences between mitochondrial codes and the universal code indicate that an evolutionary simplification has taken place, rather than a return to a more primitive code. However, these differences make it evident that the universal code is not the only code possible, and therefore earlier codes may have differed markedly from the previous code. The present universal code is probably a 'frozen accident.' The change in CUN codons from leucine to threonine (Neurospora vs. yeast mitochondria) indicates that neutral or near-neutral changes occurred in the corresponding proteins when this code change took place, caused presumably by a mutation in a tRNA gene.

  1. Hot and cold executive functions in youth with psychotic symptoms.

    Science.gov (United States)

    MacKenzie, L E; Patterson, V C; Zwicker, A; Drobinin, V; Fisher, H L; Abidi, S; Greve, A N; Bagnell, A; Propper, L; Alda, M; Pavlova, B; Uher, R

    2017-12-01

    Psychotic symptoms are common in children and adolescents and may be early manifestations of liability to severe mental illness (SMI), including schizophrenia. SMI and psychotic symptoms are associated with impairment in executive functions. However, previous studies have not differentiated between 'cold' and 'hot' executive functions. We hypothesized that the propensity for psychotic symptoms is specifically associated with impairment in 'hot' executive functions, such as decision-making in the context of uncertain rewards and losses. In a cohort of 156 youth (mean age 12.5, range 7-24 years) enriched for familial risk of SMI, we measured cold and hot executive functions with the spatial working memory (SWM) task (total errors) and the Cambridge Gambling Task (decision-making), respectively. We assessed psychotic symptoms using the semi-structured Kiddie Schedule for Affective Disorders and Schizophrenia interview, Structured Interview for Prodromal Syndromes, Funny Feelings, and Schizophrenia Proneness Instrument - Child and Youth version. In total 69 (44.23%) youth reported psychotic symptoms on one or more assessments. Cold executive functioning, indexed with SWM errors, was not significantly related to psychotic symptoms [odds ratio (OR) 1.36, 95% confidence interval (CI) 0.85-2.17, p = 0.204). Poor hot executive functioning, indexed as decision-making score, was associated with psychotic symptoms after adjustment for age, sex and familial clustering (OR 2.37, 95% CI 1.25-4.50, p = 0.008). The association between worse hot executive functions and psychotic symptoms remained significant in sensitivity analyses controlling for general cognitive ability and cold executive functions. Impaired hot executive functions may be an indicator of risk and a target for pre-emptive early interventions in youth.

  2. Healthcare. Executive Summary

    Science.gov (United States)

    Carnevale, Anthony P.; Smith, Nicole; Gulish, Artem; Beach, Bennett H.

    2012-01-01

    This executive summary highlights several findings about healthcare. These are: (1) Healthcare is 18 percent of the U.S. economy, twice as high as in other countries; (2) There are two labor markets in healthcare: high-skill, high-wage professional and technical jobs and low-skill, low-wage support jobs; (3) Demand for postsecondary education in…

  3. Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes

    Science.gov (United States)

    Harrington, James William

    Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present

  4. Prefrontal cortex executive processes affected by stress in health and disease.

    Science.gov (United States)

    Girotti, Milena; Adler, Samantha M; Bulin, Sarah E; Fucich, Elizabeth A; Paredes, Denisse; Morilak, David A

    2017-07-06

    Prefrontal cortical executive functions comprise a number of cognitive capabilities necessary for goal directed behavior and adaptation to a changing environment. Executive dysfunction that leads to maladaptive behavior and is a symptom of psychiatric pathology can be instigated or exacerbated by stress. In this review we survey research addressing the impact of stress on executive function, with specific focus on working memory, attention, response inhibition, and cognitive flexibility. We then consider the neurochemical pathways underlying these cognitive capabilities and, where known, how stress alters them. Finally, we review work exploring potential pharmacological and non-pharmacological approaches that can ameliorate deficits in executive function. Both preclinical and clinical literature indicates that chronic stress negatively affects executive function. Although some of the circuitry and neurochemical processes underlying executive function have been characterized, a great deal is still unknown regarding how stress affects these processes. Additional work focusing on this question is needed in order to make progress on developing interventions that ameliorate executive dysfunction. Published by Elsevier Inc.

  5. Aqueous Transport Code Revisions Using Geographic Information Systems

    International Nuclear Information System (INIS)

    Chen, K.F.

    2003-01-01

    STREAM II, developed at the Savannah River Site (SRS) for execution on a personal computer, is an emergency response code that predicts downstream pollutant concentrations for releases from the SRS area to the Savannah River for emergency response management decision making. The STREAM II code consists of pre-processor, calculation, and post-processor modules. The pre-processor module provides a graphical user interface (GUI) for inputting the initial release data. The GUI passes the user specified data to the calculation module that calculates the pollutant concentrations at downstream locations and the transport times. The calculation module of the STREAM II adopts the transport module of the WASP5 code. WASP5 is a US Environmental Protection Agency water quality analysis program that simulates pollutant transport and fate through surface water using a finite difference method to solve the transport equation. The calculated downstream pollutant concentrations and travel times a re passed to the post-processor for display on the computer screen in graphical and tabular forms. To minimize the user's effort in the emergency situation, the required input parameters are limited to the time and date of release, type of release, location of release, amount and duration of release, and the calculation units. The user, however, could only select one of the seventeen predetermined locations. Hence, STREAM II could not be used for situations in which release locations differ from the seventeen predetermined locations. To eliminate this limitation, STREAM II has been revised to allow users to select the release location anywhere along the specified SRS main streams or the Savannah River by mouse-selection from a map displayed on the computer monitor. The required modifications to STREAM II using geographic information systems (GIS) software is discussed in this paper

  6. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC): gap analysis for high fidelity and performance assessment code development

    International Nuclear Information System (INIS)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-01-01

    needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.

  7. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.

  8. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  9. Validating Avionics Conceptual Architectures with Executable Specifications

    Directory of Open Access Journals (Sweden)

    Nils Fischer

    2012-08-01

    Full Text Available Current avionics systems specifications, developed after conceptual design, have a high degree of uncertainty. Since specifications are not sufficiently validated in the early development process and no executable specification exists at aircraft level, system designers cannot evaluate the impact of their design decisions at aircraft or aircraft application level. At the end of the development process of complex systems, e. g. aircraft, an average of about 65 per cent of all specifications have to be changed because they are incorrect, incomplete or too vaguely described. In this paper, a model-based design methodology together with a virtual test environment is described that makes complex high level system specifications executable and testable during the very early levels of system design. An aircraft communication system and its system context is developed to demonstrate the proposed early validation methodology. Executable specifications for early conceptual system architectures enable system designers to couple functions, architecture elements, resources and performance parameters, often called non-functional parameters. An integrated executable specification at Early Conceptual Architecture Level is developed and used to determine the impact of different system architecture decisions on system behavior and overall performance.

  10. EXECUTIVE STOCK OPTION EXERCISING BEHAVIOR: EVIDENCE FROM BURSA MALAYSIA

    OpenAIRE

    Ahmad Ibn Ibrahimy; Dr. Rubi Ahmad

    2012-01-01

    In spite of the fact that shareholders exercise their options when it is in the money, s eparation of ownership and control creates the necessity of verifying the adoption of corporate governance mechanism, such as Executive Stock Option Scheme (ESOs). This study verifies the relationship between Executive Stock Option (ESO) exercise and firm performance to explore the exercise pattern of Malaysian executives whether there is any differences in trading option. We found a significant positive ...

  11. Design, experiments and Relap5 code calculations for the perseo facility

    International Nuclear Information System (INIS)

    Ferri, Roberta; Achilli, Andrea; Cattadori, Gustavo; Bianchi, Fosco; Meloni, Paride

    2005-01-01

    Research on innovative safety systems for light water reactors addressed to heat removal by in-pool immersed heat exchangers, led to design, build-up and test the PERSEO facility at SIET laboratories. The research started with the CEA-ENEA proposal of improving the GE-SBWR isolation condenser system, by moving the triggering valve from the high pressure primary side of the reactor to the low pressure pool side. A new configuration of the system was defined with the heat exchanger contained in a small pool, connected at bottom and top to a large water reservoir pool, the triggering valve being located on the pool bottom connecting pipe. ENEA funded the whole activity that included the definition and build-up of a new heat exchanger pool, on the basis of the already existing PANTHERS IC-PCC facility, at SIET laboratories, and the new plant requirements. The heat exchanger connections to the pressure vessel were maintained. An experimental campaign was executed at full scale and full thermal-hydraulic conditions for investigating the behaviour and performance of the plant in steady and unsteady conditions. The Relap5 code was utilised during all phases of the research: for the heat exchanger pool dimension definition and from pre-test and post-test analyses. The Cathare code was applied too from pre-test and post-test analyses. This paper deals with the experimental and calculated results limited to the Relap5 code

  12. Executive summary

    International Nuclear Information System (INIS)

    1981-02-01

    This paper is an 'executive summary' of work undertaken to review proposals for transport, handling and emplacement of high level radioactive wastes in an underground repository, appropriate to the U.K. context, with particular reference to: waste block size and configuration; self-shielded or partially-shielded block; stages of disposal; transportation within the repository; emplacement in vertical holes or horizontal tunnels; repository access by adit, incline or shaft; and costs. The paper contains a section on general conclusions and recommendations. (U.K.)

  13. Detecting non-coding selective pressure in coding regions

    Directory of Open Access Journals (Sweden)

    Blanchette Mathieu

    2007-02-01

    Full Text Available Abstract Background Comparative genomics approaches, where orthologous DNA regions are compared and inter-species conserved regions are identified, have proven extremely powerful for identifying non-coding regulatory regions located in intergenic or intronic regions. However, non-coding functional elements can also be located within coding region, as is common for exonic splicing enhancers, some transcription factor binding sites, and RNA secondary structure elements affecting mRNA stability, localization, or translation. Since these functional elements are located in regions that are themselves highly conserved because they are coding for a protein, they generally escaped detection by comparative genomics approaches. Results We introduce a comparative genomics approach for detecting non-coding functional elements located within coding regions. Codon evolution is modeled as a mixture of codon substitution models, where each component of the mixture describes the evolution of codons under a specific type of coding selective pressure. We show how to compute the posterior distribution of the entropy and parsimony scores under this null model of codon evolution. The method is applied to a set of growth hormone 1 orthologous mRNA sequences and a known exonic splicing elements is detected. The analysis of a set of CORTBP2 orthologous genes reveals a region of several hundred base pairs under strong non-coding selective pressure whose function remains unknown. Conclusion Non-coding functional elements, in particular those involved in post-transcriptional regulation, are likely to be much more prevalent than is currently known. With the numerous genome sequencing projects underway, comparative genomics approaches like that proposed here are likely to become increasingly powerful at detecting such elements.

  14. Dynamic Shannon Coding

    OpenAIRE

    Gagie, Travis

    2005-01-01

    We present a new algorithm for dynamic prefix-free coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient length-restricted coding, alphabetic coding and coding with unequal letter costs.

  15. The correspondence between projective codes and 2-weight codes

    NARCIS (Netherlands)

    Brouwer, A.E.; Eupen, van M.J.M.; Tilborg, van H.C.A.; Willems, F.M.J.

    1994-01-01

    The hyperplanes intersecting a 2-weight code in the same number of points obviously form the point set of a projective code. On the other hand, if we have a projective code C, then we can make a 2-weight code by taking the multiset of points E PC with multiplicity "Y(w), where W is the weight of

  16. Fast resolution of the neutron diffusion equation through public domain Ode codes

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, V.M.; Vidal, V.; Garayoa, J. [Universidad Politecnica de Valencia, Departamento de Sistemas Informaticos, Valencia (Spain); Verdu, G. [Universidad Politecnica de Valencia, Departamento de Ingenieria Quimica y Nuclear, Valencia (Spain); Gomez, R. [I.E.S. de Tavernes Blanques, Valencia (Spain)

    2003-07-01

    The time-dependent neutron diffusion equation is a partial differential equation with source terms. The resolution method usually includes discretizing the spatial domain, obtaining a large system of linear, stiff ordinary differential equations (ODEs), whose resolution is computationally very expensive. Some standard techniques use a fixed time step to solve the ODE system. This can result in errors (if the time step is too large) or in long computing times (if the time step is too little). To speed up the resolution method, two well-known public domain codes have been selected: DASPK and FCVODE that are powerful codes for the resolution of large systems of stiff ODEs. These codes can estimate the error after each time step, and, depending on this estimation can decide which is the new time step and, possibly, which is the integration method to be used in the next step. With these mechanisms, it is possible to keep the overall error below the chosen tolerances, and, when the system behaves smoothly, to take large time steps increasing the execution speed. In this paper we address the use of the public domain codes DASPK and FCVODE for the resolution of the time-dependent neutron diffusion equation. The efficiency of these codes depends largely on the preconditioning of the big systems of linear equations that must be solved. Several pre-conditioners have been programmed and tested; it was found that the multigrid method is the best of the pre-conditioners tested. Also, it has been found that DASPK has performed better than FCVODE, being more robust for our problem.We can conclude that the use of specialized codes for solving large systems of ODEs can reduce drastically the computational work needed for the solution; and combining them with appropriate pre-conditioners, the reduction can be still more important. It has other crucial advantages, since it allows the user to specify the allowed error, which cannot be done in fixed step implementations; this, of course

  17. Executive cognitive impairment detected by simple bedside testing ...

    African Journals Online (AJOL)

    Aims. Cognitive impairment in people with type 2 diabetes is a barrier to successful disease management. We sought to determine whether impaired executive function as detected by a battery of simple bedside cognitive tests of executive function was associated with inadequate glycaemic control. Methods. People with ...

  18. What's next? : operational support for business process execution

    NARCIS (Netherlands)

    Schonenberg, M.H.

    2012-01-01

    In the last decade flexibility has become an increasingly important in the area of business process management. Information systems that support the execution of the process are required to work in a dynamic environment that imposes changing demands on the execution of the process. In academia and

  19. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  20. 21 CFR 1305.12 - Procedure for executing DEA Forms 222.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Procedure for executing DEA Forms 222. 1305.12... I AND II CONTROLLED SUBSTANCES DEA Form 222 § 1305.12 Procedure for executing DEA Forms 222. (a) A purchaser must prepare and execute a DEA Form 222 simultaneously in triplicate by means of interleaved...