Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)
Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.
Full Text Available The system of budget planning, programming, development and execution of the Ministry of Defence of the Republic of Croatia (henceforth: the Croatian acronym SPPIIP is the basic system for the strategic management of defence resources through which an effective and rational distribution of available resources is conducted, based on the goals of national security of the Republic of Croatia. This system sets the principles of transparency and democratic management of defence resources while respecting the specificities of the defence system. The SPPIIP allows for decision making based on complete information about alternatives and the choice of the most economical and most efficient way to reach the goal. It unites the strategic plan, program and budget. It consists of four continuous, independent and interconnected phases: planning, programming, development and the execution of the budget. The processes of the phases are dynamic and cyclic. In addition to the SPPIIP, the Defence Resources Management Model (DRMM, Croatian acronym: MURO has also been developed. This is an analytic tool which serves as a decision support system in the SPPIIP. The DRMM is a complex computer model showing graph and tabular overviews in a multi-year period. The model examines three areas: the strength of the forces, expenses and defence programs. The purpose of the model is cost and strength analysis and the analysis of compromise and feasibility, i.e. how sensitive the programs are to fiscal movements in the sphere of the MoD budget in the course of a multiyear cycle, until a certain project ends. The analysis results are an easily understandable basis for decision making. The SPPIIP and the DRMM are mutually independent systems, but they complement each other well. The SPPIIP uses the DRMM in designing and resource allocation based on the goals set. The quality of the DRMM depends on the amount and quality of data in its database. The DRMM can be used as a basis for
execution to halt immediately, leading to conservatism . 5.2 Searching for optimal, risk-bounded cRMPL pro- grams Because cRMPL supports programs that...the timing re- quirements with probabilistic guarantees without undue conservatism . 6.1 Problem Statement In field deployment on critical missions, the...the space of potential solution policies in domains with non-destructive constraint violations, leading to conservatism . A CC-POMDP formu- lation, on
In previous work we developed a framework of computational models for the concurrent execution of functions on different levels of abstraction. It shows that the traditional sequential execution of function is just a possible implementation of an abstract computational model that allows for the concurrent execution of functions. We use this framework as base for the development of abstract computational models that allow for the concurrent execution of objects.
Van Gorp, P.M.E.; Eshuis, H.; Petriu, D.C.; Rouquette, N.
In the business process management community, transformations for process models are usually programmed using imperative languages (such as Java). The underlying mapping rules tend to be documented using informal visual rules whereas they tend to be formalized using mathematical set constructs. In
Van Gorp, P.M.E.; Eshuis, H.
In the business process management community, transformations for process models are usually programmed using imperative languages. The underlying mapping rules tend to be documented using informal visual rules whereas they tend to be formalized using mathematical set constructs. In the Graph and
Fisher, Jasmin; Piterman, Nir; Bodik, Rastislav
Over the last decade, executable models of biological behaviors have repeatedly provided new scientific discoveries, uncovered novel insights, and directed new experimental avenues. These models are computer programs whose execution mechanistically simulates aspects of the cell's behaviors. If the observed behavior of the program agrees with the observed biological behavior, then the program explains the phenomena. This approach has proven beneficial for gaining new biological insights and directing new experimental avenues. One advantage of this approach is that techniques for analysis of computer programs can be applied to the analysis of executable models. For example, one can confirm that a model agrees with experiments for all possible executions of the model (corresponding to all environmental conditions), even if there are a huge number of executions. Various formal methods have been adapted for this context, for example, model checking or symbolic analysis of state spaces. To avoid manual construction of executable models, one can apply synthesis, a method to produce programs automatically from high-level specifications. In the context of biological modeling, synthesis would correspond to extracting executable models from experimental data. We survey recent results about the usage of the techniques underlying synthesis of computer programs for the inference of biological models from experimental data. We describe synthesis of biological models from curated mutation experiment data, inferring network connectivity models from phosphoproteomic data, and synthesis of Boolean networks from gene expression data. While much work has been done on automated analysis of similar datasets using machine learning and artificial intelligence, using synthesis techniques provides new opportunities such as efficient computation of disambiguating experiments, as well as the ability to produce different kinds of models automatically from biological data.
Full Text Available Over the last decade, executable models of biological behaviors have repeatedly provided new scientific discoveries, uncovered novel insights, and directed new experimental avenues. These models are computer programs whose execution mechanistically simulates aspects of the cell’s behaviors. If the observed behavior of the program agrees with the observed biological behavior, then the program explains the phenomena. This approach has proven beneficial for gaining new biological insights and directing new experimental avenues. One advantage of this approach is that techniques for analysis of computer programs can be applied to the analysis of executable models. For example, one can confirm that a model agrees with experiments for all possible executions of the model (corresponding to all environmental conditions, even if there are a huge number of executions. Various formal methods have been adapted for this context, for example, model checking or symbolic analysis of state spaces. To avoid manual construction of executable models, one can apply synthesis, a method to produce programs automatically from high-level specifications. In the context of biological modelling, synthesis would correspond to extracting executable models from experimental data. We survey recent results about the usage of the techniques underlying synthesis of computer programs for the inference of biological models from experimental data. We describe synthesis of biological models from curated mutation experiment data, inferring network connectivity models from phosphoproteomic data, and synthesis of Boolean networks from gene expression data. While much work has been done on automated analysis of similar datasets using machine learning and artificial intelligence, using synthesis techniques provides new opportunities such as efficient computation of disambiguating experiments, as well as the ability to produce different kinds of models automatically from biological data.
John K Tsotsos
Full Text Available What are the computational tasks that an executive controller for visual attention must solve? This question is posed in the context of the Selective Tuning model of attention. The range of required computations go beyond top-down bias signals or region-of-interest determinations, and must deal with overt and covert fixations, process timing and synchronization, information routing, memory, matching control to task, spatial localization, priming, and coordination of bottom-up with top-down information. During task execution, results must be monitored to ensure the expected results. This description includes the kinds of elements that are common in the control of any kind of complex machine or system. We seek a mechanistic integration of the above, in other words, algorithms that accomplish control. Such algorithms operate on representations, transforming a representation of one kind into another, which then forms the input to yet another algorithm. Cognitive Programs (CPs are hypothesized to capture exactly such representational transformations via stepwise sequences of operations. CPs, an updated and modernized offspring of Ullman's Visual Routines, impose an algorithmic structure to the set of attentional functions and play a role in the overall shaping of attentional modulation of the visual system so that it provides its best performance. This requires that we consider the visual system as a dynamic, yet general-purpose processor tuned to the task and input of the moment. This differs dramatically from the almost universal cognitive and computational views, which regard vision as a passively observing module to which simple questions about percepts can be posed, regardless of task. Differing from Visual Routines, CPs explicitly involve the critical elements of Visual Task Executive, Visual Attention Executive, and Visual Working Memory. Cognitive Programs provide the software that directs the actions of the Selective Tuning model of visual
Research purpose: The purpose of this article is to address the training and development needs of these consulting psychologists by presenting a competence executive coaching model for the planning, implementation and evaluation of executive coaching interventions. Research design, approach and method: The study was conducted while one of the authors was involved in teaching doctoral students in consulting psychology and executive coaching, specifically in the USA. The approach involved a literature review of executive coaching models and a qualitative study using focus groups to develop and evaluate the competence executive coaching model. Main findings: The literature review provided scant evidence of competence executive coaching models and there seems to be a specific need for this in the training of coaches in South Africa. Hence the model that was developed is an attempt to provide trainers with a structured model for the training of coaches. Contribution/value-add: The uniqueness of this competence model is not only described in terms of the six distinct coaching intervention phases, but also the competencies required in each.
Barrera, Mark A.; Karriker, Timothy W.
MBA Professional Report The purpose of this MBA project was to review the current Masters of Executive Management education curriculum at NPS. An internal analysis of the current program was conducted to fully understand the strategic goals of the program and the existing curriculum. An environmental scan of current and potential military customers was conducted to assess requirements for junior executive education and determine whether the MEM program corresponds with these requiremen...
Luckow, Kasper Søe; Păsăreanu, Corina S.; Dwyer, Matthew B.
Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probab...... Java programs. We show that our algorithms significantly improve upon a state-of-the-art statistical model checking algorithm, originally developed for Markov Decision Processes....... probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also...
The overall science goal of the FSP is to develop predictive simulation capability for magnetically confined fusion plasmas at an unprecedented level of integration and fidelity. This will directly support and enable effective U.S. participation in research related to the International Thermonuclear Experimental Reactor (ITER) and the overall mission of delivering practical fusion energy. The FSP will address a rich set of scientific issues together with experimental programs, producing validated integrated physics results. This is very well aligned with the mission of the ITER Organization to coordinate with its members the integrated modeling and control of fusion plasmas, including benchmarking and validation activities. . Initial FSP research will focus on two critical areas: 1) the plasma edge and 2) whole device modeling including disruption avoidance. The first of these problems involves the narrow plasma boundary layer and its complex interactions with the plasma core and the surrounding material wall. The second requires development of a computationally tractable, but comprehensive model that describes all equilibrium and dynamic processes at a sufficient level of detail to provide useful prediction of the temporal evolution of fusion plasma experiments. The initial driver for the whole device model (WDM) will be prediction and avoidance of discharge-terminating disruptions, especially at high performance, which are a critical impediment to successful operation of machines like ITER. If disruptions prove unable to be avoided, their associated dynamics and effects will be addressed in the next phase of the FSP. The FSP plan targets the needed modeling capabilities by developing Integrated Science Applications (ISAs) specific to their needs. The Pedestal-Boundary model will include boundary magnetic topology, cross-field transport of multi-species plasmas, parallel plasma transport, neutral transport, atomic physics and interactions with the plasma wall
Full Text Available The development of the SJ Framework for session-based distributed programming is part of recent and ongoing research into integrating session types and practical, real-world programming languages. SJ programs featuring session types (protocols are statically checked by the SJ compiler to verify the key property of communication safety, meaning that parties engaged in a session only communicate messages, including higher-order communications via session delegation, that are compatible with the message types expected by the recipient. This paper presents current work on security aspects of the SJ Framework. Firstly, we discuss our implementation experience from improving the SJ Runtime platform with security measures to protect and augment communication safety at runtime. We implement a transport component for secure session execution that uses a modified TLS connection with authentication based on the Secure Remote Password (SRP protocol. The key technical point is the delicate treatment of secure session delegation to counter a previous vulnerability. We find that the modular design of the SJ Runtime, based on the notion of an Abstract Transport for session communication, supports rapid extension to utilise additional transports whilst separating this concern from the application-level session programming task. In the second part of this abstract, we formally prove the target security properties by modelling the extended SJ delegation protocols in the pi-calculus.
Full Text Available Executive Information Systems are design to improve the quality of strategic level of management in organization through a new type of technology and several techniques for extracting, transforming, processing, integrating and presenting data in such a way that the organizational knowledge filters can easily associate with this data and turn it into information for the organization. These technologies are known as Business Intelligence Tools. But in order to build analytic reports for Executive Information Systems (EIS in an organization we need to design a multidimensional model based on the business model from the organization. This paper presents some multidimensional models that can be used in EIS development and propose a new model that is suitable for strategic business requests.
Full Text Available Based on the two observations that diverse applications perform better on different multicore architectures, and that different phases of an application may have vastly different resource requirements, Pal et al. proposed a novel reconfigurable hardware approach for executing multithreaded programs. Instead of mapping a concurrent program to a fixed architecture, the architecture adaptively reconfigures itself to meet the application's concurrency and communication requirements, yielding significant improvements in performance. Based on our earlier abstract operational framework for multicore execution with hierarchical memory structures, we describe execution of multithreaded programs on reconfigurable architectures that support a variety of clustered configurations. Such reconfiguration may not preserve the semantics of programs due to the possible introduction of race conditions arising from concurrent accesses to shared memory by threads running on the different cores. We present an intuitive partial ordering notion on the cluster configurations, and show that the semantics of multithreaded programs is always preserved for reconfigurations "upward" in that ordering, whereas semantics preservation for arbitrary reconfigurations can be guaranteed for well-synchronised programs. We further show that a simple approximate notion of efficiency of execution on the different configurations can be obtained using the notion of amortised bisimulations, and extend it to dynamic reconfiguration.
Web Operating Systems can be seen as an extension of traditional Operating Systems where the addresses used to manage files and execute programs (via the basic load/execution mechanism) are extended from local filesystem path-names to URLs. A first consequence is that, similarly as in traditional web technologies, executing a program at a given URL, can be done in two modalities: either the execution is performed client-side at the invoking machine (and relative URL addressing in the executed...
Business process modeling for the Virginia Department of Transportation : a demonstration with the integrated six-year improvement program and the statewide transportation improvement program : executive summary.
This effort demonstrates business process modeling to describe the integration of particular planning and programming activities of a state highway agency. The motivations to document planning and programming activities are that: (i) resources for co...
Wham, Robert M.; Martin, Sherman
This Pu-238 Supply Program Project Execution Plan (PEP) summarizes critical information and processes necessary to manage the program. The PEP is the primary agreement regarding planning and objectives between The Department of Energy Office of Nuclear Energy (DOE NE-75), Oak Ridge National Laboratory Site Office (OSO) and the Oak Ridge National Laboratory (ORNL). The acquisition executive (AE) will approve the PEP. The PEP is a living document that will be reviewed and revised periodically until the project is complete. The purpose of the project is to reestablish the capability to produce plutonium-238 (Pu-238) domestically. This capability consists primarily of procedures, processes, and design information, not capital assets. As such, the project is not subject to the requirements of DOE O 413.3B, but it will be managed using the project management principles and best practices defined there. It is likely that some capital asset will need to be acquired to complete tasks within the project. As these are identified, project controls and related processes will be updated as necessary. Because the project at its initiation was envisioned to require significant capital assets, Critical Decision 0 (CD-0) was conducted in accordance with DOE O 413.3B, and the mission need was approved on December 9, 2003, by William Magwood IV, director of the Office of Nuclear Energy (NE), Science and Technology, DOE. No date was provided for project start-up at that time. This PEP is consistent with the strategy described in the June 2010 report to Congress, Start-up Plan for Plutonium-238 Production for Radioisotope Power Systems.
Levine, D.S. [Univ. of Texas, Arlington, TX (United States)
Brain executive function is based in a distributed system whereby prefrontal cortex is interconnected with other cortical. and subcortical loci. Executive function is divided roughly into three interacting parts: affective guidance of responses; linkage among working memory representations; and forming complex behavioral schemata. Neural network models of each of these parts are reviewed and fit into a preliminary theoretical framework.
Dubas, Khalid M.; Ghani, Waqar I.; Davis, Stanley; Strong, James T.
A study assessed the market orientation of the executive Master's in Business Administration (MBA) program at Saint Joseph's University (Pennsylvania) in terms of 12 skills and knowledge areas that reflect effective managerial performance and the student-executives' perceptions of program strengths and weaknesses in delivering these skills.…
Loomis, G.; Osborne, D.; Ancho, M.
This report provides an executive summary of the Cryofracture demonstration program performed at Nuclear Remedial Technologies Corporation under contract to EG ampersand G Idaho, Inc., for the Department of Energy (DOE). Cryofracture is a size-reducing process whereby objects are frozen whereby objects are frozen to liquid nitrogen temperatures and crushed in a large hydraulic press. Material at the cryogenic temperatures have low ductility and are easily size reduced by fracturing. The main application being investigated for the DOE is for retrieved buried and stored transuranic (TRU) waste. Six 55-gallon drums and six 2 ft x 2 ft x 8 ft boxes containing simulated waste with tracers were subjected to the Cryofracture process. Data was obtained on (a) cool-down time, (b) yield strength of the containers, (c) size distribution of the waste before and after the Cryofracture process, (d) volume reduction of the waste, and (e) sampling of air and surface dusts for spread of tracers to evaluate potential contamination spread. The Cryofracture process was compared to conventional shredders and detailed cost estimates were established for construction of a Cryofracture facility at the Idaho National Engineering Laboratory. Although cost estimates for conventional shredding are higher for Cryofracture, the potential for fire and explosion during conventional shredding would incur additional costs to preclude these events. These additional costs are unknown and would require considerable research and development. 4 refs., 6 figs., 7 tabs
that was written manually. In this paper, we rephrase the main concepts of ECNO. The focus of this paper, however, is on the architecture of the ECNO execution engine and its programming framework. We will show how this framework allows us to integrate ECNO with object-oriented models, how it works without any......ECNO (Event Coordination Notation) is a notation for modelling the behaviour of a software system on top of some object-oriented data model. ECNO has two main objectives: On the one hand, ECNO should allow modelling the behaviour of a system on the domain level; on the other hand, it should...... be possible to completely generate code from ECNO and the underlying object-oriented domain models. Today, there are several approaches that would allow to do this. But, most of them would require that the data models and the behaviour models are using the same technology and the code is generated together...
Pasareanu, Corina; Visser, Willem
Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.
Shipman, Galen M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
These are the slides for a presentation on programming models in HPC, at the Los Alamos National Laboratory's Parallel Computing Summer School. The following topics are covered: Flynn's Taxonomy of computer architectures; single instruction single data; single instruction multiple data; multiple instruction multiple data; address space organization; definition of Trinity (Intel Xeon-Phi is a MIMD architecture); single program multiple data; multiple program multiple data; ExMatEx workflow overview; definition of a programming model, programming languages, runtime systems; programming model and environments; MPI (Message Passing Interface); OpenMP; Kokkos (Performance Portable Thread-Parallel Programming Model); Kokkos abstractions, patterns, policies, and spaces; RAJA, a systematic approach to node-level portability and tuning; overview of the Legion Programming Model; mapping tasks and data to hardware resources; interoperability: supporting task-level models; Legion S3D execution and performance details; workflow, integration of external resources into the programming model.
Foudriat, E. C.
Methods by which real-time executive programs can be implemented in a higher order language are discussed, using HAL/S and Path Pascal languages as program examples. Techniques are presented by which noncyclic tasks can readily be incorporated into the executive system. Situations are shown where the executive system can fail to meet its task scheduling and yet be able to recover either by rephasing the clock or stacking the information for later processing. The concept of deadline processing is shown to enable more effective mixing of time and information synchronized systems.
Ambrosino, Robert J.
This executive summary provides a brief description of the Model Adoption Exchange Payment System (MAEPS), a unique payment system aimed at improving the delivery of adoption exchange services throughout the United States. Following a brief introductory overview, MAEPS is described in terms of (1) its six components (registration, listing,…
Full Text Available Cognitive training has been shown to improve executive functions in middle childhood and adulthood. However, fewer studies have targeted the preschool years – a time when executive functions undergo rapid development. The present study tested the effects of a short four session executive function training program in 54 four-year-olds. The training group significantly improved their working memory from pre-training relative to an active control group. Notably, this effect extended to a task sharing few surface features with the trained tasks, and continued to be apparent three months later. In addition, the benefits of training extended to a measure of mathematical reasoning three months later, indicating that training executive functions during the preschool years has the potential to convey benefits that are both long-lasting and wide-ranging.
Lean and Efficient Software: Whole-Program Optimization of Executables” Project Summary Report #5 (Report Period: 7/1/2015 to 9/30/2015...TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Lean and Efficient Software: Whole-Program Optimization of Executables 5a...unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Lean and Efficient Software: Whole-Program
19b. TELEPHONE NUMBER (Include area code) 12/31/2016 Final Technical Report (Phase I - Base Period) 30-06-2014 - 31-12-2016 Lean and Efficient...Software: Whole-Program Optimization of Executables Final Report Evan Driscoll Tom Johnson GrammaTech, Inc. 531 Esty Street Ithaca, NY 14850 Office of...hardening U U U UU 30 Tom Johnson (607) 273-7340 x.134 Page 1 of 30 “ Lean and Efficient Software: Whole-Program Optimization of Executables
Eichberg, Michael; Monperrus, Martin; Kloppenburg, Sven; Mezini, Mira
Implementing static analyses of machine-level executable code is labor intensive and complex. We show how to leverage model-driven engineering to facilitate the design and implementation of programs doing static analyses. Further, we report on important lessons learned on the benefits and drawbacks while using the following technologies: using the Scala programming language as target of code generation, using XML-Schema to express a metamodel, and using XSLT to implement (a) transformations and (b) a lint like tool. Finally, we report on the use of Prolog for writing model transformations.
Kubesch, Sabine; Walk, Laura; Spitzer, Manfred; Kammer, Thomas; Lainburg, Alyona; Heim, Rudiger; Hille, Katrin
Physical activity is not only beneficial to physical health but also to cognitive functions. In particular, executive functions that are closely related to learning achievement can be improved by acute and recurring physical activity. We examined the effects of a single 30-min physical education program in contrast to a 5-min movement break on…
This report provides an Executive Summary of the various elements of the Materials Sciences Program which is funded by the Division of Materials Sciences, Office of Basic Energy Sciences, U.S. Department of Energy at Sandia National Laboratories, New Mexico.
Gail Harlamoff is Executive Director of the Life Lab Science Program, a nationally recognized, award-winning nonprofit science and environmental organization located on the UC Santa Cruz campus. Founded in 1979, Life Lab helps schools develop gardens and implement curricula to enhance students’ learning about science, math, and the natural world. The program has trained tens of thousands of educators in more than 1400 schools across the country. Life Lab’s specialized initiatives inc...
Engineers are more and more faced to the hard problem of sophisticated real-time System whereas time to market becomes always smaller. Object oriented modeling supported by UML standard brings effective solutions to such problems. However the possibility to specify real-time aspects of an application are not yet fully satisfactory Indeed, existing industrial proposals supply good answers to concurrency specification problem but they are yet limited regarding to real-time quantitative properties specification of an application. This work aims to construct a complete and consistent UML methodology based on a profile dedicated to automotive embedded Systems modeling and prototyping. This profile contains ail needed extensions to express easily the real-time quantitative properties of an application. Moreover, thanks to the formalization of UML protocol state machines, real-time concepts have been well-integrated in the object oriented paradigm. The main result of this deep integration is that a user is now able to model real-time Systems through the classical object oriented view i.e. without needing any specific knowing in real-time area. In order to answer to an industrial requirement, Systems prototyping (key point for car industry) the ACCORD/UML approach allows also to build executable models of an application. For that purpose, the method supplies a set of rules allow.ng to remove UML ambiguous semantics points, to complete semantics variation points and then to obtain a complete and coherent global model of an application being executable. The work of UML extension and its using formalization realized all along this thesis supplied also a complete and non-ambiguous modeling framework for automotive electronics Systems development. This is also a base particularly well-suited to tackle other facets of the Systems development as automatic and optimized code generation, validation, simulation or tests. (author) [fr
Karin Zazo Ortiz
Full Text Available ABSTRACT Introduction: Dysfunction in the basal ganglia circuits is a determining factor in the physiopathology of the classic signs of Parkinson's disease (PD and hypokinetic dysarthria is commonly related to PD. Regarding speech disorders associated with PD, the latest four-level framework of speech complicates the traditional view of dysarthria as a motor execution disorder. Based on findings that dysfunctions in basal ganglia can cause speech disorders, and on the premise that the speech deficits seen in PD are not related to an execution motor disorder alone but also to a disorder at the motor programming level, the main objective of this study was to investigate the presence of sensorimotor disorders of programming (besides the execution disorders previously described in PD patients. Methods: A cross-sectional study was conducted in a sample of 60 adults matched for gender, age and education: 30 adult patients diagnosed with idiopathic PD (PDG and 30 healthy adults (CG. All types of articulation errors were reanalyzed to investigate the nature of these errors. Interjections, hesitations and repetitions of words or sentences (during discourse were considered typical disfluencies; blocking, episodes of palilalia (words or syllables were analyzed as atypical disfluencies. We analysed features including successive self-initiated trial, phoneme distortions, self-correction, repetition of sounds and syllables, prolonged movement transitions, additions or omissions of sounds and syllables, in order to identify programming and/or execution failures. Orofacial agility was also investigated. Results: The PDG had worse performance on all sensorimotor speech tasks. All PD patients had hypokinetic dysarthria. Conclusion: The clinical characteristics found suggest both execution and programming sensorimotor speech disorders in PD patients.
Reliable models for assessing human exposures are important for understanding health risks from chemicals. The Stochastic Human Exposure and Dose Simulation model for multimedia, multi-route/pathway chemicals (SHEDS-Multimedia), developed by EPA’s Office of Research and Developm...
Full Text Available In the last years, there were made efforts for delineation of a stabile and unitary frame, where the problems of logical parallel processing must find solutions at least at the level of imperative languages. The results obtained by now are not at the level of the made efforts. This paper wants to be a little contribution at these efforts. We propose an overview in parallel programming, parallel execution and collaborative systems.
Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco
Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.
Full Text Available With the rapid development of embedded systems, the systems’ security has become more and more important. Most embedded systems are at the risk of series of software attacks, such as buffer overflow attack, Trojan virus. In addition, with the rapid growth in the number of embedded systems and wide application, followed embedded hardware attacks are also increasing. This paper presents a new hardware assisted security mechanism to protect the program’s code and data, monitoring its normal execution. The mechanism mainly monitors three types of information: the start/end address of the program of basic blocks; the lightweight hash value in basic blocks and address of the next basic block. These parameters are extracted through additional tools running on PC. The information will be stored in the security module. During normal program execution, the security module is designed to compare the real-time state of program with the information in the security module. If abnormal, it will trigger the appropriate security response, suspend the program and jump to the specified location. The module has been tested and validated on the SOPC with OR1200 processor. The experimental analysis shows that the proposed mechanism can defence a wide range of common software and physical attacks with low performance penalties and minimal overheads.
Vora, V. P.; Mahmassani, H. S.
This work proposes and implements a comprehensive evaluation framework to document the telecommuter, organizational, and societal impacts of telecommuting through telecommuting programs. Evaluation processes and materials within the outlined framework are also proposed and implemented. As the first component of the evaluation process, the executive survey is administered within a public sector agency. The survey data is examined through exploratory analysis and is compared to a previous survey of private sector executives. The ordinal probit, dynamic probit, and dynamic generalized ordinal probit (DGOP) models of telecommuting adoption are calibrated to identify factors which significantly influence executive adoption preferences and to test the robustness of such factors. The public sector DGOP model of executive willingness to support telecommuting under different program scenarios is compared with an equivalent private sector DGOP model. Through the telecommuting program, a case study of telecommuting travel impacts is performed to further substantiate research.
Whittle, Sarah; Pantelis, Christos; Testa, Renee; Tiego, Jeggan; Bellgrove, Mark
Executive attention refers to the goal-directed control of attention. Existing models of executive attention distinguish between three correlated, but empirically dissociable, factors related to selectively attending to task-relevant stimuli (Selective Attention), inhibiting task-irrelevant responses (Response Inhibition), and actively maintaining goal-relevant information (Working Memory Capacity). In these models, Selective Attention and Response Inhibition are moderately strongly correlate...
Gelman, Andrew [Principal Investigator
The research allows more effective model building. By allowing researchers to fit complex models to large datasets in a scalable manner, our algorithms and software enable more effective scientific research. In the new area of “big data,” it is often necessary to fit “big models” to adjust for systematic differences between sample and population. For this task, scalable and efficient model-fitting tools are needed, and these have been achieved with our new Hamiltonian Monte Carlo algorithm, the no-U-turn sampler, and our new C++ program, Stan. In layman’s terms, our research enables researchers to create improved mathematical modes for large and complex systems.
Rhatigan, Jennifer L. (Editor)
This document (Volume I) provides an executive summary of the lessons learned from the Constellation Program. A companion Volume II provides more detailed analyses for those seeking further insight and information. In this volume, Section 1.0 introduces the approach in preparing and organizing the content to enable rapid assimilation of the lessons. Section 2.0 describes the contextual framework in which the Constellation Program was formulated and functioned that is necessary to understand most of the lessons. Context of a former program may seem irrelevant in the heady days of new program formulation. However, readers should take some time to understand the context. Many of the lessons would be different in a different context, so the reader should reflect on the similarities and differences in his or her current circumstances. Section 3.0 summarizes key findings developed from the significant lessons learned at the program level that appear in Section 4.0. Readers can use the key findings in Section 3.0 to peruse for particular topics, and will find more supporting detail and analyses in Section 4.0 in a topical format. Appendix A contains a white paper describing the Constellation Program formulation that may be of use to readers wanting more context or background information. The reader will no doubt recognize some very similar themes from previous lessons learned, blue-ribbon committee reviews, National Academy reviews, and advisory panel reviews for this and other large-scale human spaceflight programs; including Apollo, Space Shuttle, Shuttle/Mir, and the ISS. This could represent an inability to learn lessons from previous generations; however, it is more likely that similar challenges persist in the Agency structure and approach to program formulation, budget advocacy, and management. Perhaps the greatest value of these Constellation lessons learned can be found in viewing them in context with these previous efforts to guide and advise the Agency and its
Piccoli, Luciano; Simone, James N; Kowalkowlski, James B; Dubey, Abhishek
Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.
Martínez Martínez, David; Alenyà Ribas, Guillem; Torras, Carme
Task learning in robotics requires repeatedly executing the same actions in different states to learn the model of the task. However, in real-world domains, there are usually sequences of actions that, if executed, may produce unrecoverable errors (e.g. breaking an object). Robots should avoid repeating such errors when learning, and thus explore the state space in a more intelligent way. This requires identifying dangerous action effects to avoid including such actions in the generated plans...
Sheridan, M R; Flowers, K A; Hurrell, J
Programming and execution of arm movements in Parkinson's disease were investigated in choice and simple reaction time (RT) situations in which subjects made aimed movements at a target. A no-aiming condition was also studied. Reaction time was fractionated using surface EMG recording into premotor (central) and motor (peripheral) components. Premotor RT was found to be greater for parkinsonian patients than normal age-matched controls in the simple RT condition, but not in the choice condition. This effect did not depend on the parameters of the impending movement. Thus, paradoxically, parkinsonian patients were not inherently slower at initiating aiming movements from the starting position, but seemed unable to use advance information concerning motor task demands to speed up movement initiation. For both groups, low velocity movements took longer to initiate than high velocity ones. In the no-aiming condition parkinsonian RTs were markedly shorter than when aiming, but were still significantly longer than control RTs. Motor RT was constant across all conditions and was not different for patient and control subjects. In all conditions, parkinsonian movements were around 37% slower than control movements, and their movement times were more variable, the differences showing up early on in the movement, that is, during the initial ballistic phase. The within-subject variability of movement endpoints was also greater in patients. The motor dysfunction displayed in Parkinson's disease involves a number of components: (1) a basic central problem with simply initiating movements, even when minimal programming is required (no-aiming condition); (2) difficulty in maintaining computed forces for motor programs over time (simple RT condition); (3) a basic slowness of movement (bradykinesia) in all conditions; and (4) increased variability of movement in both time and space, presumably caused by inherent variability in force production.
Begum, R.; Pathak, N.; Hasnain, S.E.; Sah, N.K.; Athar, M.
Apoptosis or programmed cell death is a highly conserved genetically controlled response of metazoan cells to commit suicide. Non apoptotic programmed cell death seems to operate in single celled eukaryotes implying that evolution of PCD has preceded the evolution of multicellularity. PCD plays a crucial role in the regulation of cellular and tissue homeostasis and any aberrations in apoptosis leads to several diseases including cancer, neurodegenerative disorders and AIDS. The mechanisms by which apoptosis is controlled are varied. In some cells, members of bcl-2 family or p53 are crucial for regulating the apoptosis programme, whereas in other cells Fas ligand is more important. bcl-2 family members have a prime role in the regulation of cell death at all stages including development, whereas cell death during development is independent of p53. bcl-2 family members being localized on the outer mitochondrial membrane, control the mitochondrial homeostasis and cytochrome c redistribution and thereby regulate the cell death process. p53 promotes DNA damage mediated cell death after growth arrest and failed DNA repair. Caspases play a key role in the execution of cell death by mediating highly specific cleavages of crucial cellular proteins collectively manifesting the apoptotic phenotype. Protein inhibitors like crm A, p35 and IAPs could prevent/control apoptosis induced by a broad array of cell death stimuli by several mechanisms specially interfering in caspase activation or caspase activity. Among endonucleases, caspase activated DNase (CAD) plays a crucial role in DNA fragmentation, a biochemical hallmark of apoptosis. As regulation of cell death seems to be as complex as regulation of cell proliferation, multiple kinase mediated regulatory mechanisms might control the apoptotic process. Thus, in spite of intensive research over the past few years, the field of apoptosis still remains fertile to unravel among others, the molecular mechanisms of cytochrome c
Begum, R.; Pathak, N.; Hasnain, S.E.; Sah, N.K. [National Inst. of Immunology, New Delhi (India). Eukaryotic Gene Expression Lab.; Taneja, T.K.; Mohan, M. [National Inst. of Immunology, New Delhi (India). Eukaryotic Gene Expression Lab.]|[Dept. of Medical Elementology and Toxicology, New Delhi (India); Athar, M. [Dept. of Medical Elementology and Toxicology, New Delhi (India)
Apoptosis or programmed cell death is a highly conserved genetically controlled response of metazoan cells to commit suicide. Non apoptotic programmed cell death seems to operate in single celled eukaryotes implying that evolution of PCD has preceded the evolution of multicellularity. PCD plays a crucial role in the regulation of cellular and tissue homeostasis and any aberrations in apoptosis leads to several diseases including cancer, neurodegenerative disorders and AIDS. The mechanisms by which apoptosis is controlled are varied. In some cells, members of bcl-2 family or p53 are crucial for regulating the apoptosis programme, whereas in other cells Fas ligand is more important. bcl-2 family members have a prime role in the regulation of cell death at all stages including development, whereas cell death during development is independent of p53. bcl-2 family members being localized on the outer mitochondrial membrane, control the mitochondrial homeostasis and cytochrome c redistribution and thereby regulate the cell death process. p53 promotes DNA damage mediated cell death after growth arrest and failed DNA repair. Caspases play a key role in the execution of cell death by mediating highly specific cleavages of crucial cellular proteins collectivley manifesting the apoptotic phenotype. Protein inhibitors like crm A, p35 and IAPs could prevent/control apoptosis induced by a broad array of cell death stimuli by several mechanisms specially interfering in caspase activation or caspase activity. Among endonucleases, caspase activated DNase (CAD) plays a crucial role in DNA fragmentation, a biochemical hallmark of apoptosis. As regulation of cell death seems to be as complex as regulation of cell proliferation, multiple kinase mediated regulatory mechanisms might control the apoptotic process. Thus, in spite of intensive research over the past few years, the field of apoptosis still remains fertile to unravel among others, the molecular mechanisms of cytochrome c
Igor G. Fedorov
Full Text Available Executable business process models, as well as programs, require evidence of a defect-free finish. The methods based on the formalism of Petri nets are widely used. A business process is a network of dishes, and its properties are set by the analysis of the properties of the network. The aim is to study the methods of displaying an executable business process model in a Petri net. Analysis of the properties of the resulting model allows us to prove a number of important properties: it is a network of free choice and clean without looping.
Rensink, Arend; Zambon, Eduardo
In this report we present a type graph that models all executable constructs of the Java programming language. Such a model is useful for any graph-based technique that relies on a representation of Java programs as graphs. The model can be regarded as a common representation to which all Java
Rensink, Arend; Zambon, Eduardo; Lee, D.; Lopes, A.; Poetzsch-Heffter, A.
In this work we present a type graph that models all executable constructs of the Java programming language. Such a model is useful for any graph-based technique that relies on a representation of Java programs as graphs. The model can be regarded as a common representation to which all Java syntax
Laursen, Johan Sund; Ellekilde, Lars Peter; Schultz, Ulrik Pagh
Programming robotic assembly for industrial small-batch production is challenging; hence, it is vital to increase robustness and reduce development effort in order to achieve flexible robotic automation. A human who has made an assembly error will often simply undo the process until the error is ...
package, the Automated Cost Estimating Integrated Tools ( ACEIT ). Using development cost estimation modeling techniques, the team also estimates...using Automated Cost Estimating Integrated Tool ( ACEIT ) An SQL database, known as the Program Financial Management System, currently used by the AH... ACEIT Automated Cost Estimating Integrated Tools AFOTEC Air Force Operational Test and Evaluation Center AFSOC Air Force Special Operations Command
The Executive Assistant also performs a variety of office management related tasks, offers ... Operational and administrative activities of the VP's Office ... PPB staff at Head Quarter and reports any excess leave issues to the attention of the VP.
Baliś, B.; Bubak, M.
Records of past application executions are particularly important in the case of loosely-coupled, workflow driven scientific applications which are used to conduct in silico experiments, often on top of Grid infrastructures. In this paper, we propose an ontology-based model for storing and querying
Herbert, Luke Thomas; Herbert-Hansen, Zaza Nadja Lee
When designing safety critical systems there is a need for verification of safety properties while ensuring system operations have a specific performance profile. We present a novel application of model checking to derive execution strategies, sequences of decisions at workflow branch points...... which optimise a set of reward variables, while simultaneously observing constraints which encode any required safety properties and accounting for the underlying stochastic nature of the system. By evaluating quantitative properties of the generated adversaries we are able to construct an execution...
Perunovic, Zoran; Staffensen, Lasse
The paper discusses design and execution of OM module in an intensive program for top executives. The participants are working as consultants in six different host companies on developing growth strategies. The OM module is designed to enable the participants to develop operations strategy that s...
Java Pathfinder (JPF) is a verification and testing environment for Java that integrates model checking, program analysis, and testing. JPF consists of a custom-made Java Virtual Machine (JVM) that interprets bytecode, combined with a search interface to allow the complete behavior of a Java program to be analyzed, including interleavings of concurrent programs. JPF is implemented in Java, and its architecture is highly modular to support rapid prototyping of new features. JPF is an explicit-state model checker, because it enumerates all visited states and, therefore, suffers from the state-explosion problem inherent in analyzing large programs. It is suited to analyzing programs less than 10kLOC, but has been successfully applied to finding errors in concurrent programs up to 100kLOC. When an error is found, a trace from the initial state to the error is produced to guide the debugging. JPF works at the bytecode level, meaning that all of Java can be model-checked. By default, the software checks for all runtime errors (uncaught exceptions), assertions violations (supports Java s assert), and deadlocks. JPF uses garbage collection and symmetry reductions of the heap during model checking to reduce state-explosion, as well as dynamic partial order reductions to lower the number of interleavings analyzed. JPF is capable of symbolic execution of Java programs, including symbolic execution of complex data such as linked lists and trees. JPF is extensible as it allows for the creation of listeners that can subscribe to events during searches. The creation of dedicated code to be executed in place of regular classes is supported and allows users to easily handle native calls and to improve the efficiency of the analysis.
Purpose The importance of shared mental models in teamwork has been explored in a diverse array of artificial work groups. This study extends such research to explore the role of mental models in the strategic decision-making of a real-world senior management group. Design/Methodology Data were collected from an intact group of senior healthcare executives (N=13) through semi-structured interviews, meeting observations and internal document analysis. Participants responded to intervi...
... DEPARTMENT OF TRANSPORTATION Federal Aviation Administration Approval of Noise Compatibility Program for Chicago Executive Airport, Prospect Heights and Wheeling, IL AGENCY: Federal Aviation Administration, DOT. ACTION: Notice. SUMMARY: The Federal Aviation Administration (FAA) announces its findings on...
Gschwind, Michael K
Mechanisms for generating and executing programs for a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA) are provided. A computer program product comprising a computer recordable medium having a computer readable program recorded thereon is provided. The computer readable program, when executed on a computing device, causes the computing device to receive one or more instructions and execute the one or more instructions using logic in an execution unit of the computing device. The logic implements a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA), based on data stored in a vector register file of the computing device. The vector register file is configured to store both scalar and floating point values as vectors having a plurality of vector elements.
Universities and other higher education institutions in Europe offer a vast and increasing number of academic degree programs in the broad field of Public Administration. A subset of these programs is those offering postgraduate degrees to experienced students being already employed by public or private organisations. These executive programs are…
Geissler, Gary L.
Despite continued growth in the number of Executive MBA (EMBA) Programs in the U. S. and worldwide, previous research concerning the marketing of EMBA Programs has been very limited. Here, the author investigates ways to successfully market an EMBA Program at a southern U. S. university. Extensive exploratory research was conducted among current…
Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio
Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....
This document presents realized activities during five years program in mine field and petroleum (1979-1983). It involves mining and petroleum researches and mining productions. [French] Le document presente les activites realisees dans le domaine des mines et du petrole pendant les 5 annees du plan quinquenal (1979-1983). Il s'agit des recherches minieres et petrolieres et des productions minieres.
Itoh, Nobuhide; Itoh, Goroh; Shibata, Takayuki
The authors are conducting experience-based engineering educational programs for elementary and junior high school students with the aim to provide a chance for them to experience mechanical production. As part of this endeavor, we planned and conducted a program called “Fabrication of Original Magnet Plates by Casting” for elementary school students. This program included a course for leading nature laws and logical thinking method. Prior to the program, a preliminary program was applied to school teachers to get comments and to modify for the program accordingly. The children responded excellently to the production process which realizes their ideas, but it was found that the course on natural laws and logical methods need to be improved to draw their interest and attention. We will continue to plan more effective programs, deepening ties with the local community.
1. There are two very different ways of executing linear regression analysis. One is Model I, when the x-values are fixed by the experimenter. The other is Model II, in which the x-values are free to vary and are subject to error. 2. I have received numerous complaints from biomedical scientists that they have great difficulty in executing Model II linear regression analysis. This may explain the results of a Google Scholar search, which showed that the authors of articles in journals of physiology, pharmacology and biochemistry rarely use Model II regression analysis. 3. I repeat my previous arguments in favour of using least products linear regression analysis for Model II regressions. I review three methods for executing ordinary least products (OLP) and weighted least products (WLP) regression analysis: (i) scientific calculator and/or computer spreadsheet; (ii) specific purpose computer programs; and (iii) general purpose computer programs. 4. Using a scientific calculator and/or computer spreadsheet, it is easy to obtain correct values for OLP slope and intercept, but the corresponding 95% confidence intervals (CI) are inaccurate. 5. Using specific purpose computer programs, the freeware computer program smatr gives the correct OLP regression coefficients and obtains 95% CI by bootstrapping. In addition, smatr can be used to compare the slopes of OLP lines. 6. When using general purpose computer programs, I recommend the commercial programs systat and Statistica for those who regularly undertake linear regression analysis and I give step-by-step instructions in the Supplementary Information as to how to use loss functions. © 2011 The Author. Clinical and Experimental Pharmacology and Physiology. © 2011 Blackwell Publishing Asia Pty Ltd.
This executive summary describes highlights from the report, "Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities." City-led efforts to build coordinated systems of afterschool programming are an important strategy for improving the health, safety and academic preparedness of children…
Bull, John S (Editor)
The National Space Strategy approved by the President and Congress in 1984 sets for NASA a major goal of conducting effective and productive space applications and technology programs which contribute materially toward United States leadership and security. To contribute to this goal, OAST supports the Nation's civil and defense space programs and overall economic growth. OAST objectives are to ensure timely provision of new concepts and advanced technologies, to support both the development of NASA missions in space and the space activities of industry and other organizations, to utilize the strengths of universities in conducting the NASA space research and technology program, and to maintain the NASA centers in positions of strength in critical space technology areas. In line with these objectives, NASA has established a new program in space automation and robotics that will result in the development and transfer and automation technology to increase the capabilities, productivity, and safety of NASA space programs including the Space Station, automated space platforms, lunar bases, Mars missions, and other deep space ventures. The NASA/OAST Automation and Robotics program is divided into two parts. Ames Research Center has the lead role in developing and demonstrating System Autonomy capabilities for space systems that need to make their own decisions and do their own planning. The Jet Propulsion Laboratory has the lead role for Telerobotics (that portion of the program that has a strong human operator component in the control loop and some remote handling requirement in space). This program is intended to be a working document for NASA Headquarters, Program Offices, and implementing Project Management.
PROGRAM MANAGEMENT COURSE- SSTUDENT STUDY PROGRAM Fort Beivoir, Virginia, 22060 THE ROLE OF LEADERSHIP IN NAVY PROGRAM OFFICES STUDY REPORT PMC 73-1...jLa I1 • 1. Men at the Top. Osborne kLliott 1 2. "Understanding Leadership ". W. 0. H. Prentice, HBR Sep- Oct 1966 j 3. "A Theory of Human Motivation...Drucker, IMI reprint 9. Technical Management Institute Thifnkpiece, " Leadership " 1 • 0. "The Effective Manager ". Peter Drucker, LMI r~print 11. Sur-ival In
This five-year program plan describes the goals and philosophy of the US Department of Energy`s (DOE) Biofuels Systems Division (BSD) program and the BSD`s major research and development (R&D) activities for fiscal years (FY) 1992 through 1996. The plan represents a consensus among government and university researchers, fuel and automotive manufacturers, and current and potential users of alternative fuels and fuel additives produced from biomass. It defines the activities that are necessary to produce versatile, domestic, economical, renewable liquid fuels from biomass feedstocks. The BSD program focuses on the production of alternative liquid fuels for transportation-fuels such as ethanol, methanol, biodiesel, and fuel additives for reformulated gasoline. These fuels can be produced from many plant materials and from a significant portion of the wastes generated by municipalities and industry. Together these raw materials and wastes, or feedstocks, are called biomass.
Busing, Michael E.; Palocsay, Susan W.
Master of business administration (MBA) programs are under intense pressure to improve efficiencies, lower tuition, and offer refreshed curriculum that is of high quality and regarded as relevant by the marketplace. In light of this environment, the authors propose a conceptual framework for effectively employing operations management (OM)…
Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge for future research.
Hwang, John T.
NASA's OpenMDAO framework facilitates constructing complex models and computing their derivatives for multidisciplinary design optimization. Decomposing a model into components that follow a prescribed interface enables OpenMDAO to assemble multidisciplinary derivatives from the component derivatives using what amounts to the adjoint method, direct method, chain rule, global sensitivity equations, or any combination thereof, using the MAUD architecture. OpenMDAO also handles the distribution of processors among the disciplines by hierarchically grouping the components, and it automates the data transfer between components that are on different processors. These features have made OpenMDAO useful for applications in aircraft design, satellite design, wind turbine design, and aircraft engine design, among others. This paper presents new algorithms for OpenMDAO that enable reconfigurable model execution. This concept refers to dynamically changing, during execution, one or more of: the variable sizes, solution algorithm, parallel load balancing, or set of variables-i.e., adding and removing components, perhaps to switch to a higher-fidelity sub-model. Any component can reconfigure at any point, even when running in parallel with other components, and the reconfiguration algorithm presented here performs the synchronized updates to all other components that are affected. A reconfigurable software framework for multidisciplinary design optimization enables new adaptive solvers, adaptive parallelization, and new applications such as gradient-based optimization with overset flow solvers and adaptive mesh refinement. Benchmarking results demonstrate the time savings for reconfiguration compared to setting up the model again from scratch, which can be significant in large-scale problems. Additionally, the new reconfigurability feature is applied to a mission profile optimization problem for commercial aircraft where both the parametrization of the mission profile and the
CCMR News Article CCMR hosted a version of its biannual “Executive Program in Defense Decision Making” offering for 21 international military and civilian participants, from November 6-17, 2017. Often described as CCMR’s “flagship” course, this curriculum has been offered at the Naval Postgraduate School (NPS) in Monterey, California for over 20 years.
Roy J. What Determines Economic Growth? Economic Review – Second Quarter 1993 [References: Barro (1991); Mankiw , Romer, and Well (1992); De Long...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release: distribution unlimited ECONOMIC SECURITY...DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE: Economic Security Environment and Implementation of Planning, Programming, Budgeting, Execution
Samara, George A.; Simmons, Jerry A.
This report presents an Executive Summary of the various elements of the Materials Sciences and Engineering Program which is funded by the Division of Materials Sciences and Engineering, Office of Basic Energy Sciences, U.S. Department of Energy at Sandia National Laboratories, New Mexico. A general programmatic overview is also presented.
Summers, Don; Riley, Chris; Cremaldi, Lucien; Sanders, David
Over the past decade, UNIX workstations have provided a very powerful program development environment. However, workstations are more expensive than PCs and Macintoshes and require a system manager for day-to-day tasks such as disk backup, adding users, and setting up print queues. Native commercial software for system maintenance and "PC applications" has been lacking under UNIX. Apple's new Rhapsody operating system puts the current MacOS on a NeXT UNIX foundation and adds an enhanced NeXTS...
Hunn, B. D.; Diamond, S. C.; Bennett, G. A.; Tucker, E. F.; Roschke, M. A.
A set of computer programs, called Cal-ERDA, is described that is capable of rapid and detailed analysis of energy consumption in buildings. A new user-oriented input language, named the Building Design Language (BDL), has been written to allow simplified manipulation of the many variables used to describe a building and its operation. This manual provides the user with information necessary to understand in detail the Cal-ERDA set of computer programs. The new computer programs described include: an EXECUTIVE Processor to create computer system control commands; a BDL Processor to analyze input instructions, execute computer system control commands, perform assignments and data retrieval, and control the operation of the LOADS, SYSTEMS, PLANT, ECONOMICS, and REPORT programs; a LOADS analysis program that calculates peak (design) zone and hourly loads and the effect of the ambient weather conditions, the internal occupancy, lighting, and equipment within the building, as well as variations in the size, location, orientation, construction, walls, roofs, floors, fenestrations, attachments (awnings, balconies), and shape of a building; a Heating, Ventilating, and Air-Conditioning (HVAC) SYSTEMS analysis program capable of modeling the operation of HVAC components including fans, coils, economizers, humidifiers, etc.; 16 standard configurations and operated according to various temperature and humidity control schedules. A plant equipment program models the operation of boilers, chillers, electrical generation equipment (diesel or turbines), heat storage apparatus (chilled or heated water), and solar heating and/or cooling systems. An ECONOMIC analysis program calculates life-cycle costs. A REPORT program produces tables of user-selected variables and arranges them according to user-specified formats. A set of WEATHER ANALYSIS programs manipulates, summarizes and plots weather data. Libraries of weather data, schedule data, and building data were prepared.
Herrera, Marisa L.
This study applies the literature on leadership framing to the globalization of higher education to understand the development of the Global Executive MBA program at a large university. The purpose of the study was to provide administrators, educators and university leaders an understanding as to how to respond to globalization and, secondly, to…
... PROPERTY 38-SALE OF PERSONAL PROPERTY Implementation of the Federal Asset Sales Program § 102-38.360 What must an executive agency do to implement the eFAS program? (a) An executive agency must review the effectiveness of all sales solutions, and compare them to the effectiveness (e.g., cost, level of service, and...
Schönherr, Sebastian; Forer, Lukas; Weißensteiner, Hansi; Kronenberg, Florian; Specht, Günther; Kloss-Brandstätter, Anita
The MapReduce framework enables a scalable processing and analyzing of large datasets by distributing the computational load on connected computer nodes, referred to as a cluster. In Bioinformatics, MapReduce has already been adopted to various case scenarios such as mapping next generation sequencing data to a reference genome, finding SNPs from short read data or matching strings in genotype files. Nevertheless, tasks like installing and maintaining MapReduce on a cluster system, importing data into its distributed file system or executing MapReduce programs require advanced knowledge in computer science and could thus prevent scientists from usage of currently available and useful software solutions. Here we present Cloudgene, a freely available platform to improve the usability of MapReduce programs in Bioinformatics by providing a graphical user interface for the execution, the import and export of data and the reproducibility of workflows on in-house (private clouds) and rented clusters (public clouds). The aim of Cloudgene is to build a standardized graphical execution environment for currently available and future MapReduce programs, which can all be integrated by using its plug-in interface. Since Cloudgene can be executed on private clusters, sensitive datasets can be kept in house at all time and data transfer times are therefore minimized. Our results show that MapReduce programs can be integrated into Cloudgene with little effort and without adding any computational overhead to existing programs. This platform gives developers the opportunity to focus on the actual implementation task and provides scientists a platform with the aim to hide the complexity of MapReduce. In addition to MapReduce programs, Cloudgene can also be used to launch predefined systems (e.g. Cloud BioLinux, RStudio) in public clouds. Currently, five different bioinformatic programs using MapReduce and two systems are integrated and have been successfully deployed. Cloudgene is
Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
This report compiles information and conclusions gathered as part of the “Modeling EERE Deployment Programs” project. The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge in which future research is needed.
Bergen, Benjamin Karl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
This is the PDF of a powerpoint presentation from a teleconference on Los Alamos programming models. It starts by listing their assumptions for the programming models and then details a hierarchical programming model at the System Level and Node Level. Then it details how to map this to their internal nomenclature. Finally, a list is given of what they are currently doing in this regard.
Gan, T.; Tarboton, D. G.; Dash, P. K.; Gichamo, T.; Horsburgh, J. S.
Web based apps, web services and online data and model sharing technology are becoming increasingly available to support research. This promises benefits in terms of collaboration, platform independence, transparency and reproducibility of modeling workflows and results. However, challenges still exist in real application of these capabilities and the programming skills researchers need to use them. In this research we combined hydrologic modeling web services with an online data and model sharing system to develop functionality to support reproducible hydrologic modeling work. We used HydroDS, a system that provides web services for input data preparation and execution of a snowmelt model, and HydroShare, a hydrologic information system that supports the sharing of hydrologic data, model and analysis tools. To make the web services easy to use, we developed a HydroShare app (based on the Tethys platform) to serve as a browser based user interface for HydroDS. In this integration, HydroDS receives web requests from the HydroShare app to process the data and execute the model. HydroShare supports storage and sharing of the results generated by HydroDS web services. The snowmelt modeling example served as a use case to test and evaluate this approach. We show that, after the integration, users can prepare model inputs or execute the model through the web user interface of the HydroShare app without writing program code. The model input/output files and metadata describing the model instance are stored and shared in HydroShare. These files include a Python script that is automatically generated by the HydroShare app to document and reproduce the model input preparation workflow. Once stored in HydroShare, inputs and results can be shared with other users, or published so that other users can directly discover, repeat or modify the modeling work. This approach provides a collaborative environment that integrates hydrologic web services with a data and model sharing
Nan, Shan; Van Gorp, Pieter; Lu, Xudong; Kaymak, Uzay; Korsten, Hendrikus; Vdovjak, Richard; Duan, Huilong
Safety checklist is a type of cognitive tool enforcing short term memory of medical workers with the purpose of reducing medical errors caused by overlook and ignorance. To facilitate the daily use of safety checklists, computerized systems embedded in the clinical workflow and adapted to patient-context are increasingly developed. However, the current hard-coded approach of implementing checklists in these systems increase the cognitive efforts of clinical experts and coding efforts for informaticists. This is due to the lack of a formal representation format that is both understandable by clinical experts and executable by computer programs. We developed a dynamic checklist meta-model with a three-step approach. Dynamic checklist modeling requirements were extracted by performing a domain analysis. Then, existing modeling approaches and tools were investigated with the purpose of reusing these languages. Finally, the meta-model was developed by eliciting domain concepts and their hierarchies. The feasibility of using the meta-model was validated by two case studies. The meta-model was mapped to specific modeling languages according to the requirements of hospitals. Using the proposed meta-model, a comprehensive coronary artery bypass graft peri-operative checklist set and a percutaneous coronary intervention peri-operative checklist set have been developed in a Dutch hospital and a Chinese hospital, respectively. The result shows that it is feasible to use the meta-model to facilitate the modeling and execution of dynamic checklists. We proposed a novel meta-model for the dynamic checklist with the purpose of facilitating creating dynamic checklists. The meta-model is a framework of reusing existing modeling languages and tools to model dynamic checklists. The feasibility of using the meta-model is validated by implementing a use case in the system.
Muscettola, Nicola; Dorais, Gregory A.; Fry, Chuck; Levinson, Richard; Plaunt, Christian; Norvig, Peter (Technical Monitor)
Writing autonomous software is complex, requiring the coordination of functionally and technologically diverse software modules. System and mission engineers must rely on specialists familiar with the different software modules to translate requirements into application software. Also, each module often encodes the same requirement in different forms. The results are high costs and reduced reliability due to the difficulty of tracking discrepancies in these encodings. In this paper we describe a unified approach to planning and execution that we believe provides a unified representational and computational framework for an autonomous agent. We identify the four main components whose interplay provides the basis for the agent's autonomous behavior: the domain model, the plan database, the plan running module, and the planner modules. This representational and problem solving approach can be applied at all levels of the architecture of a complex agent, such as Remote Agent. In the rest of the paper we briefly describe the Remote Agent architecture. The new agent architecture proposed here aims at achieving the full Remote Agent functionality. We then give the fundamental ideas behind the new agent architecture and point out some implication of the structure of the architecture, mainly in the area of reactivity and interaction between reactive and deliberative decision making. We conclude with related work and current status.
Huang, Lindsey M
The purpose of this executive summary was to provide an overview of key findings from By the Numbers: 30th Report on Physician Assistant Educational Programs in the United States. The 2014 Program Survey is a Web-based survey and is administered annually to all member physician assistant (PA) program directors. This executive summary will focus on 4 of the 7 sections of the survey instrument: general, financial, program personnel, and students. The typical PA program's sponsoring institution is private and in a nonacademic health center. Most PA programs (93.0%) offer a master's degree as the primary or highest credential. The average total program budget was $2,221,751 (SD=$2,426,852). The average total resident tuition was $64,961, and the average total nonresident tuition was $75,964. Overall, 181 programs reported 1843 program faculty. Of those, 1467 were identified as core faculty and 376 were identified as adjunct faculty. A typical first-year PA student is 26 years old (SD=2.51), female (70.3%, n=5898), non-Hispanic (89.3%, n=3631), White (79.9%, n=3712), and has an overall undergraduate and science grade point average (GPA) of 3.52 (SD=0.14) and 3.47 (SD=0.16), respectively. In 2014, there were approximately 7556 graduates from 164 responding programs. By gaining a better understanding of the characteristics of PA programs and their faculty and students, policy makers can be better informed. Physician assistant educators and stakeholders are encouraged to use this information to advance and advocate for the profession.
This paper discussed the quality planning in quality management. In the quality planning, the quality objectives, the quality liabilities and the procedures shall be developed, grading supervise shall be exercised, quality assurance program shall be established, and requirements on resource and documents shall be defined. At the same time, we shall also intensify and enhance the execute force of program documentation, supervise and inspect the implementation result, establish detailed check indicators, and bring up the requirement on how to improve the quality of products. (author)
Cort, Katherine A.; Hostick, Donna J.; Belzer, David B.; Livingston, Olga V.
The purpose of this report is to compile information and conclusions gathered as part of three separate tasks undertaken as part of the overall project, “Modeling EERE Deployment Programs,” sponsored by the Planning, Analysis, and Evaluation office within the Department of Energy’s Office of Energy Efficiency and Renewable Energy (EERE). The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address improvements to modeling in the near term, and note gaps in knowledge where future research is needed.
Full Text Available Abstract Background The MapReduce framework enables a scalable processing and analyzing of large datasets by distributing the computational load on connected computer nodes, referred to as a cluster. In Bioinformatics, MapReduce has already been adopted to various case scenarios such as mapping next generation sequencing data to a reference genome, finding SNPs from short read data or matching strings in genotype files. Nevertheless, tasks like installing and maintaining MapReduce on a cluster system, importing data into its distributed file system or executing MapReduce programs require advanced knowledge in computer science and could thus prevent scientists from usage of currently available and useful software solutions. Results Here we present Cloudgene, a freely available platform to improve the usability of MapReduce programs in Bioinformatics by providing a graphical user interface for the execution, the import and export of data and the reproducibility of workflows on in-house (private clouds and rented clusters (public clouds. The aim of Cloudgene is to build a standardized graphical execution environment for currently available and future MapReduce programs, which can all be integrated by using its plug-in interface. Since Cloudgene can be executed on private clusters, sensitive datasets can be kept in house at all time and data transfer times are therefore minimized. Conclusions Our results show that MapReduce programs can be integrated into Cloudgene with little effort and without adding any computational overhead to existing programs. This platform gives developers the opportunity to focus on the actual implementation task and provides scientists a platform with the aim to hide the complexity of MapReduce. In addition to MapReduce programs, Cloudgene can also be used to launch predefined systems (e.g. Cloud BioLinux, RStudio in public clouds. Currently, five different bioinformatic programs using MapReduce and two systems are
Embree, Jennifer L; Meek, Julie; Ebright, Patricia
The purpose of this article is to describe the business case framework used to guide doctor of nursing practice (DNP) program enhancements and to discuss methods used to gain chief nurse executives' (CNEs) perspectives for desired curricular and experiential content for doctor of nursing practice nurses in health care system executive roles. Principal results of CNE interview responses were closely aligned to the knowledge, skills and/or attitudes identified by the national leadership organizations. Major conclusions of this article are that curriculum change should include increased emphasis on leadership, implementation science, and translation of evidence into practice methods. Business, information and technology management, policy, and health care law content would also need to be re-balanced to facilitate DNP graduates' health care system level practice. Copyright © 2017 Elsevier Inc. All rights reserved.
Full Text Available Similar to any other governments, Indonesia government has the role of protecting the security of its citizens via the established police unit. However, the executive unit is often unable to provide response in timely manner due to the huge data size. For the reason, an executive information system (EIS is established in order to provide necessary information to leverage the decision making process. This work intends to establish and evaluate the executive information system and its support to facilitate the efforts to fight crimes in Indonesia territory. The EIS prototype is established and is evaluated on the basis of the six information system success factors where the required data are collected by means of questionnaire. The results suggest that the factors of system quality, information quality, easy-of-use, user satisfaction, and individual and organization impacts are very significant.
Rowland, J.A. [Dallas Mining Services Pty Ltd., Wollongong, NSW (Australia)
Ventilation surveys and the development of a properly tuned ventilation model are important components of a modern underground mine safety management system to ensure the safety of miners. Such systems in Australia revolve around the routine application of risk based logic. However, assessing the risk in ventilation systems always changes. Designers of ventilation circuits therefore use ventilation modeling software as a key tool to facilitate the structured process. This paper emphasized the importance of measuring the underground circuit and replicating the measurements in a working model. The most commonly used modeling program in Australia is the Ventsim software which is available as a fully graphical 3D configuration as well as a 2D version. The value of the mine ventilation survey lies in the ability of the data to be accurately replicated on a mine ventilation model. As such, much thought must be given to the ventilation survey scope of work and overall process. The surveys must satisfy operational needs and must delineate the circuit to a level that will allow a model be to accurately assembled in order to determine when minor or major ventilation circuit adjustments are needed. 1 ref., 10 figs.
Full Text Available Programs and project evaluation models can be extremely useful in project planning and management. The aim is to set the right questions as soon as possible in order to see in time and deal with the unwanted program effects, as well as to encourage the positive elements of the project impact. In short, different evaluation models are used in order to minimize losses and maximize the benefits of the interventions upon small or large social groups. This article introduces some of the most recently used evaluation models.
Shahmoradi, Leila; Ahmadi, Maryam; Sadoughi, Farahnaz; Piri, Zakieh; Gohari, Mahmood Reza
A knowledge management audit (KMA) is the first phase in knowledge management implementation. Incomplete or incomprehensive execution of the KMA has caused many knowledge management programs to fail. A study was undertaken to investigate how KMAs are performed systematically in organizations and present a comprehensive model for performing KMAs based on a systematic review. Studies were identified by searching electronic databases such as Emerald, LISA, and the Cochrane library and e-journals such as the Oxford Journal and hand searching of printed journals, theses, and books in the Tehran University of Medical Sciences digital library. The sources used in this study consisted of studies available through the digital library of the Tehran University of Medical Sciences that were published between 2000 and 2013, including both Persian- and English-language sources, as well as articles explaining the steps involved in performing a KMA. A comprehensive model for KMAs is presented in this study. To successfully execute a KMA, it is necessary to perform the appropriate preliminary activities in relation to the knowledge management infrastructure, determine the knowledge management situation, and analyze and use the available data on this situation.
Jeffrey J. P. Tsai
Full Text Available It is well known that undiscovered errors in a requirements specification is extremely expensive to be fixed when discovered in the software maintenance phase. Errors in the requirement phase can be reduced through the validation and verification of the requirements specification. Many logic-based requirements specification languages have been developed to achieve these goals. However, the execution and reasoning of a logic-based requirements specification can be very slow. An effective way to improve their performance is to execute and reason the logic-based requirements specification in parallel. In this paper, we present a hybrid model to facilitate the parallel execution of a logic-based requirements specification language. A logic-based specification is first applied by a data dependency analysis technique which can find all the mode combinations that exist within a specification clause. This mode information is used to support a novel hybrid parallel execution model, which combines both top-down and bottom-up evaluation strategies. This new execution model can find the failure in the deepest node of the search tree at the early stage of the evaluation, thus this new execution model can reduce the total number of nodes searched in the tree, the total processes needed to be generated, and the total communication channels needed in the search process. A simulator has been implemented to analyze the execution behavior of the new model. Experiments show significant improvement based on several criteria.
Hull, Richard; Thiemann, Peter; Wadler, Philip
The world-wide web raises a variety of new programming challenges. To name a few: programming at the level of the web browser, data-centric approaches, and attempts to automatically discover and compose web services. This seminar brought together researchers from the web programming and web services communities and strove to engage them in communication with each other. The seminar was held in an unusual style, in a mixture of short presentations and in-depth discussio...
In this paper the following tasks are considered: Kozloduy NPP units 3 and 4 life time evaluation; programme for the units life time assuring; units 3 and 4 renewals. The main activities for the programme implementation are described and the obtained results are presented. In conclusion, the executed activities of program for assuring the life time of units 3 and 4 of Kozloduy NPP, cogently prove that the lifetime of structures, systems and components, is assured duly and those structures, systems and components will be in service safely, economically effectively and mostly reliable till the end of the 30 years design lifetime. For some of them it has been proved even for 35 and 40 years. Program activities continue during 2005, although the early shutdown of units 3 and 4 is possible
Tsai, Jeffrey J P
Parallel processing is a very important technique for improving the performance of various software development and maintenance activities. The purpose of this book is to introduce important techniques for parallel executation of high-level specifications of software systems. These techniques are very useful for the construction, analysis, and transformation of reliable large-scale and complex software systems. Contents: Current Approaches; Overview of the New Approach; FRORL Requirements Specification Language and Its Decomposition; Rewriting and Data Dependency, Control Flow Analysis of a Lo
The mathematical background for a multiport-network-solving program is described. A method for accurately numerically modeling an arbitrary, continuous, multiport transmission line is discussed. A modification to the transmission-line equations to accommodate multiple rf drives is presented. An improved model for the radio-frequency quadrupole (RFQ) accelerator that corrects previous errors is given. This model permits treating the RFQ as a true eight-port network for simplicity in interpreting the field distribution and ensures that all modes propagate at the same velocity in the high-frequency limit. The flexibility of the multiport model is illustrated by simple modifications to otherwise two-dimensional systems that permit modeling them as linear chains of multiport networks
This annual report of the Executive Director, UN Environment Program, to the Governing Council of the United Nations focuses on four topics of global significance: schistosomiasis, pesticide resistance, noise pollution, and tourism. The four topics, while not the only urgent ones, are important contemporary problems associated with the impacts of development and environmental management. The pressures of man's efforts to increase the agricultural base through irrigation and chemical pesticides have resulted in an unprecedented spread of infectious schistosomiasis and pesticide pollution. Industrialization and urbanization have raised noise levels until they represent serious health hazards. International tourism has grown to such proportions that hundreds of millions of individual trips are taken by the average public each year. The positive and negative aspects of these developments are examined in hopes of stimulating discussions that will lead to more desirable planning and management. 81 references. (DCK)
This report summarizes enhancements and modifications to PROGRAM FDTD executable on the Cray X-MP computer system. Specifically, the tasks defined and performed under this effort are revision of the material encoding/decoding scheme to allow material type specification on an individual cell basis; modification of the I/O buffering scheme to maximize the use of available central memory and minimize the number of physical I/O accesses; user interface enhancements. Provide enhanced input/output features for greater flexibility; increased modularity. Divide the code into additional modules for ease of maintenance and future enhancements; and assist in the conversion and testing of FDTD to Floating Point Systems scientific computers and associated peripheral devices.
Kraybill, Matthew L; Suchy, Yana
Assessing functional independence is an important part of making diagnostic decisions and treatment recommendations but is often complicated by the limitations of self-report and behavioral measures. Alternatively, it may be worthwhile to investigate neurocognitive correlates of incipient functional declines including using tests of executive functioning (EF) and motor programming (MP). The current study examined an electronic MP task and pitted it against other assessment instruments to evaluate its relative utility in assessing both EF and functional independence. Participants were 72 community-dwelling older adults. Results of this study showed that the MP task was correlated with other measures of EF, an efficient and reliable predictor of functionality, useful for identifying at-risk patients, and comparable to a longer battery in terms of sensitivity and specificity.
Scheibler, Thorsten; Leymann, Frank
One of the predominant problems IT companies are facing today is Enterprise Application Integration (EAI). Most of the infrastructures built to tackle integration issues are proprietary because no standards exist for how to model, develop, and actually execute integration scenarios. EAI patterns gain importance for non-technical business users to ease and harmonize the development of EAI scenarios. These patterns describe recurring EAI challenges and propose possible solutions in an abstract way. Therefore, one can use those patterns to describe enterprise architectures in a technology neutral manner. However, patterns are documentation only used by developers and systems architects to decide how to implement an integration scenario manually. Thus, patterns are not theoretical thought to stand for artefacts that will immediately be executed. This paper presents a tool supporting a method how EAI patterns can be used to generate executable artefacts for various target platforms automatically using a model-driven development approach, hence turning patterns into something executable. Therefore, we introduce a continuous tool chain beginning at the design phase and ending in executing an integration solution in a completely automatically manner. For evaluation purposes we introduce a scenario demonstrating how the tool is utilized for modelling and actually executing an integration scenario.
Jonker, C.M.; Treur, J.; Wijngaards, W.C.A.
In this paper an executable generic process model is proposed for combined verbal and non-verbal communication processes and their interaction. The model has been formalised by three-levelled partial temporal models, covering both the material and mental processes and their relations. The generic
Jonker, C.M.; Treur, J.; Wijngaards, W.C.A.; Dignum, F.; Greaves, M.
In this paper an executable generic process model is proposed for combined verbal and non-verbal communication processes and their interaction. The model has been formalised by three-levelled partial temporal models, covering both the material and mental processes and their relations. The generic
The design, development and analysis of the 7.3 MW MOD-5A wind turbine generator covering work performed between July 1980 and June 1984 is discussed. The report is divided into four volumes: Volume 1 summarizes the entire MOD-5A program, Volume 2 discusses the conceptual and preliminary design phases, Volume 3 describes the final design of the MOD-5A, and Volume 4 contains the drawings and specifications developed for the final design. Volume 1, the Executive Summary, summarizes all phases of the MOD-5A program. The performance and cost of energy generated by the MOD-5A are presented. Each subsystem - the rotor, drivetrain, nacelle, tower and foundation, power generation, and control and instrumentation subsystems - is described briefly. The early phases of the MOD-5A program, during which the design was analyzed and optimized, and new technologies and materials were developed, are discussed. Manufacturing, quality assurance, and safety plans are presented. The volume concludes with an index of volumes 2 and 3.
Jennifer Lynn Bizon
Full Text Available Executive functions supported by prefrontal cortical systems provide essential control and planning mechanisms to guide goal-directed behavior. As such, age-related alterations in executive functions can mediate profound and widespread deficits on a diverse array of neurocognitive processes. Many of the critical neuroanatomical and functional characteristics of prefrontal cortex are preserved in rodents, allowing for meaningful cross-species comparisons relevant to the study of cognitive aging. In particular, as rodents lend themselves to genetic, cellular and biochemical approaches, rodent models of executive function stand to significantly contribute to our understanding of the critical neurobiological mechanisms that mediate decline of executive processes across the lifespan. Moreover, rodent analogues of executive functions that decline in human aging represent an essential component of a targeted, rational approach for developing and testing effective treatment and prevention therapies for age-related cognitive decline. This paper reviews behavioral approaches used to study executive function in rodents, with a focus on those assays that share a foundation in the psychological and neuroanatomical constructs important for human aging. A particular emphasis is placed on behavioral approaches used to assess working memory and cognitive flexibility, which are sensitive to decline with age across species and for which strong rodent models currently exist. In addition, other approaches in rodent behavior that have potential for providing analogues to functions that reliably decline to human aging (e.g., information processing speed are discussed.
Chavarría-Miranda, Daniel; Halappanavar, Mahantesh; Krishnamoorthy, Sriram; Manzano Franco, Joseph B.; Vishnu, Abhinav; Hoisie, Adolfy
Efficient utilization of high-performance computing (HPC) platforms is an important and complex problem. Execution models, abstract descriptions of the dynamic runtime behavior of the execution stack, have significant impact on the utilization of HPC systems. Using a computational chemistry kernel as a case study and a wide variety of execution models combined with load balancing techniques, we explore the impact of execution models on the utilization of an HPC system. We demonstrate a 50 percent improvement in performance by using work stealing relative to a more traditional static scheduling approach. We also use a novel semi-matching technique for load balancing that has comparable performance to a traditional hypergraph-based partitioning implementation, which is computationally expensive. Using this study, we found that execution model design choices and assumptions can limit critical optimizations such as global, dynamic load balancing and finding the correct balance between available work units and different system and runtime overheads. With the emergence of multi- and many-core architectures and the consequent growth in the complexity of HPC platforms, we believe that these lessons will be beneficial to researchers tuning diverse applications on modern HPC platforms, especially on emerging dynamic platforms with energy-induced performance variability.
McLean, Marsha R; Morahan, Page S; Dannels, Sharon A; McDade, Sharon A
To explore whether geographic mobility is associated with career advancement of women in U.S. medical schools who are entering mid- to executive-level positions. Using an existing dataset of 351 participants in academic medicine who attended the Executive Leadership in Academic Medicine (ELAM) Program for Women (1996-2005) (adjusted to 345 participants in some analyses because data on initial faculty rank were missing), the authors conducted a quantitative study in 2009 to determine whether geographic mobility was associated with administrative promotion for those who relocated geographically (from employer while attending ELAM to employer at last job of record). Twenty-four percent of women (83/345) relocated geographically (movers) after attending ELAM. Moving had a positive association with career advancement (P = .001); odds for promotion were 168% higher for movers than for stayers [odds ratio Exp(β) = 2.684]. Movers attained higher administrative positions (P = .003), and more movers (60%) were promoted at the most recent job compared with stayers (40%) (P = .0001). Few movers changed city size; 70% already resided in large or urban cities where most medical schools are located. Age was not a barrier to mobility. Career advancement was not related to research reputation (National Institutes of Health grant award ranking) of participants' schools (either at time of attending ELAM or post-ELAM). Similar to findings outside academic medicine, 24% of women classified as geographic "movers" among midcareer faculty in medical schools attained career advantages. Psychosocial and socioeconomic factors underlying women's relocation decisions require additional study.
Beard, John; Yaprak, Attila
A content analysis model for assessing advertising themes and messages generated primarily for United States markets to overcome barriers in the cultural environment of international markets was developed and tested. The model is based on three primary categories for generating, evaluating, and executing advertisements: rational, emotional, and…
Willemsen, Timo; Feenstra, Anton; Groth, Paul
The amount of biological data exposed in semantic formats is steadily increasing. In particular, pathway information (a model of how molecules interact within a cell) from databases such as KEGG and WikiPathways are available in a standard RDF-based format BioPAX. However, these models are
Kluge, Florian; Schoeberl, Martin; Ungerer, Theo
The logical execution time (LET) model increases the compositionality of real-time task sets. Removal or addition of tasks does not influence the communication behavior of other tasks. In this work, we extend a multicore operating system running on a time-predictable multicore processor to support...... the LET model. For communication between tasks we use message passing on a time-predictable network-on-chip to avoid the bottleneck of shared memory. We report our experiences and present results on the costs in terms of memory and execution time....
Dasgupta, Aniruddha; Krishna, Aneesh; Ghose, Aditya K.
Agent-oriented conceptual modeling notations are highly effective in representing requirements from an intentional stance and answering questions such as what goals exist, how key actors depend on each other, and what alternatives must be considered. In this chapter, we review an approach to executing i* models by translating these into set of interacting agents implemented in the CASO language and suggest how we can perform reasoning with requirements modeled (both functional and non-functional) using i* models. In this chapter we particularly incorporate deliberation into the agent design. This allows us to benefit from the complementary representational capabilities of the two frameworks.
Full Text Available The purpose of this research is to examine the relationship between the elements of the Eight "S" model that affect strategic implementation and results achieved by companies. The main research question, to which the author sought an answer, was whether there was a relationship between individual elements that affect strategy implementation and the effects it brings in revenue growth. The survey covered 200 of the best-ranked Polish companies (where revenues constituted one of the ranking criteria where the level of strategic implementation was considered satisfactory. Testing of the research hypotheses has shown that the factors defined as Resources and Shared Values have a minor impact on the strategy implementation. The research also has shown that there is an additional element that could be incorporated into the model - the system of informal communication. In addition, the paper describes the interrelations between elements of the model.
Jørgensen, Jens Bæk; Christensen, Søren
UML is applied in the design of a pervasive healthcare middleware system for the hospitals in Aarhus County, Denmark. It works well for the modelling of static aspects of the system, but with respect to describing the behaviour, UML is not sufficient. This paper explains why and, as a remedy, su...
New York, 1980 . 49. Helmbold, R. Some Observations on the Use of Lanchester’s Theory for Prediction; Operations Research: Lincoln, RI, 1964...Empirical Model Building; Wiley: New York, 1989. 69 NO. OF NO. OF COPIES ORGANIZATION COPIES ORGANIZATION 1 DEFENSE TECHNICAL ( PDF ...ORGANIZATION 2 US ARMY RESEARCH OFFICE AMSRD ARL RO MM J LAVERY D HISLOP PO BOX 12211 RESEARCH TRIANGLE PARK NC 27709-2211 3
Walk, Laura M; Evers, Wiebke F; Quante, Sonja; Hille, Katrin
Executive functions (EFs) play a critical role in cognitive and social development. During preschool years, children show not only rapid improvement in their EFs, but also appear sensitive to developmentally appropriate interventions. EMIL is a training program for German preschool teachers that was developed and implemented to improve the EFs of preschoolers. The aim of the present study was to evaluate its effects on the EFs of children between three and six years old. The teacher training (eight sessions, 28.5 hours) was implemented in four preschools. The EFs of children of the intervention group (n = 72, 32 girls, Mage = 48 months) and the control group of four other matched preschools (n = 61, 27 girls, Mage = 48 months) were tested before, during, and after the intervention using different measures assessing working memory, inhibitory control, and cognitive flexibility. The intervention group showed significant gains on three out of seven EF tests (behavioral inhibition, visual-spatial working memory, and combined EFs) compared to the control group. Post hoc analyses for children with low initial EFs scores revealed that participation in the intervention led to significant gains in inhibitory control, visual-spatial working memory, and phonological working memory as well as a marginally significant difference for combined EFs. However, effect sizes were rather small. The results suggest that teacher training can lead to significant improvements in preschooler's EFs. Although preliminary, the results could contribute to the discussion on how teacher training can facilitate the improvement of EFs in preschool children.
Zaid, Farid; Berbner, Rainer; Steinmetz, Ralf
Business processes executed using compositions of distributed Web Services are susceptible to different fault types. The Web Services Business Process Execution Language (BPEL) is widely used to execute such processes. While BPEL provides fault handling mechanisms to handle functional faults like invalid message types, it still lacks a flexible native mechanism to handle non-functional exceptions associated with violations of QoS levels that are typically specified in a governing Service Level Agreement (SLA), In this paper, we present an approach to complement BPEL's fault handling, where expected QoS levels and necessary recovery actions are specified declaratively in form of Event-Condition-Action (ECA) rules. Our main contribution is leveraging BPEL's standard event model which we use as an event space for the created ECA rules. We validate our approach by an extension to an open source BPEL engine.
Full Text Available During last years, the combination of several filtering techniques for the development of anti-spam systems has gained a enormous popularity. However, although the accuracy achieved by these models has increased considerably, its use has entailed the emergence of new challenges such as the need to reduce the excessive use of computational resources, the increase of filtering speed and the adjustment of the weights used for the combination of several filtering techniques. In order to achieve this goal we have been refined several aspects including: (i the design and development of small technical improvements to increase the overall performance of the filter, (ii application of genetic algorithms to increase filtering accuracy and (iii the use of scheduling algorithms to improve filtering throughput.
Gray, Justin S.; Briggs, Jeffery L.
The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.
Model Driven Engineering (MDE) places models at the heart of the software engineering process. MDE helps managing the complexity of software systems and improving the quality of the development process. The Model Driven Architecture (MDA) initiative from the Object Management Group (OMG) defines a framework for building design flows in the context of MDE. MDA relies heavily on formalisms which are normalized by the OMG, such as UML for modeling, QVT for model transformations and so on. This work deals with the execution semantics of the UML language applied to embedded real-time applications. In this context, the OMG has a norm which defines an execution model for a subset of UML called fUML (foundational UML subset). This execution model gives a precise semantics to UML models, which can be used for analyzing models, generating code, or verifying transformations. The goal of this PhD thesis is to define and build an execution engine for UML models of embedded real-time systems, which takes into account the explicit hypothesis made by the designer about the execution semantics at a high level of abstraction, in order to be able to execute models as early as possible in the design flow of a system. To achieve this goal, we have extended the fUML execution model along three important axes with regard to embedded real-time systems: - Concurrence: fUML does not provide any mechanism for handling concurrent activities in its execution engine. We address this issue by introducing an explicit scheduler which allows us to control the execution of concurrent tasks. - Time: fUML does not provide any mean to handle time. By adding a clock to the model of execution, we can take into account the elapsed time as well as temporal constraints on the execution of activities. - Profiles: fUML does not take profiles into account, which makes it difficult to personalize the execution engine with new semantic variants. The execution engine we propose allows the use of UML models with
.... As public servants, whether elected or non-elected, Executive Branch employees are expected to make decisions and spend tax-payer dollars in ways that promote the overall interests of the American public...
In his State of the Union address on January 31, 1990, President Bush set a goal for US students to be number one in the world in mathematics and science achievement by the year 2000. The Teachers Academy for Mathematics and Science in Chicago is an experiment of unprecedented boldness and scale that can provide a means to the President`s goal, both for the Chicago area and as a national model. This document covers organization and governance, program activities, future training goals, and evaluation programs.
Gil, Y.; Duffy, C.
This paper proposes the concept of a "Computable Catchment" which is used to develop a collaborative platform for watershed modeling and data analysis. The object of the research is a sharable, executable document similar to a pdf, but one that includes documentation of the underlying theoretical concepts, interactive computational/numerical resources, linkage to essential data repositories and the ability for interactive model-data visualization and analysis. The executable document for each catchment is stored in the cloud with automatic provisioning and a unique identifier allowing collaborative model and data enhancements for historical hydroclimatic reconstruction and/or future landuse or climate change scenarios to be easily reconstructed or extended. The Computable Catchment adopts metadata standards for naming all variables in the model and the data. The a-priori or initial data is derived from national data sources for soils, hydrogeology, climate, and land cover available from the www.hydroterre.psu.edu data service (Leonard and Duffy, 2015). The executable document is based on Wolfram CDF or Computable Document Format with an interactive open-source reader accessible by any modern computing platform. The CDF file and contents can be uploaded to a website or simply shared as a normal document maintaining all interactive features of the model and data. The Computable Catchment concept represents one application for Geoscience Papers of the Future representing an extensible document that combines theory, models, data and analysis that are digitally shared, documented and reused among research collaborators, students, educators and decision makers.
Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing
Formal techniques have been devoted to analyzing whether network protocol specifications violate security policies; however, these methods cannot detect vulnerabilities in the implementations of the network protocols themselves. Symbolic execution can be used to analyze the paths of the network protocol implementations, but for stateful network protocols, it is difficult to reach the deep states of the protocol. This paper proposes a novel model-guided approach to detect vulnerabilities in network protocol implementations. Our method first abstracts a finite state machine (FSM) model, then utilizes the model to guide the symbolic execution. This approach achieves high coverage of both the code and the protocol states. The proposed method is implemented and applied to test numerous real-world network protocol implementations. The experimental results indicate that the proposed method is more effective than traditional fuzzing methods such as SPIKE at detecting vulnerabilities in the deep states of network protocol implementations.
Full Text Available Formal techniques have been devoted to analyzing whether network protocol specifications violate security policies; however, these methods cannot detect vulnerabilities in the implementations of the network protocols themselves. Symbolic execution can be used to analyze the paths of the network protocol implementations, but for stateful network protocols, it is difficult to reach the deep states of the protocol. This paper proposes a novel model-guided approach to detect vulnerabilities in network protocol implementations. Our method first abstracts a finite state machine (FSM model, then utilizes the model to guide the symbolic execution. This approach achieves high coverage of both the code and the protocol states. The proposed method is implemented and applied to test numerous real-world network protocol implementations. The experimental results indicate that the proposed method is more effective than traditional fuzzing methods such as SPIKE at detecting vulnerabilities in the deep states of network protocol implementations.
Mark, S.; Khomchenko, S.; Shifrin, M.; Haviv, Y.; Schwartz, J.R.; Orion, I.
We at the Negev Monte Carlo Research Center (NMCRC) have developed a powerful new interface for writing and executing FLUKA input files-TVF-NMCRC. With the TVF tool a FLUKA user has the ability to easily write an input file without requiring any previous experience. The TVF-NMCRC tool is a LINUX program that has been verified for the most common LINUX-based operating systems, and is suitable for the latest version of FLUKA (FLUKA 2006.3)
Willcock, J J; Lumsdaine, A; Quinlan, D J
Tabled execution is a generalization of memorization developed by the logic programming community. It not only saves results from tabled predicates, but also stores the set of currently active calls to them; tabled execution can thus provide meaningful semantics for programs that seemingly contain infinite recursions with the same arguments. In logic programming, tabled execution is used for many purposes, both for improving the efficiency of programs, and making tasks simpler and more direct to express than with normal logic programs. However, tabled execution is only infrequently applied in mainstream functional languages such as Scheme. We demonstrate an elegant implementation of tabled execution in Scheme, using a mix of continuation-passing style and mutable data. We also show the use of tabled execution in Scheme for a problem in formal language and automata theory, demonstrating that tabled execution can be a valuable tool for Scheme users.
Douglas M. Pase
Full Text Available Many programming models for massively parallel machines exist, and each has its advantages and disadvantages. In this article we present a programming model that combines features from other programming models that (1 can be efficiently implemented on present and future Cray Research massively parallel processor (MPP systems and (2 are useful in constructing highly parallel programs. The model supports several styles of programming: message-passing, data parallel, global address (shared data, and work-sharing. These styles may be combined within the same program. The model includes features that allow a user to define a program in terms of the behavior of the system as a whole, where the behavior of individual tasks is implicit from this systemic definition. (In general, features marked as shared are designed to support this perspective. It also supports an opposite perspective, where a program may be defined in terms of the behaviors of individual tasks, and a program is implicitly the sum of the behaviors of all tasks. (Features marked as private are designed to support this perspective. Users can exploit any combination of either set of features without ambiguity and thus are free to define a program from whatever perspective is most appropriate to the problem at hand.
Infrastructure software projects: 1. The earliest phases or spiral cycles will generally involve prototyping, using the Application Composition model...case needs to achieve, expected results, post conditions, information about the environment, infrastructure to support execution of the tests, and...Leanpub, Feb. 5, 2016. [Online]. Available: http:// microservices -book.com/. Accessed Aug. 20, 2016.  M. Farah-Stapleton, M. Auguston, R. Madachy
aspects of the architecture that were not apparent at the onset. This is a major benefit of employing multiple modeling techniques because each...still under development. Finding ways to improve the developing methods, or perhaps even expand 4 the incumbent ones, will benefit the wide array of...assets constantly monitor bingo conditions— the point at which the unit is no longer SAR capable and has just enough fuel remaining to execute a safe and
Dekmezian, Mhair; Beal, Stacy G; Damashek, Mary Jane; Benavides, Raul; Dhiman, Neelam
Successful performance and execution of rapid diagnostics in a clinical laboratory hinges heavily on careful validation, accurate and timely communication of results, and real-time quality monitoring. Laboratories must develop strategies to integrate diagnostics with stewardship and evidence-based clinical practice guidelines. We present a collaborative SUCCESS model for execution and monitoring of rapid sepsis diagnostics to facilitate timely treatment. Six months after execution of the Verigene Gram-Positive Blood Culture (BC-GP) and the AdvanDx PNA-FISH assays, data were collected on 579 and 28 episodes of bacteremia and fungemia, respectively. Clinical testing was executed using a SUCCESS model comprising the following components: stewardship, utilization of resources, core strategies, concierge services, education, support, and surveillance. Stewardship needs were identified by evaluating the specialty services benefiting from new testing. Utilization of resources was optimized by reviewing current treatment strategies and antibiogram and formulary options. Core strategies consisted of input from infectious disease leadership, pharmacy, and laboratory staff. Concierge services included automated Micro-eUpdate and physician-friendly actionable reports. Education modules were user-specific, and support was provided through a dedicated 24/7 microbiology hotline. Surveillance was performed by daily audit by the director. Using the SUCCESS model, the turnaround time for the detailed report with actionable guidelines to the physician was ∼3 hours from the time of culture positivity. The overall correlation between rapid methods and culture was 94% (546/579). Discrepant results were predominantly contaminants such as a coagulase-negative staphylococci or viridans streptococci in mixed cultures. SUCCESS is a cost-effective and easily adaptable model for clinical laboratories with limited stewardship resources.
Anderson, Matthew; Kaiser, Hartmut; Neilsen, David; Sterling, Thomas
The addition of nuclear and neutrino physics to general relativistic fluid codes allows for a more realistic description of hot nuclear matter in neutron star and black hole systems. This additional microphysics requires that each processor have access to large tables of data, such as equations of state, and in large simulations the memory required to store these tables locally can become excessive unless an alternative execution model is used. In this talk we present neutron star evolution results obtained using a message driven multi-threaded execution model known as ParalleX as an alternative to using a hybrid MPI-OpenMP approach. ParalleX provides the user a new way of computation based on message-driven flow control coordinated by lightweight synchronization elements which improves scalability and simplifies code development. We present the spectrum of radial pulsation frequencies for a neutron star with the Shen equation of state using the ParalleX execution model. We present performance results for an open source, distributed, nonblocking ParalleX-based tabulated equation of state component capable of handling tables that may even be too large to read into the memory of a single node.
O. V. Lavrukhin
Full Text Available Purpose. The aim of this research work is to develop an intelligent technology for determination of the optimal route of freight trains administration on the basis of the technical and technological parameters. This will allow receiving the operational informed decisions by the station duty officer regarding to the train operation execution within the railway station. Metodology. The main elements of the research are the technical and technological parameters of the train station during the train operation. The methods of neural networks in order to form the self-teaching automated system were put in the basis of the generated model of train operation execution. Findings. The presented model of train operation execution at the railway station is realized on the basis of artificial neural networks using learning algorithm with a «teacher» in Matlab environment. The Matlab is also used for the immediate implementation of the intelligent automated control system of the train operation designed for the integration into the automated workplace of the duty station officer. The developed system is also useful to integrate on workplace of the traffic controller. This proposal is viable in case of the availability of centralized traffic control on the separate section of railway track. Originality. The model of train station operation during the train operation execution with elements of artificial intelligence was formed. It allows providing informed decisions to the station duty officer concerning a choice of rational and a safe option of reception and non-stop run of the trains with the ability of self-learning and adaptation to changing conditions. This condition is achieved by the principles of the neural network functioning. Practical value. The model of the intelligent system management of the process control for determining the optimal route receptionfor different categories of trains was formed.In the operational mode it offers the possibility
ll)NALHuKAU U STADARS 19,l liii I0 AAfR WAR COLLG RESEARCH REPORT LZ? e lNo. AU-AWC-35-011 - N4 AFCOMS: DOES THIS SQA NEED AN EXECUTIVE cm...high incidence of safety and physical security violations. Serious deficiencies existed in the financial management area. Accounting errors were not
Full Text Available To facilitate analysis and understanding of biological systems, large-scale data are often integrated into models using a variety of mathematical and computational approaches. Such models describe the dynamics of the biological system and can be used to study the changes in the state of the system over time. For many model classes, such as discrete or continuous dynamical systems, there exist appropriate frameworks and tools for analyzing system dynamics. However, the heterogeneous information that encodes and bridges molecular and cellular dynamics, inherent to fine-grained molecular simulation models, presents significant challenges to the study of system dynamics. In this paper, we present an algorithmic information theory based approach for the analysis and interpretation of the dynamics of such executable models of biological systems. We apply a normalized compression distance (NCD analysis to the state representations of a model that simulates the immune decision making and immune cell behavior. We show that this analysis successfully captures the essential information in the dynamics of the system, which results from a variety of events including proliferation, differentiation, or perturbations such as gene knock-outs. We demonstrate that this approach can be used for the analysis of executable models, regardless of the modeling framework, and for making experimentally quantifiable predictions.
Denardo, Eric V
Introduction to sequential decision processes covers use of dynamic programming in studying models of resource allocation, methods for approximating solutions of control problems in continuous time, production control, more. 1982 edition.
Caroline de Oliveira Cardoso
Full Text Available ABSTRACT Objective: The goal of this study was to describe the construction process and content validity evidence of an early and preventive intervention program for stimulating executive functions (EF in Elementary School children within the school environment. Methods: The process has followed the recommended steps for creating neuropsychological instruments: internal phase of program organization, with literature search and analyses of available materials in the classroom; program construction; analysis by expert judges; data integration and program finalization. To determine the level of agreement among the judges, a Content Validity Index (CVI was calculated. Results: Content validity was evidenced by the agreement among the experts with regards to the program, both in general and for each activity. All steps taken were deemed necessary because they contributed to the identification of positive aspects and possible flaws in the process Conclusion: The steps also helped to adapt stimuli and improve program tasks and activities. Methodological procedures implemented in this study can be adopted by other researchers to create or adapt neuropsychological stimulation and rehabilitation programs. Furthermore, the methodological approach allows the reader to understand, in detail, the technical and scientific rigor adopted in devising this program.
Full Text Available The paper analyses the effectiveness of the best leadership models and behaviours that are used by sales executives. Although, the impact of the most common characteristics and behaviours used in sales have been researched very well, recent research has only very limited results on a comparison of the most successful leadership personality traits and behaviours depending on companies’ situation and the context. In this paper we analyse two different situations: dynamic environment and stable environment. The context in this paper refers to the area sales. The paper will provide an explanation of the most successful leadership style used in sales and the characteristics of it and why they are also very useful in a sales process. Additionally, we will analyse the effectiveness of these characteristics in the two different situations of a company and exclusively in the sales context. We argue that the characteristics can be defined as the most effective one in every situation and context. We will develop a theoretical overview that shows clearly the best leadership traits and behaviours for each of the two situations in sales that is based on a literature research. The theoretical frameworks will be adjusted and confirmed with three senior sales executives from three different industries. The results of this paper will provide sales executives useful and easy to understand information about the advantages and disadvantages of the different leadership traits and behaviours depending on the context in and situation.
Curran, R. T.
A flight computer functional executive design for the reusable shuttle is presented. The design is given in the form of functional flowcharts and prose description. Techniques utilized in the regulation of process flow to accomplish activation, resource allocation, suspension, termination, and error masking based on process primitives are considered. Preliminary estimates of main storage utilization by the Executive are furnished. Conclusions and recommendations for timely, effective software-hardware integration in the reusable shuttle avionics system are proposed.
In 1993, I tested a radio-controlled airplane designed by Jim Walker of Brigham Young University for low-elevation aerial photography. Model-air photography retains most of the advantages of standard aerial photography --- the photographs can be used to detect lineaments, to map roads and buildings, and to construct stereo pairs to measure topography --- and it is far less expensive. Proven applications on the Oak Ridge Reservation include: updating older aerial records to document new construction; using repeated overflights of the same area to capture seasonal changes in vegetation and the effects of major storms; and detecting waste trench boundaries from the color and character of the overlying grass. Aerial photography is only one of many possible applications of radio-controlled aircraft. Currently, I am funded by the Department of Energy's Office of Technology Development to review the state of the art in microavionics, both military and civilian, to determine ways this emerging technology can be used for environmental site characterization. Being particularly interested in geophysical applications, I am also collaborating with electrical engineers at Oak Ridge National Laboratory to design a model plane that will carry a 3-component flux-gate magnetometer and a global positioning system, which I hope to test in the spring of 1994
Martin FG Schaffernicht
Full Text Available This article contributes a reference of causal attributions made by vineyard executives in Chile, where increasing costs and stagnating prices challenge the vineyardsâ profits. The investigation was motivated by the question how executives interpret the industry's mid term future and how they reflect on steering their companies. Based on in-depth interviews, causal maps were elaborated to represent the executivesâ mental models. These are represented as sequences of attributions, connecting variables by causal links. It was found that some mental models guide policies bound to increase the prices, whereas other models suggest taking the prices as givens and control costs. The collection of causal attributions of the vineyard executives (CAVE has been made publicly available. As a result, CAVE can be used by other management scholars to elicit other executivesâ mental models and increase the data base available. Since such research will be cumulative, a minimum size for meaningful statistical analysis can be reached, opening up an avenue for improving the design of business policies. CAVE can also serve executives and consultants in constructing causal argumentations and business policies. Future research and development of supporting software are called for. Keywords: Mental models, Strategy, Business model
Full Text Available Globally, blended learning (BL technologies have been increasingly applied in a variety of fields, both public and private sectors. In recent years, universities, public and private businesses and organizations are among those employing blended learning methods and technologies in training and re-training of professionals in the workforce. In Malaysia, the increasing use of blended learning to enhance learning and enriching of soft skills among professionals and individuals in the work place is evident. The advancement of technology is an onset to many new avenues and tool for learning and teaching, and it is the coalescing of these various technologies with particular pedagogy or andragogy has helped to popularize BL. However, when an institution makes the critical choice of delivery methods, it is pertinent that the university needs to consider various success factors. One in particular is student-centered approach that entails the need to understand the students as the beneficiary of learning, and the support system they need to help them learn. This qualitative study reports in detail the experience of a small group of students undertaking Executive Diplomas at Executive Development Centre (EDC, Universiti Utara Malaysia as they progress through their Executive program. This paper looks at learning experiences as described by the learners- it is their story, their experience, and their perspective. This study suggests that BL offered a comfortable middle ground, and has lots of potential in higher education in Malaysia. It is a pedagogical alternative that could play a significant role not only for teaching Business Communication, but has the potential to promote lifelong learning initiatives in Malaysia in a much meaningful and inviting way. Although this study shows that BL contributed a significant and meaningful learning particularly for adult learners, it needs more definitive studies. Such information can be used to guide policy makers
Executive Energy Leadership Academy Executive Energy Leadership Academy NREL's Executive Energy Leadership Academy is a nationally renowned program that provides non-technical business, governmental, and foreground. Leadership Program The Leadership Program is designed for community and industry leaders with an
The U.S. Department of Energy's Office of Propulsion Systems provides support for an Electrochemical Energy Storage Program, that includes research and development (R ampersand D) on advanced rechargeable batteries and fuel cells. A major goal of this program is to develop electrochemical power sources suitable for application in electric vehicles (EVs). The program centers on advanced systems that offer the potential for high performance and low life-cycle costs, both of which are necessary to permit significant penetration into commercial markets. The DOE Electrochemical Energy Storage Program is divided into two projects: the Electric Vehicle Advanced Battery Systems (EVABS) Development Program and the Exploratory Technology Research (ETR) Program. The EVABS Program management responsibility has been assigned to Sandia National Laboratories (SNL); Lawrence Berkeley Laboratory (LBL) is responsible for management of the FIR Program. The EVABS and ETR Programs include an integrated matrix of R ampersand D efforts designed to advance progress on selected candidate electrochemical systems. The United States Advanced Battery Consortium (USABC), a tripartite undertaking between DOE, the U.S. automobile manufacturers and the Electric Power Research Institute (EPRI), was formed in 1991 to accelerate the development of advanced batteries for consumer EVs. The role of the FIR Program is to perform supporting research on the advanced battery systems under development by the USABC and EVABS Program, and to evaluate new systems with potentially superior performance, durability and/or cost characteristics. The specific goal of the ETR Program is to identify the most promising electrochemical technologies and transfer them to the USABC, the battery industry and/or the EVABS Program for further development and scale-up. This report summarizes the research, financial and management activities relevant to the ETR Program in CY 1993
Jørgensen, Jens Bæk
-level requirements and more technical software specifications. In MDD, userlevel requirements are not always explicitly described; it is sufficient for MDD that a specification, or platformindependent model, of the software that we are going to develop is provided. Therefore, a combination of EUCs and MDD may have......Executable Use Cases (EUCs) is a model-based approach to requirements engineering. In the introduction to this paper, we briefly discuss how EUCs may be used as a supplement to Model-Driven Development (MDD). Then we present the EUC approach in more detail. An EUC can describe and link user...... potential to cover the full software engineering path from user-level requirements via specifications to implementations of running computer systems....
Bignoux, Stephane; Sund, Kristian J.
Studies of learning and student satisfaction in the context of online university programs have largely neglected programs catering specifically to business executives. Such executives have typically been away from higher education for a number of years, and have collected substantial practical...... experience in the subject matters they are taught. Their expectations in terms of both content and delivery may therefore be different from non-executive students. We explore perceptions of the quality of tutoring in the context of an online executive MBA program through participant interviews. We find...... that in addition to some of the tutor behaviors already discussed in the literature, executive students look specifically for practical industry knowledge and experience in tutors, when judging how effective a tutor is. This has implications for both the recruitment and training of online executive MBA tutors....
Ozmen, Ozgur [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nutaro, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); New, Joshua Ryan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
A Functional Mock-up Interface (FMI) defines a standardized interface to be used in computer simulations to develop complex cyber-physical systems. FMI implementation by a software modeling tool enables the creation of a simulation model that can be interconnected, or the creation of a software library called a Functional Mock-up Unit (FMU). This report describes an FMU wrapper implementation that imports FMUs into a C++ environment and uses an Euler solver that executes FMUs in parallel using Open Multi-Processing (OpenMP). The purpose of this report is to elucidate the runtime performance of the solver when a multi-component system is imported as a single FMU (for the whole system) or as multiple FMUs (for different groups of components as sub-systems). This performance comparison is conducted using two test cases: (1) a simple, multi-tank problem; and (2) a more realistic use case based on the Modelica Buildings Library. In both test cases, the performance gains are promising when each FMU consists of a large number of states and state events that are wrapped in a single FMU. Load balancing is demonstrated to be a critical factor in speeding up parallel execution of multiple FMUs.
Social intelligence is the ability to understand others and the social context effectively and thus to interact with people successfully. Research has suggested that the theory of mind (ToM) and executive function may play important roles in explaining social intelligence. The specific aim of the present study was to test with structural equation modeling (SEM) the hypothesis that performance on ToM tasks is more associated with social intelligence in the elderly than is performance on executive functions. One hundred and seventy-seven participants (age 56-96) completed ToM, executive function, and other basic cognition tasks, and were rated with social intelligence scales. The SEM results showed that ToM and executive function were strongly correlated (0.54); however, only the path coefficient from ToM to social intelligence, and not from executive function, was significant (0.37). ToM performance, but not executive function, was strongly correlated with social intelligence among elderly individuals. ToM and executive function might play different roles in social behavior during normal aging; however, based on the present results, it is possible that ToM might play an important role in social intelligence.
Design and field execution of seismic experiments are described that recorded the characteristics of seismic signals from single and multiple explosions conducted at the Nevada Test Site in Yucca Flat, Nevada. Most of the data were obtained from small-scale underground explosions (total yields ranged from a fraction of a pound to 100 lb of explosives) that were designed to permit characterization of seismic signals as a function of explosive-source configuration. Other data were from explosions conducted in the area by others: two underground nuclear detonations with yields below 40 kt each and several surface explosions whose yields ranged from 700 lb to 100 tons. The project included a comprehensive study of the Yucca lake bed, close-range recording of seismic signals from explosions, and excavation of cavities generated by small-scale high-explosive charges. 60 figures, 14 tables
Rust, B.W.; Mankin, J.B.
Due to the recent emphasis on mathematical modeling, many ecologists are using mathematics and computers more than ever, and engineers, mathematicians and physical scientists are now included in ecological projects. However, the individual ecologist, with intuitive knowledge of the system, still requires the means to critically examine and adjust system models. An interactive program was developed with the primary goal of allowing an ecologist with minimal experience in either mathematics or computers to develop a system model. It has also been used successfully by systems ecologists, engineers, and mathematicians. This program was written in FORTRAN for the DEC PDP-10, a remote terminal system at Oak Ridge National Laboratory. However, with relatively minor modifications, it can be implemented on any remote terminal system with a FORTRAN IV compiler, or equivalent. This program may be used to simulate any phenomenon which can be described as a system of ordinary differential equations. The program allows the user to interactively change system parameters and/or initial conditions, to interactively select a set of variables to be plotted, and to model discontinuities in the state variables and/or their derivatives. One of the most useful features to the non-computer specialist is the ability to interactively address the system parameters by name and to interactively adjust their values between simulations. These and other features are described in greater detail
Choi, Hye Lim
This study examined the effectiveness of the Executive Master of Business Administration (EMBA) degree program in terms of transfer of knowledge and leadership practices. Based on a review of literature related to adult learning theories, EMBA programs, the importance of evaluation practices, and leadership practices, this study was designed to…
Bennedsen, Jens; Schulte, Carsten
This article reports on an experiment undertaken in order to evaluate the effect of a program visualization tool for helping students to better understand the dynamics of object-oriented programs. The concrete tool used was BlueJ's debugger and object inspector. The study was done as a control-group experiment in an introductory programming…
Gettens, Katelyn M; Gorin, Amy A
Weight loss maintenance is a complex, multifaceted process that presents a significant challenge for most individuals who lose weight. A growing body of literature indicates a strong relationship between cognitive dysfunction and excessive body weight, and suggests that a subset of high-order cognitive processes known as executive functions (EF) likely play an important role in weight management. Recent reviews cover neuropsychological correlates of weight status yet fail to address the role of executive function in the central dilemma of successful weight loss maintenance. In this paper, we provide an overview of the existing literature examining executive functions as they relate to weight status and initial weight loss. Further, we propose a novel conceptual model of the relationships between EF, initial weight loss, and weight loss maintenance, mapping specific executive functions onto strategies known to be associated with both phases of the weight control process. Implications for the development of more efficacious weight loss maintenance interventions are discussed.
Russell, M.L.; McCardell, R.K.; Broughton, J.M.
The purpose of the TMI-2 Accident Evaluation Program Sample Acquisition and Examination (TMI-2 AEP SA and E) program is to develop and implement a test and inspection plan that completes the current-condition characterization of (a) the TMI-2 equipment that may have been damaged by the core damage events and (b) the TMI-2 core fission product inventory. The characterization program includes both sample acquisitions and examinations and in-situ measurements. Fission product characterization involves locating the fission products as well as determining their chemical form and determining material association
Jones, Kennie H.; Randall, Donald P.; Stallcup, Scott S.; Rowell, Lawrence F.
The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational data base management system. In volume 2, the use of a SYSTEM LIBRARY PROCESSOR is used to construct a DATA DICTIONARY describing all relations defined in the data base, and a TEMPLATE LIBRARY. A TEMPLATE is a description of all subsets of relations (including conditional selection criteria and sorting specifications) to be accessed as input or output for a given application. Together, these form the SYSTEM LIBRARY which is used to automatically produce the data base schema, FORTRAN subroutines to retrieve/store data from/to the data base, and instructions to a generic REVIEWER program providing review/modification of data for a given template. Automation of these functions eliminates much of the tedious, error prone work required by the usual approach to data base integration.
Nicholas, Marjorie; Sinotte, Michele P; Helm-Estabrooks, Nancy
Learning how to use a computer-based communication system can be challenging for people with severe aphasia even if the system is not word-based. This study explored cognitive and linguistic factors relative to how they affected individual patients' ability to communicate expressively using C-Speak Aphasia (CSA), an alternative communication computer program that is primarily picture-based. Ten individuals with severe non-fluent aphasia received at least six months of training with CSA. To assess carryover of training, untrained functional communication tasks (i.e., answering autobiographical questions, describing pictures, making telephone calls, describing a short video, and two writing tasks) were repeatedly probed in two conditions: (1) using CSA in addition to natural forms of communication, and (2) using only natural forms of communication, e.g., speaking, writing, gesturing, drawing. Four of the 10 participants communicated more information on selected probe tasks using CSA than they did without the computer. Response to treatment was also examined in relation to baseline measures of non-linguistic executive function skills, pictorial semantic abilities, and auditory comprehension. Only nonlinguistic executive function skills were significantly correlated with treatment response.
Nicholas, Marjorie; Sinotte, Michele P.; Helm-Estabrooks, Nancy
Learning how to use a computer-based communication system can be challenging for people with severe aphasia even if the system is not word-based. This study explored cognitive and linguistic factors relative to how they affected individual patients’ ability to communicate expressively using C-Speak Aphasia, (CSA), an alternative communication computer program that is primarily picture-based. Ten individuals with severe non-fluent aphasia received at least six months of training with CSA. To assess carryover of training, untrained functional communication tasks (i.e., answering autobiographical questions, describing pictures, making telephone calls, describing a short video, and two writing tasks) were repeatedly probed in two conditions: 1) using CSA in addition to natural forms of communication, and 2) using only natural forms of communication, e.g., speaking, writing, gesturing, drawing. Four of the ten participants communicated more information on selected probe tasks using CSA than they did without the computer. Response to treatment also was examined in relation to baseline measures of non-linguistic executive function skills, pictorial semantic abilities, and auditory comprehension. Only nonlinguistic executive function skills were significantly correlated with treatment response. PMID:21506045
Марія Костянтинівна СУХОНОС
Full Text Available A conceptual model of the life cycle of the program is proposed. This model is based on the value approach. As a resulting index, it uses a category of complex structural value. This model renders the process of the life cycle of the program in the context of time/result. It assumes the presence of four basic phases of the life cycle, namely, initiation, planning, executing and closing. Also, this model formalizes interconnection of management processes of integration of program and management of its community and subprocesses. Selection of a value approach for the forming of a resulting index of a program determines by a variety of results of the program. This is a result of its variety and complexity in the process of finding a criterion for evaluation. Worked out a mechanism for assessing the value of the program. It consists of four steps and involves using of conventional methods (decomposition and expert estimates. As a unit of measurement assumes to use points and rating scale with the maximum score a hundred points. A complex value, which is evaluated at one hundred points, is a result of the program. It is critically important in the process of current and final evaluation of the program.
This Report on Activities and Programs for Countering Proliferation and NBC Terrorism is submitted to the United States Congress as required by the 1994 National Defense Authorization Act (NDAA) (as amended...
The overall objective of the urban maglev transit technology development program is to develop magnetic levitation technology that is a cost effective, reliable, and environmentally sound transit option for urban mass transportation in the United Sta...
Anderson, Joseph S; Warnick, David A
...: the implementation of Program Budget Decision 753, the deterioration of the matrix support structure, the aging of the civilian workforce, and the influx of new organizations resulting from the Base...
Donald Kerwin; Robert Warren
The Obama administration has developed two broad programs to defer immigration enforcement actions against undocumented persons living in the United States: (1) Deferred Action for Parents of Americans and Lawful Permanent Residents (DAPA); and (2) Deferred Action for Childhood Arrivals (DACA). The DACA program, which began in August 2012, was expanded on November 20, 2014. DAPA and the DACA expansion (hereinafter referred to as “DACA-plus”) are currently under review by the US Supreme Court ...
Vladuca, G.; Deberth, C.
The program GIGMF computes the differential and integrated statistical model cross sections for the reactions proceeding through a compound nuclear stage. The computational method is based on the Hauser-Feshbach-Wolfenstein theory, modified to include the modern version of Tepel et al. Although the program was written for a PDP-15 computer, with 16K high speed memory, many reaction channels can be taken into account with the following restrictions: the pro ectile spin must be less than 2, the maximum spin momenta of the compound nucleus can not be greater than 10. These restrictions are due solely to the storage allotments and may be easily relaxed. The energy of the impinging particle, the target and projectile masses, the spin and paritjes of the projectile, target, emergent and residual nuclei the maximum orbital momentum and transmission coefficients for each reaction channel are the input parameters of the program. (author)
Bervoets, Joachim; Jonkman, Lisa M; Mulkens, Sandra; de Vries, Hein; Kok, Gerjo
BACKGROUND: Executive functions are higher cognitive control functions, which are essential to physical and psychological well-being, academic performance, and healthy social relationships. Executive functions can be trained, albeit without broad transfer, to this date. Broad transfer entails the
Full Text Available Executive functions (EF have been defined as a series of higher-order cognitive processes which allow the control of thought, behavior and affection according to the achievement of a goal. Such processes present a lengthy postnatal development which matures completely by the end of adolescence. In this article we make a review of some of the main models of EF development during childhood. The aim of this work is to describe the state of the art related to the topic, identifying the main theoretical difficulties and methodological limitations associated with the different proposed paradigms. Finally, some suggestions are given to cope with such difficulties, emphasizing that the development of an ontology of EF could be a viable alternative to counter them. We believe that futture researches should guide their efforts toward the development of that ontology.
A general overview of the Waste Isolation Pilot Plant transuranic wastes experimental characterization program is presented. Objectives and outstanding concerns of this program are discussed. Characteristics of transuranic wastes are also described. Concerns for the terminal isolation of such wastes in a deep bedded salt facility are divided into two phases, those during the short-term operational phase of the facility, and those potentially occurring in the long-term, after decommissioning of the repository. An inclusive summary covering individual studies, their importance to the Waste Isolation Pilot Plant, investigators, general milestones, and comments are presented
This is an executive summary to a report on the Hawaii Energy Strategy Program. The topics of the report include the a description of the program including an overview, objectives, policy statement and purpose and objectives; energy strategy policy development; energy strategy projects; current energy situation; modeling Hawaii`s energy future; energy forecasts; reducing energy demand; scenario assessment, and recommendations.
Full Text Available In recent years a number of organisations have implemented executive information systems (EIS in order to improve the performance of their executives’ jobs. Although the use of EIS is important in executives’ work, the majority of executives are unwilling to use EIS applications because of their design flaws. By using social factors, habits and facilitation condition variables from Triandis’ framework, this paper extends the Technology Acceptance Model (TAM to derive useful variables to address the problem of the low usage of EIS by executives. This paper reports on research in progress in Australia on the adoption and usage of EIS by executives. The preliminary results suggest that executives’ experiences in EIS positively relates to their experiences in computer-based information systems. The results also suggest there is a high degree of perceived usefulness, perceived ease of use as well as positive attitudes towards using EIS. Further, the results suggest that executives consider social factors in using EIS in their work. Moreover, the results suggest that facilitating conditions such as EIS development process, EIS management process and organisational environment are strongly related to the adoption and usage of EIS by executives. Finally, the results suggest a higher degree of EIS usage by middle managers than top-level managers, which an EIS was meant to support.
Luckow, Kasper Søe; Thomsen, Bent; Frost, Christian
We present a novel tool for statically determining the Worst Case Execution Time (WCET) of Java Bytecode-based programs called Tool for Execution Time Analysis of Java bytecode (TetaJ). This tool differentiates itself from existing tools by separating the individual constituents of the execution...... environment into independent components. The prime benefit is that it can be used for execution environments featuring common embedded processors and software implementations of the JVM. TetaJ employs a model checking approach for statically determining WCET where the Java program, the JVM, and the hardware...
Billy G. Pemberton
The Forest and Rangeland Renewable Resources Planning Act of 1974 directs the Secretary of Agriculture to prepare an assessment of the nation's renewable resources and a program that will assure an adequate future supply of these resources. Responsibility for this work is assigned to the Forest Service. An inter-university symposium was held in 1976 to evaluate...
The BES Materials Sciences Program has the central theme of Scientifically Tailored Materials. The major objective of this program is to combine Sandia`s expertise and capabilities in the areas of solid state sciences, advanced atomic-level diagnostics and materials synthesis and processing science to produce new classes of tailored materials as well as to enhance the properties of existing materials for US energy applications and for critical defense needs. Current core research in this program includes the physics and chemistry of ceramics synthesis and processing, the use of energetic particles for the synthesis and study of materials, tailored surfaces and interfaces for materials applications, chemical vapor deposition sciences, artificially-structured semiconductor materials science, advanced growth techniques for improved semiconductor structures, transport in unconventional solids, atomic-level science of interfacial adhesion, high-temperature superconductors, and the synthesis and processing of nano-size clusters for energy applications. In addition, the program includes the following three smaller efforts initiated in the past two years: (1) Wetting and Flow of Liquid Metals and Amorphous Ceramics at Solid Interfaces, (2) Field-Structured Anisotropic Composites, and (3) Composition-Modulated Semiconductor Structures for Photovoltaic and Optical Technologies. The latter is a joint effort with the National Renewable Energy Laboratory. Separate summaries are given of individual research areas.
Bennedsen, Jens B.; Schulte, Carsten
-group experiment in an introductory programming course.The results of the experiment show that the students who used BlueJ?s debugger did not perform statistically significantly better than the students not using it; both groups profited about the same amount from the exercises given in the experiment. We discuss...
Hepp, Stefan; Schoeberl, Martin
optimization is method in lining. It is especially important for languages, like Java, where small setter and getter methods are considered good programming style. In this paper we present and explore WCET driven in lining of Java methods. We use the WCET analysis tool for the Java processor JOP to guide...
Biag, Manuelito; Raab, Erin; Hofstedt, Mary
Targeting students in grades K-8, Art in Action's program consists of 12 age-appropriate lessons per year led by parent and teacher volunteers. The curriculum is based on historically significant artists and their works of art. Through semi-structured discussions, students examine a variety of masterpieces, learning about the artist as well as…
MacAllum, Keith; Taylor, Susan Hubbard; Johnson, Amy Bell
The Lansing Area Manufacturing Partnership (LAMP) is an academically rigorous, business/labor-driven school-to-career program in Lansing, Michigan, that includes business, union, school, and parent partners and provides participating students with work-based learning experiences for 2.5 hours every day throughout their senior year. LAMP's…
Merrill, Lisa; Kang, David; Siman, Nina; Soltani, Jasmine
The iMentor College Ready Program is a model that combines school-based mentoring with technology and aspects of whole school reform. The program aims to create strong relationships between low-income youth and college-educated mentors--and to leverage these relationships to help students develop the mindsets, skills, and knowledge necessary to…
McGinnis, C.P.; Eisenhower, B.M.; Reeves, M.E.; DePaoli, S.M.; Stinton, L.H.; Harrington, E.H.
The Hazardous Waste Development, Demonstration, and Disposal (HAZWDDD) Program Plan provides a strategy for management of hazardous and mixed wastes generated by the five Department of Energy (DOE) installations managed by Martin Marietta Energy Systems, Inc. (Energy Systems). This integrated corporate plan is based on the individual installation plans, which identify waste streams, facility capabilities, problem wastes, future needs, and funding needs. Using this information, the corporate plan identifies common concerns and technology/facility needs over the next 10 years. The overall objective of this corporate plan is to ensure that treatment, storage, and disposal (TSD) needs for all hazardous and mixed wastes generated by Energy Systems installations have been identified and planned for. Specific objectives of the program plan are to (1) identify all hazardous and mixed waste streams; (2) identify hazardous and mixed waste TSD requirements; (3) identify any unresolved technical issues preventing implementation of the strategy; (4) develop schedules for studies, demonstrations, and facilities to resolve the issues; and (5) define the interfaces with the Low-Level Waste Disposal Development and Demonstration (LLWDDD) Program. 10 refs., 7 figs
Somogyi, Z.; Ramamohanarao, K.; Vaghani, J. (Univ. of Melbourne, Parkville (Australia))
The authors present the first backtracking algorithm for stream AND-parallel logic programs. It relies on compile-time knowledge of the data flow graph of each clause to let it figure out efficiently which goals to kill or restart when a goal fails. This crucial information, which they derive from mode declarations, was not available at compile-time in any previous stream AND-parallel system. They show that modes can increase the precision of the backtracking algorithm, though their algorithm allows this precision to be traded off against overhead on a procedure-by-procedure and call-by-call basis. The modes also allow their algorithm to handle efficiently programs that manipulate partially instantiated data structures and an important class of programs with circular dependency graphs. On code that does not need backtracking, the efficiency of their algorithm approaches that of the committed-choice languages; on code that does need backtracking its overhead is comparable to that of the independent AND-parallel backtracking algorithms.
The programmer's task is often taken to be the construction of algorithms, expressed in hierarchical structures of procedures: this view underlies the majority of traditional programming languages, such as Fortran. A different view is appropriate to a wide class of problem, perhaps including some problems in High Energy Physics. The programmer's task is regarded as having three main stages: first, an explicit model is constructed of the reality with which the program is concerned; second, this model is elaborated to produce the required program outputs; third, the resulting program is transformed to run efficiently in the execution environment. The first two stages deal in network structures of sequential processes; only the third is concerned with procedure hierarchies. (orig.)
Comprehensive German introduction to the structure and operation of a relatively comfortable realtime operating system for process control by VORTEX-I for the V73 of the VARIAN-Company. This is a disc oriented single user system with parallel foreground and background operation. The complete organization of the system-nucleus is described (for input/output exists a separate external report). To the more experienced user of the V73 the existing facilities of interacting with the operating system (operator commands, interrupts and program calls) are exactly described regarding to their internal working. (orig./WB) [de
McDade, Sharon A; Richman, Rosalyn C; Jackson, Gregg B; Morahan, Page S
This study measured the impact of participation by women academics in the Executive Leadership in Academic Medicine (ELAM) program as part of a robust evaluation agenda. The design is a classic pre/post, within-group, self-report study. The survey elicits self-perception about leadership in ten constructs: knowledge of leadership, management, and organizational theory; environmental scanning; financial management; communication; networking and coalition building; conflict management; general leadership; assessment of strengths and weaknesses; acceptance of leadership demands; and career advancement sophistication. The post surveys inquire additionally about perceived program usefulness. Data were collected from 79 participants (1997-98, 1998-99, and 2000-01 classes). Response rates were nearly 100% (pre) and 69% to 76% (post). Statistically significant increases (p leadership capabilities were identified across all ten leadership constructs. Gains were large in knowledge of leadership and organizational theory, environmental scanning, financial management, and general leadership. Gains in career building knowledge were large to moderate. More modest were gains in communication, networking, and conflict management. There were significant correlations between each leadership construct and perceived usefulness of the program. Significant improvements were reported on all leadership constructs, even when participants viewed themselves as already skilled. While it cannot be concluded that participation in ELAM directly and solely caused all improvements, it seems unlikely that midcareer women faculty would improve on all ten constructs in 11 months after program completion by natural maturation alone. Future research will investigate whether the changes are due to ELAM or other factors, and assess whether participants show more rapid advancement into leadership than comparable women not participating in ELAM.
Tillema, M.; van den Bergh, H.; Rijlaarsdam, G.; Sanders, T.
Current theory about writing states that the quality of (meta)cognitive processing (i.e. planning, text production, revising, et cetera) is, at least partly, determined by the temporal distribution of (meta)cognitive activities across task execution. Put simply, the quality of task execution is
Full Text Available The Obama administration has developed two broad programs to defer immigration enforcement actions against undocumented persons living in the United States: (1 Deferred Action for Parents of Americans and Lawful Permanent Residents (DAPA; and (2 Deferred Action for Childhood Arrivals (DACA. The DACA program, which began in August 2012, was expanded on November 20, 2014. DAPA and the DACA expansion (hereinafter referred to as “DACA-plus” are currently under review by the US Supreme Court and subject to an active injunction.This paper offers a statistical portrait of the intended direct beneficiaries of DAPA, DACA, and DACA-plus. It finds that potential DAPA, DACA, and DACA-plus recipients are deeply embedded in US society, with high employment rates, extensive US family ties, long tenure, and substantial rates of English-language proficiency. The paper also notes various groups that would benefit indirectly from the full implementation of DAPA and DACA or, conversely, would suffer from the removal of potential beneficiaries of these programs. For example, all those who would rely on the retirement programs of the US government will benefit from the high employment rates and relative youth of the DACA population, while many US citizens who rely on the income of a DAPA-eligible parent would fall into poverty or extreme poverty should that parent be removed from the United States.This paper offers an analysis of potential DAPA and DACA beneficiaries. In an earlier study, the authors made the case for immigration reform based on long-term trends related to the US undocumented population, including potential DAPA and DACA beneficiaries (Warren and Kerwin 2015. By contrast, this paper details the degree to which these populations have become embedded in US society. It also compares persons eligible for the original DACA program with those eligible for DACA-plus.As stated, the great majority of potential DAPA and DACA recipients enjoy strong family
Richman, R C; Morahan, P S; Cohen, D W; McDade, S A
Women are persistently underrepresented in the higher levels of academic administration despite the fact that they have been entering the medical profession in increasing numbers for at least 20 years and now make up a large proportion of the medical student body and fill a similar proportion of entry level positions in medical schools. Although there are no easy remedies for gender inequities in medical schools, strategies have been proposed and implemented both within academic institutions and more broadly to achieve and sustain the advancement of women faculty to senior level positions. Substantial, sustained efforts to increase programs and activities addressing the major obstacles to advancement of women must be put in place so that the contributions of women can be fully realized and their skills fittingly applied in meeting the medical education and healthcare needs of all people in the 21st century.
Zweizig, Douglas; Hopkins, Dianne McAfee
This document presents the executive summary of an evaluation of Library Power, a program of the DeWitt Wallace-Reader's Digest Fund to enhance and elevate the role of libraries in public schools. The report begins with an examination of Library Power's core components (collection development, facilities refurbishing, flexible scheduling,…
Pureza, Janice R.; Fonseca, Rochele P.
Introduction The importance of executive functions (EF) in childhood development, and their role as indicators of health, well-being, professional and academic success have been demonstrated by several studies in the literature. FE are cognitive processes that aim to control and manage behavior to achieve specific goal and included skills planning, inhibition, cognitive flexibility, (executive) attention and the central executive component of working memory (WM). In the context of education, the EF are crucial for continued learning and efficient academic performance due to their involvement in several components of the educational process. Objective The aim of this article was to describe the development and content validity of the CENA Program for Educational Training on the Neuropsychology of Learning, with an emphasis on executive functions and attention. Methods The study involved seven specialists (four responsible for evaluating the program, and three involved in brainstorming), and was carried out in three stages: Background research: neuropsychology and education; Program development - author brainstorming and Evaluation by expert judges The goals, language and methods. Results CENA Program were considered adequate, attesting to its content validity as a school-based neuropsychological intervention. Conclusion Teacher training in school neuropsychology may be an important area for future investment and contribute to academic achievement and student development in the Brazilian education system. PMID:29213497
Hoeyer Mortensen, K. [Aarhus Univ., Computer Science Dept. (Denmark); Pinci, V. [Meta Software Corporation, Cambridge, MA (United States)
In this paper we describe a modelling project to improve a nuclear waste management program in charge of the creation of a new system for the permanent disposal of nuclear waste. SADT (Structural Analysis and Design Technique) is used in order to provide a work-flow description of the functions to be performed by the waste management program. This description is then translated into a number of Coloured Petri Nets (CPN or CP-nets) corresponding to different program functions where additional behavioural inscriptions provide basis for simulation. Each of these CP-nets is simulated to produce timed event charts that are useful for understanding the behaviour of the program functions under different scenarios. Then all the CPN models are linked together to form a single stand-alone application that is useful for validating the interaction and cooperation between the different program functions. A technique for linking executable CPN models is developed for supporting large modelling projects and parallel development of independent CPN models. (au) 11 refs.
Heroux, Michael [US Dept. of Energy, Washington, DC (United States); Lethin, Richard [US Dept. of Energy, Washington, DC (United States)
Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that make design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.
Hoeyer Mortensen, K.; Pinci, V.
In this paper we describe a modelling project to improve a nuclear waste management program in charge of the creation of a new system for the permanent disposal of nuclear waste. SADT (Structural Analysis and Design Technique) is used in order to provide a work-flow description of the functions to be performed by the waste management program. This description is then translated into a number of Coloured Petri Nets (CPN or CP-nets) corresponding to different program functions where additional behavioural inscriptions provide basis for simulation. Each of these CP-nets is simulated to produce timed event charts that are useful for understanding the behaviour of the program functions under different scenarios. Then all the CPN models are linked together to form a single stand-alone application that is useful for validating the interaction and cooperation between the different program functions. A technique for linking executable CPN models is developed for supporting large modelling projects and parallel development of independent CPN models. (au) 11 refs
Bradford, Robert W.; Harrison, Denise
"We have a new strategy to grow our organization." Developing the plan is just the start. Implementing it in the organization is the real challenge. Many organizations don't fail due to lack of strategy; they struggle because it isn't effectively implemented. After working with hundreds of companies on strategy development, Denise and Robert have distilled the critical areas where organizations need to focus in order to enhance profitability through superior execution. If these questions are important to your organization, you'll find useful answers in the following articles: Do you find yourself overwhelmed by too many competing priorities? How do you limit how many strategic initiatives/projects your organization is working on at one time? How do you balance your resource requirements (time and money) with the availability of these resources? How do you balance your strategic initiative requirements with the day-to-day requirements of your organization?
Raver, C. Cybele; Blair, Clancy; Willoughby, Michael
In a predominantly low-income, population-based longitudinal sample of 1,259 children followed from birth, results suggest that chronic exposure to poverty and the strains of financial hardship were each uniquely predictive of young children’s performance on measures of executive functioning. Results suggest that temperament-based vulnerability serves as a statistical moderator of the link between poverty-related risk and children’s executive functioning. Implications for models of ecology and biology in shaping the development of children’s self-regulation are discussed. PMID:22563675
Fedotova , Irina; Krause , Bernd; Siemens , Eduard
Part 6: Embedded and Real Time Systems; International audience; This paper describes the application of statistical analysis of the timing behavior for a generic real-time task model. Using specific processor of ARM Cortex-A series and an empirical approach of time values retrieval, the algorithm to predict the upper bounds for the task of the time acquisition operation has been formulated. For the experimental verification of the algorithm, we have used the robust Measurement-Based Probabili...
Energy-efficient disinfection of sewage sludge, permitting its use as a fertilizer and soil conditioner in areas open to public access or on certain food chain crops, is possible using the process technology developed by Sandia National Laboratories under DOE and EPA joint support. This process accomplishes disinfection by gamma ray irradiation with cesium-137, a by-product isotope recovered from reprocessing of defense production waste. Disinfection with cesium-137 gamma irradiation provides an energy-efficient option for the Nation's cities to beneficially utilize sewage sludge, while at the same time conserving energy by utilizing a radioisotope, traditionally considered waste, in a beneficial manner. While the Sandia sludge irradiation technology has successfully completed its research and development phase, a major consideration remains: the introduction of a new technology into a marketplace which traditionally is skeptical of new products or process technologies until their performance is well proven. This document analyzes the factors important to market introduction of this new technology, develops options, and recommends a program strategy for transfer of the Sandia sludge irradiation technology to the marketplace by developing public awareness and acceptance, and by stimulating private sector commercialization interest
In the late 1980's and early 1990's, Lawrence Livermore National Laboratory was deeply engrossed in determining the next generation programming model for the Integrated Design Codes (IDC) beyond vectorization for the Cray 1s series of computers. The vector model, developed in mid 1970's first for the CDC 7600 and later extended from stack based vector operation to memory to memory operations for the Cray 1s, lasted approximately 20 years (See Slide 5). The Cray vector era was deemed an extremely long lived era as it allowed vector codes to be developed over time (the Cray 1s were faster in scalar mode than the CDC 7600) with vector unit utilization increasing incrementally over time. The other attributes of the Cray vector era at LLNL were that we developed, supported and maintained the Operating System (LTSS and later NLTSS), communications protocols (LINCS), Compilers (Civic Fortran77 and Model), operating system tools (e.g., batch system, job control scripting, loaders, debuggers, editors, graphics utilities, you name it) and math and highly machine optimized libraries (e.g., SLATEC, and STACKLIB). Although LTSS was adopted by Cray for early system generations, they later developed COS and UNICOS operating systems and environment on their own. In the late 1970s and early 1980s two trends appeared that made the Cray vector programming model (described above including both the hardware and system software aspects) seem potentially dated and slated for major revision. These trends were the appearance of low cost CMOS microprocessors and their attendant, departmental and mini-computers and later workstations and personal computers. With the wide spread adoption of Unix in the early 1980s, it appeared that LLNL (and the other DOE Labs) would be left out of the mainstream of computing without a rapid transition to these 'Killer Micros' and modern OS and tools environments. The other interesting advance in the period is that systems were being
Thomsen, Bent; Luckow, Kasper Søe; Thomsen, Lone Leth
frameworks, we have in recent years pursued an agenda of translating hard-real-time embedded safety critical programs written in the Safety Critical Java Profile  into networks of timed automata  and subjecting those to automated analysis using the UPPAAL model checker . Several tools have been...... built and the tools have been used to analyse a number of systems for properties such as worst case execution time, schedulability and energy optimization [12–14,19,34,36,38]. In this paper we will elaborate on the theoretical underpinning of the translation from Java programs to timed automata models...... and briefly summarize some of the results based on this translation. Furthermore, we discuss future work, especially relations to the work in [16,24] as Java recently has adopted first class higher order functions in the form of lambda abstractions....
Havelund, Klaus; Norvig, Peter (Technical Monitor)
This paper describes how two runtime analysis algorithms, an existing data race detection algorithm and a new deadlock detection algorithm, have been implemented to analyze Java programs. Runtime analysis is based on the idea of executing the program once. and observing the generated run to extract various kinds of information. This information can then be used to predict whether other different runs may violate some properties of interest, in addition of course to demonstrate whether the generated run itself violates such properties. These runtime analyses can be performed stand-alone to generate a set of warnings. It is furthermore demonstrated how these warnings can be used to guide a model checker, thereby reducing the search space. The described techniques have been implemented in the b e grown Java model checker called PathFinder.
National Aeronautics and Space Administration — In response to a need for cost-effective small satellite missions, Odyssey Space Research is proposing the development of a common flight software executive and a...
Faure, Jean-Baptiste; Marques-Carneiro, José E; Akimana, Gladys; Cosquer, Brigitte; Ferrandon, Arielle; Herbeaux, Karine; Koning, Estelle; Barbelivien, Alexandra; Nehlig, Astrid; Cassel, Jean-Christophe
Temporal lobe epilepsy is a relatively frequent, invalidating, and often refractory neurologic disorder. It is associated with cognitive impairments that affect memory and executive functions. In the rat lithium-pilocarpine temporal lobe epilepsy model, memory impairment and anxiety disorder are classically reported. Here we evaluated sustained visual attention in this model of epilepsy, a function not frequently explored. Thirty-five Sprague-Dawley rats were subjected to lithium-pilocarpine status epilepticus. Twenty of them received a carisbamate treatment for 7 days, starting 1 h after status epilepticus onset. Twelve controls received lithium and saline. Five months later, attention was assessed in the five-choice serial reaction time task, a task that tests visual attention and inhibitory control (impulsivity/compulsivity). Neuronal counting was performed in brain regions of interest to the functions studied (hippocampus, prefrontal cortex, nucleus basalis magnocellularis, and pedunculopontine tegmental nucleus). Lithium-pilocarpine rats developed motor seizures. When they were able to learn the task, they exhibited attention impairment and a tendency toward impulsivity and compulsivity. These disturbances occurred in the absence of neuronal loss in structures classically related to attentional performance, although they seemed to better correlate with neuronal loss in hippocampus. Globally, rats that received carisbamate and developed motor seizures were as impaired as untreated rats, whereas those that did not develop overt motor seizures performed like controls, despite evidence for hippocampal damage. This study shows that attention deficits reported by patients with temporal lobe epilepsy can be observed in the lithium-pilocarpine model. Carisbamate prevents the occurrence of motor seizures, attention impairment, impulsivity, and compulsivity in a subpopulation of neuroprotected rats. Wiley Periodicals, Inc. © 2014 International League Against Epilepsy.
Full Text Available To quickly respond to the diverse product demands, mixed-model assembly lines are well adopted in discrete manufacturing industries. Besides the complexity in material distribution, mixed-model assembly involves a variety of components, different process plans and fast production changes, which greatly increase the difficulty for agile production management. Aiming at breaking through the bottlenecks in existing production management, a novel RFID-enabled manufacturing execution system (MES, which is featured with real-time and wireless information interaction capability, is proposed to identify various manufacturing objects including WIPs, tools, and operators, etc., and to trace their movements throughout the production processes. However, being subject to the constraints in terms of safety stock, machine assignment, setup, and scheduling requirements, the optimization of RFID-enabled MES model for production planning and scheduling issues is a NP-hard problem. A new heuristical generalized Lagrangian decomposition approach has been proposed for model optimization, which decomposes the model into three subproblems: computation of optimal configuration of RFID senor networks, optimization of production planning subjected to machine setup cost and safety stock constraints, and optimization of scheduling for minimized overtime. RFID signal processing methods that could solve unreliable, redundant, and missing tag events are also described in detail. The model validity is discussed through algorithm analysis and verified through numerical simulation. The proposed design scheme has important reference value for the applications of RFID in multiple manufacturing fields, and also lays a vital research foundation to leverage digital and networked manufacturing system towards intelligence.
Dannels, Sharon A; Yamagata, Hisashi; McDade, Sharon A; Chuang, Yu-Chuan; Gleason, Katharine A; McLaughlin, Jean M; Richman, Rosalyn C; Morahan, Page S
The Hedwig van Ameringen Executive Leadership in Academic Medicine (ELAM) program provides an external yearlong development program for senior women faculty in U.S. and Canadian medical schools. This study aims to determine the extent to which program participants, compared with women from two comparison groups, aspire to leadership, demonstrate mastery of leadership competencies, and attain leadership positions. A pre-/posttest methodology and longitudinal structure were used to evaluate the impact of ELAM participation. Participants from two ELAM cohorts were compared with women who applied but were not accepted into the ELAM program (NON) and women from the Association of American Medical Colleges (AAMC) Faculty Roster. The AAMC group was a baseline for midcareer faculty; the NON group allowed comparison for leadership aspiration. Baseline data were collected in 2002, with follow-up data collected in 2006. Sixteen leadership indicators were considered: administrative leadership attainment (four indicators), full professor academic rank (one), leadership competencies and readiness (eight), and leadership aspirations and education (three). For 15 of the indicators, ELAM participants scored higher than AAMC and NON groups, and for one indicator they scored higher than only the AAMC group (aspiration to leadership outside academic health centers). The differences were statistically significant for 12 indicators and were distributed across the categories. These included seven of the leadership competencies, three of the administrative leadership attainment indicators, and two of the leadership aspirations and education indicators. These findings support the hypothesis that the ELAM program has a beneficial impact on ELAM fellows in terms of leadership behaviors and career progression.
Green, Lawrence L.
One focus area of the National Aeronautics and Space Administration (NASA) is to improve aviation safety. Runway safety is one such thrust of investigation and research. The two primary components of this runway safety research are in runway incursion (RI) and runway excursion (RE) events. These are adverse ground-based aviation incidents that endanger crew, passengers, aircraft and perhaps other nearby people or property. A runway incursion is the incorrect presence of an aircraft, vehicle or person on the protected area of a surface designated for the landing and take-off of aircraft; one class of RI events simultaneously involves two aircraft, such as one aircraft incorrectly landing on a runway while another aircraft is taking off from the same runway. A runway excursion is an incident involving only a single aircraft defined as a veer-off or overrun off the runway surface. Within the scope of this effort at NASA Langley Research Center (LaRC), generic RI, RE and combined (RI plus RE, or RUNSAFE) event models have each been developed and implemented as a Bayesian Belief Network (BBN). Descriptions of runway safety issues from the literature searches have been used to develop the BBN models. Numerous considerations surrounding the process of developing the event models have been documented in this report. The event models were then thoroughly reviewed by a Subject Matter Expert (SME) panel through multiple knowledge elicitation sessions. Numerous improvements to the model structure (definitions, node names, node states and the connecting link topology) were made by the SME panel. Sample executions of the final RUNSAFE model have been presented herein for baseline and worst-case scenarios. Finally, a parameter sensitivity analysis for a given scenario was performed to show the risk drivers. The NASA and LaRC research in runway safety event modeling through the use of BBN technology is important for several reasons. These include: 1) providing a means to clearly
Swindle, D.W. Jr.
The most recent progress, results, and plans for future work on the US Enrichment Safeguards Program's principal development activities are summarized. Nineteen development activities are reported that have potential International Atomic Energy Agency (IAEA) safeguards applications. Part 1 presents Executive Summaries for these, each of which includes information on (1) the purpose and scope of the development activity; (2) the potential IAEA safeguards application and/or use if adopted; (3) significant development work, results, and/or conclusions to date; and where appropriate (4) future activities and plans for continued work. Development activities cover: measurement technology for limited-frequency-unannounced-access stategy inspections; integrated data acquisition system; enrichment-monitoring system; load-cell-based weighing system for UF 6 cylinder mass verifications; vapor phase versus liquid phase sampling of UF 6 cylinders; tamper-safing hardware and systems; an alternative approach to IAEA nuclear material balance verifications resulting from intermittent inspections; UF 6 sample bottle enrichment analyzer; crated waste assay monitor; and compact 252 Cf shuffler for UF 6 measurements
Buerger, L.; Szegi, Zs.; Vegh, E.
A basic principle simulator for WWER-440 type nuclear power plants is under development in the Central Research Institute for Physics, Budapest. So far the technological models of both to primary and secondary circuits are ready and this paper presents the Real-time Executive and the on-line operating environment which controls the simulator. This executive system contains eight programs and the detailed structure of the data base is presented. The control of the execution of the model programs, their timing and the error recoveries are also discussed. (author) 5 refs
Energy efficiency is an important goal of modern computing, with direct impact on system operational cost, reliability, usability and environmental sustainability. This dissertation describes the design and implementation of two innovative programming languages for constructing energy-aware systems. First, it introduces ET, a strongly typed programming language to promote and facilitate energy-aware programming, with a novel type system design called Energy Types. Energy Types is built upon a key insight into today's energy-efficient systems and applications: despite the popular perception that energy and power can only be described in joules and watts, real-world energy management is often based on discrete phases and modes, which in turn can be reasoned about by type systems very effectively. A phase characterizes a distinct pattern of program workload, and a mode represents an energy state the program is expected to execute in. Energy Types is designed to reason about energy phases and energy modes, bringing programmers into the optimization of energy management. Second, the dissertation develops Eco, an energy-aware programming language centering around sustainability. A sustainable program built from Eco is able to adaptively adjusts its own behaviors to stay on a given energy budget, avoiding both deficit that would lead to battery drain or CPU overheating, and surplus that could have been used to improve the quality of the program output. Sustainability is viewed as a form of supply and demand matching, and a sustainable program consistently maintains the equilibrium between supply and demand. ET is implemented as a prototyped compiler for smartphone programming on Android, and Eco is implemented as a minimal extension to Java. Programming practices and benchmarking experiments in these two new languages showed that ET can lead to significant energy savings for Android Apps and Eco can efficiently promote battery awareness and temperature awareness in real
Khaleel, Ibrahim Adamu
Describes the spiral interactive program evaluation model, which is designed to evaluate vocational-technical education programs in secondary schools in Nigeria. Program evaluation is defined; utility oriented and process oriented models for evaluation are described; and internal and external evaluative factors and variables that define each…
Madsen, Ole Lehrmann; Møller-Pedersen, Birger
of this paper is to go back to the future and get inspiration from SIMULA and propose a unied approach. In addition to reintroducing the contributions of SIMULA and the Scandinavian approach to object-oriented programming, we do this by discussing a number of issues in modeling and programming and argue3 why we......SIMULA was a language for modeling and programming and provided a unied approach to modeling and programming in contrast to methodologies based on structured analysis and design. The current development seems to be going in the direction of separation of modeling and programming. The goal...
D. Zhang (Dan)
textabstractThis dissertation focuses on how executive compensation is designed and its implications for corporate finance and government regulations. Chapter 2 analyzes several proposals to restrict CEO compensation and calibrates two models of executive compensation that describe how firms would
Lowest Price, Technically Acceptable Evaluation Criteria Used in the November 2014 Request for Proposal for the Program Executive Office Soldier Systems Engineering and Technical Assistance (SETA) Contract
GS-0895-09 Industrial Engineering Technician 23.04 7.56 GS-0810-09 Civil Engineering Developmental (Programming) 23.04 7.56 GS-0810-11 Civil ... Engineering –Programming 27.88 9.14 GS-0810-12 Civil Engineering –Design 33.41 10.96 GS-2210-09 Info Tech Spec (Network Systems/Customer Support) 23.04...USED IN THE NOVEMBER 2014 REQUEST FOR PROPOSAL FOR THE PROGRAM EXECUTIVE OFFICE SOLDIER SYSTEMS ENGINEERING AND TECHNICAL ASSISTANCE (SETA) CONTRACT
Lammerts, Lieke; Schaafsma, Frederieke G; van Mechelen, Willem; Anema, Johannes R
A process evaluation of a participatory supportive return to work program, aimed at workers without a (permanent) employment contract who are sick-listed due to a common mental disorder, revealed that this program was executed less successfully than similar programs evaluated in earlier studies. The program consisted of a participatory approach, integrated care and direct placement in competitive employment. Aim of this study was to get a better understanding of the execution of the program by evaluating stakeholders' perceptions. In the absence of an employer, the program was applied by the Dutch Social Security Agency, in collaboration with vocational rehabilitation agencies. Together with the sick-listed workers, these were the main stakeholders. Our research questions involved stakeholders' perceptions of the function(s) of the program, and their perceptions of barriers and facilitators for a successful execution of the program within the Dutch social security sector. Semi-structured interviews were held with five sick-listed workers, eight professionals of the Social Security Agency, and two case managers of vocational rehabilitation agencies. Interview topics were related to experiences with different components of the program. Selection of respondents was based on purposive sampling and continued until data saturation was reached. Content analysis was applied to identify patterns in the data. Two researchers developed a coding system, based on predefined topics and themes emerging from the data. Although perceived functions of some components of the program were as intended, all stakeholders stressed that the program often had not resulted in return to work. Perceived barriers for a successful execution were related to a poor collaboration between the Dutch Social Security Agency, vocational rehabilitation agencies and healthcare providers, the type of experienced (health) problems, time constraints, and limited job opportunities. For future implementation
Sigaloff, C.L.; Nabben, E.H. (Iselien); Bergsma, E.
The purpose of this paper is to provide an alternative model of a leadership-development program. Design/methodology/approach: A leadership-development program based on a "closure-type description" instead of an "input-type description" (Varela) was designed and executed for an organization. The
Claudio Gil Soares de Araújo
Full Text Available The aim of this study was to relate flexibility improvements from a supervised exercise program (SEP attendance, to possible improvements in the execution of daily actions by adults. The sample consisted of 20 subjects, the majority of them cardiac patients, with an average age of 58 ± 9 years, actively participating in an SEP, selected intentionally. The Flexitest, was used to determine flexibility. In addition, the subjects answered an 11-question questionnaire, aiming to assess relative difficulty in daily actions. The questionnaire was completed between three and 18 months after beginning the program and assessed the subjects’ opinion on their improvements in daily actions since starting on the SEP. After the SEP, improvements were observed in the execution of 11 daily actions, global flexibility, and six individual movements on the Flexitest (p RESUMO Este estudo objetivou relacionar ganhos de flexibilidade decorrentes da participa��ão em programa de exercício supervisionado (PES com eventuais facilitações na execução de ações cotidianas em adultos. Vinte indivíduos, a maioria coronariopatas, com idade de 58 ± 9 anos, que estavam freqüentando um PES, foram selecionados intencionalmente. Para a avaliação da flexibilidade utilizou-se o Flexiteste. Em adendo, os indivíduos responderam um questionário com 11 perguntas para avaliar subjetivamente, a facilidade e/ou dificuldade de realizar ações cotidianas, no início do PES e no momento em que estavam respondendo o questionário. Após o PES, houve ganhos na facilidade de execução das 11 ações, na flexibilidade global passiva e em seis movimentos individuais do Flexiteste (p<0,05. Há correlação significativa entre as diferenças das respostas ao questionário e as variações na flexibilidade global (r=0,45; p<0,04. Existe relação inversa entre as variações de peso e de flexibilidade (r=-0,66; p<0,05. Concluiu-se que a facilitação na realização de a
For user-friendliness, many software systems offer progress indicators for long-duration tasks. A typical progress indicator continuously estimates the remaining task execution time as well as the portion of the task that has been finished. Building a machine learning model often takes a long time, but no existing machine learning software supplies a non-trivial progress indicator. Similarly, running a data mining algorithm often takes a long time, but no existing data mining software provides a nontrivial progress indicator. In this article, we consider the problem of offering progress indicators for machine learning model building and data mining algorithm execution. We discuss the goals and challenges intrinsic to this problem. Then we describe an initial framework for implementing such progress indicators and two advanced, potential uses of them, with the goal of inspiring future research on this topic. PMID:29177022
van der Niet, Anneke G.; Hartman, Esther; Smith, Joanne; Visscher, Chris
Objectives: The relationship between physical fitness and academic achievement in children has received much attention, however, whether executive functioning plays a mediating role in this relationship is unclear. The aim of this study therefore was to investigate the relationships between physical
Full Text Available The paper presents results of numerical calculations of a diaphragm wall model executed in Poznań clay formation. Two selected FEM codes were applied, Plaxis and Abaqus. Geological description of Poznań clay formation in Poland as well as geotechnical conditions on construction site in Warsaw city area were presented. The constitutive models of clay implemented both in Plaxis and Abaqus were discussed. The parameters of the Poznań clay constitutive models were assumed based on authors’ experimental tests. The results of numerical analysis were compared taking into account the measured values of horizontal displacements.
Olesen, Mads Chr.
Software programs are proliferating throughout modern life, to a point where even the simplest appliances such as lightbulbs contain software, in addition to the software embedded in cars and airplanes. The correct functioning of these programs is therefore of the utmost importance, for the quality...
Caspersen, Michael Edelgaard; Bennedsen, Jens; Larsen, Kasper Dalgaard
Predicting the success of students participating in introductory programming courses has been an active research area for more than 25 years. Until recently, no variables or tests have had any significant predictive power. However, Dehnadi and Bornat claim to have found a simple test for programm......Predicting the success of students participating in introductory programming courses has been an active research area for more than 25 years. Until recently, no variables or tests have had any significant predictive power. However, Dehnadi and Bornat claim to have found a simple test...... for programming aptitude to cleanly separate programming sheep from non-programming goats. We briefly present their theory and test instrument. We have repeated their test in our local context in order to verify and perhaps generalise their findings, but we could not show that the test predicts students' success...... in our introductory program-ming course. Based on this failure of the test instrument, we discuss various explanations for our differing results and suggest a research method from which it may be possible to generalise local results in this area. Furthermore, we discuss and criticize Dehnadi and Bornat...
Chen, Der-San; Dang, Yu
An accessible treatment of the modeling and solution of integer programming problems, featuring modern applications and software In order to fully comprehend the algorithms associated with integer programming, it is important to understand not only how algorithms work, but also why they work. Applied Integer Programming features a unique emphasis on this point, focusing on problem modeling and solution using commercial software. Taking an application-oriented approach, this book addresses the art and science of mathematical modeling related to the mixed integer programming (MIP) framework and
Rabinovici, Gil D.; Stephens, Melanie L.; Possin, Katherine L.
Purpose of Review: Executive functions represent a constellation of cognitive abilities that drive goal-oriented behavior and are critical to the ability to adapt to an ever-changing world. This article provides a clinically oriented approach to classifying, localizing, diagnosing, and treating disorders of executive function, which are pervasive in clinical practice. Recent Findings: Executive functions can be split into four distinct components: working memory, inhibition, set shifting, and fluency. These components may be differentially affected in individual patients and act together to guide higher-order cognitive constructs such as planning and organization. Specific bedside and neuropsychological tests can be applied to evaluate components of executive function. While dysexecutive syndromes were first described in patients with frontal lesions, intact executive functioning relies on distributed neural networks that include not only the prefrontal cortex, but also the parietal cortex, basal ganglia, thalamus, and cerebellum. Executive dysfunction arises from injury to any of these regions, their white matter connections, or neurotransmitter systems. Dysexecutive symptoms therefore occur in most neurodegenerative diseases and in many other neurologic, psychiatric, and systemic illnesses. Management approaches are patient specific and should focus on treatment of the underlying cause in parallel with maximizing patient function and safety via occupational therapy and rehabilitation. Summary: Executive dysfunction is extremely common in patients with neurologic disorders. Diagnosis and treatment hinge on familiarity with the clinical components and neuroanatomic correlates of these complex, high-order cognitive processes. PMID:26039846
Piovesana, Adina; Ross, Stephanie; Lloyd, Owen; Whittingham, Koa; Ziviani, Jenny; Ware, Robert S; McKinlay, Lynne; Boyd, Roslyn N
To examine the efficacy of a multi-modal web-based therapy program, Move it to improve it (Mitii™) delivered at home to improve Executive Functioning (EF) in children with an acquired brain injury (ABI). Randomised Waitlist controlled trial. Home environment. Sixty children with an ABI were matched in pairs by age and intelligence quotient then randomised to either 20-weeks of Mitii™ training or 20 weeks of Care As Usual (waitlist control; n=30; 17 males; mean age=11y, 11m (±2y, 6m); Full Scale IQ=76.24±17.84). Fifty-eight children completed baseline assessments (32 males; mean age=11.87±2.47; Full Scale IQ=75.21±16.76). Executive functioning was assessed on four domains: attentional control, cognitive flexibility, goal setting, and information processing using subtests from the Wechsler Intelligence Scale for Children (WISC-IV), Delis-Kaplan Executive Functioning System (D-KEFS), Comprehensive Trail Making Test (CTMT), Tower of London (TOL), and Test of Everyday Attention for Children (Tea-Ch). Executive functioning performance in everyday life was assessed via parent questionnaire (Behaviour Rating Inventory of Executive Functioning; BRIEF). No differences were observed at baseline measures. Groups were compared at 20-weeks using linear regression with no significant differences found between groups on all measures of EF. Out of a potential total dose of 60 hours, children in the Mitii™ group completed a mean of 17 hours of Mitii™ intervention. Results indicate no additional benefit to receiving Mitii™ compared to standard care. Mitii™, in its current form, was not shown to improve EF in children with ABI.
Prins, P.J.M.; ten Brink, E.; Dovis, S.; Ponsioen, A.; Geurts, H.M.; de Vries, M.; van der Oord, S.
In the area of childhood attention-deficit hyperactivity disorder, there is an urgent need for new, innovative, and child-focused treatments. A computerized executive functioning training with game elements aimed at enhancing self-control was developed. The first results are promising, and the next
Havelund, Klaus; Pressburger, Thomas
This paper describes a translator called JAVA PATHFINDER from JAVA to PROMELA, the "programming language" of the SPIN model checker. The purpose is to establish a framework for verification and debugging of JAVA programs based on model checking. This work should be seen in a broader attempt to make formal methods applicable "in the loop" of programming within NASA's areas such as space, aviation, and robotics. Our main goal is to create automated formal methods such that programmers themselves can apply these in their daily work (in the loop) without the need for specialists to manually reformulate a program into a different notation in order to analyze the program. This work is a continuation of an effort to formally verify, using SPIN, a multi-threaded operating system programmed in Lisp for the Deep-Space 1 spacecraft, and of previous work in applying existing model checkers and theorem provers to real applications.
Bervoets, Joachim; Jonkman, Lisa M; Mulkens, Sandra; de Vries, Hein; Kok, Gerjo
Executive functions are higher cognitive control functions, which are essential to physical and psychological well-being, academic performance, and healthy social relationships. Executive functions can be trained, albeit without broad transfer, to this date. Broad transfer entails the translation of improved cognitive functions to daily life (behaviors). The intervention Train your Mind was designed to train executive functions among elementary school children aged 9 to 11 years, and obtain broad transfer in terms of enhanced physical activity, healthy eating, and socioemotional regulation. This paper aims to describe the cluster randomized trial to test the effectiveness of the Train your Mind intervention. Train your Mind was integrated into the existing school curriculum for 8 months (25 weeks excluding holidays). The effectiveness of the intervention was tested in a cluster randomized trial comprising 13 schools, 34 groups (school classes), and 800 children, using a battery of 6 computer tasks at pre- and postmeasurement. Each of the 3 core executive functions was measured by 2 tasks (Flanker and Go/No-Go; N-Back and Running Span; Attention Switching Task and Dots/Triangles). Moreover, we administered questionnaires that measure emotion-regulation, cognitive errors, physical activity, dietary habits, and the psycho-social determinants of diet and physical activity. Body mass index was also measured. Multilevel analyses will account for clustering at the school and group levels, and randomization took place at the school level. Results are currently being analyzed. The main purpose of this study is to test Train your Mind's effectiveness in enhancing executive functions. Second, we investigate whether increased executive functions lead to improved physical activity and healthy eating. If found effective, executive function training could easily be integrated into school curricula everywhere, and as such, boost health, academic performance, and emotion
Full Text Available Current performance prediction analytical models try to characterize the performance behavior of actual machines through a small set of parameters. In practice, substantial deviations are observed. These differences are due to factors as memory hierarchies or network latency. A natural approach is to associate a different proportionality constant with each basic block, and analogously, to associate different latencies and bandwidths with each "communication block". Unfortunately, to use this approach implies that the evaluation of parameters must be done for each algorithm. This is a heavy task, implying experiment design, timing, statistics, pattern recognition and multi-parameter fitting algorithms. Software support is required. We present a compiler that takes as source a C program annotated with complexity formulas and produces as output an instrumented code. The trace files obtained from the execution of the resulting code are analyzed with an interactive interpreter, giving us, among other information, the values of those parameters.
A measurement control program for the model plant is described. The discussion includes the technical basis for such a program, the application of measurement control principles to each measurement, and the use of special experiments to estimate measurement error parameters for difficult-to-measure materials. The discussion also describes the statistical aspects of the program, and the documentation procedures used to record, maintain, and process the basic data
Full Text Available Several studies have assessed the effects of computer-based cognitive programs (CCP in the management of age-related cognitive decline, but the role of CCP remains controversial. Therefore, this systematic review evaluated the evidence on the efficacy of CCP for age-related cognitive decline in healthy older adults.Six electronic databases (through October 2014 were searched. The risk of bias was assessed using the Cochrane Collaboration tool. The standardized mean difference (SMD and 95% confidence intervals (CI of a random-effects model were calculated. The heterogeneity was assessed using the Cochran Q statistic and quantified with the I2 index.Twelve studies were included in the current review and were considered as moderate to high methodological quality. The aggregated results indicate that CCP improves memory performance (SMD, 0.31; 95% CI 0.16 to 0.45; p < 0.0001 and processing speed (SMD, 0.50; 95% CI 0.14 to 0.87; p = 0.007 but not executive function (SMD, -0.12; 95% CI -0.33 to 0.09; p = 0.27. Furthermore, there were long-term gains in memory performance (SMD, 0.59; 95% CI 0.13 to 1.05; p = 0.01.CCP may be a valid complementary and alternative therapy for age-related cognitive decline, especially for memory performance and processing speed. However, more studies with longer follow-ups are warranted to confirm the current findings.
Prins, Pier J M; Brink, Esther Ten; Dovis, Sebastiaan; Ponsioen, Albert; Geurts, Hilde M; de Vries, Marieke; van der Oord, Saskia
In the area of childhood attention-deficit hyperactivity disorder, there is an urgent need for new, innovative, and child-focused treatments. A computerized executive functioning training with game elements aimed at enhancing self-control was developed. The first results are promising, and the next steps involve replication with larger samples, evaluating transfer of training effects to daily life, and enhancing motivation through more gaming elements.
V. A. Gerasimova
Full Text Available By the analysis results of scientific works in the field of competence-based approach in education authors proved need of computer support of the planning and development stage of the main educational program, they developed the main educational program structure automatic formation model on the graphs basis, offered the integrated criterion of an discipline assessment and developed a strategic map of a discipline complex assessment. The executed theoretical researches are a basis for creation of the main educational program planning and development support automated system.
Sztipanovits, J.; Biegl, C.; Karsai, G.; Bogunovic, N.; Purves, B.; Williams, R.; Christiansen, T.
A programming model and architecture which was developed for the design and implementation of complex, heterogeneous measurement and control systems is described. The Multigraph Architecture integrates artificial intelligence techniques with conventional software technologies, offers a unified framework for distributed and shared memory based parallel computational models and supports multiple programming paradigms. The system can be implemented on different hardware architectures and can be adapted to strongly different applications.
Qian, Ying; Chen, Min; Shuai, Lan; Cao, Qing-Jiu; Yang, Li; Wang, Yu-Feng
As medication does not normalize outcomes of children with attention deficit hyperactivity disorder (ADHD), especially in real-life functioning, nonpharmacological methods are important to target this field. This randomized controlled clinical trial was designed to evaluate the effects of a comprehensive executive skill training program for school-aged children with ADHD in a relatively large sample. The children (aged 6-12 years) with ADHD were randomized to the intervention or waitlist groups. A healthy control group was composed of gender- and age-matched healthy children. The intervention group received a 12-session training program for multiple executive skills. Executive function (EF), ADHD symptoms, and social functioning in the intervention and waitlist groups were evaluated at baseline and the end of the final training session. The healthy controls (HCs) were only assessed once at baseline. Repeated measures analyses of variance were used to compare EF, ADHD symptoms, and social function between intervention and waitlist groups. Thirty-eight children with ADHD in intervention group, 30 in waitlist group, and 23 healthy children in healthy control group were included in final analysis. At posttreatment, intervention group showed significantly lower Behavior Rating Inventory of Executive Function (BRIEF) total score (135.89 ± 16.80 vs. 146.09 ± 23.92, P= 0.04) and monitoring score (18.05 ± 2.67 vs. 19.77 ± 3.10, P= 0.02), ADHD-IV overall score (41.11 ± 7.48 vs. 47.20 ± 8.47, PADHD-IV overall score (F = 21.72, PADHD-rating scale-IV, and WEISS Functional Impairment Scale-Parent form (WFIRS-P) among the intervention and waitlist groups at posttreatment and HCs at baseline. This randomized controlled study on executive skill training in a relatively large sample provided some evidences that the training could improve EF deficits, reduce problematic symptoms, and potentially enhance the social functioning in school-aged children with ADHD. http
Agency and program administrators and decisionmakers responsible for implementing early childhood intervention programs are becoming more interested in quantifying the costs and benefits of such programs...
J. Sompolski (Juliusz); M. Zukowski (Marcin); P.A. Boncz (Peter)
textabstractCompiling database queries into executable (sub-) programs provides substantial benefits comparing to traditional interpreted execution. Many of these benefits, such as reduced interpretation overhead, better instruction code locality, and providing opportunities to use SIMD
Idzorek, J J
Experimental results of the aerodynamic performance of seven candidate diffusers are presented to assist in determining their suitability for joining an MHD channel to a steam generator at minimum spacing. The three dimensional diffusers varied in area ratio from 2 to 3.8 and wall half angle from 2 to 5 degrees. The program consisted of five phases: (1) tailoring a diffuser inlet nozzle to a 15 percent blockage; (2) comparison of isolated diffusers at enthalpy ratios 0.5 to 1.0 with respect to separation characteristics and pressure recovery coefficients; (3) recording the optimum diffuser exit flow distribution; (4) recording the internal flow distribution within the steam generator when attached to the diffuser; and (5) observing isolated diffuser exhaust dynamic characteristics. The 2 and 2-1/3 degree half angle rectangular diffusers showed recovery coefficients equal to 0.48 with no evidence of flow separation or instability. Diffusion at angles greater than these produced flow instabilities and with angles greater than 3 degrees random flow separation and reattachment.
Experimental results of the aerodynamic performance of seven candidate diffusers are presented to assist in determining their suitability for joining an MHD channel to a steam generator at minimum spacing. The three dimensional diffusers varied in area ratio from 2 to 3.8 and wall half angle from 2 to 5 degrees. The program consisted of five phases: (1) tailoring a diffuser inlet nozzle to a 15 percent blockage; (2) comparison of isolated diffusers at enthalpy ratios 0.5 to 1.0 with respect to separation characteristics and pressure recovery coefficients; (3) recording the optimum diffuser exit flow distribution; (4) recording the internal flow distribution within the steam generator when attached to the diffuser; and (5) observing isolated diffuser exhaust dynamic characteristics. The 2 and 2-1/3 degree half angle rectangular diffusers showed recovery coefficients equal to 0.48 with no evidence of flow separation or instability. Diffusion at angles greater than these produced flow instabilities and with angles greater than 3 degrees random flow separation and reattachment
van Nimwegen, N.; van Nimwegen, N.; van der Erf, R.
The Demography Monitor 2008 gives a concise overview of current demographic trends and related developments in education, the labour market and retirement for the European Union and some other countries. This executive summary highlights the major findings of the Demography Monitor 2008 and further
.... Life cycle cost analyses were performed using the Life Cycle Cost in Design (LCCID) computer program. Project descriptions and DD1391 forms were prepared for four Energy Conservation Investment program...
In order to streamline and simplify the methodologies required to obtain and process the requisite meteorological data for mesoscale meteorological models such as the Battlescale Forecast Model (BFM...
Park, Subin; Lee, Jong-Min; Baik, Young; Kim, Kihyun; Yun, Hyuk Jin; Kwon, Hunki; Jung, Yeon-Kyung; Kim, Bung-Nyun
The authors examined the effects of arts education on cognition, behavior, and brain of children. Twenty-nine nonclinical children participated in a 15-week arts education program that was composed of either creative movement or musical arts. Children completed the Wisconsin Card Sorting Test, clinical scales, and brain magnetic resonance imaging before and after the intervention. Following program completion, performances on the Wisconsin Card Sorting Test, the Children's Depression Inventory scores, and conduct disorder scores were significantly improved. Furthermore, cortical thickness in the left postcentral gyrus and superior parietal lobule were increased, and the mean diffusivity values in the right posterior corona radiate and superior longitudinal fasciculus were decreased. Positive correlations between changes in cognitive measurements and changes in cortical thickness were observed. This preliminary study suggests a positive effect of arts education on executive functions in association with brain changes. However, these findings must be interpreted with caution due to the noncomparative study design. © The Author(s) 2015.
Extracting maximum performance of multi-core architectures is a difficult task primarily due to bandwidth limitations of the memory subsystem and its complex hierarchy. In this work, we study the implications of fork-join and data-driven execution models on this type of architecture at the level of task parallelism. For this purpose, we use a highly optimized fork-join based implementation of the FMM and extend it to a data-driven implementation using a distributed task scheduling approach. This study exposes some limitations of the conventional fork-join implementation in terms of synchronization overheads. We find that these are not negligible and their elimination by the data-driven method, with a careful data locality strategy, was beneficial. Experimental evaluation of both methods on state-of-the-art multi-socket multi-core architectures showed up to 22% speed-ups of the data-driven approach compared to the original method. We demonstrate that a data-driven execution of FMM not only improves performance by avoiding global synchronization overheads but also reduces the memory-bandwidth pressure caused by memory-intensive computations. © 2013 Springer-Verlag.
Arifin, Dadang; Yusuf, Edhi
This paper aims to develop a replacement model of city bus vehicles operated in Bandung City. This study is driven from real cases encountered by the Damri Company in the efforts to improve services to the public. The replacement model propounds two policy alternatives: First, to maintain or keep the vehicles, and second is to replace them with new ones taking into account operating costs, revenue, salvage value, and acquisition cost of a new vehicle. A deterministic dynamic programming approach is used to solve the model. The optimization process was heuristically executed using empirical data of Perum Damri. The output of the model is to determine the replacement schedule and the best policy if the vehicle has passed the economic life. Based on the results, the technical life of the bus is approximately 20 years old, while the economic life is an average of 9 (nine) years. It means that after the bus is operated for 9 (nine) years, managers should consider the policy of rejuvenation.
Lu, D R; Mao, F
A computer program, PharmK, was developed for pharmacokinetic modeling of experimental data. The program was written in C computer language based on the high-level user-interface Macintosh operating system. The intention was to provide a user-friendly tool for users of Macintosh computers. An interactive algorithm based on the exponential stripping method is used for the initial parameter estimation. Nonlinear pharmacokinetic model fitting is based on the maximum likelihood estimation method and is performed by the Levenberg-Marquardt method based on chi 2 criterion. Several methods are available to aid the evaluation of the fitting results. Pharmacokinetic data sets have been examined with the PharmK program, and the results are comparable with those obtained with other programs that are currently available for IBM PC-compatible and other types of computers.
Lusk, Ewing; Butler, Ralph; Pieper, Steven C.
Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generality and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.
Schmitz, Oliver; de Kok, Jean Luc; Karssenberg, Derek
Dynamic environmental models use a state transition function, external inputs and parameters to simulate the change of real-world processes over time. Modellers specify the state transition function and the external inputs required in the process calculation of each time step in a component model, a
Krishnan, Shankar M
There is a proliferation of medical devices across the globe for the diagnosis and therapy of diseases. Biomedical engineering (BME) plays a significant role in healthcare and advancing medical technologies thus creating a substantial demand for biomedical engineers at undergraduate and graduate levels. There has been a surge in undergraduate programs due to increasing demands from the biomedical industries to cover many of their segments from bench to bedside. With the requirement of multidisciplinary training within allottable duration, it is indeed a challenge to design a comprehensive standardized undergraduate BME program to suit the needs of educators across the globe. This paper's objective is to describe three major models of undergraduate BME programs and their curricular requirements, with relevant recommendations to be applicable in institutions of higher education located in varied resource settings. Model 1 is based on programs to be offered in large research-intensive universities with multiple focus areas. The focus areas depend on the institution's research expertise and training mission. Model 2 has basic segments similar to those of Model 1, but the focus areas are limited due to resource constraints. In this model, co-op/internship in hospitals or medical companies is included which prepares the graduates for the work place. In Model 3, students are trained to earn an Associate Degree in the initial two years and they are trained for two more years to be BME's or BME Technologists. This model is well suited for the resource-poor countries. All three models must be designed to meet applicable accreditation requirements. The challenges in designing undergraduate BME programs include manpower, facility and funding resource requirements and time constraints. Each academic institution has to carefully analyze its short term and long term requirements. In conclusion, three models for BME programs are described based on large universities, colleges, and
Carlson, Michelle C.; Saczynski, Jane S.; Rebok, George W.; Seeman, Teresa; Glass, Thomas A.; McGill, Sylvia; Tielsch, James; Frick, Kevin D.; Hill, Joel; Fried, Linda P.
Purpose: There is little empirical translation of multimodal cognitive activity programs in "real-world" community-based settings. This study sought to demonstrate in a short-term pilot randomized trial that such an activity program improves components of cognition critical to independent function among sedentary older adults at greatest risk.…
Afterschool Alliance, 2015
Afterschool programs continue to make advances when it comes to providing students with nutritious foods, keeping them physically fit and promoting health. Such programs have great potential to help prevent obesity and instill lifelong healthy habits, serving more than 10 million children and youth across America, with more than 19 million more…
Full Text Available The aim of this paper is to present a qualitative evaluation of three state-of-the-art parallel languages: OpenMP, Unified Parallel C (UPC and Co-Array Fortran (CAF. OpenMP and UPC are explicit parallel programming languages based on the ANSI standard. CAF is an implicit programming language. On the one hand, OpenMP designs for shared-memory architectures and extends the base-language by using compiler directives that annotate the original source-code. On the other hand, UPC and CAF designs for distribute-shared memory architectures and extends the base-language by new parallel constructs. We deconstruct each language into its basic components, show examples, make a detailed analysis, compare them, and finally draw some conclusions.
Haus, Utz-Uwe; Niermann, Kathrin; Truemper, Klaus; Weismantel, Robert
We propose a static and a dynamic approach to model biological signaling networks, and show how each can be used to answer relevant biological questions. For this, we use the two different mathematical tools of Propositional Logic and Integer Programming. The power of discrete mathematics for handling qualitative as well as quantitative data has so far not been exploited in molecular biology, which is mostly driven by experimental research, relying on first-order or statistical models. The arising logic statements and integer programs are analyzed and can be solved with standard software. For a restricted class of problems the logic models reduce to a polynomial-time solvable satisfiability algorithm. Additionally, a more dynamic model enables enumeration of possible time resolutions in poly-logarithmic time. Computational experiments are included.
The paper gives a description of mathematical models and computer programs for analysing possible strategies for spent fuel management, with emphasis on economic analysis. The computer programs developed, describe the material flows, facility construction schedules, capital investment schedules and operating costs for the facilities used in managing the spent fuel. The computer programs use a combination of simulation and optimization procedures for the economic analyses. Many of the fuel cycle steps (such as spent fuel discharges, storage at the reactor, and transport to the RFCC) are described in physical and economic terms through simulation modeling, while others (such as reprocessing plant size and commissioning schedules, interim storage facility commissioning schedules etc.) are subjected to economic optimization procedures to determine the approximate lowest-cost plans from among the available feasible alternatives
Peisner-Feinberg, Ellen; Schaaf, Jennifer; Hildebrandt, Lisa; LaForett, Dore
The North Carolina Pre-Kindergarten Program (NC Pre-K) is a state-funded initiative for at-risk 4-year-olds, designed to provide a high quality, classroom-based educational program during the year prior to kindergarten entry. Children are eligible for NC Pre-K based on age, family income (at or below 75% of state median income), and other risk…
This paper is an 'executive summary' of work undertaken to review proposals for transport, handling and emplacement of high level radioactive wastes in an underground repository, appropriate to the U.K. context, with particular reference to: waste block size and configuration; self-shielded or partially-shielded block; stages of disposal; transportation within the repository; emplacement in vertical holes or horizontal tunnels; repository access by adit, incline or shaft; and costs. The paper contains a section on general conclusions and recommendations. (U.K.)
The total-system life-cycle cost (TSLCC) analysis for the Department of Energy's Civilian Radioactive Waste Management Progrram is an ongoing activity that helps determine whether the revenue-producing mechanism established by the Nuclear Waste Policy Act of 1982 is sufficient to cover the cost of the program. This report is an input into the third evaluation of the adequacy of the fee. The total-system cost for the reference waste-management program in this analysis is estimated to be 24 to 30 billion (1984) dollars. For the sensitivity cases studied in this report, the costs could be as high as 35 billion dollars and as low as 21 billion dollars. Because factors like repository location, the quantity of waste generated, transportation-cask technology, and repository startup dates exert substantial impacts on total-system costs, there are several tradeoffs between these factors, and these tradeoffs can greatly influence the total cost of the program. The total-system cost for the reference program described in this report is higher by 3 to 5 billion dollars, or 15 to 20%, than the cost for the reference program of the TSLCC analysis of April 1984. More than two-thirds of this increase is in the cost of repository construction and operation. These repository costs have increased because of changing design concepts, different assumptions about the effort required to perform the necessary activities, and a change in the source data on which the earlier analysis was based. Development and evaluation costs have similarly increased because of a net addition to the work content. Transportation costs have increased because of different assumptions about repository locations and several characteristics of the transportation system. It is expected that the estimates of total-system costs will continue to change in response to both an evolving program strategy and better definition of the work required to achieve the program objectives
Heidarzadeh, Mohammad; Jodiery, Behzad; Mirnia, Kayvan; Akrami, Forouzan; Hosseini, Mohammad Bagher; Heidarabadi, Seifollah; HabibeLahi, Abbas
Intervention in early childhood development as one of the social determinants of health, is important for reducing social gap and inequity. In spite of increasingly developing intensive neonatal care wards and decreasing neonatal mortality rate, there is no follow up program in Iran. This study was carreid out to design high risk infants follow up care program with the practical aim of creating an model action for whole country, in 2012. This qualitative study has been done by the Neonatal Department of the Deputy of Public Health in cooperation with Pediatrics Health Research Center of Tabriz University of Medical Sciences, Iran. After study of international documents, consensus agreement about adapted program for Iran has been accomplished by focus group discussion and attended Delphi agreement technique. After compiling primary draft included evidence based guidelines and executive plan, 14 sessions including expert panels were hold to finalize the program. After finalizing the program, high risk infants follow up care service package has been designed in 3 chapters: Evidence based clinical guidelines; eighteen main clinical guidelines and thirteen subsidiaries clinical guidelines, executive plan; 6 general, 6 following up and 5 backup processes. Education program including general and especial courses for care givers and follow up team, and family education processes. We designed and finalized high risk infants follow up care service package. It seems to open a way to extend it to whole country.
Karen H. Warren
Full Text Available PDDP, the parallel data distribution preprocessor, is a data parallel programming model for distributed memory parallel computers. PDDP implements high-performance Fortran-compatible data distribution directives and parallelism expressed by the use of Fortran 90 array syntax, the FORALL statement, and the WHERE construct. Distributed data objects belong to a global name space; other data objects are treated as local and replicated on each processor. PDDP allows the user to program in a shared memory style and generates codes that are portable to a variety of parallel machines. For interprocessor communication, PDDP uses the fastest communication primitives on each platform.
Evertson, Carolyn M.; And Others
A summary is presented of the final report, "Effective Classroom Management and Instruction: An Exploration of Models." The final report presents a set of linked investigations of the effects of training teachers in effective classroom management practices in a series of school-based workshops. Four purposes were addressed by the study: (1) to…
Stuit, Marco; Szirbik, Nick B.; O'Hare, GMP; Ricci, A; OGrady, MJ; Dirkenelli, O
Interaction refers to an abstract and intangible concept. In modelling, intangible concepts can be embodied and made explicit. This allows to manipulate the abstractions and to build predictable designs. Business processes in organisations are in fact reducible to interactions, especially when
Lee, Kerry; Bull, Rebecca; Ho, Ringo M. H.
Although early studies of executive functioning in children supported Miyake et al.'s (2000) three-factor model, more recent findings supported a variety of undifferentiated or two-factor structures. Using a cohort-sequential design, this study examined whether there were age-related differences in the structure of executive functioning among…
The Roadside Inspection and Traffic Enforcement programs are two of FMCSAs most powerful safety tools. By continually examining the results of these programs, FMCSA can ensure that they are being executed effectively and are producing the desired ...
Christodoulou, Nikolaos A; Tousert, Nikolaos E; Georgiadi, Eleni Ch; Argyri, Katerina D; Misichroni, Fay D; Stamatakos, Georgios S
The plethora of available disease prediction models and the ongoing process of their application into clinical practice - following their clinical validation - have created new needs regarding their efficient handling and exploitation. Consolidation of software implementations, descriptive information, and supportive tools in a single place, offering persistent storage as well as proper management of execution results, is a priority, especially with respect to the needs of large healthcare providers. At the same time, modelers should be able to access these storage facilities under special rights, in order to upgrade and maintain their work. In addition, the end users should be provided with all the necessary interfaces for model execution and effortless result retrieval. We therefore propose a software infrastructure, based on a tool, model and data repository that handles the storage of models and pertinent execution-related data, along with functionalities for execution management, communication with third-party applications, user-friendly interfaces to access and use the infrastructure with minimal effort and basic security features.
start or continue a profitable growth pattern and boost morale and motivation, a poor decision may bring the company to the brink of financial...HON Frank Kendall issued a memorandum titled “Key Leadership Positions and Qualification Criteria” (Kendall, 2013). This memorandum provides a...Chief Developmental Tester • Program Lead, Business Financial Manager SELECTING CANDIDATES FOR KEY LEADERSHIP POSITIONS 4
Greenberg, David; Dechausay, Nadine; Fraker, Carolyn
In 2007, New York City's Center for Economic Opportunity launched Opportunity NYC-Family Rewards, an experimental, privately funded, conditional cash transfer (CCT) program to help families break the cycle of poverty. Family Rewards provided payments to low-income families in six of the city's poorest communities for achieving specific goals…
Pasareanu, Corina S.; Rungta, Neha
Symbolic Pathfinder (SPF) combines symbolic execution with model checking and constraint solving for automated test case generation and error detection in Java programs with unspecified inputs. In this tool, programs are executed on symbolic inputs representing multiple concrete inputs. Values of variables are represented as constraints generated from the analysis of Java bytecode. The constraints are solved using off-the shelf solvers to generate test inputs guaranteed to achieve complex coverage criteria. SPF has been used successfully at NASA, in academia, and in industry.
Conducting transnational programs can be a very rewarding activity for a School, Faculty or University. Apart from increasing the profile of the university, the conduct of transnational programs can also provide the university with openings for business opportunities, consultative activities, and collaborative research. It can also be a costly exercise placing an enormous strain on limited resources with little reward for the provider. Transnational ventures can become nonviable entities in a very short period of time due to unanticipated global economic trends. Transnational courses offered by Faculties of Business and Computing are commonplace, however, there is a growing number of health science programs, particularly nursing that are being offered transnational. This paper plans an overview of several models employed for the delivery of transnational nursing courses and discusses several key issues pertaining to conducting courses outside the host university's country.
Dufresne, M.; Silvester, P.P.
Continuum plasma models often use a finite element (FE) formulation. Another approach is simulation models based on particle-in-cell (PIC) formulation. The model equations generally include four nonlinear differential equations specifying the plasma parameters. In simulation a large number of equations must be integrated iteratively to determine the plasma evolution from an initial state. The complexity of the resulting programs is a combination of the physics involved and the numerical method used. The data structure requirements of plasma programs are stated by defining suitable abstract data types. These abstractions are then reduced to data structures and a group of associated algorithms. These are implemented in an object oriented language (C++) as object classes. Base classes encapsulate data management into a group of common functions such as input-output management, instance variable updating and selection of objects by Boolean operations on their instance variables. Operations are thereby isolated from specific element types and uniformity of treatment is guaranteed. Creation of the data structures and associated functions for a particular plasma model is reduced merely to defining the finite element matrices for each equation, or the equations of motion for PIC models. Changes in numerical method or equation alterations are readily accommodated through the mechanism of inheritance, without modification of the data management software. The central data type is an n-relation implemented as a tuple of variable internal structure. Any finite element program may be described in terms of five relational tables: nodes, boundary conditions, sources, material/particle descriptions, and elements. Equivalently, plasma simulation programs may be described using four relational tables: cells, boundary conditions, sources, and particle descriptions
Grande Ronde Model Watershed Program Administration and Habitat Projects, Annual Progress Report, Project Period: Program Administration: January 1, 1997 - December 31, 1997 Habitat Projects: January 1, 1997 - March 31, 1998.
Noyes, Cecilia; Kuchenbecker, Lyle; Perry, Patty
This agreement provided funding for operation and administration of the Grande Ronde Model Watershed Program including staffing of an Executive Director, Program Planner, and clerical personnel. The contract covers maintaining program services, project planning, subwatershed plans (CRMP's), public involvement and education, interagency coordination/clearing house, monitoring, and technical support activities that have taken place in the Grande Ronde basin. Cost-share has been received from the Bureau of Reclamation and the Governor's Watershed Enhancement Board.
On 18 May 2001, the Finnish Parliament ratified the Decision in Principle on the final disposal facility for spent nuclear fuel at Olkiluoto, within the municipality of Eurajoki. The Municipality Council and the government has made positive decisions earlier, at the end of 2000, and in compliance with the Nuclear Energy Act, Parliament's ratification was then required. The decision is valid for the spent fuel generated by the existing Finnish nuclear power plants and means that the construction of the final disposal facility is considered to be in line with the overall good of society. Earlier steps included, amongst others, the approval of the technical project by the Safety Authority. Future steps include construction of an underground rock characterisation facility, ONKALO (2003-2004), and application for separate construction and operating licences for the final disposal facility (from about 2010). How did this political and societal decision come about? The FSC Workshop provided the opportunity to present the history leading up to the Decision in Principle (DiP), and to examine future perspectives with an emphasis on stakeholder involvement. This Executive Summary gives an overview of the presentations and discussions that took place at the workshop. It presents, for the most part, a factual account of the individual presentations and of the discussions that took place. It relies importantly on the notes that were taken at the meeting. Most materials are elaborated upon in a fuller way in the texts that the various speakers and session moderators contributed for these proceedings. The structure of the Executive Summary follows the structure of the workshop itself. Complementary to this Summary and also provided with this document, is a NEA Secretariat's perspective aiming to place the results of all discussions, feedback and site visit into an international perspective. (authors)
Mesquita, Isabel; Farias, Claudio; Hastie, Peter
The purpose of this study was to examine the impact of a hybrid Sport Education-Invasion Games Competence Model (SE-IGCM) unit application on students' improvements in decision making, skill execution and overall game performance, during a soccer season. Twenty-six fifth-grade students from a Portuguese public elementary school participated in a…
Berghausen, P.E. Jr.
Continued behavior observation is mandated by ANSI/ANS 3.3. This paper presents a model for behavior observation training that is in accordance with this standard and the recommendations contained in US NRC publications. The model includes seventeen major topics or activities. Ten of these are discussed: Pretesting of supervisor's knowledge of behavior observation requirements, explanation of the goals of behavior observation programs, why behavior observation training programs are needed (legal and psychological issues), early indicators of emotional instability, use of videotaped interviews to demonstrate significant psychopathology, practice recording behaviors, what to do when unusual behaviors are observed, supervisor rationalizations for noncompliance, when to be especially vigilant, and prevention of emotional instability
Applying the Model of the Interrelationship of Leadership Environments and Outcomes for Nurse Executives: a community hospital's exemplar in developing staff nurse engagement through documentation improvement initiatives.
Adams, Jeffrey M; Denham, Debra; Neumeister, Irene Ramirez
The Model of the Interrelationship of Leadership, Environments & Outcomes for Nurse Executives (MILE ONE) was developed on the basis of existing literature related to identifying strategies for simultaneous improvement of leadership, professional practice/work environments (PPWE), and outcomes. Through existing evidence, the MILE ONE identifies the continuous and dependent interrelationship of 3 distinct concept areas: (1) nurse executives influence PPWE, (2) PPWE influence patient and organizational outcomes, and (3) patient and organizational outcomes influence nurse executives. This article highlights the application of the MILE ONE framework to a community district hospital's clinical documentation performance improvement projects. Results suggest that the MILE ONE is a valid and useful framework yielding both anticipated and unexpected enhancements to leaders, environments, and outcomes.
Morris, Matthew; Kaiser, Mary Elizabeth; Bohlin, Ralph; Kurucz, Robert; ACCESS Team
A goal of the ACCESS program (Absolute Color Calibration Experiment for Standard Stars) is to enable greater discrimination between theoretical astrophysical models and observations, where the comparison is limited by systematic errors associated with the relative flux calibration of the targets. To achieve these goals, ACCESS has been designed as a sub-orbital rocket borne payload and ground calibration program, to establish absolute flux calibration of stellar targets at flight candidates, as well as a selection of A and G stars from the CALSPEC database. Stellar atmosphere models were generated using Atlas 9 and Atlas 12 Kurucz stellar atmosphere software. The effective temperature, log(g), metallicity, and redenning were varied and the chi-squared statistic was minimized to obtain a best-fit model. A comparison of these models and the results from interpolation between grids of existing models will be presented. The impact of the flexibility of the Atlas 12 input parameters (e.g. solar metallicity fraction, abundances, microturbulent velocity) is being explored.
The Waveforms and Sonic boom Perception and Response (WSPR) Program was designed to test and demonstrate the applicability and effectiveness of techniques to gather data relating human subjective response to multiple low-amplitude sonic booms. It was in essence a practice session for future wider scale testing on naive communities, using a purpose built low-boom demonstrator aircraft. The low-boom community response pilot experiment was conducted in California in November 2011. The WSPR team acquired sufficient data to assess and evaluate the effectiveness of the various physical and psychological data gathering techniques and analysis methods.
Sattler, A.R.; Hunter, T.O.
This document presents plans for in-situ experiments in a specific location in southeastern New Mexico. Schedule and facility design were based on features of a representative local potash mine and on contract negotiations with mine owners. Subsequent WIPP program uncertainties have required a delay in the implementation of the activities discussed here; however, the relative schedule for various activities are appropriate for future planning. The document represents a matrix of in-situ activities to address relevant technical issues prior to the availability of a bedded salt repository.
Sattler, A.R.; Hunter, T.O.
This document presents plans for in-situ experiments in a specific location in southeastern New Mexico. Schedule and facility design were based on features of a representative local potash mine and on contract negotiations with mine owners. Subsequent WIPP program uncertainties have required a delay in the implementation of the activities discussed here; however, the relative schedule for various activities are appropriate for future planning. The document represents a matrix of in-situ activities to address relevant technical issues prior to the availability of a bedded salt repository
Ben Abdallah, Nada M-B; Fuss, Johannes; Trusel, Massimo; Galsworthy, Michael J; Bobsin, Kristin; Colacicco, Giovanni; Deacon, Robert M J; Riva, Marco A; Kellendonk, Christoph; Sprengel, Rolf; Lipp, Hans-Peter; Gass, Peter
Deficits in executive functions are key features of schizophrenia. Rodent behavioral paradigms used so far to find animal correlates of such deficits require extensive effort and time. The puzzle box is a problem-solving test in which mice are required to complete escape tasks of increasing difficulty within a limited amount of time. Previous data have indicated that it is a quick but highly reliable test of higher-order cognitive functioning. We evaluated the use of the puzzle box to explore executive functioning in five different mouse models of schizophrenia: mice with prefrontal cortex and hippocampus lesions, mice treated sub-chronically with the NMDA-receptor antagonist MK-801, mice constitutively lacking the GluA1 subunit of AMPA-receptors, and mice over-expressing dopamine D2 receptors in the striatum. All mice displayed altered executive functions in the puzzle box, although the nature and extent of the deficits varied between the different models. Deficits were strongest in hippocampus-lesioned and GluA1 knockout mice, while more subtle deficits but specific to problem solving were found in the medial prefrontal-lesioned mice, MK-801-treated mice, and in mice with striatal overexpression of D2 receptors. Data from this study demonstrate the utility of the puzzle box as an effective screening tool for executive functions in general and for schizophrenia mouse models in particular. Published by Elsevier Inc.
Chadalawada, Jayashree; Babovic, Vladan
One of the recent challenges for the hydrologic research community is the need for the development of coupled systems that involves the integration of hydrologic, atmospheric and socio-economic relationships. This poses a requirement for novel modelling frameworks that can accurately represent complex systems, given, the limited understanding of underlying processes, increasing volume of data and high levels of uncertainity. Each of the existing hydrological models vary in terms of conceptualization and process representation and is the best suited to capture the environmental dynamics of a particular hydrological system. Data driven approaches can be used in the integration of alternative process hypotheses in order to achieve a unified theory at catchment scale. The key steps in the implementation of integrated modelling framework that is influenced by prior understanding and data, include, choice of the technique for the induction of knowledge from data, identification of alternative structural hypotheses, definition of rules, constraints for meaningful, intelligent combination of model component hypotheses and definition of evaluation metrics. This study aims at defining a Genetic Programming based modelling framework that test different conceptual model constructs based on wide range of objective functions and evolves accurate and parsimonious models that capture dominant hydrological processes at catchment scale. In this paper, GP initializes the evolutionary process using the modelling decisions inspired from the Superflex framework [Fenicia et al., 2011] and automatically combines them into model structures that are scrutinized against observed data using statistical, hydrological and flow duration curve based performance metrics. The collaboration between data driven and physical, conceptual modelling paradigms improves the ability to model and manage hydrologic systems. Fenicia, F., D. Kavetski, and H. H. Savenije (2011), Elements of a flexible approach
This report covers work on the BWR Radiation Assessment and Control (BRAC) Program from 1978 to 1982. The major activities during this report period were assessment of the radiation-level trends in BWRs, evaluation of the effects of forward-pumped heater drains on BWR water quality, installation and operation of a corrosion-product deposition loop in an operating BWR, and analyzation of fuel-deposit samples from two BWRs. Radiation fields were found to be controlled by cobalt-60 and to vary from as low as 50 mr/hr to as high as 800 mr/hr on the recirculation-system piping. Detailed information on BWR corrosion films and system deposits is presented in the report. Additionally, the results of an oxygen-injection experiment and recontamination monitoring studies are provided
Bruce, Grady D.
Critics of the overall value of the MBA have not systematically considered the attitudes of MBA students about the value of their degree. The author used data from a large sample of graduates (N = 16,268) to do so, and to explore predictors of overall degree value. The author developed separate regression models for full-time, part-time, and…
M. J. Mišić
Full Text Available Graphics processing units and similar accelerators have been intensively used in general purpose computations for several years. In the last decade, GPU architecture and organization changed dramatically to support an ever-increasing demand for computing power. Along with changes in hardware, novel programming models have been proposed, such as NVIDIA’s Compute Unified Device Architecture (CUDA and Open Computing Language (OpenCL by Khronos group. Although numerous commercial and scientific applications have been developed using these two models, they still impose a significant challenge for less experienced users. There are users from various scientific and engineering communities who would like to speed up their applications without the need to deeply understand a low-level programming model and underlying hardware. In 2011, OpenACC programming model was launched. Much like OpenMP for multicore processors, OpenACC is a high-level, directive-based programming model for manycore processors like GPUs. This paper presents an analysis of OpenACC programming model and its applicability in typical domains like image processing. Three, simple image processing algorithms have been implemented for execution on the GPU with OpenACC. The results were compared with their sequential counterparts, and results are briefly discussed.
Foster-Jorgensen, Karen; Harrington, Angela
This handbook is designed to assist childcare executive officers (CEOs) in managing the finances of their programs. The guide is divided into five sections. Section 1, "Financial Entrepreneurship," advocates the adoption of an entrepreneurial spirit in directors and recommends: (1) becoming the Chief Executive Officer of the program; (2) actively…
Friborg, Rune Møllegaard
and can usually benefit performance-wise from both multiprocessing, cluster and grid environments. PyCSP is an implementation of Communicating Sequential Processes (CSP) for the Python programming language and takes advantage of CSP's formal and verifiable approach to controlling concurrency...... on multi-processing and cluster computing using PyCSP. Additionally, McStas is demonstrated to utilise grid computing resources using PyCSP. Finally, this thesis presents a new dynamic channel model, which has not yet been implemented for PyCSP. The dynamic channel is able to change the internal...... synchronisation mechanisms on-the-fly, depending on the location and number of channel-ends connected. Thus it may start out as a simple local pipe and evolve into a distributed channel spanning multiple nodes. This channel is a necessary next step for PyCSP to allow for complete freedom in executing CSP...
Humphrey, Gary [Fugro Geoconsulting Inc., Houston, TX (United States)
The objective of this project (and report) is to produce a guide to developing scientific, operational, and logistical plans for a future methane hydrate-focused offshore pressure coring program. This report focuses primarily on a potential coring program in the Walker Ridge 313 and Green Canyon 955 blocks where previous investigations were undertaken as part of the 2009 Department of Energy JIP Leg II expedition, however, the approach to designing a pressure coring program that was utilized for this project may also serve as a useful model for planning pressure coring programs for hydrates in other areas. The initial portion of the report provides a brief overview of prior investigations related to gas hydrates in general and at the Walker Ridge 313 and Green Canyon 955 blocks in particular. The main content of the report provides guidance for various criteria that will come into play when designing a pressure coring program.
Peter Backlund; Anthony Janetos; David Schimel; J. Hatfield; M. Ryan; S. Archer; D. Lettenmaier
This report is an assessment of the effects of climate change on U.S. land resources, water resources, agriculture, and biodiversity. It is one of a series of 21 Synthesis and Assessment Products being produced under the auspices of the U.S. Climate Change Science Program (CCSP), which coordinates the climate change research activities of U.S. government agencies. The...
Fournier, D.F. Jr.
The Model Low Energy Installation Program is a demonstration of an installation-wide, comprehensive energy conservation program that meets the Department of Defense (DoD) energy management goals of reducing energy usage and costs by at least 20%. It employs the required strategies for meeting these goals, quantifies the environmental compliance benefits resulting from energy conservation and serves as a prototype for DoD wide application. This project will develop both analysis tools and implementation procedures as well as demonstrate the effectiveness of a comprehensive, coordinated energy conservation program based on state-of-the-art technologies. A military installation is in reality a small to medium sized city. It generally has a complete utilities infrastructure including water supply and distribution, sewage collection and treatment, electrical supply and distribution, central heating and cooling plants with thermal distribution, and a natural gas distribution system. These utilities are quite extensive and actually consume about 10-15% of the energy on the facility not counting the energy going into the central plants
This is Volume 3 of three volumes of documentation of the International Nuclear Model (INM). This volume presents the Program Description of the International Nuclear Model, which was developed for the Nuclear and Alternate Fuels Division (NAFD), Office of Coal, Nuclear, Electric and Alternate Fuels, Energy Information Administration (EIA), US Department of Energy (DOE). The International Nuclear Model (INM) is a comprehensive model of the commercial nuclear power industry. It simulates economic decisions for reactor deployment and fuel management decision based on an input set of technical economic and scenario parameters. The technical parameters include reactor operating characteristics, fuel cycle timing and mass loss factors, and enrichment tails assays. Economic parameters include fuel cycle costs, financial data, and tax alternatives. INM has a broad range of scenario options covering, for example, process constraints, interregional activities, reprocessing, and fuel management selection. INM reports reactor deployment schedules, electricity generation, and fuel cycle requirements and costs. It also has specialized reports for extended burnup and permanent disposal. Companion volumes to Volume 3 are: Volume 1 - Model Overview, and Volume 2 - Data Base Relationships
This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...
Rasmussen, Kourosh Marjani; Madsen, Claus A.; Poulsen, Rolf
The Danish mortgage market is large and sophisticated. However, most Danish mortgage banks advise private home-owners based on simple, if sensible, rules of thumb. In recent years a number of papers (from Nielsen and Poulsen in J Econ Dyn Control 28:1267–1289, 2004 over Rasmussen and Zenios in J...... Risk 10:1–18, 2007 to Pedersen et al. in Ann Oper Res, 2013) have suggested a model-based, stochastic programming approach to mortgage choice. This paper gives an empirical comparison of performance over the period 2000–2010 of the rules of thumb to the model-based strategies. While the rules of thumb.......3–0.9 %-points (depending on the borrower’s level of conservatism) compared to the rules of thumb without increasing the risk. The answer to the question in the title is thus affirmative....
Jing, L.; Stephansson, O. [Royal Inst. of Technology, Stockholm (Sweden). Engineering Geology; Tsang, C.F. [Lawrence Berkely National Laboratory, Berkeley, CA (United States). Earth Science Div.; Mayor, J.C. [ENRESA, Madrid (Spain); Kautzky, F. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden)] (eds.)
DECOVALEX is an international consortium of governmental agencies associated with the disposal of high-level nuclear waste in a number of countries. The consortium's mission is the DEvelopment of COupled models and their VALidation against EXperiments. Hence the acronym/name DECOVALEX. Currently, agencies from Canada, Finland, France, Germany, Japan, Spain, Switzerland, Sweden, United Kingdom, and the United States are in DECOVALEX. Emplacement of nuclear waste in a repository in geologic media causes a number of physical processes to be intensified in the surrounding rock mass due to the decay heat from the waste. The four main processes of concern are thermal, hydrological, mechanical and chemical. Interactions or coupling between these heat-driven processes must be taken into account in modeling the performance of the repository for such modeling to be meaningful and reliable. DECOVALEX III is organized around four tasks. The FEBEX (Full-scale Engineered Barriers EXperiment) in situ experiment being conducted at the Grimsel site in Switzerland is to be simulated and analyzed in Task 1. Task 2, centered around the Drift Scale Test (DST) at Yucca Mountain in Nevada, USA, has several sub-tasks (Task 2A, Task 2B, Task 2C and Task 2D) to investigate a number of the coupled processes in the DST. Task 3 studies three benchmark problems: a) the effects of thermal-hydrologic-mechanical (THM) coupling on the performance of the near-field of a nuclear waste repository (BMT1); b) the effect of upscaling THM processes on the results of performance assessment (BMT2); and c) the effect of glaciation on rock mass behavior (BMT3). Task 4 is on the direct application of THM coupled process modeling in the performance assessment of nuclear waste repositories in geologic media. This executive summary presents the motivation, structure, objectives, approaches, and the highlights of the main achievements and outstanding issues of the tasks studied in the DECOVALEX III project
Jing, L.; Stephansson, O.; Kautzky, F.
DECOVALEX is an international consortium of governmental agencies associated with the disposal of high-level nuclear waste in a number of countries. The consortium's mission is the DEvelopment of COupled models and their VALidation against EXperiments. Hence the acronym/name DECOVALEX. Currently, agencies from Canada, Finland, France, Germany, Japan, Spain, Switzerland, Sweden, United Kingdom, and the United States are in DECOVALEX. Emplacement of nuclear waste in a repository in geologic media causes a number of physical processes to be intensified in the surrounding rock mass due to the decay heat from the waste. The four main processes of concern are thermal, hydrological, mechanical and chemical. Interactions or coupling between these heat-driven processes must be taken into account in modeling the performance of the repository for such modeling to be meaningful and reliable. DECOVALEX III is organized around four tasks. The FEBEX (Full-scale Engineered Barriers EXperiment) in situ experiment being conducted at the Grimsel site in Switzerland is to be simulated and analyzed in Task 1. Task 2, centered around the Drift Scale Test (DST) at Yucca Mountain in Nevada, USA, has several sub-tasks (Task 2A, Task 2B, Task 2C and Task 2D) to investigate a number of the coupled processes in the DST. Task 3 studies three benchmark problems: a) the effects of thermal-hydrologic-mechanical (THM) coupling on the performance of the near-field of a nuclear waste repository (BMT1); b) the effect of upscaling THM processes on the results of performance assessment (BMT2); and c) the effect of glaciation on rock mass behavior (BMT3). Task 4 is on the direct application of THM coupled process modeling in the performance assessment of nuclear waste repositories in geologic media. This executive summary presents the motivation, structure, objectives, approaches, and the highlights of the main achievements and outstanding issues of the tasks studied in the DECOVALEX III project. The
Pittman, W. D.
The ADAMS Executive and Operating System, a multitasking environment under which a variety of data reduction, display and utility programs are executed, a system which provides a high level of isolation between programs allowing them to be developed and modified independently, is described. The Airborne Data Analysis/Monitor System (ADAMS) was developed to provide a real time data monitoring and analysis capability onboard Boeing commercial airplanes during flight testing. It inputs sensor data from an airplane performance data by applying transforms to the collected sensor data, and presents this data to test personnel via various display media. Current utilization and future development are addressed.
Domingos, Roberto P.; Schirru, Roberto; Martinez, Aquilino S.
This work presents a Genetic Programming paradigm and a nuclear application. A field of Artificial Intelligence, based on the concepts of Species Evolution and Natural Selection, can be understood as a self-programming process where the computer is the main agent responsible for the discovery of a program able to solve a given problem. In the present case, the problem was to find a mathematical expression in symbolic form, able to express the existent relation between equivalent ratio of a fuel cell, the enrichment of fuel elements and the multiplication factor. Such expression would avoid repeatedly reactor physics codes execution for core optimization. The results were compared with those obtained by different techniques such as Neural Networks and Linear Multiple Regression. Genetic Programming has shown to present a performance as good as, and under some features superior to Neural Network and Linear Multiple Regression. (author). 10 refs., 8 figs., 1 tabs
Matzke, Orville R.
The purpose of this study was to formulate a linear programming model to simulate a foundation type support program and to apply this model to a state support program for the public elementary and secondary school districts in the State of Iowa. The model was successful in producing optimal solutions to five objective functions proposed for…
The State-of-the-Art Report on Multi-scale Modelling of Nuclear Fuels describes the state of fundamental materials models that can represent fuel behaviour, the methodologies for obtaining material properties, and modelling principles as they can be implemented in fuel performance codes. This report, while far from being a detailed assessment of nuclear fuel modelling, provides a recognition of the approaches to the significant aspects of fuel modelling and examples of their application. Fuel behaviour phenomena are discussed that are applicable across the spectrum of fuel forms, from conventional LWR oxide pellets to MOX, carbide, and metal SFR fuel, to coated particle fuel for gas-cooled reactors. A key issue is microstructural evolution during burn-up, and the state of understanding of that phenomenon is considered at length. Covered in the discussions are the basic material properties of heat capacity, free energy, and thermal conductivity and diffusion. Also included are the more functional effects of restructuring, bubble formation, constituent redistribution, fuel and clad oxidation, and fuel clad and environmental interactions. Fuel fabrication is considered as are many material modelling challenges, such as representing injection casting of metallic fuels, as seen in the preparation of nuclear fuel. The last set of contributions covered the basic principles for modelling phenomena and determining fundamental materials properties, a look at the state of fuel performance codes and a last note about integrating across multiple scales. The state-of-the-art of modelling phenomena related to nuclear fuel has advanced significantly in recent years. The representation of atomic level behaviour is increasingly becoming more accurate as capabilities to utilise larger sets of atoms evolve and empirical potentials improve. At the mesoscale, models for transport and microstructure evolution have also progressed with new techniques that well represent restructuring. At
Owings, Angie; Graves, JoBeth; Johnson, Sherry; Gilliam, Craig; Gipson, Mike; Hakim, Hana
To prevent central line-associated bloodstream infections (CLABSIs), leadership line care rounds (LLCRs) used the engage, educate, execute, and evaluate improvement model to audit compliance, identify barriers and opportunities, empower patients and families, and engage leadership. Findings of excellence and improvement opportunities were communicated to unit staff and managers. LLCRs contributed to compliance with CLABSI prevention interventions. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Yoshimura, H.R.; Attaway, S.W.; Bronowski, D.R.; Uncapher, W.L.; Huerta, M.; Abbott, D.G.
This paper describes the drop testing of a one-third scale model transport cask system. Two casks were supplied by Transnuclear, Inc. (TN) to demonstrate dual purpose shipping/storage casks. These casks will be used to ship spent fuel from DOEs West Valley demonstration project in New York to the Idaho National Engineering Laboratory (INEL) for long term spent fuel dry storage demonstration. As part of the certification process, one-third scale model tests were performed to obtain experimental data. Two 9-m (30-ft) drop tests were conducted on a mass model of the cask body and scaled balsa and redwood filled impact limiters. In the first test, the cask system was tested in an end-on configuration. In the second test, the system was tested in a slap-down configuration where the axis of the cask was oriented at a 10 degree angle with the horizontal. Slap-down occurs for shallow angle drops where the primary impact at one end of the cask is followed by a secondary impact at the other end. The objectives of the testing program were to (1) obtain deceleration and displacement information for the cask and impact limiter system, (2) obtain dynamic force-displacement data for the impact limiters, (3) verify the integrity of the impact limiter retention system, and (4) examine the crush behavior of the limiters. This paper describes both test results in terms of measured deceleration, post test deformation measurements, and the general structural response of the system
Amer, Abdelhalim; Maruyama, Naoya; Pericà s, Miquel; Taura, Kenjiro; Yokota, Rio; Matsuoka, Satoshi
Extracting maximum performance of multi-core architectures is a difficult task primarily due to bandwidth limitations of the memory subsystem and its complex hierarchy. In this work, we study the implications of fork-join and data-driven execution
Verburgh, Lot; Scherder, Erik J. A.; van Lange, Paul A.M.; Oosterlaan, Jaap
Executive functions might be important for successful performance in sports, particularly in team sports requiring quick anticipation and adaptation to continuously changing situations in the field. The executive functions motor inhibition, attention and visuospatial working memory were examined in highly talented soccer players. Eighty-four highly talented youth soccer players (mean age 11.9), and forty-two age-matched amateur soccer players (mean age 11.8) in the age range 8 to 16 years performed a Stop Signal task (motor inhibition), the Attention Network Test (alerting, orienting, and executive attention) and a visuospatial working memory task. The highly talented soccer players followed the talent development program of the youth academy of a professional soccer club and played at the highest national soccer competition for their age. The amateur soccer players played at a regular soccer club in the same geographical region as the highly talented soccer players and play in a regular regional soccer competition. Group differences were tested using analyses of variance. The highly talented group showed superior motor inhibition as measured by stop signal reaction time (SSRT) on the Stop Signal task and a larger alerting effect on the Attention Network Test, indicating an enhanced ability to attain and maintain an alert state. No group differences were found for orienting and executive attention and visuospatial working memory. A logistic regression model with group (highly talented or amateur) as dependent variable and executive function measures that significantly distinguished between groups as predictors showed that these measures differentiated highly talented soccer players from amateur soccer players with 89% accuracy. Highly talented youth soccer players outperform youth amateur players on suppressing ongoing motor responses and on the ability to attain and maintain an alert state; both may be essential for success in soccer. PMID:24632735
Full Text Available Executive functions might be important for successful performance in sports, particularly in team sports requiring quick anticipation and adaptation to continuously changing situations in the field. The executive functions motor inhibition, attention and visuospatial working memory were examined in highly talented soccer players. Eighty-four highly talented youth soccer players (mean age 11.9, and forty-two age-matched amateur soccer players (mean age 11.8 in the age range 8 to 16 years performed a Stop Signal task (motor inhibition, the Attention Network Test (alerting, orienting, and executive attention and a visuospatial working memory task. The highly talented soccer players followed the talent development program of the youth academy of a professional soccer club and played at the highest national soccer competition for their age. The amateur soccer players played at a regular soccer club in the same geographical region as the highly talented soccer players and play in a regular regional soccer competition. Group differences were tested using analyses of variance. The highly talented group showed superior motor inhibition as measured by stop signal reaction time (SSRT on the Stop Signal task and a larger alerting effect on the Attention Network Test, indicating an enhanced ability to attain and maintain an alert state. No group differences were found for orienting and executive attention and visuospatial working memory. A logistic regression model with group (highly talented or amateur as dependent variable and executive function measures that significantly distinguished between groups as predictors showed that these measures differentiated highly talented soccer players from amateur soccer players with 89% accuracy. Highly talented youth soccer players outperform youth amateur players on suppressing ongoing motor responses and on the ability to attain and maintain an alert state; both may be essential for success in soccer.
The staff of the Nuclear Regulatory Commission is performing nuclear power plant design certification reviews based on a design process plan that describes the human factors engineering (HFE) program elements that are necessary and sufficient to develop an acceptable detailed design specification and an acceptable implemented design. There are two principal reasons for this approach. First, the initial design certification applications submitted for staff review did not include detailed design information. Second, since human performance literature and industry experiences have shown that many significant human factors issues arise early in the design process, review of the design process activities and results is important to the evaluation of an overall design. However, current regulations and guidance documents do not address the criteria for design process review. Therefore, the HFE Program Review Model (HFE PRM) was developed as a basis for performing design certification reviews that include design process evaluations as well as review of the final design. A central tenet of the HFE PRM is that the HFE aspects of the plant should be developed, designed, and evaluated on the basis of a structured top-down system analysis using accepted HFE principles. The HFE PRM consists of ten component elements. Each element in divided into four sections: Background, Objective, Applicant Submittals, and Review Criteria. This report describes the development of the HFE PRM and gives a detailed description of each HFE review element
Broten, Thomas A.; Brown, David A.
Increased operational autonomy and reduced operating costs have become critical design objectives in next-generation NASA and DoD space programs. The objective is to develop a semi-automated system for intelligent spacecraft operations support. The Spacecraft Operations and Anomaly Resolution System (SOARS) is presented as a standardized, model-based architecture for performing High-Level Tasking, Status Monitoring and automated Procedure Execution Control for a variety of spacecraft. The particular focus is on the Procedure Execution Control module. A hierarchical procedure network is proposed as the fundamental means for specifying and representing arbitrary operational procedures. A separate procedure interpreter controls automatic execution of the procedure, taking into account the current status of the spacecraft as maintained in an object-oriented spacecraft model.
Winkler, S. M.; Affenzeller, M.; Wagner, S.
The use of genetic programming (GP) in nonlinear system identification enables the automated search for mathematical models that are evolved by an evolutionary process using the principles of selection, crossover and mutation. Due to the stochastic element that is intrinsic to any evolutionary process, GP cannot guarantee the generation of similar or even equal models in each GP process execution; still, if there is a physical model underlying to the data that are analyzed, then GP is expected to find these structures and produce somehow similar results. In this paper we define a function for measuring the syntactic similarity of mathematical models represented as structure trees; using this similarity function we compare the results produced by GP techniques for a data set representing measurement data of a BMW Diesel engine.
A Spacecraft Position Optimal Tracking (SPOT) program was developed to process Global Positioning System (GPS) data, sent via telemetry from a spacecraft, to generate accurate navigation estimates of the vehicle position and velocity (state vector) using a Kalman filter. This program uses the GPS onboard receiver measurements to sequentially calculate the vehicle state vectors and provide this information to ground flight controllers. It is the first real-time ground-based shuttle navigation application using onboard sensors. The program is compact, portable, self-contained, and can run on a variety of UNIX or Linux computers. The program has a modular objec-toriented design that supports application-specific plugins such as data corruption remediation pre-processing and remote graphics display. The Kalman filter is extensible to additional sensor types or force models. The Kalman filter design is also strong against data dropouts because it uses physical models from state and covariance propagation in the absence of data. The design of this program separates the functionalities of SPOT into six different executable processes. This allows for the individual processes to be connected in an a la carte manner, making the feature set and executable complexity of SPOT adaptable to the needs of the user. Also, these processes need not be executed on the same workstation. This allows for communications between SPOT processes executing on the same Local Area Network (LAN). Thus, SPOT can be executed in a distributed sense with the capability for a team of flight controllers to efficiently share the same trajectory information currently being computed by the program. SPOT is used in the Mission Control Center (MCC) for Space Shuttle Program (SSP) and International Space Station Program (ISSP) operations, and can also be used as a post -flight analysis tool. It is primarily used for situational awareness, and for contingency situations.
Shipman, D. L.
The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.
Harrold, Marc W.
This paper describes and evaluates the use of a molecular modeling computer program (Alchemy II) in a pharmaceutical education program. Provided are the hardware requirements and basic program features as well as several examples of how this program and its features have been applied in the classroom. (GLR)
How Do Executive Functions Fit with the Cattell-Horn-Carroll Model? Some Evidence from a Joint Factor Analysis of the Delis-Kaplan Executive Function System and the Woodcock-Johnson III Tests of Cognitive Abilities
Floyd, Randy G.; Bergeron, Renee; Hamilton, Gloria; Parra, Gilbert R.
This study investigated the relations among executive functions and cognitive abilities through a joint exploratory factor analysis and joint confirmatory factor analysis of 25 test scores from the Delis-Kaplan Executive Function System and the Woodcock-Johnson III Tests of Cognitive Abilities. Participants were 100 children and adolescents…
Yusuf Adam Hilman
Full Text Available This study examines the issue of community empowerment, which is considered to be a solution problem of poverty, which is more interesting when the community of "Janda" becomes an important object. This research focuses on the study of community-based community empowerment model in Janda village. Purpose is able to measure the effectiveness and also the ideal form of community empowerment program model in the village of Janda, Dadapan Village, Balong District, Ponorogo Regency. The research are method qualitative descriptive approach, with object in research is the people members, especially mothers who berrstatus "widow" in the Dadapan Village, Balong District, Ponorogo Regency.. Activities include 1. Training of processed food from the existing agricultural potential, 2. Make a kitchen granary from the land around the community. 3. Train the art activities of mothers who are "Janda". From concluded this research is community empowerment activities in Dadapan Village, Balong District, Ponorogo Regency, which have been done are very focused on "Janda", so that the activity is expected to contribute, to the life of those who is distressed or increase the independence of the family, with this activity, economical but psychologically they will be motivated to become a powerful individual.
Wallace, Steven O
The U.S. Executive Branch should change its fundamental structure at the operational level to achieve integrated planning and regional unity of effort through unified regional executors to synchronize...
Game Theory and its Relationship with Linear Programming Models. ... This paper shows that game theory and linear programming problem are closely related subjects since any computing method devised for ... AJOL African Journals Online.
Jing, L.; Stephansson, O.; Tsang, C.F.; Kautsky, F.
This executive summary presents the motivation, structure, objectives, methodologies and results of the first stage of the international DECOVALEX project - DECOVALEX I (1992-1995). The acronym stands for Development of Coupled Models and their Validation against Experiment in Nuclear Waste Isolation, and the project is an international effort to develop mathematical models, numerical methods and computer codes for coupled thermo-hydro-mechanical processes in fractured rocks and buffer materials for geological isolation of spent nuclear fuel and other radioactive wastes, and validate them against laboratory and field experiments. 24 refs
Wolt, Jeffrey D.
Presents a computer program for use in teaching ion speciation in soil solutions. Provides information on the structure of the program, execution, and software specifications. The program estimates concentrations of ion pairs, hydrolytic species, metal-organic complexes, and free ions in solutions. (Author/RT)
Kim, Yun Goo; Seong, Poong Hyun
The Computerized Procedure System (CPS) is one of the primary operating support systems in the digital Main Control Room. The CPS displays procedure on the computer screen in the form of a flow chart, and displays plant operating information along with procedure instructions. It also supports operator decision making by providing a system decision. A procedure flow should be correct and reliable, as an error would lead to operator misjudgement and inadequate control. In this paper we present a modeling for the CPS that enables formal verification based on Petri nets. The proposed State Token Petri Nets (STPN) also support modeling of a procedure flow that has various interruptions by the operator, according to the plant condition. STPN modeling is compared with Coloured Petri net when they are applied to Emergency Operating Computerized Procedure. A converting program for Computerized Procedure (CP) to STPN has been also developed. The formal verification and validation methods of CP with STPN increase the safety of a nuclear power plant and provide digital quality assurance means that are needed when the role and function of the CPS is increasing.
Wu, Juan; Ge, Xueqian
Linear programming is an important branch of operational research and it is a mathematical method to assist the people to carry out scientific management. GAMS is an advanced simulation and optimization modeling language and it will combine a large number of complex mathematical programming, such as linear programming LP, nonlinear programming NLP, MIP and other mixed-integer programming with the system simulation. In this paper, based on the linear programming model, the optimized investment decision-making of generation is simulated and analyzed. At last, the optimal installed capacity of power plants and the final total cost are got, which provides the rational decision-making basis for optimized investments.
Full Text Available Effective Cooperative Extension programs produce important private and public value for individuals, families, businesses, and communities. However, the public value of Extension programming often goes unmeasured and unarticulated. Extension needs to reclaim its role as a key provider of public value for Land-Grant Universities through strong educational programs driven by infusing public value into all elements of the Extension Program Development Model. This article describes Extension’s public value movement including organizational, professional, program, and scholarship development efforts to enhance public good effectiveness articulation. Lessons learned, implications, and next steps for Extension’s public value success through a strong program development model are also shared.
Since 2007, the OECD Nuclear Energy Agency has been organising a series of workshops on Structural Materials for Innovative Nuclear Systems. The third meeting was held on 7-9 October 2013 in Idaho Falls (United States). The main objectives of this workshop are to stimulate an exchange of information on current materials R and D programmes for different innovative nuclear systems. The main topics of the workshop covered fundamental studies, modelling and experiments on innovative structural materials including cladding materials for the range of advanced nuclear systems such as thermal/fast systems, sub-critical systems, as well as fusion systems. During the workshop, the following topics were discussed: - Fundamental studies; - Metallic materials; - Ceramic materials; - Novel materials pathways; - Ion vs neutron irradiation. Fundamental studies focused on the identification of mechanisms driving the response of materials under the conditions expected in innovative nuclear systems. These mechanisms may have acted at the atomic or higher scale with the application of multi-scale approaches, together with related problems of scale-bridging or numerical methods, were of special interest. Moreover, irradiation experiments and subsequent characterisation of materials with analytical techniques were included in the session if aimed at better understanding the acting mechanisms or drawing physics-based correlations. Metal alloys, ceramic and ceramic composites included in- and out-of-core applications which took into account the scope of: data availability and gaps (considering also licensing issues); experimental and modelling needs for specific components or degradation modes; the link between R and D, standardisation and experimental protocols; coolant effects and mechanical properties. Code development and implementation plans were also discussed. Application of SiC composites to LWR systems was of interest as an advanced concept. Novel materials pathways considered
Kemp, Andrew H; López, Santiago Rodríguez; Passos, Valeria M A; Bittencourt, Marcio S; Dantas, Eduardo M; Mill, José G; Ribeiro, Antonio L P; Thayer, Julian F; Bensenor, Isabela M; Lotufo, Paulo A
Research has linked high-frequency heart rate variability (HF-HRV) to cognitive function. The present study adopts a modern path modelling approach to understand potential causal pathways that may underpin this relationship. Here we examine the association between resting-state HF-HRV and executive function in a large sample of civil servants from Brazil (N=8114) recruited for the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil). HF-HRV was calculated from 10-min resting-state electrocardiograms. Executive function was assessed using the trail-making test (version B). Insulin resistance (a marker of type 2 diabetes mellitus) and carotid intima-media thickness (subclinical atherosclerosis) mediated the relationship between HRV and executive function in seriatim. A limitation of the present study is its cross-sectional design; therefore, conclusions must be confirmed in longitudinal study. Nevertheless, findings support that possibility that HRV provides a 'spark' that initiates a cascade of adverse downstream effects that subsequently leads to cognitive impairment. Copyright © 2016 Elsevier B.V. All rights reserved.
Selle, Benny; Muttil, Nitin
SummaryGenetic Programming is able to systematically explore many alternative model structures of different complexity from available input and response data. We hypothesised that Genetic Programming can be used to test the structure of hydrological models and to identify dominant processes in hydrological systems. To test this, Genetic Programming was used to analyse a data set from a lysimeter experiment in southeastern Australia. The lysimeter experiment was conducted to quantify the deep percolation response under surface irrigated pasture to different soil types, watertable depths and water ponding times during surface irrigation. Using Genetic Programming, a simple model of deep percolation was recurrently evolved in multiple Genetic Programming runs. This simple and interpretable model supported the dominant process contributing to deep percolation represented in a conceptual model that was published earlier. Thus, this study shows that Genetic Programming can be used to evaluate the structure of hydrological models and to gain insight about the dominant processes in hydrological systems.
Ivanov, A.P.; Sizova, T.B.; Mikhejkina, N.D.; Sankovskij, G.A.; Tyufyagin, A.N.
A brief description of software for automated developing the models of integrating modular programming system, program module generator and program module library providing thermal-hydraulic calcualtion of process dynamics in power unit equipment components and on-line control system operation simulation is given. Technical recommendations for model development are based on experience in creation of concrete models of NPP power units. 8 refs., 1 tab., 4 figs
Selle, B.; Muttil, N.
Genetic Programming is able to systematically explore many alternative model structures of different complexity from available input and response data. We hypothesised that genetic programming can be used to test the structure hydrological models and to identify dominant processes in hydrological systems. To test this, genetic programming was used to analyse a data set from a lysimeter experiment in southeastern Australia. The lysimeter experiment was conducted to quantify the deep percolation response under surface irrigated pasture to different soil types, water table depths and water ponding times during surface irrigation. Using genetic programming, a simple model of deep percolation was consistently evolved in multiple model runs. This simple and interpretable model confirmed the dominant process contributing to deep percolation represented in a conceptual model that was published earlier. Thus, this study shows that genetic programming can be used to evaluate the structure of hydrological models and to gain insight about the dominant processes in hydrological systems.
Shehane, Ronald; Sherman, Steven
This study examines detailed usage of online training videos that were designed to address specific course problems that were encountered in an online computer programming course. The study presents the specifics of a programming course where training videos were used to provide students with a quick start path to learning a new programming…
To address the economy's growing reliance on international business, San Diego State University has recently introduced a program in international commerce. The program was developed by packaging coursework in three existing areas: business administration, language training, and area studies. Although still in its infancy, the international…
Tamaqua Area School District, PA.
This publication describes, in three sections, a high school Communication Arts Curriculum (CAC) program designed to further students' communication skills as they participate in student-centered learning activities in the fine arts, the practical arts, and the performing arts. "Program Operation" includes a course outline and inventories for…
Guthrie, Steven P.
In two articles on outdoor programming models, Watters distinguished four models on a continuum ranging from the common adventure model, with minimal organizational structure and leadership control, to the guide service model, in which leaders are autocratic and trips are highly structured. Club programs and instructional programs were in between,…
Hill, Phyllis J.
A newly appointed woman dean discusses the value of a management development program involving a process of self-analysis and self-determination of leadership style and effectiveness (the University of Illinois "Executive Leadership Seminar"). (JT)
represent program families with infinite integers as so-called (finite-state) featured symbolic automata. Specifically designed model checking algorithms are then employed to verify safety of all programs from a family at once and pinpoint those programs that are unsafe (respectively, safe). We present...... a prototype tool implementing this approach, and we illustrate it with several examples....
This document provides the interface specification, including related data models such as state model, activity description, resource and activity information, of an execution service, matching the needs of the EMI production middleware stack composed of ARC, gLite and UNICORE components. This service therefore is referred to as the EMI Execution Service (or “ES” for short). This document is a continuation of the work previously know as the GENEVA, then AGU (“ARC, gLite UNICORE”), then PGI execution service. As a starting point, the v0.42 of the “PGI Execution Service Specification” (doc15839) was used.
Full Text Available Executive function is traditionally conceptualized as a set of abilities required to guide behavior toward goals. Here, an integrated theoretical framework for executive function is developed which has its roots in the notion of hierarchical mental models. Further following Duncan (2010a,b, executive function is construed as a hierarchical recursive system of test-operation-test-exit units (Miller, Galanter, and Pribram, 1960. Importantly, it is shown that this framework can be used to model the main regional prefrontal syndromes, which are characterized by apathetic, disinhibited and dysexecutive cognition and behavior, respectively. Implications of these considerations for the neuropsychological assessment of executive function are discussed.
New Sunshine Program for Fiscal 2000. International cooperative project for developing photovoltaic power system practicalization technology (International Energy Agency (IEA)/Cooperative Program on Photovoltaic Power Systems (PVPS) implementing agreement - Executive committee meeting); 2000 nendo New sunshine keikaku. Taiyoko hatsuden system jitsuyoka gijustu kaihatsu kokusai kyoryoku jigyo (IEA taiyoko hatsuden system kenkyu kyoryoku program jisshi kyotei shikko iinkai)
Cooperative endeavors of research and development, verification, analysis, information exchange, introduction acceleration, etc., were exerted through participation in the above-said PVPS program. At the 15th PVPS executive committee meeting held in this fiscal year, reconsideration was made about the commencement of new tasks, change of OAs (operating agents), change of participating countries, etc., whose current state was not correctly reflected in the existing implementation agreement. At the 16th PVPS executive committee meeting, discussions were made and conclusions were reached that the next executive committee meeting decide whether to change the chairman, that deliberation be made in 2003 to decide whether to hold the 4th IEA/PVPS executive conference in Japan, that the assessment of each of the tasks be carried out in fiscal 2001, and that Task I conduct studies about market implementation for the fruits of the research-centered activities in the past to hit the market, etc. Workshop meetings were held, where Australia, France, Italy, and Japan reported their PVPS research, development, and popularization efforts. (NEDO)
Full Text Available Shared‐memory and data‐parallel programming models are two important paradigms for scientific applications. Both models provide high‐level program abstractions, and simple and uniform views of network structures. The common features of the two models significantly simplify program coding and debugging for scientific applications. However, the underlining execution and overhead patterns are significantly different between the two models due to their programming constraints, and due to different and complex structures of interconnection networks and systems which support the two models. We performed this experimental study to present implications and comparisons of execution patterns on two commercial architectures. We implemented a standard electromagnetic simulation program (EM and a linear system solver using the shared‐memory model on the KSR‐1 and the data‐parallel model on the CM‐5. Our objectives are to examine the execution pattern changes required for an implementation transformation between the two models; to study memory access patterns; to address scalability issues; and to investigate relative costs and advantages/disadvantages of using the two models for scientific computations. Our results indicate that the EM program tends to become computation‐intensive in the KSR‐1 shared‐memory system, and memory‐demanding in the CM‐5 data‐parallel system when the systems and the problems are scaled. The EM program, a highly data‐parallel program performed extremely well, and the linear system solver, a highly control‐structured program suffered significantly in the data‐parallel model on the CM‐5. Our study provides further evidence that matching execution patterns of algorithms to parallel architectures would achieve better performance.
Chaussenot, Rémi; Edeline, Jean-Marc; Le Bec, Benoit; El Massioui, Nicole; Laroche, Serge; Vaillend, Cyrille
Duchenne muscular dystrophy (DMD) is associated with language disabilities and deficits in learning and memory, leading to intellectual disability in a patient subpopulation. Recent studies suggest the presence of broader deficits affecting information processing, short-term memory and executive functions. While the absence of the full-length dystrophin (Dp427) is a common feature in all patients, variable mutation profiles may additionally alter distinct dystrophin-gene products encoded by separate promoters. However, the nature of the cognitive dysfunctions specifically associated with the loss of distinct brain dystrophins is unclear. Here we show that the loss of the full-length brain dystrophin in mdx mice does not modify the perception and sensorimotor gating of auditory inputs, as assessed using auditory brainstem recordings and prepulse inhibition of startle reflex. In contrast, both acquisition and long-term retention of cued and trace fear memories were impaired in mdx mice, suggesting alteration in a functional circuit including the amygdala. Spatial learning in the water maze revealed reduced path efficiency, suggesting qualitative alteration in mdx mice learning strategy. However, spatial working memory performance and cognitive flexibility challenged in various behavioral paradigms in water and radial-arm mazes were unimpaired. The full-length brain dystrophin therefore appears to play a role during acquisition of associative learning as well as in general processes involved in memory consolidation, but no overt involvement in working memory and/or executive functions could be demonstrated in spatial learning tasks. Copyright © 2015 Elsevier Inc. All rights reserved.
...This notice announces the testing of the Advance Payment Model for certain accountable care organizations participating in the Medicare Shared Savings Program scheduled to begin in 2012, and provides information about the model and application process.
Tolk, Andreas; Shuman, Edwin A.; Garcia, Johnny J.
Executable Architectures allow the evaluation of system architectures not only regarding their static, but also their dynamic behavior. However, the systems engineering community do not agree on a common formal specification of executable architectures. To close this gap and identify necessary elements of an executable architecture, a modeling language, and a modeling formalism is topic of ongoing PhD research. In addition, systems are generally defined and applied in an operational context to provide capabilities and enable missions. To maximize the benefits of executable architectures, a second PhD effort introduces the idea of creating an executable context in addition to the executable architecture. The results move the validation of architectures from the current information domain into the knowledge domain and improve the reliability of such validation efforts. The paper presents research and results of both doctoral research efforts and puts them into a common context of state-of-the-art of systems engineering methods supporting more agility.
Loyalty programs have exploded in popularity in recent decades. In the United States alone, membership has reached 1.3 billion (Ferguson and Hlavinka, 2007). In spite of their continued popularity, the effectiveness of these programs has been long debated in the literature, with mostly mixed results. Verhoef (2003) finds that the effects are positive but very small, DeWulf et al. (2001) finds no support for positive effects of direct mail, Shugan (2005) finds that firms gain short term revenu...
Executive skills are the cognitive abilities that make it possible for people to set goals, regulate impulses, and complete the steps necessary to achieve their objectives. Examples of these skills include time management, emotional control, and organization. Richard Guare and Peggy Dawson have developed a coaching strategy based on executive…
The objectives of the study are to determine: (1) condition on learning creative writing at high school students in Makassar, (2) requirement of learning model in creative writing, (3) program planning and design model in ideal creative writing, (4) feasibility of model study based on creative writing in neurolinguistic programming, and (5) the effectiveness of the learning model based on creative writing in neurolinguisticprogramming.The method of this research uses research development of L...
This report describes the methodology for using a genetic programming model to develop tracking behaviors for autonomous, microscale robotic vehicles. The use of such vehicles for surveillance and detection operations has become increasingly important in defense and humanitarian applications. Through an evolutionary process similar to that found in nature, the genetic programming model generates a computer program that when downloaded onto a robotic vehicle's on-board computer will guide the robot to successfully accomplish its task. Simulations of multiple robots engaged in problem-solving tasks have demonstrated cooperative behaviors. This report also discusses the behavior model produced by genetic programming and presents some results achieved during the study
Sheldon, Daniel S.; And Others
Provides geology departments and science educators with a leadership model for developing earth science inservice programs. Model emphasizes cooperation/coordination among departments, science educators, and curriculum specialists at local/intermediate/state levels. Includes rationale for inservice programs and geology department involvement in…
The description of the SNOP program developed in JINR and intended for numerical modeling of a beam dynamics in accelerating setups of cyclotron type is presented. The main methods of work with program components, and also stages of numerical modeling of a cyclotron, the analysis of the main characteristics of the accelerated bunch by means of the SNOP are given. The explanation of some algorithms and procedures used in the program is given. [ru
Full Text Available The model presented in this paper is based on the model developed by Billionnet for the hierarchical workforce problem. In Billionnet’s Model, while determining the workers’ weekly costs, weekly working hours of workers are not taken into consideration. In our model, the weekly costs per worker are reduced in proportion to the working hours per week. Our model is illustrated on the Billionnet’s Example. The models in question are compared and evaluated on the basis of the results obtained from the example problem. A reduction is achieved in the total cost by the proposed model.
Huizing, C.; Kuiper, R.; Luijten, C.A.A.M.; Vandalon, V.; Helfert, M.; Martins, M.J.; Cordeiro, J.
We provide an explicit, consistent, execution model for OO programs, specifically Java, together with a tool that visualizes the model This equips the student with a model to think and communicate about OO programs. Especially for an e-learning situation this is significant. Firstly, such a model
Vieira, Ágata; Melo, Cristina; Machado, Jorge; Gabriel, Joaquim
To analyse the effect of a six-month home-based phase III cardiac rehabilitation (CR) specific exercise program, performed in a virtual reality (Kinect) or conventional (booklet) environment, on executive function, quality of life and depression, anxiety and stress of subjects with coronary artery disease. A randomized controlled trial was conducted with subjects, who had completed phase II, randomly assigned to intervention group 1 (IG1), whose program encompassed the use of Kinect (n = 11); or intervention group 2 (IG2), a paper booklet (n = 11); or a control group (CG), only subjected to the usual care (n = 11). The three groups received education on cardiovascular risk factors. The assessed parameters, at baseline (M0), 3 (M1) and 6 months (M2), were executive function, control and integration in the implementation of an adequate behaviour in relation to a certain objective, specifically the ability to switch information (Trail Making Test), working memory (Verbal Digit Span test), and selective attention and conflict resolution ability (Stroop test), quality of life (MacNew questionnaire) and depression, anxiety and stress (Depression, Anxiety and Stress Scale 21). Descriptive and inferential statistical measures were used, significance level was set at .05. The IG1 revealed significant improvements, in the selective attention and conflict resolution ability, in comparison with the CG in the variable difference M0 - M2 (p = .021) and in comparison with the IG2 in the variable difference M1 - M2 and M0 - M2 (p = .001 and p = .002, respectively). No significant differences were found in the quality of life, and depression, anxiety and stress. The virtual reality format had improved selective attention and conflict resolution ability, revealing the potential of CR, specifically with virtual reality exercise, on executive function. Implications for Rehabilitation In cardiac rehabilitation, especially in phase III, it is
specified at each step. Since the user controls the interaction, the user may determine the order in which information flows into PMB. Information is received...until only ten years ago the term aautomatic programming" referred to the development of the assemblers, macro expanders, and compilers for these
Hawkes, C.; Lee, M.
The computer code COMFORT, developed for the online control of machine functions at the SLC, has recently undergone several modifications to overcome some of its limitations. This note describes the reasons for these changes, the methods employed, some test results and the applications of the new version of the program
workflows for repeatable and partial re-execution. We will coordinate the physical snapshots of virtual machines with parallel programming constructs, such as barriers, to automate checkpoint and restart. We will also integrate with HPC-specific container runtimes to gain access to accelerators and other specialized hardware to preserve native performance. Containers will link development to continuous integration. When application developers check code in, it will automatically be tested on a suite of different software and hardware architectures.
Ivilina Popova; Joseph G. Haubrich
We use a version of the Grossman and Hart principal-agent model with 10 actions and 10 states to produce quantitative predictions for executive compensation. Performance incentives derived from the model are compared with the performance incentives of 350 firms chosen from a survey by Michael Jensen and Kevin Murphy. The results suggest both that the model does a reasonable job of explaining the data and that actual incentives are close to the optimal incentives predicted by theory.
Bender, Alex C; Austin, Andrea M; Grodstein, Francine; Bynum, Julie P W
We examined the relationship between health care expenditures and cognition, focusing on differences across cognitive systems defined by global cognition, executive function, or episodic memory. We used linear regression models to compare annual health expenditures by cognitive status in 8125 Nurses' Health Study participants who completed a cognitive battery and were enrolled in Medicare parts A and B. Adjusting for demographics and comorbidity, executive impairment was associated with higher total annual expenditures of $1488 per person (P episodic memory impairment was found. Expenditures exhibited a linear relationship with executive function, but not episodic memory ($584 higher for every 1 standard deviation decrement in executive function; P < .01). Impairment in executive function is specifically and linearly associated with higher health care expenditures. Focusing on management strategies that address early losses in executive function may be effective in reducing costly services. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
Cardoso, Goncalo; Stadler, Michael; Siddiqui, Afzal; Marnay, Chris; DeForest, Nicholas; Barbosa-Povoa, Ana; Ferrao, Paulo
This paper describes the introduction of stochastic linear programming into Operations DER-CAM, a tool used to obtain optimal operating schedules for a given microgrid under local economic and environmental conditions. This application follows previous work on optimal scheduling of a lithium-iron-phosphate battery given the output uncertainty of a 1 MW molten carbonate fuel cell. Both are in the Santa Rita Jail microgrid, located in Dublin, California. This fuel cell has proven unreliable, partially justifying the consideration of storage options. Several stochastic DER-CAM runs are executed to compare different scenarios to values obtained by a deterministic approach. Results indicate that using a stochastic approach provides a conservative yet more lucrative battery schedule. Lower expected energy bills result, given fuel cell outages, in potential savings exceeding 6percent.
The CARB Executive Order Exemption Process for a Hydrogen-fueled Internal Combustion Engine Vehicle was undertaken to define the requirements to achieve a California Air Resource Board Executive Order for a hydrogenfueled vehicle retrofit kit. A 2005 to 2006 General Motors Company Sierra/Chevrolet Silverado 1500HD pickup was assumed to be the build-from vehicle for the retrofit kit. The emissions demonstration was determined not to pose a significant hurdle due to the non-hydrocarbon-based fuel and lean-burn operation. However, significant work was determined to be necessary for Onboard Diagnostics Level II compliance. Therefore, it is recommended that an Experimental Permit be obtained from the California Air Resource Board to license and operate the vehicles for the durability of the demonstration in support of preparing a fully compliant and certifiable package that can be submitted.
This thesis details a longitudinal study on factors that influence introductory programming success and on the development of machine learning models to predict incoming student performance. Although numerous studies have developed models to predict programming success, the models struggled to achieve high accuracy in predicting the likely performance of incoming students. Our approach overcomes this by providing a machine learning technique, using a set of three significant...
Jiao, Jiao; Kan, Shuanglong; Lin, Shang-Wei; Sanan, David; Liu, Yang; Sun, Jun
Bitcoin has attracted everyone's attention and interest recently. Ethereum (ETH), a second generation cryptocurrency, extends Bitcoin's design by offering a Turing-complete programming language called Solidity to develop smart contracts. Smart contracts allow creditable execution of contracts on EVM (Ethereum Virtual Machine) without third parties. Developing correct smart contracts is challenging due to its decentralized computation nature. Buggy smart contracts may lead to huge financial lo...
Hartwig, Laura; Smith, Matt
In 2006, Honeywell Federal Manufacturing & Technologies (FM&T) announced an updatedvision statement for the organization. The vision is “To be the most admired team within the NNSA [National Nuclear Security Administration] for our relentless drive to convert ideas into the highest quality products and services for National Security by applying the right technology, outstanding program management and best commercial practices.” The challenge to provide outstanding program management was taken up by the Program Management division and the Program Integration Office (PIO) of the company. This article describes how Honeywell developed and deployed a program management maturity model to drive toward excellence.
Cavallari, Larisa H; Lee, Craig R; Duarte, Julio D; Nutescu, Edith A; Weitzel, Kristin W; Stouffer, George A; Johnson, Julie A
The operational elements essential for establishing an inpatient pharmacogenetic service are reviewed, and the role of the pharmacist in the provision of genotype-guided drug therapy in pharmacogenetics programs at three institutions is highlighted. Pharmacists are well positioned to assume important roles in facilitating the clinical use of genetic information to optimize drug therapy given their expertise in clinical pharmacology and therapeutics. Pharmacists have assumed important roles in implementing inpatient pharmacogenetics programs. This includes programs designed to incorporate genetic test results to optimize antiplatelet drug selection after percutaneous coronary intervention and personalize warfarin dosing. Pharmacist involvement occurs on many levels, including championing and leading pharmacogenetics implementation efforts, establishing clinical processes to support genotype-guided therapy, assisting the clinical staff with interpreting genetic test results and applying them to prescribing decisions, and educating other healthcare providers and patients on genomic medicine. The three inpatient pharmacogenetics programs described use reactive versus preemptive genotyping, the most feasible approach under the current third-party payment structure. All three sites also follow Clinical Pharmacogenetics Implementation Consortium guidelines for drug therapy recommendations based on genetic test results. With the clinical emergence of pharmacogenetics into the inpatient setting, it is important that pharmacists caring for hospitalized patients are well prepared to serve as experts in interpreting and applying genetic test results to guide drug therapy decisions. Since genetic test results may not be available until after patient discharge, pharmacists practicing in the ambulatory care setting should also be prepared to assist with genotype-guided drug therapy as part of transitions in care. Copyright © 2016 by the American Society of Health
Cardoso, Jorge; Poels, Geert
This SpringerBrief explores the internal workings of service systems. The authors propose a lightweight semantic model for an effective representation to capture the essence of service systems. Key topics include modeling frameworks, service descriptions and linked data, creating service instances, tool support, and applications in enterprises.Previous books on service system modeling and various streams of scientific developments used an external perspective to describe how systems can be integrated. This brief introduces the concept of white-box service system modeling as an approach to mo
Kridasakti, S. W.; Sudirah; Siregar, H.
These studies were to respond whether the UT social-aid management had been executed under CO-CD principles (Ife J. 1995) and what CO-CD base community service management model can be built. The goals of these evaluational studies were UT social-aid managerial performance profile (2011-2013) and CO-CD management model development. The methods used were Survey and FGD. For data collection were involving the UT officers, the counterparts, and the documents. The analysis used combination between the Performance Analysis (Irawan P., 2003) and the CIPP (Stuffelbeam, D, L., & Shinkfield, A, J., 1985). The findings showed that the quantitative targeting in program completion was credible in achievement (85%). However, the “qulitative targeting” of the management goals was indicating far from a good-stage (≤5.0_Interval-Force: 1-10 Scale). The “Gap” was due to the absent of socialization_needs-analysis_maintenance_release factors on the UT social-service grand-policy. The trial of CO-CD Base Management Model had been imposed to the community that turned out to be very effective to self-help, and the ensuing SOP had been successfully defined. Conclusion, ‘CO-CD Principles’ were not designed in UT community service programs management. However, if efficiency and effectivity likely to be achieved, the SOP of ‘CO-CD Base Management Model has to be adopted.
Mahajan, S.M.; Ramesh, K.; Rajesh, K.; Somani, A.; Goel, M.
This paper presents a simplified approach in the forms of a tree structured computational model for parallel application programs. An attempt is made to provide a standard user interface to execute programs on BARC Parallel Processing System (BPPS), a scalable distributed memory multiprocessor. The interface package called PSHED provides a basic framework for representing and executing parallel programs on different parallel architectures. The PSHED package incorporates concepts from a broad range of previous research in programming environments and parallel computations. (author). 6 refs
Jørgensen, Jens Bæk; Bossen, Claus
and into the work processes they're to support. However, prototypes typically provide an explicit representation only of the system itself. Executable use cases, on the other hand, can also describe the environment. EUCs are designed to: narrow the gap between informal ideas about requirements and the formalization...... modeling. This article describes a case study in which developers used EUCs to prototype an electronic patient record system for hospitals in Aarhus, Denmark.......Many software experts argue that when we design a new system, we should create an explicit description of the environment in which the proposed system is to be used. The argument becomes crucial for pervasive computing, which aims to tightly integrate systems into their environments...
Nielsen, Mogens; Palamidessi, Catuscia; Valencia, Frank Dan
The ntcc calculus is a model of non-deterministic temporal concurrent constraint programming. In this paper we study behavioral notions for this calculus. In the underlying computational model, concurrent constraint processes are executed in discrete time intervals. The behavioral notions studied...
Bignoux, Stephane; Sund, Kristian J.
Studies of learning and student satisfaction in the context of online university programmes have largely neglected programmes catering specifically to business executives. Such executives have typically been away from higher education for a number of years, and have collected substantial practical...... experience in the subject matters they are taught. Their expectations in terms of both content and delivery may therefore be different from non-executive students. We explore perceptions of the quality of tutoring in the context of an online executive MBA programme through participant interviews. We find...... that in addition to some of the tutor behaviours already discussed in the literature, executive students look specifically for practical industry knowledge and experience in tutors, when judging how effective a tutor is. This has implications for both the recruitment and training of online executive MBA tutors....
A model surveillance program is presented based on regulatory experience. The program consists of three phases: Program Delineation, Data Acquistion and Data Analysis. Each phase is described in terms of key quality assurance elements and some current philosophies is the United States Licensing Program. Other topics include the application of these ideas to test equipment used in the surveillance progam and audits of the established program. Program Delineation discusses the establishment of administrative controls for organization and the description of responsibilities using the 'Program Coordinator' concept, with assistance from Data Acquisition and Analysis Teams. Ideas regarding frequency of surveillance testing are also presented. The Data Acquisition Phase discusses various methods for acquiring data including operator observations, test procedures, operator logs, and computer output, for trending equipment performance. The Data Analysis Phase discusses the process for drawing conclusions regarding component/equipment service life, proper application, and generic problems through the use of trend analysis and failure rate data. (orig.)
This paper presents an optimum workforce-size model which determines the minimum number of excess workers (overstaffing) as well as the minimum total recruitment cost during a specified planning horizon. The model is an extension of other existing dynamic programming models for manpower planning in the sense ...
It provides fuzzy programming approach to solve real-life decision problems in fuzzy environment. Within the framework of credibility theory, it provides a self-contained, comprehensive and up-to-date presentation of fuzzy programming models, algorithms and applications in portfolio analysis.
S.-S.T.Q. Jongmans (Sung-Shik); K.V. Hindriks; M.B. van Riemsdijk; L. Dennis; O. Boissier; R.H. Bordini (Rafael)
htmlabstractState space reduction techniques have been developed to increase the efficiency of model checking in the context of imperative programming languages. Unfortunately, these techniques cannot straightforwardly be applied to agents: the nature of states in the two programming paradigms
This work discussed how the simplex method of linear programming could be used to maximize the profit of any business firm using Saclux Paint Company as a case study. It equally elucidated the effect variation in the optimal result obtained from linear programming model, will have on any given firm. It was demonstrated ...
Akhmatov, Vladislav; Knudsen, Hans
with and without a model of the mechanical shaft. The reason for the discrepancies are explained, and it is shown that the phenomenon is due partly to the presence of DC offset currents in the induction machine stator, and partly to the mechanical shaft system of the wind turbine and the generator rotor......For AC networks with large amounts of induction generators-in case of e.g. windmills-the paper demonstrates a significant discrepancy in the simulated voltage recovery after faults in weak networks, when comparing result obtained with dynamic stability programs and transient programs, respectively....... It is shown that it is possible to include a transient model in dynamic stability programs and thus obtain correct results also in dynamic stability programs. A mechanical model of the shaft system has also been included in the generator model...
Kim, B.T.; Kyum, M.C.; Hong, S.W.; Park, M.H.; Udagawa, T.
A FORTRAN program NLOM for nonlocal optical model calculations is described. It is based on a method recently developed by Kim and Udagawa, which utilizes the Lanczos technique for solving integral equations derived from the nonlocal Schroedinger equation. (orig.)
Andreyev, N I; Kurtyka, T; Oberli, L R; Perini, D; Russenschuck, Stephan; Siegel, N; Siemko, A; Tommasini, D; Vanenkov, I; Walckiers, L
Superconducting single and twin aperture 1-m long dipole magnets are currently being fabricated at CERN at a rate of about one per month in the framework of the short dipole model program for the LHC. The program allows to study performance improvements coming from refinements in design, components and assembly options and to accumulate statistics based on a small-scale production. The experience thus gained provides in turn feedback into the long magnet program in industry. In recent models initial quenching fields above 9 T have been obtained and after a short training the conductor limit at 2 K is reached, resulting in a central bore field exceeding 10 T. The paper describes the features of recent single aperture models, the results obtained during cold tests and the plans to ensure the continuation of a vigorous model program providing input for the fabrication of the main LHC dipoles.
Federal Laboratory Consortium — Colorado State University has received funding from the U.S. Environmental Protection Agency (EPA) for its Space-Time Aquatic Resources Modeling and Analysis Program...
AUTHOR|(CDS)2079190; Darvas, Daniel; Blanco Vinuela, Enrique; Tournier, Jean-Charles; Bliudze, Simon; Blech, Jan Olaf; Gonzalez Suarez, Victor M
Programmable logic controllers (PLCs) are embedded computers widely used in industrial control systems. Ensuring that a PLC software complies with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of safety-critical software but is still underused in industry due to the complexity of building and managing formal models of real applications. In this paper, we propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (\\eg CTL, LTL) on PLC programs. This methodology is based on an intermediate model (IM), meant to transform PLC programs written in various standard languages (ST, SFC, etc.) to different modeling languages of verification tools. We present the syntax and semantics of the IM and the transformation rules of the ST and SFC languages to the nuXmv model checker passing through the intermediate model. Finally, two real cases studies of \\CERN PLC programs, written mainly in th...
Response of a veterinary college to career development needs identified in the KPMG LLP study and the executive summary of the Brakke study: a combined MBA/DVM program, business certificate program, and curricular modifications.
Kogan, Lori R; McConnell, Sherry L; Schoenfeld-Tacher, Regina
In the present market, veterinarians with a strong background in career development, practice management, and business skills have a clear advantage in achieving financial success. Although there is ample evidence that the scientific and clinical skills of veterinary college graduates are high, there are also data that suggest that additional capabilities in the business realm may promote greater economic success. As noted in the KPMG executive summary, the field of veterinary medicine must make changes in its "current business practices and attitudes" to be successful in the future. Furthermore, the KPMG study found that 36% of industry employers reported that some jobs within their companies had specific job requirements that were not met by a veterinarian with only a veterinary medical degree. The areas of additional training most often cited included business, administration, personnel management, sales and marketing, and financial skills. Yet, Lewis and Klausner found that veterinarians reported challenges in the business realm, such as "how business works and how business goals are translated into action. This challenge held true for veterinarians in industry, academia, government, and private practice." The present gender trends in the field of veterinary medicine provide additional impetus to make career development and business skills training more prevalent. Presently, women comprise >65% of the veterinary student population and approximately 45% of all practicing veterinarians. In some areas of practice, the rate is much higher. For example, in 2002, women comprised 48.2% of all small animal exclusive private practitioners. Unfortunately, the KPMG study found that female veterinarians in private practice report lower self-evaluation of business management and financial skills, compared with their male cohorts. Female veterinarians in nonprivate practice report lower self-evaluation in communication, personnel management, business management, and
CalTOX has been developed as a spreadsheet model to assist in health-risk assessments that address contaminated soils and the contamination of adjacent air, surface water, sediments, and ground water. The modeling effort includes a multimedia transport and transformation model, exposure scenario models, and efforts to quantify and reduce uncertainty in multimedia, multiple-pathway exposure models. This report provides an overview of the CalTOX model components, lists the objectives of the model, describes the philosophy under which the model was developed, identifies the chemical classes for which the model can be used, and describes critical sensitivities and uncertainties. The multimedia transport and transformation model is a dynamic model that can be used to assess time-varying concentrations of contaminants introduced initially to soil layers or for contaminants released continuously to air or water. This model assists the user in examining how chemical and landscape properties impact both the ultimate route and quantity of human contact. Multimedia, multiple pathway exposure models are used in the CalTOX model to estimate average daily potential doses within a human population in the vicinity of a hazardous substances release site. The exposure models encompass twenty-three exposure pathways. The exposure assessment process consists of relating contaminant concentrations in the multimedia model compartments to contaminant concentrations in the media with which a human population has contact (personal air, tap water, foods, household dusts soils, etc.). The average daily dose is the product of the exposure concentrations in these contact media and an intake or uptake factor that relates the concentrations to the distributions of potential dose within the population.
This project contains three major parts. In the first part a digital computer simulation model was developed with the aim to model the traffic through a freeway work zone situation. The model was based on the Arena simulation software and used cumula...
Garen, John E
The empirical literature on executive compensation generally fails to specify a model of executive pay on which to base hypotheses regarding its determinants. In contrast, this paper analyzes a simple principal-agent model to determine how well it explains variations in CEO incentive pay and salaries. Many findings are consistent with the basic intuition of principle-agent models that compensation is structured to trade off incentives with insurance. However, statistical significance for some...
Hansen, Ronald E.
A model for technology teacher education curriculum has three facets: (1) purpose (experiential learning, personal development, technological enlightenment, economic well-being); (2) content (professional knowledge, curriculum development competence, pedagogical knowledge and skill, technological foundations); and (3) process (planned reflection,…
Complete documentation of the marketing and distribution (M and D) computer model is provided. The purpose is to estimate the costs of selling and transporting photovoltaic solar energy products from the manufacturer to the final customer. The model adjusts for the inflation and regional differences in marketing and distribution costs. The model consists of three major components: the marketing submodel, the distribution submodel, and the financial submodel. The computer program is explained including the input requirements, output reports, subprograms and operating environment. The program specifications discuss maintaining the validity of the data and potential improvements. An example for a photovoltaic concentrator collector demonstrates the application of the model.
Raghuraman, M K; Brewer, Bonita J
Eukaryotes have long been reported to show temporal programs of replication, different portions of the genome being replicated at different times in S phase, with the added possibility of developmentally regulated changes in this pattern depending on species and cell type. Unicellular model organisms, primarily the budding yeast Saccharomyces cerevisiae, have been central to our current understanding of the mechanisms underlying the regulation of replication origins and the temporal program of replication in particular. But what exactly is a temporal program of replication, and how might it arise? In this article, we explore this question, drawing again on the wealth of experimental information in unicellular model organisms.
Trinckes, John J
Supplying a complete overview of the concepts executives need to know, this book provides the tools needed to ensure your organization has an effective information security management program in place. It also includes a ready-to use security framework for developing workable programs and supplies proven tips for avoiding common pitfalls.
Brøgger, Morten; Wittchen, Kim Bjarne
Many building stock models employ archetype-buildings in order to capture the essential characteristics of a diverse building stock. However, these models often require multiple archetypes, which make them inflexible. This paper proposes an array-programming based model, which calculates the heat...... tend to overestimate potential energy-savings, if we do not consider these discrepancies. The proposed model makes it possible to compute and visualize potential energy-savings in a flexible and transparent way....
Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco
Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.
Ofelia Ema Aleca
Full Text Available This article proposes and tests a set of performance indicators for the assessment of Bachelor and Master studies, from two perspectives: the study programs and the disciplines. The academic performance at the level of a study program shall be calculated based on success and efficiency rates, and at discipline level, on the basis of rates of efficiency, success and absenteeism. This research proposes a model of classification of the study programs within a Bachelor and Master cycle based on the education performance and efficiency. What recommends this model as a best practice model in academic management is the possibility of grouping a study program or a discipline in a particular category of efficiency
Lee, Deok Ki; Park, Sang Yong; Park, Soo Uk
The goal of this study is the development of the assessment model for demand-side management investment programs (DSMIPs) in the areas of natural gas and district heating. Demand-side management (DSM) is the process of managing the consumption of energy to optimize available and planned generation resources and DSMIPs are the actions conducted by energy suppliers to promote investment in the DSM. In this research, the analytic hierarchy process (AHP) method was used to develop a scientific and rational assessment model for DSMIPs. To apply the AHP method, assessment indicators for the assessment have been identified by using the concept of 'plan, do, see' and the decision-making hierarchy was established. Then AHP model was developed to set up the priorities of assessment indicators and a survey of experts from government and energy suppliers was carried out. Finally, the priorities of assessment indicators were calculated based on the result of survey using the AHP method. The assessment model developed from this research will actually be used to assess the results of DSMIPs, which is being carried out by Korea gas corporation (KOGAS) and Korea district heating corporation (KDHC). The use of the assessment model developed by this research is expected to contribute to enhance efficiency in planning, execution, and assessment of DSMIPs
Spadoni, D. J.
The development of the planetary program cost model is discussed. The Model was updated to incorporate cost data from the most recent US planetary flight projects and extensively revised to more accurately capture the information in the historical cost data base. This data base is comprised of the historical cost data for 13 unmanned lunar and planetary flight programs. The revision was made with a two fold objective: to increase the flexibility of the model in its ability to deal with the broad scope of scenarios under consideration for future missions, and to maintain and possibly improve upon the confidence in the model's capabilities with an expected accuracy of 20%. The Model development included a labor/cost proxy analysis, selection of the functional forms of the estimating relationships, and test statistics. An analysis of the Model is discussed and two sample applications of the cost model are presented.
Moore, Berrien, III; Sahagian, Dork
The Goal of the GAIM is: To advance the study of the coupled dynamics of the Earth system using as tools both data and models; to develop a strategy for the rapid development, evaluation, and application of comprehensive prognostic models of the Global Biogeochemical Subsystem which could eventually be linked with models of the Physical-Climate Subsystem; to propose, promote, and facilitate experiments with existing models or by linking subcomponent models, especially those associated with IGBP Core Projects and with WCRP efforts. Such experiments would be focused upon resolving interface issues and questions associated with developing an understanding of the prognostic behavior of key processes; to clarify key scientific issues facing the development of Global Biogeochemical Models and the coupling of these models to General Circulation Models; to assist the Intergovernmental Panel on Climate Change (IPCC) process by conducting timely studies that focus upon elucidating important unresolved scientific issues associated with the changing biogeochemical cycles of the planet and upon the role of the biosphere in the physical-climate subsystem, particularly its role in the global hydrological cycle; and to advise the SC-IGBP on progress in developing comprehensive Global Biogeochemical Models and to maintain scientific liaison with the WCRP Steering Group on Global Climate Modelling.
Schuller, B. (JUELICH); Smirnova, O (Lund University); Konstantinov, A. (Oslo University); Skou Andersen, M. (University of Copenhagen); Riedel, M. (JUELICH); Memon, A.S. (JUELICH); Memon, M.S. (JUELICH); Zangrando, L. (INFN); Sgaravatto, M. (INFN); Frizziero, E. (INFN)
This document provides the interface specification, including related data models such as state model, activity description, resource and activity information, of an execution service, matching the needs of the EMI production middleware stack composed of ARC, gLite and UNICORE components. This service therefore is referred to as the EMI Execution Service (or “ES” for short). This document is a continuation of the work previously known as the GENEVA, then AGU (“ARC, gLite UNICORE”), then PGI execution service.
CHARNES, A.; AND OTHERS
A GOAL PROGRAMING MODEL FOR SELECTING MEDIA IS PRESENTED WHICH ALTERS THE OBJECTIVE AND EXTENDS PREVIOUS MEDIA MODELS BY ACCOUNTING FOR CUMULATIVE DUPLICATING AUDIENCES OVER A VARIETY OF TIME PERIODS. THIS PERMITS DETAILED CONTROL OF THE DISTRIBUTION OF MESSAGE FREQUENCIES DIRECTED AT EACH OF NUMEROUS MARKETING TARGETS OVER A SEQUENCE OF…
Full Text Available Executive Functions (EF concern a range of abilities including problem-solving, planning, initiation, selfmonitoring,conscious attention, cope with new situations and the ability to modify plans if necessary. It’s a high cognitive function that is crucial for a person to get engaged and maintain daily activities whilst keeping a good quality of life. Problems in the EF were formerly known as Dysexecutive Syndrome (DS. There are many models concerning DS, although the literature on the subject still remains unclear. Several works appoint the effects brought by elderly life, as well as abuse of drugs and some psychopathologies. These factors are known to increase the distress of the frontal circuits and that could be associated to executive deficits. The effects of DS would compromise individuals in day-to-day routine, academic, social and labor fields. There is a growing body of studies trying to determine the causes, implications, associations and the best way to take care of these effects. This work intends to review DS, focusing on the most important fields related to this area, such as psychopathology associations, cognitive reserve, assessment and cognitive rehabilitation programs.
Institutional Effectiveness Assessment Process, 1992-93. Executive Summary. Hospitality and Service Occupations Division, Food Sciences Department, Food Production Program, Food Production Management Program, Pastry and Specialty Baking Program.
South Seattle Community Coll., Washington.
In the 1992-93 academic year, the Hospitality and Food Sciences Department at South Seattle Community College conducted surveys of current and former students and local foodservice employers to determine the level of satisfaction with Department programs. Specifically, the surveys focused on four key outcomes: determining the extent to which…
Iversen, T. K.; Kristoffersen, K. J.; Larsen, Kim Guldstrand
In this paper, we present a method for automatic verification of real-time control programs running on LEGO(R) RCX(TM) bricks using the verification tool UPPALL. The control programs, consisting of a number of tasks running concurrently, are automatically translated into the mixed automata model...... of UPPAAL. The fixed scheduling algorithm used by the LEGO(R) RCX(TM) processor is modeled in UPPALL, and supply of similar (sufficient) timed automata models for the environment allows analysis of the overall real-time system using the tools of UPPALL. To illustrate our technique for sorting LEGO(R) bricks...
A Science Definition Team was established in December 1990 by the Space Physics Division, NASA, to develop a satellite program to conduct research on the energetics, dynamics, and chemistry of the mesosphere and lower thermosphere/ionosphere. This two-volume publication describes the TIMED (Thermosphere-Ionosphere-Mesosphere, Energetics and Dynamics) mission and associated science program. The report outlines the scientific objectives of the mission, the program requirements, and the approach towards meeting these requirements.
Simmons, C.S.; Cole, C.R.
This document was written to provide guidance to managers and site operators on how ground-water transport codes should be selected for assessing burial site performance. There is a need for a formal approach to selecting appropriate codes from the multitude of potentially useful ground-water transport codes that are currently available. Code selection is a problem that requires more than merely considering mathematical equation-solving methods. These guidelines are very general and flexible and are also meant for developing systems simulation models to be used to assess the environmental safety of low-level waste burial facilities. Code selection is only a single aspect of the overall objective of developing a systems simulation model for a burial site. The guidance given here is mainly directed toward applications-oriented users, but managers and site operators need to be familiar with this information to direct the development of scientifically credible and defensible transport assessment models. Some specific advice for managers and site operators on how to direct a modeling exercise is based on the following five steps: identify specific questions and study objectives; establish costs and schedules for achieving answers; enlist the aid of professional model applications group; decide on approach with applications group and guide code selection; and facilitate the availability of site-specific data. These five steps for managers/site operators are discussed in detail following an explanation of the nine systems model development steps, which are presented first to clarify what code selection entails
Dudley, William N; Benuzillo, Jose G; Carrico, Mineh S
Mediation modeling can explain the nature of the relation among three or more variables. In addition, it can be used to show how a variable mediates the relation between levels of intervention and outcome. The Sobel test, developed in 1990, provides a statistical method for determining the influence of a mediator on an intervention or outcome. Although interactive Web-based and stand-alone methods exist for computing the Sobel test, SPSS and SAS programs that automatically run the required regression analyses and computations increase the accessibility of mediation modeling to nursing researchers. To illustrate the utility of the Sobel test and to make this programming available to the Nursing Research audience in both SAS and SPSS. The history, logic, and technical aspects of mediation testing are introduced. The syntax files sobel.sps and sobel.sas, created to automate the computation of the regression analysis and test statistic, are available from the corresponding author. The reported programming allows the user to complete mediation testing with the user's own data in a single-step fashion. A technical manual included with the programming provides instruction on program use and interpretation of the output. Mediation modeling is a useful tool for describing the relation between three or more variables. Programming and manuals for using this model are made available.
Resolution 188/013. It authorize the direct contracting of the Geophysical AS petroleum firm by Ancap, for the realization of a multi-client contract for the execution, marketing and revenue participation in a program for the acquisition and processing of three-dimensional seismic data ''Costa Afuera del Uruguay'' no exclusive
This Resolution authorizes the direct contracting of the Geophysical AS petroleum firm by Ancap. This contract has the purpose of the execution, marketing and revenue participation in a program for the acquisition and processing of three dimensional seismic data 'Costa afuera del Uruguay' no exclusive. ANCAP is the only organization that authorizes the execution of the activities, businesses and operations of the oil industry according to regulations.
Overgeneral autobiographical memory in healthy young and older adults: Differential age effects on components of the capture and rumination, functional avoidance, and impaired executive control (CaRFAX) model.
Ros, Laura; Latorre, Jose M; Serrano, Juan P; Ricarte, Jorge J
The CaRFAX model (Williams et al., 2007) has been used to explain the causes of overgeneral autobiographical memory (OGM; the difficulty to retrieve specific autobiographical memories), a cognitive phenomenon generally related with different psychopathologies. This model proposes 3 different mechanisms to explain OGM: capture and rumination (CaR), functional avoidance (FA) and impaired executive functions (X). However, the complete CaRFAX model has not been tested in nonclinical populations. This study aims to assess the usefulness of the CaRFAX model to explain OGM in 2 healthy samples: a young sample and an older sample, to test for possible age-related differences in the underlying causes of OGM. A total of 175 young (age range: 19-36 years) and 175 older (age range: 53-88 years) participants completed measures of brooding rumination (CaR), functional avoidance (FA), and executive tasks (X). Using structural equation modeling, we found that memory specificity is mainly associated with lower functional avoidance and higher executive functions in the older group, but only with executive functions in young participants. We discuss the different roles of emotional regulation strategies used by young and older people and their relation to the CaRFAX model to explain OGM in healthy people. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Cajueiro, Daniel O.; Maldonado, Wilfredo L.
In order to explain the empirical evidence that the dynamics of human activity may not be well modeled by Poisson processes, a model based on queuing processes was built in the literature [A. L. Barabasi, Nature (London) 435, 207 (2005)]. The main assumption behind that model is that people execute their tasks based on a protocol that first executes the high priority item. In this context, the purpose of this paper is to analyze the validity of that hypothesis assuming that people are rational agents that make their decisions in order to minimize the cost of keeping nonexecuted tasks on the list. Therefore, we build and analytically solve a dynamic programming model with two priority types of tasks and show that the validity of this hypothesis depends strongly on the structure of the instantaneous costs that a person has to face if a given task is kept on the list for more than one period. Moreover, one interesting finding is that in one of the situations the protocol used to execute the tasks generates complex one-dimensional dynamics.
Schultz, Caroline; Seith, David
The Work Advancement and Support Center (WASC) program in Fort Worth was part of a demonstration that is testing innovative strategies to help increase the income of low-wage workers, who make up a large segment of the U.S. workforce. The program offered services to help workers stabilize their employment, improve their skills, and increase their…
Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue
in a strong sense: a universal algorithm exists, that is able to execute any program, and is not asymptotically inefficient. A prototype model has been implemented (for now in silico on a conventional computer). This work opens new perspectives on just how computation may be specified at the biological level......., by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only...... executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways and without arcane encodings of data and algorithm); it is also uniform: new “hardware” is not needed to solve new problems; and (last but not least) it is Turing complete...
Aalami, H.A.; Moghaddam, M. Parsa; Yousefi, G.R.
One of the responsibilities of power market regulator is setting rules for selecting and prioritizing demand response (DR) programs. There are many different alternatives of DR programs for improving load profile characteristics and achieving customers' satisfaction. Regulator should find the optimal solution which reflects the perspectives of each DR stakeholder. Multi Attribute Decision Making (MADM) is a proper method for handling such optimization problems. In this paper, an extended responsive load economic model is developed. The model is based on price elasticity and customer benefit function. Prioritizing of DR programs can be realized by means of Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method. Considerations of ISO/utility/customer regarding the weighting of attributes are encountered by entropy method. An Analytical Hierarchy Process (AHP) is used for selecting the most effective DR program. Numerical studies are conducted on the load curve of the Iranian power grid in 2007. (author)
Abdelwahed, Mohamed F.
One of the seismological programs to manipulate seismic data is SGRAPH program. It consists of integrated tools to perform advanced seismological techniques. SGRAPH is considered a new system for maintaining and analyze seismic waveform data in a stand-alone Windows-based application that manipulate a wide range of data formats. SGRAPH was described in detail in the first part of this paper. In this part, I discuss the advanced techniques including in the program and its applications in seismology. Because of the numerous tools included in the program, only SGRAPH is sufficient to perform the basic waveform analysis and to solve advanced seismological problems. In the first part of this paper, the application of the source parameters estimation and hypocentral location was given. Here, I discuss SGRAPH waveform modeling tools. This paper exhibits examples of how to apply the SGRAPH tools to perform waveform modeling for estimating the focal mechanism and crustal structure of local earthquakes.
Bacause of high development costs of IC (Integrated Circuit)test programs,recycling existing test programs from one kind of ATE (Automatic Test Equipment) to another or generating directly from CAD simulation modules to ATE is more and more valuable.In this paper,a new approach to migrating test programs is presented.A virtual ATE model based on object-oriented paradigm is developed;it runs Test C++ (an intermediate test control language) programs and TeIF(Test Inftermediate Format-an intermediate pattern),migrates test programs among three kinds of ATE (Ando DIC8032,Schlumberger S15 and GenRad 1732) and generates test patterns from two kinds of CAD 9Daisy and Panda) automatically.
Jacob, P.; Paretzke, H.G.; Roth, P.
The Association Contract covers a range of research domains that are important to the Radiation Protection Research Action, especially in the areas 'Evaluation of Radiation Risks' and 'Understanding Radiation Mechanisms and Epidemiology'. Three research projects concentrate on radiation dosimetry research and two projects on the modelling of radiation carcinogenesis. The following list gives an overview on the topics and responsible scientific project leaders of the Association Contract: Study of radiation fields and dosimetry at aviation altitudes. Biokinetics and dosimetry of incorporated radionuclides. Dose reconstruction. Biophysical models for the induction of cancer by radiation. Experimental data for the induction of cancer by radiation of different qualities. (orig.)
Raghuraman, M. K.; Brewer, Bonita J.
Eukaryotes have long been reported to show temporal programs of replication, different portions of the genome being replicated at different times in S phase, with the added possibility of developmentally regulated changes in this pattern depending on species and cell type. Unicellular model organisms, primarily the budding yeast Saccharomyces cerevisiae, have been central to our current understanding of the mechanisms underlying the regulation of replication origins and the temporal program o...
Fung, L.S.N.; Fung, Nick Lik San; Widya, I.A.; Broens, T.H.F.; Larburu Rubio, Nekane; Bults, Richard G.A.; Shalom, Erez; Jones, Valerie M.; Hermens, Hermanus J.
We present a conceptual framework for modelling clinical guidelines as networks of concurrent processes. This enables the guideline to be partitioned and distributed at run-time across a knowledge-based telemedicine system, which is distributed by definition but whose exact physical configuration
Martel, Michelle M; Pan, Pedro M; Hoffmann, Maurício S; Gadelha, Ary; do Rosário, Maria C; Mari, Jair J; Manfro, Gisele G; Miguel, Eurípedes C; Paus, Tomás; Bressan, Rodrigo A; Rohde, Luis A; Salum, Giovanni A
High rates of comorbidities and poor validity of disorder diagnostic criteria for mental disorders hamper advances in mental health research. Recent work has suggested the utility of continuous cross-cutting dimensions, including general psychopathology and specific factors of externalizing and internalizing (e.g., distress and fear) syndromes. The current study evaluated the reliability of competing structural models of psychopathology and examined external validity of the best fitting model on the basis of family risk and child global executive function (EF). A community sample of 8,012 families from Brazil with children ages 6-12 years completed structured interviews about the child and parental psychiatric syndromes, and a subsample of 2,395 children completed tasks assessing EF (i.e., working memory, inhibitory control, and time processing). Confirmatory factor analyses tested a series of structural models of psychopathology in both parents and children. The model with a general psychopathology factor ("P factor") with 3 specific factors (fear, distress, and externalizing) exhibited the best fit. The general P factor accounted for most of the variance in all models, with little residual variance explained by each of the 3 specific factors. In addition, associations between child and parental factors were mainly significant for the P factors and nonsignificant for the specific factors from the respective models. Likewise, the child P factor-but not the specific factors-was significantly associated with global child EF. Overall, our results provide support for a latent overarching P factor characterizing child psychopathology, supported by familial associations and child EF. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Schopf, W.; Rexer, G.; Ruehle, R.
This report documents a generator program by which econometric simulation models formulated in an application-orientated language can be transformed automatically in a Fortran program. Thus the model designer is able to build up, test and modify models without the need of a Fortran programmer. The development of a computer model is therefore simplified and shortened appreciably; in chapter 1-3 of this report all rules are presented for the application of the generator to the model design. Algebraic models including exogeneous and endogeneous time series variables, lead and lag function can be generated. In addition, to these language elements, Fortran sequences can be applied to the formulation of models in the case of complex model interrelations. Automatically the generated model is a module of the program system RSYST III and is therefore able to exchange input and output data with the central data bank of the system and in connection with the method library modules can be used to handle planning problems. (orig.) [de
Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K
Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis.
Report on achievements in fiscal 1999. Development of technology to put photovoltaic power generation system into practical use by international cooperation project (Executive Committee meetings for treaty of executing IEA photovoltaic power generation system research cooperation program - IEA/PVPS); 1999 nendo taiyoko hatsuden system jitsuyoka gijutsu kaihatsu. Kokusai kyoryoku jigyo (IEA taiyoko hatsuden system kenkyu kyoryoku program jisshi kyotei (shikko iinkai))
The 13th and 14th Executive Committee meetings were held. The resolutions adopted at the 13th Executive Committee meeting (Lausanne, Switzerland, May 3 through 5, 1999) were the approval of the Task VIII as a new IEA/PVPS task, the feasibility study on large scale photovoltaic power generation utilizing unused land such as desert (the operating agent country being Japan headed by Mr. Kando, chief researcher at NEDO and Professor Kurokawa) and the Task IX, proliferation of photovoltaic power generation under coordination with developing countries (the operating agent country being England headed by Mr. Bernard McNelis (IT Power) and that the task I operating agent country is substituted jointly by the three countries of Australia, Holland and Switzerland for six months. The resolutions adopted at the 14th Executive Committee meeting (Oslo, Norway, October 18 through 20, 1999) were appointing Australia as the Task I operating agent country headed by Mr. G. Watt, setting a web site of IEA/PVPS, issuing the annual report for fiscal 1998 in early 2000, and holding the Executive Committee meetings for fiscal 2000 in Canada (Quebec, April 17 through 19, 2000), and Italy (October 16 through 18, 2000). (NEDO)
Burke, G C; Bice, M O
Health care executives must consider renewal and change within their own lives if they are to breathe life into their own institutions. Yet numerous barriers to executive renewal exist, including time pressures, fatigue, cultural factors, and trustee attitudes. This essay discusses such barriers and suggests approaches that health care executives may consider for programming renewal into their careers. These include self-assessment for professional and personal goals, career or job change, process vs. outcome considerations, solitude, networking, lifelong education, surrounding oneself with change agents, business travel and sabbaticals, reading outside the field, physical exercise, mentoring, learning from failures, a sense of humor, spiritual reflection, and family and friends. Renewal is a continuous, lifelong process requiring constant learning. Individual executives would do well to develop a framework for renewal in their careers and organizations.
Reynolds, Clare M; Vickers, Mark H
Any effective strategy to tackle the global obesity and rising noncommunicable disease epidemic requires an in-depth understanding of the mechanisms that underlie these conditions that manifest as a consequence of complex gene-environment interactions. In this context, it is now well established that alterations in the early life environment, including suboptimal nutrition, can result in an increased risk for a range of metabolic, cardiovascular, and behavioral disorders in later life, a process preferentially termed developmental programming. To date, most of the mechanistic knowledge around the processes underpinning development programming has been derived from preclinical research performed mostly, but not exclusively, in laboratory mouse and rat strains. This review will cover the utility of small animal models in developmental programming, the limitations of such models, and potential future directions that are required to fully maximize information derived from preclinical models in order to effectively translate to clinical use.
authors of this thesis to opt for on site administration of the questionnaire when testing the model at actual commands. At this point a broad general...interface with the respondents. This is what will be done when the questionnaire is administered on site at actual USN activities. At this point, however...those objective. Is thus for tie, B r d*nateet ennhanced written ai ti 1 fiquoA by the Commaul9 fco VARIANCE RIPIA3ATION PORN
Lorenza Antonia Reyes de Duran
The proposal analysis, interpretation, disassembly, self-criticism and guidance is born and comes from work experience planned mass sports and social organizations opposed-not in the conventional sense comparative-private business models and sport, state and management. The contribution made by the sports management experience from positions of power, either state or business are undeniable and its impact is difficult to express in numbers for its humanistic value, which is incalculable. Howe...
Cohn, Hans M.
Argues that although the executive has many tasks, he or she must view internal organizational integration as a primary task, making use of organizational charts, job descriptions, statements of goals and objectives, evaluations, and feedback devices. (RH)
In August 1992, the Energy Research Center (ERC) at the University of Kansas was awarded a contract by the US Department of Energy (DOE) to develop a technology transfer regional model. This report describes the development and testing of the Kansas Technology Transfer Model (KTTM) which is to be utilized as a regional model for the development of other technology transfer programs for independent operators throughout oil-producing regions in the US. It describes the linkage of the regional model with a proposed national technology transfer plan, an evaluation technique for improving and assessing the model, and the methodology which makes it adaptable on a regional basis. The report also describes management concepts helpful in managing a technology transfer program.
This article discusses some features of benchmarking and the possibility of its practical use in the system of professional education during the implementation of the state program "stolichnoe obrazovanie" in order to improve the efficiency of the educational organization
NASA's Upper Atmosphere Research Program UARP and Atmospheric Chemistry Modeling and Analysis Program (ACMAP): Research Summaries 1994 - 1996. Report to Congress and the Environmental Protection Agency
Kendall, Rose (Compiler); Wolfe, Kathy (Compiler)
1996.- An Assessment Report. It consists primarily of the Executive Summary and Chapter Summaries of the World Meteorological Organization Global Ozone Research and Monitoring Project Report No. 37, Scientific Assessment of Ozone Depletion: 1994, sponsored by NASA, the National Oceanic and Atmospheric Administration (NOAA), the UK Department of the Environment, the United Nations Environment Program, and the World Meteorological Organization. Other sections of Part 11 include summaries of the following: an Atmospheric Ozone Research Plan from NASA's Office of Mission to Planet Earth; summaries from a series of Space Shuttle-based missions and two recent airborne measurement campaigns; the Executive Summary of the 1995 Scientific Assessment of the Atmospheric Effects of Stratospheric Aircraft, and the most recent evaluation of photochemical and chemical kinetics data (Evaluation No. 12 of the NASA Panel for Data Evaluation) used as input parameters for atmospheric models.
Spiegel, Samuel Albert
Researchers and practioners alike recognize that "the national goal that every child in the United States has access to high-quality school education in science and mathematics cannot be realized without the availability of effective professional development of teachers" (Hewson, 1997, p. 16). Further, there is a plethora of reports calling for the improvement of professional development efforts (Guskey & Huberman, 1995; Kyle, 1995; Loucks-Horsley, Hewson, Love, & Stiles, 1997). In this study I analyze a successful 3-year teacher enhancement program, one form of professional development, to: (1) identify essential components of an effective teacher enhancement program; and (2) create a model to identify and articulate the critical issues in designing, implementing, and evaluating teacher enhancement programs. Five primary sources of information were converted into data: (1) exit questionnaires, (2) exit surveys, (3) exit interview transcripts, (4) focus group transcripts, and (5) other artifacts. Additionally, a focus group was used to conduct member checks. Data were analyzed in an iterative process which led to the development of the list of essential components. The Components are categorized by three organizers: Structure (e.g., science research experience, a mediator throughout the program), Context (e.g., intensity, collaboration), and Participant Interpretation (e.g., perceived to be "safe" to examine personal beliefs and practices, actively engaged). The model is based on: (1) a 4-year study of a successful teacher enhancement program; (2) an analysis of professional development efforts reported in the literature; and (3) reflective discussions with implementors, evaluators, and participants of professional development programs. The model consists of three perspectives, cognitive, symbolic interaction, and organizational, representing different viewpoints from which to consider issues relevant to the success of a teacher enhancement program. These
Arán Filippetti, Vanessa; Richaud, María Cristina
Though the relationship between executive functions (EFs) and mathematical skills has been well documented, little is known about how both EFs and IQ differentially support diverse math domains in primary students. Inconsistency of results may be due to the statistical techniques employed, specifically, if the analysis is conducted with observed variables, i.e., regression analysis, or at the latent level, i.e., structural equation modeling (SEM). The current study explores the contribution of both EFs and IQ in mathematics through an SEM approach. A total of 118 8- to 12-year-olds were administered measures of EFs, crystallized (Gc) and fluid (Gf) intelligence, and math abilities (i.e., number production, mental calculus and arithmetical problem-solving). Confirmatory factor analysis (CFA) offered support for the three-factor solution of EFs: (1) working memory (WM), (2) shifting, and (3) inhibition. Regarding the relationship among EFs, IQ and math abilities, the results of the SEM analysis showed that (i) WM and age predict number production and mental calculus, and (ii) shifting and sex predict arithmetical problem-solving. In all of the SEM models, EFs partially or totally mediated the relationship between IQ, age and math achievement. These results suggest that EFs differentially supports math abilities in primary-school children and is a more significant predictor of math achievement than IQ level.
Lord, Robert K; Mayhew, Christopher R; Korupolu, Radha; Mantheiy, Earl C; Friedman, Michael A; Palmer, Jeffrey B; Needham, Dale M
To evaluate the potential annual net cost savings of implementing an ICU early rehabilitation program. Using data from existing publications and actual experience with an early rehabilitation program in the Johns Hopkins Hospital Medical ICU, we developed a model of net financial savings/costs and presented results for ICUs with 200, 600, 900, and 2,000 annual admissions, accounting for both conservative- and best-case scenarios. Our example scenario provided a projected financial analysis of the Johns Hopkins Medical ICU early rehabilitation program, with 900 admissions per year, using actual reductions in length of stay achieved by this program. U.S.-based adult ICUs. Financial modeling of the introduction of an ICU early rehabilitation program. Net cost savings generated in our example scenario, with 900 annual admissions and actual length of stay reductions of 22% and 19% for the ICU and floor, respectively, were $817,836. Sensitivity analyses, which used conservative- and best-case scenarios for length of stay reductions and varied the per-day ICU and floor costs, across ICUs with 200-2,000 annual admissions, yielded financial projections ranging from -$87,611 (net cost) to $3,763,149 (net savings). Of the 24 scenarios included in these sensitivity analyses, 20 (83%) demonstrated net savings, with a relatively small net cost occurring in the remaining four scenarios, mostly when simultaneously combining the most conservative assumptions. A financial model, based on actual experience and published data, projects that investment in an ICU early rehabilitation program can generate net financial savings for U.S. hospitals. Even under the most conservative assumptions, the projected net cost of implementing such a program is modest relative to the substantial improvements in patient outcomes demonstrated by ICU early rehabilitation programs.
Darvas, D; Blanco, E
Most of CERN’s industrial installations rely on PLC-based (Programmable Logic Controller) control systems developed using the UNICOS framework. This framework contains common, reusable program modules and their correctness is a high priority. Testing is already applied to find errors, but this method has limitations. In this work an approach is proposed to transform automatically PLC programs into formal models, with the goal of applying formal verification to ensure their correctness. We target model checking which is a precise, mathematical-based method to check formalized requirements automatically against the system.
Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S
Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.
Kraus, Thomas; Gube, Monika; Lang, Jessica; Esser, Andre; Sturm, Walter; Fimm, Bruno; Willmes, Klaus; Neulen, Joseph; Baron, Jens Malte; Merk, Hans; Schettgen, Thomas; Konrad, Kerstin; Deisz, Sabine; Rink, Lothar; Hagmann, Michael; Fillies, Birgit; Zschiesche, Wolfgang; Wittsiepe, Jürgen; Wilhelm, Michael
In a German company polychlorinated biphenyls (PCB)-containing transformers and capacitors were recycled on a large scale. Human biomonitoring revealed a high PCB body burden in workers of the recycling company, in surrounding locations of this plant, in companies in the neighborhood of this plant, and in family members of these employees. In order to clarify whether possible adverse health effects occurred or may occur in the future, a prospective surveillance program was initiated. After an extensive literature search, an interdisciplinary group of experts developed a surveillance program based on current knowledge with respect to possible adverse health effects that might occur in the recycling process of transformers and capacitors. Exposure to various hazardous substances (PCB, polychlorinated dibenzo-p-dioxins and dibenzo-furans [PCDD/F], metals, solvents) was considered. Criteria derived from human biomonitoring results of PCB were used for admission to the program. Participants in the surveillance program are first informed about risks and aims of the program. Subsequently, physicians started a detailed documentation of participants' general and occupational history, with their complaints, diseases, and nutritional habits, as well as information regarding their living areas, by means of a standardized questionnaire. In addition, separate examinations were performed to detect possible neurological, immunological, (neuro)psychological, hormonal, and skin effects. Moreover, DNA exposure as assessed by the comet assay and antioxidative status were determined. The program will be offered at yearly intervals for 3 years, and then at 5 and 10 years after program onset. Until now the program has proved to be feasible, and acceptance among workers and their families has been high. Based on the results, criteria will be developed to define adverse health effects that might be attributable to a hazardous substance exposure.
Full Text Available The ongoing penetration of building automation by information technology is by far not saturated. Today's systems need not only be reliable and fault tolerant, they also have to regard energy efficiency and flexibility in the overall consumption. Meeting the quality and comfort goals in building automation while at the same time optimizing towards energy, carbon footprint and cost-efficiency requires systems that are able to handle large amounts of information and negotiate system behaviour that resolves conflicting demands—a decision-making process. In the last years, research has started to focus on bionic principles for designing new concepts in this area. The information processing principles of the human mind have turned out to be of particular interest as the mind is capable of processing huge amounts of sensory data and taking adequate decisions for (re-actions based on these analysed data. In this paper, we discuss how a bionic approach can solve the upcoming problems of energy optimal systems. A recently developed model for environment recognition and decision-making processes, which is based on research findings from different disciplines of brain research is introduced. This model is the foundation for applications in intelligent building automation that have to deal with information from home and office environments. All of these applications have in common that they consist of a combination of communicating nodes and have many, partly contradicting goals.
Roques, Fabien; Perekhodtsev, Dmitri; Verhaeghe, Charles
One of the 10 key priorities of the new European Commission President Jean-Claude Juncker consists of 'reform(ing) and reorganis(ing) Europe's energy policy into a new European Energy Union'. The Energy Union work programme released on 25 February 2015 suggests that a new electricity market design is needed in order to tackle Europe's chosen policy objectives of de-carbonisation whilst maintaining security of supply. The current regulatory and market framework does not provide a sound basis for the investments needed to maintain security of supply and de-carbonise the power sector at an affordable cost. As policy priorities in favour of de-carbonisation and maintaining security of supply have taken centre stage on the policy agenda in the past decade, the design of liberalised electricity markets has failed to evolve and be reconciled with these new priorities. In addition, the issues of competitiveness and affordability of electricity in Europe remain central in the discussions about the market framework. The objective of this study is to assess the deficiencies and gaps in the current European Target Model and the wider regulatory framework for power generation and to propose a number of policy recommendations for improvement. Recognising the need for 'fresh thinking' on the issue, this study looks outside Europe to learn the lessons from experiences with a range of alternative market designs that exist around the globe. The study provides a comprehensive assessment of the issues with current European electricity markets. The study investigates the lessons from market reforms in North America and in Latin America in the past decade to identify 'out of the box' thinking to fill the gaps in the current European Target model. This report presents some of the research findings and concludes with a set of alternative potential directions for reform of European power markets models in the long term, as well as a number of
Jensen, Rasmus Lund; Sørensen, Karl Grau; Heiselberg, Per
An existing computer model for dynamic hygrothermal analysis of buildings has been extended with a multizone airflow model based on loop equations to account for the coupled thermal and airflow in natural and hybrid ventilated buildings. In water distribution network and related fields loop...... a methodology adopted from water distribution network that automatically sets up the independent loops and is easy to implement into a computer program. Finally an example of verification of the model is given which demonstrates the ability of the models to accurately predict the airflow of a simple multizone...
Sudiajeng, L.; Parwita, I. G. L.; Wiraga, I. W.; Mudhina, M.
The previous research showed that there were indicators of water crisis in the northern and eastern part of Denpasar city and most of coastal area experienced on seawater intrusion. The recommended water conservation programs were rainwater harvesting and educate the community to develop a water saving and environmentally conscious culture. This research was conducted to built the community based educational model on water conservation program through ergonomics SHIP approach which placed the human aspect as the first consideration, besides the economic and technically aspects. The stakeholders involved in the program started from the problem analyses to the implementation and the maintenance as well. The model was built through three main steps, included determination of accepted design; building the recharge wells by involving local communities; guidance and assistance in developing a water saving and environmentally conscious culture for early childhood, elementary and junior high school students, community and industry. The program was implemented based on the “TRIHITA KARANA” concept, which means the relationship between human to God, human-to-human, and human to environment. Through the development of the model, it is expected to grow a sense of belonging and awareness from the community to maintain the sustainability of the program.
The computer program FLOW finds the nonrelativistic self- consistent set of two-dimensional ion trajectories and electric fields (including space charges from ions and electrons) for a given set of initial and boundary conditions for the particles and fields. The combination of FLOW with the optimization code PISA gives the program WOLF, which finds the shape of the emitter which is consistent with the plasma forming it, and in addition varies physical characteristics such as electrode position, shapes, and potentials so that some performance characteristics are optimized. The motivation for developing these programs was the desire to design optimum ion source extractor/accelerator systems in a systematic fashion. The purpose of this report is to explain and derive the mathematical models and algorithms which approximate the real physical processes. It serves primarily to document the computer programs. 10 figures
Full Text Available This paper formulates a bilevel compromise programming model for allocating resources between pavement and bridge deck maintenances. The first level of the model aims to solve the resource allocation problems for pavement management and bridge deck maintenance, without considering resource sharing between them. At the second level, the model uses the results from the first step as an input and generates the final solution to the resource-sharing problem. To solve the model, the paper applies genetic algorithms to search for the optimal solution. We use a combination of two digits to represent different maintenance types. Results of numerical examples show that the conditions of both pavements and bridge decks are improved significantly by applying compromise programming, rather than conventional methods. Resources are also utilized more efficiently when the proposed method is applied.
Lorenza Antonia Reyes de Duran
Full Text Available The proposal analysis, interpretation, disassembly, self-criticism and guidance is born and comes from work experience planned mass sports and social organizations opposed-not in the conventional sense comparative-private business models and sport, state and management. The contribution made by the sports management experience from positions of power, either state or business are undeniable and its impact is difficult to express in numbers for its humanistic value, which is incalculable. However, it is urgent to emphasize the products and results achieved by some social organizations related to sport; as are the reference cases in Higuerón and the municipality Independence of Yaracuy state. From a dialectical analysis of reality, we try to understand the complex system that involves high-level sports management and performance, we introduce in the praxical notions of sporting activity and associated approach applied to social and community work. As opening and closing action research processes, we will make a proposal to accompany sports management concerning overcrowding by and from social organizations. This proposal is constructed from an innovative, flexible, open, inclusive and social, community and organizational curriculum relevance.
Dhoruri, Atmini; Lestari, Dwi; Ratnasari, Eminugroho
Diabetes mellitus (DM) was a chronic metabolic disease characterized by higher than normal blood glucose level (normal blood glucose level = = 80 -120 mg/dl). In this study, type 2 DM which mostly caused by unhealthy eating habits would be investigated. Related to eating habit, DM patients needed dietary menu planning with an extracare regarding their nutrients intake (energy, protein, fat and carbohydrate). Therefore, the measures taken were by organizing nutritious dietary menu for diabetes mellitus patients. Dietary menu with appropriate amount of nutrients was organized by considering the amount of calories, proteins, fats and carbohydrates. In this study, Goal Programming model was employed to determine optimal dietary menu variations for diabetes mellitus patients by paying attention to optimal expenses. According to the data obtained from hospitals in Yogyakarta, optimal menu variations would be analyzed by using Goal Programming model and would be completed by using LINGO computer program.
Polson, Peter; Sherry, Lance; Feary, Michael; Palmer, Everett; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)
This paper proposes a model-based training program for the skills necessary to operate advance avionics systems that incorporate advanced autopilots and fight management systems. The training model is based on a formalism, the operational procedure model, that represents the mission model, the rules, and the functions of a modem avionics system. This formalism has been defined such that it can be understood and shared by pilots, the avionics software, and design engineers. Each element of the software is defined in terms of its intent (What?), the rationale (Why?), and the resulting behavior (How?). The Advanced Computer Tutoring project at Carnegie Mellon University has developed a type of model-based, computer aided instructional technology called cognitive tutors. They summarize numerous studies showing that training times to a specified level of competence can be achieved in one third the time of conventional class room instruction. We are developing a similar model-based training program for the skills necessary to operation the avionics. The model underlying the instructional program and that simulates the effects of pilots entries and the behavior of the avionics is based on the operational procedure model. Pilots are given a series of vertical flightpath management problems. Entries that result in violations, such as failure to make a crossing restriction or violating the speed limits, result in error messages with instruction. At any time, the flightcrew can request suggestions on the appropriate set of actions. A similar and successful training program for basic skills for the FMS on the Boeing 737-300 was developed and evaluated. The results strongly support the claim that the training methodology can be adapted to the cockpit.
O'Keefe, Christine M.; Head, Richard J.
It is the purpose of this article to discuss the development and application of a logic model in the context of a large scientific research program within the Commonwealth Scientific and Industrial Research Organisation (CSIRO). CSIRO is Australia's national science agency and is a publicly funded part of Australia's innovation system. It conducts…
Henriksen, Anders Starcke; Hvitved, Tom; Filinski, Andrzej
We present an extension of the programming-by-contract (PBC) paradigm to a concurrent and distributed environment. Classical PBC is characterized by absolute conformance of code to its specification, assigning blame in case of failures, and a hierarchical, cooperative decomposition model – none...
Full Text Available 800x600 In the present paper, we have considered the allocation problem of repairable components for a parallel-series system as a multi-objective optimization problem and have discussed two different models. In first model the reliability of subsystems are considered as different objectives. In second model the cost and time spent on repairing the components are considered as two different objectives. These two models is formulated as multi-objective Nonlinear Programming Problem (MONLPP and a Fuzzy goal programming method is used to work out the compromise allocation in multi-objective selective maintenance reliability model in which we define the membership functions of each objective function and then transform membership functions into equivalent linear membership functions by first order Taylor series and finally by forming a fuzzy goal programming model obtain a desired compromise allocation of maintenance components. A numerical example is also worked out to illustrate the computational details of the method. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4
Until recently, teen pregnancy and birth rates had declined in the United States. Despite these declines, U.S. teen birth and sexually transmitted infection (STI) rates remain among the highest in the industrialized world. Given the need to focus limited prevention resources on effective programs, Advocates for Youth undertook exhaustive reviews…
Executive functions are thinking skills that assist with reasoning, planning, problem solving, and managing one’s life. The brain areas that underlie these skills are interconnected with and influenced by activity in many different brain areas, some of which are associated with emotion and stress. One consequence of the stress-specific connections is that executive functions, which help us to organize our thinking, tend to be disrupted when stimulation is too high and we are stressed out, or too low when we are bored and lethargic. Given their central role in reasoning and also in managing stress and emotion, scientists have conducted studies, primarily with adults, to determine whether executive functions can be improved by training. By and large, results have shown that they can be, in part through computer-based videogame-like activities. Evidence of wider, more general benefits from such computer-based training, however, is mixed. Accordingly, scientists have reasoned that training will have wider benefits if it is implemented early, with very young children as the neural circuitry of executive functions is developing, and that it will be most effective if embedded in children’s everyday activities. Evidence produced by this research, however, is also mixed. In sum, much remains to be learned about executive function training. Without question, however, continued research on this important topic will yield valuable information about cognitive development. PMID:27906522
Cui, Xiaohui; Mueller, Frank; Potok, Thomas E.; Zhang, Yongpeng
Accelerating processors can often be more cost and energy effective for a wide range of data-parallel computing problems than general-purpose processors. For graphics processor units (GPUs), this is particularly the case when program development is aided by environments such as NVIDIA s Compute Unified Device Architecture (CUDA), which dramatically reduces the gap between domain-specific architectures and general purpose programming. Nonetheless, general-purpose GPU (GPGPU) programming remains subject to several restrictions. Most significantly, the separation of host (CPU) and accelerator (GPU) address spaces requires explicit management of GPU memory resources, especially for massive data parallelism that well exceeds the memory capacity of GPUs. One solution to this problem is to transfer data between the GPU and host memories frequently. In this work, we investigate another approach. We run massively data-parallel applications on GPU clusters. We further propose a programming model for massive data parallelism with data dependencies for this scenario. Experience from micro benchmarks and real-world applications shows that our model provides not only ease of programming but also significant performance gains
The purpose of the Advanced Fuels Campaign (AFC) Execution Plan is to communicate the structure and management of research, development, and demonstration (RD&D) activities within the Fuel Cycle Research and Development (FCRD) program. Included in this document is an overview of the FCRD program, a description of the difference between revolutionary and evolutionary approaches to nuclear fuel development, the meaning of science-based development of nuclear fuels, and the “Grand Challenge” for the AFC that would, if achieved, provide a transformational technology to the nuclear industry in the form of a high performance, high reliability nuclear fuel system. The activities that will be conducted by the AFC to achieve success towards this grand challenge are described and the goals and milestones over the next 20 to 40 year period of research and development are established.
The purpose of the Advanced Fuels Campaign (AFC) Execution Plan is to communicate the structure and management of research, development, and demonstration (RD&D) activities within the Fuel Cycle Research and Development (FCRD) program. Included in this document is an overview of the FCRD program, a description of the difference between revolutionary and evolutionary approaches to nuclear fuel development, the meaning of science-based development of nuclear fuels, and the 'Grand Challenge' for the AFC that would, if achieved, provide a transformational technology to the nuclear industry in the form of a high performance, high reliability nuclear fuel system. The activities that will be conducted by the AFC to achieve success towards this grand challenge are described and the goals and milestones over the next 20 to 40 year period of research and development are established.
Pebesma, Edzer J.; Wesseling, Cees G.
Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.
Burton, T L; Williamson, D L
Why do the vast majority of those who suffer harm from drinking fail to obtain treatment? Based on a review of research literature and educational and treatment program materials, a model of nonparticipation in treatment is proposed whereby particular population groups are separated out according to whether or not they exhibit specified characteristics related to both harm from drinking and attitudes towards treatment. Eleven groups have been identified in the model, each of which has different reasons for failing to seek and/or obtain treatment. It is suggested that differing educational program messages should be sent to each group. While the model does not purport to be wholly inclusive of all nonparticipation, it offers a basis for addressing the variety of disparate groups that suffer harm from drinking but do not obtain treatment.
Minakova, N.; Petrov, I.
The development of biometric systems is one of the labor-intensive processes. Therefore, the creation and analysis of approaches and techniques is an urgent task at present. This article presents a technique of modeling and prototyping biometric systems based on dataflow programming. The technique includes three main stages: the development of functional blocks, the creation of a dataflow graph and the generation of a prototype. A specially developed software modeling environment that implements this technique is described. As an example of the use of this technique, an example of the implementation of the iris localization subsystem is demonstrated. A variant of modification of dataflow programming is suggested to solve the problem related to the undefined order of block activation. The main advantage of the presented technique is the ability to visually display and design the model of the biometric system, the rapid creation of a working prototype and the reuse of the previously developed functional blocks.
Kaushik, Dinesh; Keyes, David E.; Balay, Satish; Smith, Barry F.
The complexity of programming modern multicore processor based clusters is rapidly rising, with GPUs adding further demand for fine-grained parallelism. This paper analyzes the performance of the hybrid (MPI+OpenMP) programming model in the context of an implicit unstructured mesh CFD code. At the implementation level, the effects of cache locality, update management, work division, and synchronization frequency are studied. The hybrid model presents interesting algorithmic opportunities as well: the convergence of linear system solver is quicker than the pure MPI case since the parallel preconditioner stays stronger when hybrid model is used. This implies significant savings in the cost of communication and synchronization (explicit and implicit). Even though OpenMP based parallelism is easier to implement (with in a subdomain assigned to one MPI process for simplicity), getting good performance needs attention to data partitioning issues similar to those in the message-passing case. © 2011 Springer-Verlag.
Kulhánek, Petr; Břeň, David
A fully three dimensional Particle in Cell model of the plasma fiber had been developed. The code is written in FORTRAN 95, implementation CVF (Compaq Visual Fortran) under Microsoft Visual Studio user interface. Five particle solvers and two field solvers are included in the model. The solvers have relativistic and non-relativistic variants. The model can deal both with periodical and non-periodical boundary conditions. The mechanism of the surface turbulences generation in the plasma fiber was successfully simulated with the PIC program package.
Kircher, E.; Dezordi, W.L.
It is presented, applyed to Angra-1, a methodology for implanting the monitoring program of the vicinity level radiation exposure to the installation. The method considers two kinds of radioactive effluents in the environment: gaseous (in the atmosphere) and liquid (in the marine aquatic environment). It is based on the generation and ordering of the important relation: radiation exposure pathway/radionuclide group. (M.C.K.) [pt
Koelle, H H; Stephenson, D G
This report is an initial review of plans for a extensive program to survey and develop the Moon and to explore the planet Mars during the 21st century. It presents current typical plans for separate, associated and fully integrated programs of Lunar and Martian research, exploration and development, and concludes that detailed integrated plans must be prepared and be subject to formal criticism. Before responsible politicians approve a new thrust into space they will demand attractive, defensible, and detailed proposals that explain the WHEN, HOW and WHY of each stage of an expanded program of 21st century space research, development and exploration. In particular, the claims of daring, innovative, but untried systems must be compared with the known performance of existing technologies. The time has come to supersede the present haphazard approach to strategic space studies with a formal international structure to plan for future advanced space missions under the aegis of the world's national space agencies, and supported by governments and the corporate sector. c2002 Elsevier Science Ltd. All rights reserved.
Lamoureux, Kim; Campbell, Michael; Smith, Roland
Most companies have an opportunity to improve their succession management programs. The number one challenge for succession management (as identified by both HR leaders and executives) is developing a succession planning strategy. This comprehensive industry study sets out to determine how succession management (when done well) helps improve…
Ambrose, David M.; Pol, Louis G.
The University of Nebraska's Executive Master's in Business Administration (MBA) program has integrated international research activities into the curriculum. The university contracted with domestic corporations to conduct studies on prospects for international business. Research assignments include assessment of competitors, economic evaluations,…
Ravazzotti, Mariolina T.; Jørgensen, John Leif; Neefs, Marc
Under the ESA contract #11453/95/NL/JG(SC), aiming at assessing the feasibility of Rendez-vous and docking of unmanned spacecrafts, a reference mission scenario was defined. This report gives an executive summary of the achievements and results from the project.......Under the ESA contract #11453/95/NL/JG(SC), aiming at assessing the feasibility of Rendez-vous and docking of unmanned spacecrafts, a reference mission scenario was defined. This report gives an executive summary of the achievements and results from the project....
Conclusion: This study revealed that although executive functions may be improved by protracted abstinence, executive dysfunctions are not completely relieved, and specific attention to planning and implementation of intervention programs are necessary.
Xu, Zeshui S; Chen, Jian
Group decision making with preference information on alternatives is an interesting and important research topic which has been receiving more and more attention in recent years. The purpose of this paper is to investigate multiple-attribute group decision-making (MAGDM) problems with distinct uncertain preference structures. We develop some linear-programming models for dealing with the MAGDM problems, where the information about attribute weights is incomplete, and the decision makers have their preferences on alternatives. The provided preference information can be represented in the following three distinct uncertain preference structures: 1) interval utility values; 2) interval fuzzy preference relations; and 3) interval multiplicative preference relations. We first establish some linear-programming models based on decision matrix and each of the distinct uncertain preference structures and, then, develop some linear-programming models to integrate all three structures of subjective uncertain preference information provided by the decision makers and the objective information depicted in the decision matrix. Furthermore, we propose a simple and straightforward approach in ranking and selecting the given alternatives. It is worth pointing out that the developed models can also be used to deal with the situations where the three distinct uncertain preference structures are reduced to the traditional ones, i.e., utility values, fuzzy preference relations, and multiplicative preference relations. Finally, we use a practical example to illustrate in detail the calculation process of the developed approach.
Zhang, Bo; Peng, Jin; Li, Shengguo
In an indeterminacy economic environment, experts' knowledge about the returns of securities consists of much uncertainty instead of randomness. This paper discusses portfolio selection problem in uncertain environment in which security returns cannot be well reflected by historical data, but can be evaluated by the experts. In the paper, returns of securities are assumed to be given by uncertain variables. According to various decision criteria, the portfolio selection problem in uncertain environment is formulated as expected-variance-chance model and chance-expected-variance model by using the uncertainty programming. Within the framework of uncertainty theory, for the convenience of solving the models, some crisp equivalents are discussed under different conditions. In addition, a hybrid intelligent algorithm is designed in the paper to provide a general method for solving the new models in general cases. At last, two numerical examples are provided to show the performance and applications of the models and algorithm.
Baskaran, L.; Cook, R. B.; Thornton, P. E.; Post, W. M.; Wilson, B. E.; Dadi, U.
The Modeling and Synthesis Thematic Data Center (MAST-DC) supports the North American Carbon Program by providing data products and data management services needed for modeling and synthesis activities. The overall objective of MAST-DC is to provide advanced data management support to NACP investigators doing modeling and synthesis, thereby freeing those investigators from having to perform data management functions. MAST-DC has compiled a number of data products for North America, including sub-pixel land-water content, daily meteorological data, and soil, land cover, and elevation data. In addition, we have developed an internet-based WebGIS system that enables users to browse, query, display, subset, and download spatial data using a standard web browser. For the mid-continent intensive, MAST-DC is working with a group of data assimilation modelers to generate a consistent set of meteorological data to drive bottom-up models.
Seo, Hyung-beom; Kim, Sung-min; Park, Joong-woo; Kim, Kwang-su; Ko, Dae-hack; Han, Bong-seob
NUCIRC is a steady-state thermal-hydraulic code used for design and performance analyses of CANDU Heat Transport System. The code is used to build PHT model in Wolsong NPP and to calculate channel flow distribution. Wolsong NPP has to calculate channel flow distribution and quality of coolant at the ROH header after every outage by OPP (Operating Policy and Principal). PHT modeling work is time consuming which need a lot of operation experience and specialty. It is very difficult to build PHT model as plant operator in two weeks which is obligate for plant operation after every outage. That is why Wolsong NPP develop NUMODEL (NUcirc MODELing) with many-years experience and a know-how of using NUCIRC code. NUMODEL is computer program which is used to create PHT model based on utilizing NUCIRC code
The Integrated Data Base (IDB) is the official US Department of Energy (DOE) data base for spent fuel and radioactive waste inventories and projections. As such, it should be as convenient to utilize as is practical. Examples of summary-level tables and figures are presented, as well as more-detailed graphics describing waste-form distribution by site and line charts illustrating historical and projected volume (or mass) changes. This information is readily accessible through the annual IDB publication. Other presentation formats are also available to the DOE community through a simple request to the IDB Program
O. V. VASYLENKO
Full Text Available Purpose. To improve simulation and design of Automatic Control Systems in the SPICE-compatible programs and to obtain separate economic and universal macromodels of PWM controller. Development of an PWM controller economical macromodel for the study of automatic control systems (ACS in computer-aided design (ECAD programs, which does not generate algorithmic failures in comparison with the existing models of PWM. Findings. Analysis of SPICE-family applications’ mathematical basis allowed to classifying existing models of PWM-controllers, defining their suitability for ACS simulation. The criteria for the synthesis of new models have been defined. For the SPICE 3G algorithms, the Switch and Averaged models based on behavioral elements has been developed. Universal and economical PWM controller macromodel based on the simple algorithm for determining the output signal with minimum numbers of input parameters has been designed. For the Automated Measuring magnetic susceptibility System, the macromodel of quasi-PWM signal generator have been designed, which is used in the compensation subsystem. This model is different from the existing ones: it synthesizes the staircase output signal instead the pulse one, thus, there is direct control of the amplitude of the output signal, which is taken averaged. The adequacy of the models is confirmed as comparison of the simulation results during investigations of the model already existing in the SPICE program, as well as the results of experiments with real ACS. The modeling of the PWM controller was carried out on the basis of behavioral elements from the ECAD library, simulation (solution of algebra-differential equations systems with programming elements is based on SPICE algorithms. The object of the study was the simulation process of ACS with the pulse-width principle of adjusting the output value. The subject of the research are the models of PWM controllers. Originality. The new macromodel of PWM
Jackson, M A
The programmer's task is often taken to be the construction of algorithms, expressed in hierarchical structures of procedures: this view underlies the majority of traditional programming languages, such as Fortran. A different view is appropriate to a wide class of problem, perhaps including some problems in High Energy Physics. The programmer's task is regarded as having three main stages: first, an explicit model is constructed of the reality with which the program is concerned; second, thi...
Full Text Available Background Patients receiving cancer treatment start lifestyle changes mostly at the end of the treatment during the rehabilitation period. Most often, the first step is a dietary change and physical exercises built into the daily routine. Patients who do this in groups led by qualified therapists and based on professional counseling can build more effective and more permanent changes into their life. To develop a complex rehabilitation program which, in the short term, aims to familiarize patients with a lifestyle which harmonizes the physical, mental, spiritual and social spheres of life and, in the long term, to build it into their everyday life in order to ameliorate the physical and mental state and reduce the psychological symptoms and the isolation of patients. The physical component focuses on diet and exercise. The psycho-social-spiritual support focuses on discovering inner sources of strength, developing active coping mechanisms and helping to achieve more open communication. Participants and procedure In February and March 2011, 8 patients treated for malignant tumors participated in the model program. The components of the model program were psychotherapy, physiotherapy, cancer consultation, nutrition counseling, creative activities and walking. Results During the period of the model program the isolation of the patients decreased and their social support and ability of coping with the illness ameliorated. They reported an ease in anxiety and depression in their everyday activities. According to feedback, their communication with each other, with the staff and with their relatives became more open. Altogether this had advantageous effects on the functioning of the ward and the mood of the staff. Conclusions The rehabilitation program confirmed that beside individual psycho-social support, beneficial and economic psycho-social support can be provided for the patients in group form along with the most effective assignment of the
Chim, Justine M Y
An effective office ergonomics program can predict positive results in reducing musculoskeletal injury rates, enhancing productivity, and improving staff well-being and job satisfaction. Its objective is to provide a systematic solution to manage the potential risk of musculoskeletal disorders among computer users in an office setting. A FITS Model office ergonomics program is developed. The FITS Model Office Ergonomics Program has been developed which draws on the legislative requirements for promoting the health and safety of workers using computers for extended periods as well as previous research findings. The Model is developed according to the practical industrial knowledge in ergonomics, occupational health and safety management, and human resources management in Hong Kong and overseas. This paper proposes a comprehensive office ergonomics program, the FITS Model, which considers (1) Furniture Evaluation and Selection; (2) Individual Workstation Assessment; (3) Training and Education; (4) Stretching Exercises and Rest Break as elements of an effective program. An experienced ergonomics practitioner should be included in the program design and implementation. Through the FITS Model Office Ergonomics Program, the risk of musculoskeletal disorders among computer users can be eliminated or minimized, and workplace health and safety and employees' wellness enhanced.
Lofquist, William Steele
Full Text Available The stories of those who have been executed in the Bahamas are heretofore untold. In telling these stories and in linking them to the changing course of Bahamian history, the present research adds an important dimension to our understanding of Bahamian history and politics. The major theme of this effort is that the changing practice of the death penalty is much more than a consequence of changes in crime. The use of the death penalty parallels the changing interests of colonial rulers, the changing practice of slavery, and the changing role of the Bahamas in colonial and regional affairs. Four distinctive eras of death penalty practice can be identified: (1 the slave era, where executions and commutations were used liberally and with a clear racial patterning; (2 a long era of stable colonialism, a period of marginalization and few executions; (3 an era of unstable colonialism characterized by intensive and efficient use of the death penalty; and (4 the current independence era of high murder rates and equally high impediments to the use of executions.
Rouw, Romke; van Driel, Joram; Knip, Koen; Richard Ridderinkhof, K.
In grapheme-color synesthesia, a number or letter can evoke two different and possibly conflicting (real and synesthetic) color sensations at the same time. In this study, we investigate the relationship between synesthesia and executive control functions. First, no general skill differences were
The School Executive Website will be a one-stop, online site for officials who are looking for educational data, best practices, product reviews, school documents, professional opinions, and/or job-related networking. The format of the website is designed in certain sections similar to other current and popular websites, such as Angie's List.com,…
Carnevale, Anthony P.; Smith, Nicole; Gulish, Artem; Beach, Bennett H.
This executive summary highlights several findings about healthcare. These are: (1) Healthcare is 18 percent of the U.S. economy, twice as high as in other countries; (2) There are two labor markets in healthcare: high-skill, high-wage professional and technical jobs and low-skill, low-wage support jobs; (3) Demand for postsecondary education in…
McAuley, Edward; Mullen, Sean P; Szabo, Amanda N; White, Siobhan M; Wójcicki, Thomas R; Mailey, Emily L; Gothe, Neha P; Olson, Erin A; Voss, Michelle; Erickson, Kirk; Prakash, Ruchika; Kramer, Arthur F
Self-efficacy and the use of self-regulatory strategies are consistently associated with physical activity behavior. Similarly, behavioral inhibition and cognitive resource allocation-indices of executive control function-have also been associated with this health behavior. The purpose of this study was to examine the hypothesis that self-efficacy mediates the relationship between self-regulatory processes, such as executive function, and sustained exercise behavior. Older adults (N=177, mean age=66.44 years) completed measures of executive function, self-reported use of self-regulatory strategies, and self-efficacy prior to and during the first month of a 12-month exercise intervention. Percentage of exercise classes attended over the following 11 months was used to represent adherence. Data were collected from 2007 to 2010 and analyzed in 2010-2011. Structural equation models were tested examining the effect of executive function and strategy use on adherence via efficacy. As hypothesized, results showed significant direct effects of two elements of executive function and of strategy use on self-efficacy and of efficacy on adherence. In addition, there were significant indirect effects of strategy use and executive function on adherence via self-efficacy. Higher levels of executive function and use of self-regulatory strategies at the start of an exercise program enhance beliefs in exercise capabilities, which in turn leads to greater adherence. Copyright © 2011 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Mellor-Crummey, John [William Marsh Rice University
As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.
Kelly, Suzanne Marie; Pedretti, Kevin Thomas Tauke; Levenhagen, Michael J.
This report summarizes our investigations into multi-core processors and programming models for parallel scientific applications. The motivation for this study was to better understand the landscape of multi-core hardware, future trends, and the implications on system software for capability supercomputers. The results of this study are being used as input into the design of a new open-source light-weight kernel operating system being targeted at future capability supercomputers made up of multi-core processors. A goal of this effort is to create an agile system that is able to adapt to and efficiently support whatever multi-core hardware and programming models gain acceptance by the community.
THE EVALUATION OF SCIENCE TEACHING ON JUNIOR HIGH SCHOOL USING STAKE’S COUNTENANCE MODEL Abstract The purpose of the study was to describe the science learning program on junior high school in Bone Bolanga district based on the Regulation of Minister of Education and Culture of the Republic of Indonesia, Number 65 of 2013 about Processing Standard of Primary and Secondary Education. This study used Stake’s Countanance evaluation model. The data were collected using observation, interview and documentation techniques. The conclusion was: (1 the planning of science learning was categorized fair (68%, it was found that lesson plan was not in accordance with the learning processing standard. (2 The implementation of science learning was categorized fair (57%, that unconformitted with learning processing implementation standard. (3 Student learning outcomes have not met the completeness of minimum criteria (KKM that categorized enough (65% and (4 There were the contingency of planing learning proces and outcome. Keywords: Program Evaluation, Stake's Countenance, Science Learning
Zhu, Dandan [Tsinghua Univ., Beijing (China); Hong, Tianzhen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yan, Da [Tsinghua Univ., Beijing (China); Wang, Chuang [Tsinghua Univ., Beijing (China)
This technical report presented the methodologies, processes, and results of comparing three Building Energy Modeling Programs (BEMPs) for load calculations: EnergyPlus, DeST and DOE-2.1E. This joint effort, between Lawrence Berkeley National Laboratory, USA and Tsinghua University, China, was part of research projects under the US-China Clean Energy Research Center on Building Energy Efficiency (CERC-BEE). Energy Foundation, an industrial partner of CERC-BEE, was the co-sponsor of this study work. It is widely known that large discrepancies in simulation results can exist between different BEMPs. The result is a lack of confidence in building simulation amongst many users and stakeholders. In the fields of building energy code development and energy labeling programs where building simulation plays a key role, there are also confusing and misleading claims that some BEMPs are better than others. In order to address these problems, it is essential to identify and understand differences between widely-used BEMPs, and the impact of these differences on load simulation results, by detailed comparisons of these BEMPs from source code to results. The primary goal of this work was to research methods and processes that would allow a thorough scientific comparison of the BEMPs. The secondary goal was to provide a list of strengths and weaknesses for each BEMP, based on in-depth understandings of their modeling capabilities, mathematical algorithms, advantages and limitations. This is to guide the use of BEMPs in the design and retrofit of buildings, especially to support China’s building energy standard development and energy labeling program. The research findings could also serve as a good reference to improve the modeling capabilities and applications of the three BEMPs. The methodologies, processes, and analyses employed in the comparison work could also be used to compare other programs. The load calculation method of each program was analyzed and compared to
Borrás, Susana; Peters, B. Guy
This paper studies the effects of the Lisbon Strategy on the way in which national executives co-ordinate EU policy at the domestic level. Comparing seven countries (Denmark, the United Kingdom [UK], Austria, Slovenia, Spain, France and Poland) it finds evidence that the Lisbon Strategy has been...... advancing (further) centralization and politicization in national patterns of EU policy co-ordination, empowering core executives. The Lisbon Strategy's ideational elements (‘grand’ goals and politically visible targets) as well as organizational requirements (Spring Council, national programming and annual...
Full Text Available The executive function (EF is a set of abilities, which allows us to invoke voluntary control of our behavioral responses. These functions enable human beings to develop and carry out plans, make up analogies, obey social rules, solve problems, adapt to unexpected circumstances, do many tasks simultaneously and locate episodes in time and place. EF includes divided attention and sustained attention, working memory, set-shifting, flexibility, planning and the regulation of goal directed behavior and can be defined as a brain function underlying the human faculty to act or think not only in reaction to external events but also in relation with internal goals and states. EF is mostly associated with dorsolateral prefrontal cortex (PFC. Besides EF, PFC is involved in self-regulation of behavior, i.e. the ability to regulate behavior according to internal goals and constraints, particularly in less structured situations. Self-regulation of behavior is subtended by ventral medial /orbital PFC. Impairment of EF is one of the most commonly observed deficits in schizophrenia through the various disease stages. Impairment in tasks measuring conceptualization, planning, cognitive flexibility, verbal fluency, ability to solve complex problems and working memory occur in schizophrenia. Disorders detected by executive tests are consistent with evidence from functional neuroimaging, which have shown PFC dysfunction in patients while performing these kinds of tasks. Schizophrenics also exhibit deficit in odor identifying, decision-making and self-regulation of behavior suggesting dysfunction of the orbital PFC. However, impairment in executive tests is explained by dysfunction of prefronto-striato-thalamic, prefronto-parietal and prefronto-temporal neural networks mainly. Disorders in executive functions may be considered central facts with respect to schizophrenia and it has been suggested that negative symptoms may be explained by that executive dysfunction.
Executive functions are thinking skills that assist with reasoning, planning, problem solving, and managing one's life. The brain areas that underlie these skills are interconnected with and influenced by activity in many different brain areas, some of which are associated with emotion and stress. One consequence of the stress-specific connections is that executive functions, which help us to organize our thinking, tend to be disrupted when stimulation is too high and we are stressed out, or too low when we are bored and lethargic. Given their central role in reasoning and also in managing stress and emotion, scientists have conducted studies, primarily with adults, to determine whether executive functions can be improved by training. By and large, results have shown that they can be, in part through computer-based videogame-like activities. Evidence of wider, more general benefits from such computer-based training, however, is mixed. Accordingly, scientists have reasoned that training will have wider benefits if it is implemented early, with very young children as the neural circuitry of executive functions is developing, and that it will be most effective if embedded in children's everyday activities. Evidence produced by this research, however, is also mixed. In sum, much remains to be learned about executive function training. Without question, however, continued research on this important topic will yield valuable information about cognitive development. WIREs Cogn Sci 2017, 8:e1403. doi: 10.1002/wcs.1403 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.
This book systematically introduces the development of simulation models as well as the implementation and evaluation of simulation experiments with Tecnomatix Plant Simulation. It deals with all users of Plant Simulation, who have more complex tasks to handle. It also looks for an easy entry into the program. Particular attention has been paid to introduce the simulation flow language SimTalk and its use in various areas of the simulation. The author demonstrates with over 200 examples how to combine the blocks for simulation models and how to deal with SimTalk for complex control and analys
Gao Rui; Yang Yanhua; Lin Meng
Based on the power conversion system of nuclear and conventional islands of Daya Bay Power Station, this paper models the thermal-hydraulic systems of primary and secondary loops for PWR by using the PWR best-estimate program-RELAP5. To simulate the full-scope power conversion system, not only the traditional basic system models of nuclear island, but also the major system models of conventional island are all considered and modeled. A comparison between the calculated results and the actual data of reactor demonstrates a fine match for Daya Bay Nuclear Power Station, and manifests the feasibility in simulating full-scope power conversion system of PWR by RELAP5 at the same time. (authors)
Blankenhorn, James A.
A national program for the management of low level waste is essential to the success of environmental clean-up, decontamination and decommissioning, current operations and future missions. The value of a national program is recognized through procedural consistency and a shared set of resources. A national program requires a clear waste definition and an understanding of waste characteristics matched against available and proposed disposal options. A national program requires the development and implementation of standards and procedures for implementing the waste hierarchy, with a specitic emphasis on waste avoidance, minimization and recycling. It requires a common set of objectives for waste characterization based on the disposal facility's waste acceptance criteria, regulatory and license requirements and performance assessments. Finally, a national waste certification program is required to ensure compliance. To facilitate and enhance the national program, a centralized generator services organization, tasked with providing technical services to the generators on behalf of the national program, is necessary. These subject matter experts are the interface between the generating sites and the disposal facility(s). They provide an invaluable service to the generating organizations through their involvement in waste planning prior to waste generation and through championing implementation of the waste hierarchy. Through their interface, national treatment and transportation services are optimized and new business opportunities are identified. This national model is based on extensive experience in the development and on-going management of a national transuranic waste program and management of the national repository, the Waste Isolation Pilot Plant. The Low Level Program at the Savannah River Site also successfully developed and implemented the waste hierarchy, waste certification and waste generator services concepts presented below. The Savannah River Site
Baker, Matthew B [ORNL; Gorentla Venkata, Manjunath [ORNL; Aderholdt, William Ferrol [ORNL; Shamis, Pavel [ARM Research
The OpenSHMEM reference implementation was developed towards the goal of developing an open source and high-performing Open- SHMEM implementation. To achieve portability and performance across various networks, the OpenSHMEM reference implementation uses GAS- Net and UCCS for network operations. Recently, new network layers have emerged with the promise of providing high-performance, scalabil- ity, and portability for HPC applications. In this paper, we implement the OpenSHMEM reference implementation to use the UCX framework for network operations. Then, we evaluate its performance and scalabil- ity on Cray XK systems to understand UCX s suitability for developing the OpenSHMEM programming model. Further, we develop a bench- mark called SHOMS for evaluating the OpenSHMEM implementation. Our experimental results show that OpenSHMEM-UCX outperforms the vendor supplied OpenSHMEM implementation in most cases on the Cray XK system by up to 40% with respect to message rate and up to 70% for the execution of application kernels.
Syeh Hawib Hamzah
Full Text Available The model of learning is a vital thing in education. A good appropriate model of learning could reach the goal of learning efficently and effectively. The lecturers of education and teacher training program of STAIN Samarinda implement a various teaching and learning models when they perform their teaching, such as: model of contectual teaching, social interaction, informational proces, personal-based learning, behaviorism, cooperative learning, and problem-based learning.
Vatsavai, Ranga Raju; Graesser, Jordan B.; Bhaduri, Budhendra L.
A programmable media includes a graphical processing unit in communication with a memory element. The graphical processing unit is configured to detect one or more settlement regions from a high resolution remote sensed image based on the execution of programming code. The graphical processing unit identifies one or more settlements through the execution of the programming code that executes a multi-instance learning algorithm that models portions of the high resolution remote sensed image. The identification is based on spectral bands transmitted by a satellite and on selected designations of the image patches.
The study examined a national sample of 396 randomly selected hospital nurse executives to explore transformational leadership, stage of power, and organizational climate. Results from a few nurse executive studies have found nurse executives were transformational leaders. As executives were more transformational, they achieved better staff satisfaction and higher work group effectiveness. This study integrates Bass' transformational leadership model with Hagberg's power stage theory and Likert's organizational climate theory. Nurse executives (396) and staff reporting to them (1,115) rated the nurse executives' leadership style, staff extra effort, staff satisfaction, and work group effectiveness using Bass and Avolio's Multifactor Leadership Questionnaire. Executives' bosses (360) rated executive work group effectiveness. Executives completed Hagberg's Personal Power Profile and ranked their organizational climate using Likert's Profile of Organizational Characteristics. Nurse executives used transformational leadership fairly often; achieved fairly satisfied staff levels; were very effective according to bosses; were most likely at stage 3 (power by achievement) or stage 4 (power by reflection); and rated their hospital as a Likert System 3 Consultative Organization. Staff satisfaction and work group effectiveness decreased as nurse executives were more transactional. Higher transformational scores tended to occur with higher educational degrees and within more participative organizations. Transformational qualities can be enhanced by further education, by achieving higher power stages, and by being within more participative organizations.
Existing health information systems largely only support the daily operations of a medical centre, and are unable to generate the information required by executives for decision-making. Building on past research concerning information retrieval behaviour and learning through mental models, this study examines the use of information systems by hospital executives in medical centres. It uses a structural equation model to help find ways hospital executives might use information systems more effectively. The results show that computer self-efficacy directly affects the maintenance of mental models, and that system characteristics directly impact learning styles and information retrieval behaviour. Other results include the significant impact of perceived environmental uncertainty on scan searches; information retrieval behaviour and focused searches on mental models and perceived efficiency; scan searches on mental model building; learning styles and model building on perceived efficiency; and finally the impact of mental model maintenance on perceived efficiency and effectiveness.
Rezaee, Rita; Shokrpour, Nasrin; Boroumand, Maryam
In e-learning, people get involved in a process and create the content (product) and make it available for virtual learners. The present study was carried out in order to evaluate the first virtual master program in medical education at Shiraz University of Medical Sciences according to P3 Model. This is an evaluation research study with post single group design used to determine how effective this program was. All students 60 who participated more than one year in this virtual program and 21 experts including teachers and directors participated in this evaluation project. Based on the P3 e-learning model, an evaluation tool with 5-point Likert rating scale was designed and applied to collect the descriptive data. Students reported storyboard and course design as the most desirable element of learning environment (2.30±0.76), but they declared technical support as the less desirable part (1.17±1.23). Presence of such framework in this regard and using it within the format of appropriate tools for evaluation of e-learning in universities and higher education institutes, which present e-learning curricula in the country, may contribute to implementation of the present and future e-learning curricula efficiently and guarantee its implementation in an appropriate way.
Full Text Available Introduction: In e-learning, people get involved in a process and create the content (product and make it available for virtual learners. The present study was carried out in order to evaluate the first virtual master program in medical education at Shiraz University of Medical Sciences according to P3 Model. Methods: This is an evaluation research study with post single group design used to determine how effective this program was. All students 60 who participated more than one year in this virtual program and 21 experts including teachers and directors participated in this evaluation project. Based on the P3 e-learning model, an evaluation tool with 5-point Likert rating scale was designed and applied to collect the descriptive data. Results: Students reported storyboard and course design as the most desirable element of learning environment (2.30±0.76, but they declared technical support as the less desirable part (1.17±1.23. Conclusion: Presence of such framework in this regard and using it within the format of appropriate tools for evaluation of e-learning in universities and higher education institutes, which present e-learning curricula in the country, may contribute to implementation of the present and future e-learning curricula efficiently and guarantee its implementation in an appropriate way.
Heroux, Michael Allen; Teranishi, Keita
Recovery from process loss during the execution of a distributed memory parallel application is presently achieved by restarting the program, typically from a checkpoint file. Future computer system trends indicate that the size of data to checkpoint, the lack of improvement in parallel file system performance and the increase in process failure rates will lead to situations where checkpoint restart becomes infeasible. In this report we describe and prototype the use of a new application level resilient computing model that manages persistent storage of local state for each process such that, if a process fails, recovery can be performed locally without requiring access to a global checkpoint file. LFLR provides application developers with an ability to recover locally and continue application execution when a process is lost. This report discusses what features are required from the hardware, OS and runtime layers, and what approaches application developers might use in the design of future codes, including a demonstration of LFLR-enabled MiniFE code from the Matenvo mini-application suite.
Nielsen, Mogens; Valencia Posso, Frank Dan
The ntcc calculus is a model of non-deterministic temporal concurrent constraint programming. In this paper we study behavioral notions for this calculus. In the underlying computational model, concurrent constraint processes are executed in discrete time intervals. The behavioral notions studied...... reflect the reactive interactions between concurrent constraint processes and their environment, as well as internal interactions between individual processes. Relationships between the suggested notions are studied, and they are all proved to be decidable for a substantial fragment of the calculus...
Maimunah; Aldila, Dipo
In this article, using a deterministic approach in a seven-dimensional nonlinear ordinary differential equation, we establish a mathematical model for the spread of HIV with an ART treatment intervention. In a simplified model, when no ART treatment is implemented, disease-free and the endemic equilibrium points were established analytically along with the basic reproduction number. The local stability criteria of disease-free equilibrium and the existing criteria of endemic equilibrium were analyzed. We find that endemic equilibrium exists when the basic reproduction number is larger than one. From the sensitivity analysis of the basic reproduction number of the complete model (with ART treatment), we find that the increased number of infected humans who follow the ART treatment program will reduce the basic reproduction number. We simulate this result also in the numerical experiment of the autonomous system to show how treatment intervention impacts the reduction of the infected population during the intervention time period.
Full Text Available The paper provides a scientific approach to the problem of selecting a pension fund by taking into account some specific characteristics of the Lithuanian Republic (LR pension accumulation system. The decision making model, which can be used to plan a long-term pension accrual of the Lithuanian Republic (LR citizens, in an optimal way is presented. This model focuses on factors that influence the sustainability of the pension system selection under macroeconomic, social and demographic uncertainty. The model is formalized as a single stage stochastic optimization problem where the long-term optimal strategy can be obtained based on the possible scenarios generated for a particular participant. Stochastic programming methods allow including the pension fund rebalancing moment and direction of investment, and taking into account possible changes of personal income, changes of society and the global financial market. The collection of methods used to generate scenario trees was found useful to solve strategic planning problems.
Vipavetz, Kevin G.; Murphy, Douglas G.; Infeld, Samatha I.
NASA Langley Research Center conducted a pilot program to evaluate the benefits of using a Model-Based Systems Engineering (MBSE) approach during the early phase of the Materials International Space Station Experiment-X (MISSE-X) project. The goal of the pilot was to leverage MBSE tools and methods, including the Systems Modeling Language (SysML), to understand the net gain of utilizing this approach on a moderate size flight project. The System Requirements Review (SRR) success criteria were used to guide the work products desired from the pilot. This paper discusses the pilot project implementation, provides SysML model examples, identifies lessons learned, and describes plans for further use on MBSE on MISSE-X.
Full Text Available An application of Genetic Programming (an evolutionary computational tool without and with standardization data is presented with the aim of modeling the behavior of the water temperature in a river in terms of meteorological variables that are easily measured, to explore their explanatory power and to emphasize the utility of the standardization of variables in order to reduce the effect of those with large variance. Recorded data corresponding to the water temperature behavior at the Ebro River, Spain, are used as analysis case, showing a performance improvement on the developed model when data are standardized. This improvement is reflected in a reduction of the mean square error. Finally, the models obtained in this document were applied to estimate the water temperature in 2004, in order to provide evidence about their applicability to forecasting purposes.
Drucker, Peter F
In more than 30 essays for Harvard Business Review, Peter Drucker (1909-2005) urged readers to take on the hard work of thinking--always combined, he insisted, with decisive action. He closely analyzed the phenomenon of knowledge work--the growing call for employees who use their minds rather than their hands--and explained how it challenged the conventional wisdom about the way organizations should be run. He was intrigued by employees who knew more about certain subjects than their bosses or colleagues but who still had to cooperate with others in a large organization. As the business world matured in the second half of the twentieth century, executives came to think that they knew how to run companies--and Drucker took it upon himself to poke holes in their assumptions, lest organizations become stale. But he did so sympathetically, operating from the premise that his readers were intelligent, hardworking people of goodwill. Well suited to HBR's format of practical, idea-based essays for executives, his clear-eyed, humanistic writing enriched the magazine time and again. This article is a compilation of the savviest management advice Drucker offered HBR readers over the years--in short, his greatest hits. It revisits the following insightful, influential contributions: "The Theory of the Business" (September-October 1994), "Managing for Business Effectiveness" (May-June 1963), "What Business Can Learn from Nonprofits" (July-August 1989), "The New Society of Organizations" (September-October 1992), "The Information Executives Truly Need" (January-February 1995), "Managing Oneself" (March-April 1999 republished January 2005), "They're Not Employees, They're People" (February 2002), "What Makes an Effective Executive" (June 2004).
Over the last two decades, Chinese nationals have increasingly been employed by multinational companies (MNCs) operating in China taking positions previously occupied by foreign expatriates from investor countries. The development of local managers has therefore become crucial in the field of human resource management because the success of these companies depends greatly upon the ability and competence of their executive management class. The present paper addresses the issue of how to devel...
The overall objective of this project was to develop an updated model Energy Conservation training program for stationary engineers. This revision to the IUOE National Training Fund’s existing Energy Conservation training curriculum is designed to enable stationary engineers to incorporate essential energy management into routine building operation and maintenance tasks. The curriculum uses a blended learning approach that includes classroom, hands-on, computer simulation and web-based training in addition to a portfolio requirement for a workplace-based learning application. The Energy Conservation training program goal is development of a workforce that can maintain new and existing commercial buildings at optimum energy performance levels. The grant start date was July 6, 2010 and the project continued through September 30, 2012, including a three month non-funded extension.
Contemporary astronomy is characterized by increasingly complex instruments and observational techniques, higher data collection rates, and large data archives, placing severe stress on software analysis systems. The object-oriented paradigm represents a significant new approach to software design and implementation that holds great promise for dealing with this increased complexity. The basic concepts of this approach will be characterized in contrast to more traditional procedure-oriented approaches. The fundamental features of objected-oriented programming will be discussed from a C++ programming language perspective, using examples familiar to astronomers. This discussion will focus on objects, classes and their relevance to the data type system; the principle of information hiding; and the use of inheritance to implement generalization/specialization relationships. Drawing on the object-oriented approach, features of a new database model to support astronomical data analysis will be presented.
This manual is a guide for law enforcement agencies and community organizations in creating and implementing a citizens DWI reporting program in their communities modeling the Operation Extra Eyes program. Extra Eyes is a program that engages volu...
Keren, Baruch; Pliskin, Joseph S
The optimal timing for performing radical medical procedures as joint (e.g., hip) replacement must be seriously considered. In this paper we show that under deterministic assumptions the optimal timing for joint replacement is a solution of a mathematical programming problem, and under stochastic assumptions the optimal timing can be formulated as a stochastic programming problem. We formulate deterministic and stochastic models that can serve as decision support tools. The results show that the benefit from joint replacement surgery is heavily dependent on timing. Moreover, for a special case where the patient's remaining life is normally distributed along with a normally distributed survival of the new joint, the expected benefit function from surgery is completely solved. This enables practitioners to draw the expected benefit graph, to find the optimal timing, to evaluate the benefit for each patient, to set priorities among patients and to decide if joint replacement should be performed and when.
Law, E.; Day, B. H.; Kim, R. M.; Bui, B.; Malhotra, S.; Chang, G.; Sadaqathullah, S.; Arevalo, E.; Vu, Q. A.
NASA's Lunar and Planetary Mapping and Modeling Program produces a suite of online visualization and analysis tools. Originally designed for mission planning and science, these portals offer great benefits for education and public outreach (EPO), providing access to data from a wide range of instruments aboard a variety of past and current missions. As a component of NASA's Science EPO Infrastructure, they are available as resources for NASA STEM EPO programs, and to the greater EPO community. As new missions are planned to a variety of planetary bodies, these tools are facilitating the public's understanding of the missions and engaging the public in the process of identifying and selecting where these missions will land. There are currently three web portals in the program: the Lunar Mapping and Modeling Portal or LMMP (http://lmmp.nasa.gov), Vesta Trek (http://vestatrek.jpl.nasa.gov), and Mars Trek (http://marstrek.jpl.nasa.gov). Portals for additional planetary bodies are planned. As web-based toolsets, the portals do not require users to purchase or install any software beyond current web browsers. The portals provide analysis tools for measurement and study of planetary terrain. They allow data to be layered and adjusted to optimize visualization. Visualizations are easily stored and shared. The portals provide 3D visualization and give users the ability to mark terrain for generation of STL files that can be directed to 3D printers. Such 3D prints are valuable tools in museums, public exhibits, and classrooms - especially for the visually impaired. Along with the web portals, the program supports additional clients, web services, and APIs that facilitate dissemination of planetary data to a range of external applications and venues. NASA challenges and hackathons are also providing members of the software development community opportunities to participate in tool development and leverage data from the portals.
... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 4 6 3 2. The Basics of Program Writing Programs from Conception to Execution Creating a Real Program Getting Help in Unix Getting Help in an IDE Programming...
DeAngelis, D.L.; Van Winkle, W.; Christensen, S.W.; Blum, S.R.; Kirk, B.L.; Rust, B.W.; Ross, C.
A generalized fish life-cycle population model and computer program have been prepared to evaluate the long-term effect of changes in mortality in age class 0. The general question concerns what happens to a fishery when density-independent sources of mortality are introduced that act on age class 0, particularly entrainment and impingement at power plants. This paper discusses the model formulation and computer program, including sample results. The population model consists of a system of difference equations involving age-dependent fecundity and survival. The fecundity for each age class is assumed to be a function of both the fraction of females sexually mature and the weight of females as they enter each age class. Natural mortality for age classes 1 and older is assumed to be independent of population size. Fishing mortality is assumed to vary with the number and weight of fish available to the fishery. Age class 0 is divided into six life stages. The probability of survival for age class 0 is estimated considering both density-independent mortality (natural and power plant) and density-dependent mortality for each life stage. Two types of density-dependent mortality are included. These are cannibalism of each life stage by older age classes and intra-life-stage competition
The report describes a fiber-optics system model and its computer implementation. This implementation can calculate the bit error ratio (BER) versus time for optical fibers that have been exposed to gamma radiation. The program is designed so that the user may arbitrarily change any or all of the system input variables and produce separate outputs. The primary output of the program is a table of the BER as a function of time. This table may be stored on magnetic media and later incorporated into computer graphic programs. The program was written in FORTRAN 77 for the IBM PC/AT/XT computers. Flow charts and program listings are included in the report
D. V. Vatlitsov
Full Text Available The technology evolution creates the prerequisites for the emergence of new informational concept and approaches to the formation of a fundamentally new principles of biological objects understanding. The aim was to study the activators of the programmed cell death in an isolated system model. Cell culture aging parameters were performed on flow cytometer. It had formed the theory that the changes in the concentrations of metal ions and increase their extracellular concentration had formed a negative gradient into the cells.regulation of cell death. It was shown that the metals ions concentrations.
Stafford, J. M.
Receivers operating on a space vehicle or an aircraft having many on-board transmitters are subject to intermodulation interference from mixing in the transmitting antenna systems, the external environment, or the receiver front-ends. This paper presents the techniques utilized in RFI Math Model computer programs that were developed to aid in the prevention of interference by predicting problem areas prior to occurrence. Frequencies and amplitudes of possible intermodulation products generated in the external environment are calculated and compared to receiver sensitivities. Intermodulation products generated in receivers are evaluated to determine the adequacy of preselector ejection.
This article presents a model program for managing problem employees that includes a description ofthe basic types of problem employees and employee problems, as well as practical recommendations for. (1) selection and screening, (2) education and training, (3) coaching and counseling, (4) discipline, (5) psychological fitness-for-duty evaluations, (6) mental health services, (7) termination, and (8) leadership and administrative strategies. Throughout, the emphasis on balancing the need for order and productivity in the workplace with fairness and concern for employee health and well-being.
Eichhorn, G.; Piercey, R.B.
The adverse effects of ionizing radiation on microelectronic systems include cumulative dosage effects, single-event upsets (SEU's) and latch-up. Most frequent, especially when the radiation environment includes heavy ions, are SEU's. Unfortunately SEU's are difficult to detect since they can be read (in RAM or ROM) as valid addresses. They can however be handled in software by proper techniques. The authors refer to their method as MRS - Maximally Redundant Software. The MRS programming model which the authors are developing uses multiply redundant boot blocks, majority voting, periodic refresh, and error recovery techniques to minimize the deleterious effects of SEU's. 1 figure
Haxthausen, Anne Elisabeth; Peleska, Jan
In this article, the feasibility of a unified modelling and programming paradigm is discussed from the perspective of large scale system development and verification in collaborative development environments. We motivate the necessity to utilise multiple formalisms for development and verification....... It is illustrated by means of a case study from the railway domain, how this can be achieved, using concepts from the theory of institutions. This also enables the utilisation of verification tools in different formalisms, despite the fact that these tools are usually developed for one specific formal method....