WorldWideScience

Sample records for integration analysis executive

  1. The Executive as Integrator.

    Science.gov (United States)

    Cohn, Hans M.

    1983-01-01

    Argues that although the executive has many tasks, he or she must view internal organizational integration as a primary task, making use of organizational charts, job descriptions, statements of goals and objectives, evaluations, and feedback devices. (RH)

  2. Integrated communication, navigation, and identification avionics: Impact analysis. Executive summary

    Science.gov (United States)

    Veatch, M. H.; McManus, J. C.

    1985-10-01

    This paper summarizes the approach and findings of research into reliability, supportability, and survivability prediction techniques for fault-tolerant avionics systems. Since no technique existed to analyze the fault tolerance of reconfigurable systems, a new method was developed and implemented in the Mission Reliability Model (MIREM). The supportability analysis was completed by using the Simulation of Operational Availability/Readiness (SOAR) model. Both the Computation of Vulnerable Area and Repair Time (COVART) model and FASTGEN, a survivability model, proved valuable for the survivability research. Sample results are presented and several recommendations are also given for each of the three areas investigated under this study: reliability supportablility and survivability.

  3. Administrative Challenges to the Integration of Oral Health With Primary Care: A SWOT Analysis of Health Care Executives at Federally Qualified Health Centers.

    Science.gov (United States)

    Norwood, Connor W; Maxey, Hannah L; Randolph, Courtney; Gano, Laura; Kochhar, Komal

    Inadequate access to preventive oral health services contributes to oral health disparities and is a major public health concern in the United States. Federally Qualified Health Centers play a critical role in improving access to care for populations affected by oral health disparities but face a number of administrative challenges associated with implementation of oral health integration models. We conducted a SWOT (strengths, weaknesses, opportunities, and threats) analysis with health care executives to identify strengths, weaknesses, opportunities, and threats of successful oral health integration in Federally Qualified Health Centers. Four themes were identified: (1) culture of health care organizations; (2) operations and administration; (3) finance; and (4) workforce.

  4. Senior executives' perspectives of integrated reporting regulatory ...

    African Journals Online (AJOL)

    kirstam

    2014-12-09

    Dec 9, 2014 ... that integrated thinking and integrated reporting principles will have ... in response to financial, governance and other crises; heightened expectations of ... The Prince's Accounting for Sustainability Project suggests a concept.

  5. Executive functions as predictors of visual-motor integration in children with intellectual disability.

    Science.gov (United States)

    Memisevic, Haris; Sinanovic, Osman

    2013-12-01

    The goal of this study was to assess the relationship between visual-motor integration and executive functions, and in particular, the extent to which executive functions can predict visual-motor integration skills in children with intellectual disability. The sample consisted of 90 children (54 boys, 36 girls; M age = 11.3 yr., SD = 2.7, range 7-15) with intellectual disabilities of various etiologies. The measure of executive functions were 8 subscales of the Behavioral Rating Inventory of Executive Function (BRIEF) consisting of Inhibition, Shifting, Emotional Control, Initiating, Working memory, Planning, Organization of material, and Monitoring. Visual-motor integration was measured with the Acadia test of visual-motor integration (VMI). Regression analysis revealed that BRIEF subscales explained 38% of the variance in VMI scores. Of all the BRIEF subscales, only two were statistically significant predictors of visual-motor integration: Working memory and Monitoring. Possible implications of this finding are further elaborated.

  6. Pegasys: software for executing and integrating analyses of biological sequences

    Directory of Open Access Journals (Sweden)

    Lett Drew

    2004-04-01

    Full Text Available Abstract Background We present Pegasys – a flexible, modular and customizable software system that facilitates the execution and data integration from heterogeneous biological sequence analysis tools. Results The Pegasys system includes numerous tools for pair-wise and multiple sequence alignment, ab initio gene prediction, RNA gene detection, masking repetitive sequences in genomic DNA as well as filters for database formatting and processing raw output from various analysis tools. We introduce a novel data structure for creating workflows of sequence analyses and a unified data model to store its results. The software allows users to dynamically create analysis workflows at run-time by manipulating a graphical user interface. All non-serial dependent analyses are executed in parallel on a compute cluster for efficiency of data generation. The uniform data model and backend relational database management system of Pegasys allow for results of heterogeneous programs included in the workflow to be integrated and exported into General Feature Format for further analyses in GFF-dependent tools, or GAME XML for import into the Apollo genome editor. The modularity of the design allows for new tools to be added to the system with little programmer overhead. The database application programming interface allows programmatic access to the data stored in the backend through SQL queries. Conclusions The Pegasys system enables biologists and bioinformaticians to create and manage sequence analysis workflows. The software is released under the Open Source GNU General Public License. All source code and documentation is available for download at http://bioinformatics.ubc.ca/pegasys/.

  7. Integrating Symbolic Execution with Sensornet Simulation for Efficient Bug Finding

    OpenAIRE

    Österlind, Fredrik; Sasnauskas, Raimondas; Dustmann, Oscar Soria; Dunkels, Adam; Wehrle, Klaus

    2010-01-01

    High-coverage testing of sensornet applications is vital for pre-deployment bug cleansing, but has previously been difficult due to the limited set of available tools. We integrate the KleeNet symbolic execution engine with the COOJA network simulator to allow for straight-forward and intuitive high-coverage testing initiated from a simulation environment. A tight coupling of simulation and testing helps detect, narrow down, and fix complex interaction bugs in an early ...

  8. The role of the supply chain executive in supply chain integration: a behavioral approach

    OpenAIRE

    ELENA REVILLA; LUIS GOMEZ - MEJIA

    2008-01-01

    Applying a behavioural approach of agency theory, this paper aimed to identify the most appropriate employment and compensation system (ECS) for supply chain executives in order to foster supply chain integration. We attempted to develop a novel approach of how encourages supply chain integration from the perspective of managerial incentives, an enabler that has not been analyzed in the literature. The paper presents the analysis of three sources of risk bearing - compensation risk, employmen...

  9. Relationship between grey matter integrity and executive abilities in aging.

    Science.gov (United States)

    Manard, Marine; Bahri, Mohamed Ali; Salmon, Eric; Collette, Fabienne

    2016-07-01

    This cross-sectional study was designed to investigate grey matter changes that occur in healthy aging and the relationship between grey matter characteristics and executive functioning. Thirty-six young adults (18-30 years old) and 43 seniors (60-75 years old) were included. A general executive score was derived from a large battery of neuropsychological tests assessing three major aspects of executive functioning (inhibition, updating and shifting). Age-related grey matter changes were investigated by comparing young and older adults using voxel-based morphometry and voxel-based cortical thickness methods. A widespread difference in grey matter volume was found across many brain regions, whereas cortical thinning was mainly restricted to central areas. Multivariate analyses showed age-related changes in relatively similar brain regions to the respective univariate analyses but appeared more limited. Finally, in the older adult sample, a significant relationship between global executive performance and decreased grey matter volume in anterior (i.e. frontal, insular and cingulate cortex) but also some posterior brain areas (i.e. temporal and parietal cortices) as well as subcortical structures was observed. Results of this study highlight the distribution of age-related effects on grey matter volume and show that cortical atrophy does not appear primarily in "frontal" brain regions. From a cognitive viewpoint, age-related executive functioning seems to be related to grey matter volume but not to cortical thickness. Therefore, our results also highlight the influence of methodological aspects (from preprocessing to statistical analysis) on the pattern of results, which could explain the lack of consensus in literature. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Informing the Structure of Executive Function in Children: A Meta-Analysis of Functional Neuroimaging Data

    Science.gov (United States)

    McKenna, Róisín; Rushe, T.; Woodcock, Kate A.

    2017-01-01

    The structure of executive function (EF) has been the focus of much debate for decades. What is more, the complexity and diversity provided by the developmental period only adds to this contention. The development of executive function plays an integral part in the expression of children's behavioral, cognitive, social, and emotional capabilities. Understanding how these processes are constructed during development allows for effective measurement of EF in this population. This meta-analysis aims to contribute to a better understanding of the structure of executive function in children. A coordinate-based meta-analysis was conducted (using BrainMap GingerALE 2.3), which incorporated studies administering functional magnetic resonance imaging (fMRI) during inhibition, switching, and working memory updating tasks in typical children (aged 6–18 years). The neural activation common across all executive tasks was compared to that shared by tasks pertaining only to inhibition, switching or updating, which are commonly considered to be fundamental executive processes. Results support the existence of partially separable but partially overlapping inhibition, switching, and updating executive processes at a neural level, in children over 6 years. Further, the shared neural activation across all tasks (associated with a proposed “unitary” component of executive function) overlapped to different degrees with the activation associated with each individual executive process. These findings provide evidence to support the suggestion that one of the most influential structural models of executive functioning in adults can also be applied to children of this age. However, the findings also call for careful consideration and measurement of both specific executive processes, and unitary executive function in this population. Furthermore, a need is highlighted for a new systematic developmental model, which captures the integrative nature of executive function in children. PMID

  11. WCET Analysis of Java Bytecode Featuring Common Execution Environments

    DEFF Research Database (Denmark)

    Luckow, Kasper Søe; Thomsen, Bent; Frost, Christian

    2011-01-01

    We present a novel tool for statically determining the Worst Case Execution Time (WCET) of Java Bytecode-based programs called Tool for Execution Time Analysis of Java bytecode (TetaJ). This tool differentiates itself from existing tools by separating the individual constituents of the execution...... environment into independent components. The prime benefit is that it can be used for execution environments featuring common embedded processors and software implementations of the JVM. TetaJ employs a model checking approach for statically determining WCET where the Java program, the JVM, and the hardware...

  12. “The Relationship between Executive Functioning, Processing Speed and White Matter Integrity in Multiple Sclerosis”

    Science.gov (United States)

    Genova, Helen M.; DeLuca, John; Chiaravalloti, Nancy; Wylie, Glenn

    2014-01-01

    The primary purpose of the current study was to examine the relationship between performance on executive tasks and white matter integrity, assessed by diffusion tensor imaging (DTI) in Multiple Sclerosis (MS). A second aim was to examine how processing speed affects the relationship between executive functioning and FA. This relationship was examined in two executive tasks that rely heavily on processing speed: the Color-Word Interference Test and Trail-Making Test (Delis-Kaplan Executive Function System). It was hypothesized that reduced fractional anisotropy (FA) is related to poor performance on executive tasks in MS, but that this relationship would be affected by the statistical correction of processing speed from the executive tasks. 15 healthy controls and 25 persons with MS participated. Regression analyses were used to examine the relationship between executive functioning and FA, both before and after processing speed was removed from the executive scores. Before processing speed was removed from the executive scores, reduced FA was associated with poor performance on Color-Word Interference Test and Trail-Making Test in a diffuse network including corpus callosum and superior longitudinal fasciculus. However, once processing speed was removed, the relationship between executive functions and FA was no longer significant on the Trail Making test, and significantly reduced and more localized on the Color-Word Interference Test. PMID:23777468

  13. From Modelling to Execution of Enterprise Integration Scenarios: The GENIUS Tool

    Science.gov (United States)

    Scheibler, Thorsten; Leymann, Frank

    One of the predominant problems IT companies are facing today is Enterprise Application Integration (EAI). Most of the infrastructures built to tackle integration issues are proprietary because no standards exist for how to model, develop, and actually execute integration scenarios. EAI patterns gain importance for non-technical business users to ease and harmonize the development of EAI scenarios. These patterns describe recurring EAI challenges and propose possible solutions in an abstract way. Therefore, one can use those patterns to describe enterprise architectures in a technology neutral manner. However, patterns are documentation only used by developers and systems architects to decide how to implement an integration scenario manually. Thus, patterns are not theoretical thought to stand for artefacts that will immediately be executed. This paper presents a tool supporting a method how EAI patterns can be used to generate executable artefacts for various target platforms automatically using a model-driven development approach, hence turning patterns into something executable. Therefore, we introduce a continuous tool chain beginning at the design phase and ending in executing an integration solution in a completely automatically manner. For evaluation purposes we introduce a scenario demonstrating how the tool is utilized for modelling and actually executing an integration scenario.

  14. The Environment for Application Software Integration and Execution (EASIE), version 1.0. Volume 2: Program integration guide

    Science.gov (United States)

    Jones, Kennie H.; Randall, Donald P.; Stallcup, Scott S.; Rowell, Lawrence F.

    1988-01-01

    The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational data base management system. In volume 2, the use of a SYSTEM LIBRARY PROCESSOR is used to construct a DATA DICTIONARY describing all relations defined in the data base, and a TEMPLATE LIBRARY. A TEMPLATE is a description of all subsets of relations (including conditional selection criteria and sorting specifications) to be accessed as input or output for a given application. Together, these form the SYSTEM LIBRARY which is used to automatically produce the data base schema, FORTRAN subroutines to retrieve/store data from/to the data base, and instructions to a generic REVIEWER program providing review/modification of data for a given template. Automation of these functions eliminates much of the tedious, error prone work required by the usual approach to data base integration.

  15. ISSUES RELATED TO A REASONABLENESS OF EXECUTIVE COMPENSATION ANALYSIS

    Directory of Open Access Journals (Sweden)

    Angela Eliza MICU

    2006-01-01

    Full Text Available In most companies, there is ongoing conflict between managers in charge of covering costs (finance and accounting and managers in charge of satisfying customers (marketing and sales. Accounting journals warn against prices that fail to cover full costs, while marketing journals argue that customer willingness-to-pay must be the sole driver of prices. This article will further explain these reasons to conduct an independent reasonableness of executive/professional practitioner compensation analysis. In addition, this article will discuss many of the typical factors that the independent analyst will consider in assessing the reasonableness of executive compensation for controversy, taxation, corporate planning, and corporate governance purposes.

  16. A meta-analysis of executive components of working memory.

    Science.gov (United States)

    Nee, Derek Evan; Brown, Joshua W; Askren, Mary K; Berman, Marc G; Demiralp, Emre; Krawitz, Adam; Jonides, John

    2013-02-01

    Working memory (WM) enables the online maintenance and manipulation of information and is central to intelligent cognitive functioning. Much research has investigated executive processes of WM in order to understand the operations that make WM "work." However, there is yet little consensus regarding how executive processes of WM are organized. Here, we used quantitative meta-analysis to summarize data from 36 experiments that examined executive processes of WM. Experiments were categorized into 4 component functions central to WM: protecting WM from external distraction (distractor resistance), preventing irrelevant memories from intruding into WM (intrusion resistance), shifting attention within WM (shifting), and updating the contents of WM (updating). Data were also sorted by content (verbal, spatial, object). Meta-analytic results suggested that rather than dissociating into distinct functions, 2 separate frontal regions were recruited across diverse executive demands. One region was located dorsally in the caudal superior frontal sulcus and was especially sensitive to spatial content. The other was located laterally in the midlateral prefrontal cortex and showed sensitivity to nonspatial content. We propose that dorsal-"where"/ventral-"what" frameworks that have been applied to WM maintenance also apply to executive processes of WM. Hence, WM can largely be simplified to a dual selection model.

  17. PENGEMBANGAN ACADEMIC INFORMATION DASHBOARD EXECUTIVE (A-INDEX DENGAN PENTAHO DATA INTEGRATION DAN QLIKVIEW

    Directory of Open Access Journals (Sweden)

    Herry Sofyan

    2016-01-01

    Full Text Available Information Dashboard Executive (INDEX is a visual representation of data in the form of dashboards that are used to get a snapshot of performance in every business process so as to facilitate the executives took a quick response. Pentaho is a BI application is free open source software (FOSS and runs on top of the Java platform. QlikView is focused on simplifying decision making for business users across the organization. Processing needs to be able to optimize data analysis functions of PDPT is developing an interactive dashboard visualization data. The dashboard will be built using the data pentaho integration as a gateway connecting between database applications with Data PDPT and data visualization are developed by using QlikView. Software development methodologies in application development work is incremental method which is a combination of linear and iterative method with parallel modifications in the process the iterative process so that the project done faster.The results of this study are is the data representation of the modeling query is constructed able to describe the activity / student profiles in a certain semester. The data representations constructed include active distribution per class, per student graduation force distribution, distribution of student status, distribution provinces of origin of students per class, the distribution of the number of class participants, distribution of credits lecturers and distribution of subject.

  18. Applying an Integrative Framework of Executive Function to Preschoolers With Specific Language Impairment.

    Science.gov (United States)

    Kapa, Leah L; Plante, Elena; Doubleday, Kevin

    2017-08-16

    The first goal of this research was to compare verbal and nonverbal executive function abilities between preschoolers with and without specific language impairment (SLI). The second goal was to assess the group differences on 4 executive function components in order to determine if the components may be hierarchically related as suggested within a developmental integrative framework of executive function. This study included 26 4- and 5-year-olds diagnosed with SLI and 26 typically developing age- and sex-matched peers. Participants were tested on verbal and nonverbal measures of sustained selective attention, working memory, inhibition, and shifting. The SLI group performed worse compared with typically developing children on both verbal and nonverbal measures of sustained selective attention and working memory, the verbal inhibition task, and the nonverbal shifting task. Comparisons of standardized group differences between executive function measures revealed a linear increase with the following order: working memory, inhibition, shifting, and sustained selective attention. The pattern of results suggests that preschoolers with SLI have deficits in executive functioning compared with typical peers, and deficits are not limited to verbal tasks. A significant linear relationship between group differences across executive function components supports the possibility of a hierarchical relationship between executive function skills.

  19. Applying an Integrative Framework of Executive Function to Preschoolers With Specific Language Impairment

    Science.gov (United States)

    Plante, Elena; Doubleday, Kevin

    2017-01-01

    Purpose The first goal of this research was to compare verbal and nonverbal executive function abilities between preschoolers with and without specific language impairment (SLI). The second goal was to assess the group differences on 4 executive function components in order to determine if the components may be hierarchically related as suggested within a developmental integrative framework of executive function. Method This study included 26 4- and 5-year-olds diagnosed with SLI and 26 typically developing age- and sex-matched peers. Participants were tested on verbal and nonverbal measures of sustained selective attention, working memory, inhibition, and shifting. Results The SLI group performed worse compared with typically developing children on both verbal and nonverbal measures of sustained selective attention and working memory, the verbal inhibition task, and the nonverbal shifting task. Comparisons of standardized group differences between executive function measures revealed a linear increase with the following order: working memory, inhibition, shifting, and sustained selective attention. Conclusion The pattern of results suggests that preschoolers with SLI have deficits in executive functioning compared with typical peers, and deficits are not limited to verbal tasks. A significant linear relationship between group differences across executive function components supports the possibility of a hierarchical relationship between executive function skills. PMID:28724132

  20. An integrative architecture for general intelligence and executive function revealed by lesion mapping

    Science.gov (United States)

    Colom, Roberto; Solomon, Jeffrey; Krueger, Frank; Forbes, Chad; Grafman, Jordan

    2012-01-01

    Although cognitive neuroscience has made remarkable progress in understanding the involvement of the prefrontal cortex in executive control, the broader functional networks that support high-level cognition and give rise to general intelligence remain to be well characterized. Here, we investigated the neural substrates of the general factor of intelligence (g) and executive function in 182 patients with focal brain damage using voxel-based lesion–symptom mapping. The Wechsler Adult Intelligence Scale and Delis–Kaplan Executive Function System were used to derive measures of g and executive function, respectively. Impaired performance on these measures was associated with damage to a distributed network of left lateralized brain areas, including regions of frontal and parietal cortex and white matter association tracts, which bind these areas into a coordinated system. The observed findings support an integrative framework for understanding the architecture of general intelligence and executive function, supporting their reliance upon a shared fronto-parietal network for the integration and control of cognitive representations and making specific recommendations for the application of the Wechsler Adult Intelligence Scale and Delis–Kaplan Executive Function System to the study of high-level cognition in health and disease. PMID:22396393

  1. Integrating manufacturing softwares for intelligent planning execution: a CIIMPLEX perspective

    Science.gov (United States)

    Chu, Bei Tseng B.; Tolone, William J.; Wilhelm, Robert G.; Hegedus, M.; Fesko, J.; Finin, T.; Peng, Yun; Jones, Chris H.; Long, Junshen; Matthews, Mike; Mayfield, J.; Shimp, J.; Su, S.

    1997-01-01

    Recent developments have made it possible to interoperate complex business applications at much lower costs. Application interoperation, along with business process re- engineering can result in significant savings by eliminating work created by disconnected business processes due to isolated business applications. However, we believe much greater productivity benefits can be achieved by facilitating timely decision-making, utilizing information from multiple enterprise perspectives. The CIIMPLEX enterprise integration architecture is designed to enable such productivity gains by helping people to carry out integrated enterprise scenarios. An enterprise scenario is triggered typically by some external event. The goal of an enterprise scenario is to make the right decisions considering the full context of the problem. Enterprise scenarios are difficult for people to carry out because of the interdependencies among various actions. One can easily be overwhelmed by the large amount of information. We propose the use of software agents to help gathering relevant information and present them in the appropriate context of an enterprise scenario. The CIIMPLEX enterprise integration architecture is based on the FAIME methodology for application interoperation and plug-and-play. It also explores the use of software agents in application plug-and- play.

  2. Age-Related Differences and Heterogeneity in Executive Functions: Analysis of NAB Executive Functions Module Scores.

    Science.gov (United States)

    Buczylowska, Dorota; Petermann, Franz

    2016-05-01

    Normative data from the German adaptation of the Neuropsychological Assessment Battery were used to examine age-related differences in 6 executive function tasks. A multivariate analysis of variance was employed to investigate the differences in performance in 484 participants aged 18-99 years. The coefficient of variation was calculated to compare the heterogeneity of scores between 10 age groups. Analyses showed an increase in the dispersion of scores with age, varying from 7% to 289%, in all subtests. Furthermore, age-dependent heterogeneity appeared to be associated with age-dependent decline because the subtests with the greatest increase in dispersion (i.e., Mazes, Planning, and Categories) also exhibited the greatest decrease in mean scores. In contrast, scores for the subtests Letter Fluency, Word Generation, and Judgment had the lowest increase in dispersion with the lowest decrease in mean scores. Consequently, the results presented here show a pattern of age-related differences in executive functioning that is consistent with the concept of crystallized and fluid intelligence. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Integrating deductive verification and symbolic execution for abstract object creation in dynamic logic

    NARCIS (Netherlands)

    C.P.T. de Gouw (Stijn); F.S. de Boer (Frank); W. Ahrendt (Wolfgang); R. Bubel (Richard)

    2016-01-01

    textabstractWe present a fully abstract weakest precondition calculus and its integration with symbolic execution. Our assertion language allows both specifying and verifying properties of objects at the abstraction level of the programming language, abstracting from a specific implementation of

  4. International Space Station Configuration Analysis and Integration

    Science.gov (United States)

    Anchondo, Rebekah

    2016-01-01

    Ambitious engineering projects, such as NASA's International Space Station (ISS), require dependable modeling, analysis, visualization, and robotics to ensure that complex mission strategies are carried out cost effectively, sustainably, and safely. Learn how Booz Allen Hamilton's Modeling, Analysis, Visualization, and Robotics Integration Center (MAVRIC) team performs engineering analysis of the ISS Configuration based primarily on the use of 3D CAD models. To support mission planning and execution, the team tracks the configuration of ISS and maintains configuration requirements to ensure operational goals are met. The MAVRIC team performs multi-disciplinary integration and trade studies to ensure future configurations meet stakeholder needs.

  5. Packer Detection for Multi-Layer Executables Using Entropy Analysis

    Directory of Open Access Journals (Sweden)

    Munkhbayar Bat-Erdene

    2017-03-01

    Full Text Available Packing algorithms are broadly used to avoid anti-malware systems, and the proportion of packed malware has been growing rapidly. However, just a few studies have been conducted on detection various types of packing algorithms in a systemic way. Following this understanding, we elaborate a method to classify packing algorithms of a given executable into three categories: single-layer packing, re-packing, or multi-layer packing. We convert entropy values of the executable file loaded into memory into symbolic representations, for which we used SAX (Symbolic Aggregate Approximation. Based on experiments of 2196 programs and 19 packing algorithms, we identify that precision (97.7%, accuracy (97.5%, and recall ( 96.8% of our method are respectively high to confirm that entropy analysis is applicable in identifying packing algorithms.

  6. Visuomotor Integration and Executive Functioning Are Uniquely Linked to Chinese Word Reading and Writing in Kindergarten Children

    Science.gov (United States)

    Chung, Kevin Kien Hoa; Lam, Chun Bun; Cheung, Ka Chun

    2018-01-01

    This cross-sectional study examined the associations of visuomotor integration and executive functioning with Chinese word reading and writing in kindergarten children. A total of 369 Chinese children (mean age = 57.99 months; 55% of them were girls) from Hong Kong, China, completed tasks on visuomotor integration, executive functioning, and…

  7. 48 CFR 970.5223-1 - Integration of environment, safety, and health into work planning and execution.

    Science.gov (United States)

    2010-10-01

    ..., safety, and health into work planning and execution. 970.5223-1 Section 970.5223-1 Federal Acquisition... Integration of environment, safety, and health into work planning and execution. As prescribed in 970.2303-3(b), insert the following clause: Integration of Environment, Safety, and Health Into Work Planning and...

  8. Rotational error in path integration: encoding and execution errors in angle reproduction.

    Science.gov (United States)

    Chrastil, Elizabeth R; Warren, William H

    2017-06-01

    Path integration is fundamental to human navigation. When a navigator leaves home on a complex outbound path, they are able to keep track of their approximate position and orientation and return to their starting location on a direct homebound path. However, there are several sources of error during path integration. Previous research has focused almost exclusively on encoding error-the error in registering the outbound path in memory. Here, we also consider execution error-the error in the response, such as turning and walking a homebound trajectory. In two experiments conducted in ambulatory virtual environments, we examined the contribution of execution error to the rotational component of path integration using angle reproduction tasks. In the reproduction tasks, participants rotated once and then rotated again to face the original direction, either reproducing the initial turn or turning through the supplementary angle. One outstanding difficulty in disentangling encoding and execution error during a typical angle reproduction task is that as the encoding angle increases, so does the required response angle. In Experiment 1, we dissociated these two variables by asking participants to report each encoding angle using two different responses: by turning to walk on a path parallel to the initial facing direction in the same (reproduction) or opposite (supplementary angle) direction. In Experiment 2, participants reported the encoding angle by turning both rightward and leftward onto a path parallel to the initial facing direction, over a larger range of angles. The results suggest that execution error, not encoding error, is the predominant source of error in angular path integration. These findings also imply that the path integrator uses an intrinsic (action-scaled) rather than an extrinsic (objective) metric.

  9. Integrated genetic analysis microsystems

    International Nuclear Information System (INIS)

    Lagally, Eric T; Mathies, Richard A

    2004-01-01

    With the completion of the Human Genome Project and the ongoing DNA sequencing of the genomes of other animals, bacteria, plants and others, a wealth of new information about the genetic composition of organisms has become available. However, as the demand for sequence information grows, so does the workload required both to generate this sequence and to use it for targeted genetic analysis. Microfabricated genetic analysis systems are well poised to assist in the collection and use of these data through increased analysis speed, lower analysis cost and higher parallelism leading to increased assay throughput. In addition, such integrated microsystems may point the way to targeted genetic experiments on single cells and in other areas that are otherwise very difficult. Concomitant with these advantages, such systems, when fully integrated, should be capable of forming portable systems for high-speed in situ analyses, enabling a new standard in disciplines such as clinical chemistry, forensics, biowarfare detection and epidemiology. This review will discuss the various technologies available for genetic analysis on the microscale, and efforts to integrate them to form fully functional robust analysis devices. (topical review)

  10. An empirical analysis of executive behaviour with hospital executive information systems in Taiwan.

    Science.gov (United States)

    Huang, Wei-Min

    2013-01-01

    Existing health information systems largely only support the daily operations of a medical centre, and are unable to generate the information required by executives for decision-making. Building on past research concerning information retrieval behaviour and learning through mental models, this study examines the use of information systems by hospital executives in medical centres. It uses a structural equation model to help find ways hospital executives might use information systems more effectively. The results show that computer self-efficacy directly affects the maintenance of mental models, and that system characteristics directly impact learning styles and information retrieval behaviour. Other results include the significant impact of perceived environmental uncertainty on scan searches; information retrieval behaviour and focused searches on mental models and perceived efficiency; scan searches on mental model building; learning styles and model building on perceived efficiency; and finally the impact of mental model maintenance on perceived efficiency and effectiveness.

  11. Correlation between videogame mechanics and executive functions through EEG analysis.

    Science.gov (United States)

    Mondéjar, Tania; Hervás, Ramón; Johnson, Esperanza; Gutierrez, Carlos; Latorre, José Miguel

    2016-10-01

    This paper addresses a different point of view of videogames, specifically serious games for health. This paper contributes to that area with a multidisciplinary perspective focus on neurosciences and computation. The experiment population has been pre-adolescents between the ages of 8 and 12 without any cognitive issues. The experiment consisted in users playing videogames as well as performing traditional psychological assessments; during these tasks the frontal brain activity was evaluated. The main goal was to analyse how the frontal lobe of the brain (executive function) works in terms of prominent cognitive skills during five types of game mechanics widely used in commercial videogames. The analysis was made by collecting brain signals during the two phases of the experiment, where the signals were analysed with an electroencephalogram neuroheadset. The validated hypotheses were whether videogames can develop executive functioning and if it was possible to identify which kind of cognitive skills are developed during each kind of typical videogame mechanic. The results contribute to the design of serious games for health purposes on a conceptual level, particularly in support of the diagnosis and treatment of cognitive-related pathologies. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. The Importance of Information Management for the Professional Performance of the Executive Secretary - an Integrative National Literature Review.

    Directory of Open Access Journals (Sweden)

    Nuriane Santos Montezano

    2015-08-01

    Full Text Available The article presents the reality of the new Executive Secretariat professional and its relation to the Strategic Information System. All the concepts were worked out based on the national integrative literature review. The aim was to determine what is the importance of information management and its applicability to the professional context of the Executive Secretariat. The discussion and theoretical reflection showed that the current Executive Secretary professional is prepared for the new organizational dynamics to incorporate technologically execution management information in context. This is another task that gives and confirms its multifunctional character as an important information manager figure in decision-making organizations.

  13. Strategies and Decision Support Systems for Integrating Variable Energy Resources in Control Centers for Reliable Grid Operations. Executive Summary

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Lawrence E. [Alstom Grid, Inc., Washington, DC (United States)

    2011-11-01

    This is the executive summary for a report that provides findings from the field regarding the best ways in which to guide operational strategies, business processes and control room tools to support the integration of renewable energy into electrical grids.

  14. Using recurrence plot analysis for software execution interpretation and fault detection

    Science.gov (United States)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  15. White matter integrity in dyskinetic cerebral palsy: Relationship with intelligence quotient and executive function

    Directory of Open Access Journals (Sweden)

    Olga Laporta-Hoyos

    2017-01-01

    Conclusion: The widespread loss in the integrity of WM tissue is mainly located in the parietal lobe and related to IQ in dyskinetic CP. Unexpectedly, executive functions are only related with WM microstructure in regions containing fronto-cortical and posterior cortico-subcortical pathways, and not being specifically related to the state of fronto-striatal pathways which might be due to brain reorganization. Further studies of this nature may improve our understanding of the neurobiological bases of cognitive impairments after early brain insult.

  16. Information-theoretic analysis of the dynamics of an executable biological model.

    Directory of Open Access Journals (Sweden)

    Avital Sadot

    Full Text Available To facilitate analysis and understanding of biological systems, large-scale data are often integrated into models using a variety of mathematical and computational approaches. Such models describe the dynamics of the biological system and can be used to study the changes in the state of the system over time. For many model classes, such as discrete or continuous dynamical systems, there exist appropriate frameworks and tools for analyzing system dynamics. However, the heterogeneous information that encodes and bridges molecular and cellular dynamics, inherent to fine-grained molecular simulation models, presents significant challenges to the study of system dynamics. In this paper, we present an algorithmic information theory based approach for the analysis and interpretation of the dynamics of such executable models of biological systems. We apply a normalized compression distance (NCD analysis to the state representations of a model that simulates the immune decision making and immune cell behavior. We show that this analysis successfully captures the essential information in the dynamics of the system, which results from a variety of events including proliferation, differentiation, or perturbations such as gene knock-outs. We demonstrate that this approach can be used for the analysis of executable models, regardless of the modeling framework, and for making experimentally quantifiable predictions.

  17. Analysis for Parallel Execution without Performing Hardware/Software Co-simulation

    OpenAIRE

    Muhammad Rashid

    2014-01-01

    Hardware/software co-simulation improves the performance of embedded applications by executing the applications on a virtual platform before the actual hardware is available in silicon. However, the virtual platform of the target architecture is often not available during early stages of the embedded design flow. Consequently, analysis for parallel execution without performing hardware/software co-simulation is required. This article presents an analysis methodology for parallel execution of ...

  18. Integrating hydrologic modeling web services with online data sharing to prepare, store, and execute models in hydrology

    Science.gov (United States)

    Gan, T.; Tarboton, D. G.; Dash, P. K.; Gichamo, T.; Horsburgh, J. S.

    2017-12-01

    Web based apps, web services and online data and model sharing technology are becoming increasingly available to support research. This promises benefits in terms of collaboration, platform independence, transparency and reproducibility of modeling workflows and results. However, challenges still exist in real application of these capabilities and the programming skills researchers need to use them. In this research we combined hydrologic modeling web services with an online data and model sharing system to develop functionality to support reproducible hydrologic modeling work. We used HydroDS, a system that provides web services for input data preparation and execution of a snowmelt model, and HydroShare, a hydrologic information system that supports the sharing of hydrologic data, model and analysis tools. To make the web services easy to use, we developed a HydroShare app (based on the Tethys platform) to serve as a browser based user interface for HydroDS. In this integration, HydroDS receives web requests from the HydroShare app to process the data and execute the model. HydroShare supports storage and sharing of the results generated by HydroDS web services. The snowmelt modeling example served as a use case to test and evaluate this approach. We show that, after the integration, users can prepare model inputs or execute the model through the web user interface of the HydroShare app without writing program code. The model input/output files and metadata describing the model instance are stored and shared in HydroShare. These files include a Python script that is automatically generated by the HydroShare app to document and reproduce the model input preparation workflow. Once stored in HydroShare, inputs and results can be shared with other users, or published so that other users can directly discover, repeat or modify the modeling work. This approach provides a collaborative environment that integrates hydrologic web services with a data and model sharing

  19. Parents' Executive Functioning and Involvement in Their Child's Education: An Integrated Literature Review.

    Science.gov (United States)

    Wilson, Damali M; Gross, Deborah

    2018-04-01

    Parents' involvement in their children's education is integral to academic success. Several education-based organizations have identified recommendations for how parents can best support their children's learning. However, executive functioning (EF), a high-ordered cognitive skill set, contributes to the extent to which parents can follow through with these recommendations. This integrative review of the literature describes how executive function can affect parents' ability to facilitate and actively participate in their child's education and provides strategies for all school staff to strengthen parent-school partnerships when parents have limitations in EF. EF skills are fluid and influenced by several factors, including parental age, sleep, stress, and mood/affect. Despite possible limitations in parental EF, there are strategies school personnel can employ to strengthen partnership with parents to support their children's academic success. As reforms in education call for increased customization and collaboration with families, parental EF is an important consideration for school personnel. Awareness and understanding of how parents' EF affects children's learning will help schools better support parents in supporting their children's academic success. © 2018 The Authors. Journal of School Health published by Wiley Periodicals, Inc. on behalf of American School Health Association.

  20. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    International Nuclear Information System (INIS)

    Smith, L.M.; Hochstedler, R.D.

    1997-01-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of the accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code)

  1. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    Science.gov (United States)

    Smith, L. M.; Hochstedler, R. D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of the accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).

  2. Integrated Project Teams - An Essential Element of Project Management during Project Planning and Execution - 12155

    Energy Technology Data Exchange (ETDEWEB)

    Burritt, James G.; Berkey, Edgar [Longenecker and Associates, Las Vegas, NV 89135 (United States)

    2012-07-01

    Managing complex projects requires a capable, effective project manager to be in place, who is assisted by a team of competent assistants in various relevant disciplines. This team of assistants is known as the Integrated Project Team (IPT). he IPT is composed of a multidisciplinary group of people who are collectively responsible for delivering a defined project outcome and who plan, execute, and implement over the entire life-cycle of a project, which can be a facility being constructed or a system being acquired. An ideal IPT includes empowered representatives from all functional areas involved with a project-such as engineering design, technology, manufacturing, test and evaluation, contracts, legal, logistics, and especially, the customer. Effective IPTs are an essential element of scope, cost, and schedule control for any complex, large construction project, whether funded by DOE or another organization. By recently assessing a number of major, on-going DOE waste management projects, the characteristics of high performing IPTs have been defined as well as the reasons for potential IPT failure. Project managers should use IPTs to plan and execute projects, but the IPTs must be properly constituted and the members capable and empowered. For them to be effective, the project manager must select the right team, and provide them with the training and guidance for them to be effective. IPT members must treat their IPT assignment as a primary duty, not some ancillary function. All team members must have an understanding of the factors associated with successful IPTs, and the reasons that some IPTs fail. Integrated Project Teams should be used by both government and industry. (authors)

  3. Management development: a needs analysis for nurse executives and managers.

    Science.gov (United States)

    Kirk, R

    1987-04-01

    Clearly, both the nurse executive and nurse manager roles are becoming more complex. I feel an enthusiasm by the professionals in those positions to pursue development activities that will help them do their jobs better and with less discomfort. Nurse executives obviously know the power of combining knowledge with experience. How do the different leadership and management development needs identified by these NE fit with your organization's needs? What is the content in your leadership and management development programs? Are your programs meeting the real needs of your executive and management-level staff? One way to find out is to do a simple survey. Today, nurse executives are responsible and accountable for challenges we never considered possible, even a few years ago. But along with the new challenges came the flexibility and positive attitudes of NEs to respond to changes and acquire new skills such as cost accounting, computers, or marketing. It's this type of proactive thinking that helps nurse leaders turn problems into opportunities and their situations into success stories.

  4. Analysis Of The Executive Components Of The Farmer Field School ...

    African Journals Online (AJOL)

    The purpose of this study was to investigate the executive components of the Farmer Field School (FFS) project in Uromieh county of West Azerbaijan Province, Iran. All the members and non-members (as control group) of FFS pilots in Uromieh county (N= 98) were included in the study. Data were collected by use of ...

  5. The Effects of Acute Stress on Core Executive Functions: A Meta-Analysis and Comparison with Cortisol

    Science.gov (United States)

    Shields, Grant S.; Sazma, Matthew A.; Yonelinas, Andrew P.

    2016-01-01

    Core executive functions such as working memory, inhibition, and cognitive flexibility are integral to daily life. A growing body of research has suggested that acute stress may impair core executive functions. However, there are a number of inconsistencies in the literature, leading to uncertainty about how or even if acute stress influences core executive functions. We addressed this by conducting a meta-analysis of acute stress effects on working memory, inhibition, and cognitive flexibility. We found that stress impaired working memory and cognitive flexibility, whereas it had nuanced effects on inhibition. Many of these effects were moderated by other variables, such as sex. In addition, we compared effects of acute stress on core executive functions to effects of cortisol administration and found some striking differences. Our findings indicate that stress works through mechanisms aside from or in addition to cortisol to produce a state characterized by more reactive processing of salient stimuli but greater control over actions. We conclude by highlighting some important future directions for stress and executive function research. PMID:27371161

  6. Parents' Executive Functioning and Involvement in Their Child's Education: An Integrated Literature Review

    Science.gov (United States)

    Wilson, Damali M.; Gross, Deborah

    2018-01-01

    Background: Parents' involvement in their children's education is integral to academic success. Several education-based organizations have identified recommendations for how parents can best support their children's learning. However, executive functioning (EF), a high-ordered cognitive skill set, contributes to the extent to which parents can…

  7. 48 CFR 952.223-71 - Integration of environment, safety, and health into work planning and execution.

    Science.gov (United States)

    2010-10-01

    ..., safety, and health into work planning and execution. 952.223-71 Section 952.223-71 Federal Acquisition... Provisions and Clauses 952.223-71 Integration of environment, safety, and health into work planning and... safety and health standards applicable to the work conditions of contractor and subcontractor employees...

  8. Strategy Execution: An integrative perspective and method for the knowledge-based economy

    OpenAIRE

    Strikwerda, J.

    2017-01-01

    Within the field of business administration, in research and in the practice of business, the issue of strategy execution lacks a generally accepted paradigm. Strategy execution so far has not received the attention it should be given in view of its critical role in the performance of the firm, especially with the growth of complexity in organizations. The attention that is usually given to strategy execution in the strategy literature and especially in popular management books is, with a few...

  9. Strategy Execution : An integrative perspective and method for the knowledge-based economy

    NARCIS (Netherlands)

    Strikwerda, J.

    2017-01-01

    Within the field of business administration, in research and in the practice of business, the issue of strategy execution lacks a generally accepted paradigm. Strategy execution so far has not received the attention it should be given in view of its critical role in the performance of the firm,

  10. High integrity software for nuclear power plants: Candidate guidelines, technical basis and research needs. Executive summary: Volume 1

    International Nuclear Information System (INIS)

    Seth, S.; Bail, W.; Cleaves, D.; Cohen, H.; Hybertson, D.; Schaefer, C.; Stark, G.; Ta, A.; Ulery, B.

    1995-06-01

    The work documented in this report was performed in support of the US Nuclear Regulatory Commission to examine the technical basis for candidate guidelines that could be considered in reviewing and evaluating high integrity computer software used in the safety systems of nuclear power plants. The framework for the work consisted of the following software development and assurance activities: requirements specification; design; coding; verification and validation, including static analysis and dynamic testing; safety analysis; operation and maintenance; configuration management; quality assurance; and planning and management. Each activity (framework element) was subdivided into technical areas (framework subelements). The report describes the development of approximately 200 candidate guidelines that span the entire range of software life-cycle activities; the assessment of the technical basis for those candidate guidelines; and the identification, categorization and prioritization of research needs for improving the technical basis. The report has two volumes: Volume 1, Executive Summary, includes an overview of the framework and of each framework element, the complete set of candidate guidelines, the results of the assessment of the technical basis for each candidate guideline, and a discussion of research needs that support the regulatory function; Volume 2 is the main report

  11. An Analysis of a Hard Real-Time Execution Environment Extension for FreeRTOS

    Directory of Open Access Journals (Sweden)

    STANGACIU, C.

    2015-08-01

    Full Text Available FreeRTOS is a popular real-time operating system, which has been under a significant attention in the last years due to its main advantages: it is open source, portable, well documented and implemented on more than 30 architectures. FreeRTOS execution environment is dynamic, preemptive and priority based, but it is not suitable for hard real-time tasks, because it provides task execution determinism only to a certain degree and cannot guarantee the absence of task execution jitter. As a solution to this problem, we propose a hard real time execution extension to FreeRTOS in order to support a particular model of HRT tasks, called ModXs, which are executed with no jitter. This article presents a detailed analysis, in terms of scheduling, task execution and memory usage of this hard real time execution environment extension. The article is concluding with the advantages this extension brings to the system compared to the small memory and timing overhead introduced.

  12. Integrating and differentiating aspects of self-regulation: effortful control, executive functioning, and links to negative affectivity.

    Science.gov (United States)

    Bridgett, David J; Oddi, Kate B; Laake, Lauren M; Murdock, Kyle W; Bachmann, Melissa N

    2013-02-01

    Subdisciplines within psychology frequently examine self-regulation from different frameworks despite conceptually similar definitions of constructs. In the current study, similarities and differences between effortful control, based on the psychobiological model of temperament (Rothbart, Derryberry, & Posner, 1994), and executive functioning are examined and empirically tested in three studies (n = 509). Structural equation modeling indicated that effortful control and executive functioning are strongly associated and overlapping constructs (Study 1). Additionally, results indicated that effortful control is related to the executive function of updating/monitoring information in working memory, but not inhibition (Studies 2 and 3). Study 3 also demonstrates that better updating/monitoring information in working memory and better effortful control were uniquely linked to lower dispositional negative affect, whereas the executive function of low/poor inhibition was uniquely associated with an increased tendency to express negative affect. Furthermore, dispositional negative affect mediated the links between effortful control and, separately, the executive function of updating/monitoring information in working memory and the tendency to express negative affect. The theoretical implications of these findings are discussed, and a potential framework for guiding future work directed at integrating and differentiating aspects of self-regulation is suggested. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  13. Stakeholder Analysis of an Executable Achitecture Systems Engineering (EASE) Tool

    Science.gov (United States)

    2013-06-21

    The FCR tables and stakeholder feedback are then used as the foundation of a Strengths, Weaknesses, Opportunities, and Threats ( SWOT ) analysis . Finally...the SWOT analysis and stakeholder feedback arc translated into an EASE future development strategy; a series of recommendations regarding...and Threats ( SWOT ) analysis . Finally, the SWOT analysis and stakeholder feedback are translated into an EASE future development strategy; a series

  14. Technique of the biomechanical analysis of execution of upward jump piked

    Directory of Open Access Journals (Sweden)

    Nataliya Batieieva

    2016-12-01

    Full Text Available Purpose: the biomechanical analysis of execution of upward jump piked. Material & Methods: the following methods of the research were used: theoretical analysis and synthesis of data of special scientific and methodical literature; photographing, video filming, biomechanical computer analysis, pedagogical observation. Students (n=8 of the chair of national choreography of the department of choreographic art of Kiev national university of culture and art took part in carrying out the biomechanical analysis of execution of upward jump piked. Results: the biomechanical analysis of execution of upward jump piked is carried out, the kinematic characteristics (way, speed, acceleration, effort of the general center of weight (GCW and center of weight (CW of biolinks of body of the executor are received (feet, shins, hips, shoulder, forearm, hands. Biokinematic models (phases are constructed. Power characteristics are defined – mechanical work and kinetic energy of links of legs and hands at execution of upward jump piked. Conclusions: it is established that the technique of execution of upward jump piked considerably influences the level of technical training of the qualified sportsmen in gymnastics (sports, in aerobic gymnastics (aerobics, diving and dancing sports.

  15. Method for critical software event execution reliability in high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Kidd, M.E. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This report contains viewgraphs on a method called SEER, which provides a high level of confidence that critical software driven event execution sequences faithfully exceute in the face of transient computer architecture failures in both normal and abnormal operating environments.

  16. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  17. Scientific Integrity and Executive National Security Proclamations: A Conflict of the Modern Age

    Science.gov (United States)

    Nelson, R.; Banerdt, B.; Bell, J. L.; Byrnes, D. V.; Carlisle, G. L.; D'Addario, L. R.; Weissman, P. R.; Eisenhardt, P. R.; Foster, S. D.; Golombek, M. P.; Gorjian, V.; Gorjian, Z.; Hale, A. S.; Kulleck, J. G.; Laubach, S. L.; McElrath, T. P.; Penanen, K. I.; Satter, C.; Walker, W. J.

    2010-12-01

    In 2004, in response to the events of September, 11, 2001, President George W. Bush issued Homeland Security Presidential Directive #12, an executive order requiring a uniform means of identification (i.e. identification badge) for all employees and contractors at federal facilities. To comply with this directive NASA ordered that its contract employees at the Jet Propulsion Laboratory 'voluntarily' agree to an open ended, unrestricted, background investigation into the intimate details of their private lives. These employees do not have security clearances and do not work with classified material. Caltech, which employs the JPL personnel under a NASA management contract, informed the employees that if they did not ‘voluntarily’ consent to the background investigation, they would be assumed to have voluntarily resigned and therefore be denied access to JPL (i.e. they would be functionally terminated). In October 2007, twentyeight JPL employees filed suit in Federal District Court. After an initial dismissal by the lowest federal court, the Ninth Circuit Court of Appeals issued an injunction against Caltech and NASA, stopping the background investigations. The Appeals Court found that the investigations were not narrowly tailored to meet the specific needs of NASA and therefore violated the employee’s legitimate expectation of informational privacy. This injunction has been reviewed and upheld several times by various panels of the Ninth Circuit Court of Appeals. In November 2009, the United States Department of Justice petitioned the U.S. Supreme Court requesting that it overturn this injunction. The Supreme Court accepted the case for oral arguments and scheduled them for October 5, 2010. A decision is imminent (if it has not been made already). The case has opened the following questions regarding all research workers under government contract: 1. What impact would such intrusive investigations have on open scientific inquiry and scientific integrity? 2

  18. Exploring the role of the posterior middle temporal gyrus in semantic cognition: Integration of anterior temporal lobe with executive processes.

    Science.gov (United States)

    Davey, James; Thompson, Hannah E; Hallam, Glyn; Karapanagiotidis, Theodoros; Murphy, Charlotte; De Caso, Irene; Krieger-Redwood, Katya; Bernhardt, Boris C; Smallwood, Jonathan; Jefferies, Elizabeth

    2016-08-15

    Making sense of the world around us depends upon selectively retrieving information relevant to our current goal or context. However, it is unclear whether selective semantic retrieval relies exclusively on general control mechanisms recruited in demanding non-semantic tasks, or instead on systems specialised for the control of meaning. One hypothesis is that the left posterior middle temporal gyrus (pMTG) is important in the controlled retrieval of semantic (not non-semantic) information; however this view remains controversial since a parallel literature links this site to event and relational semantics. In a functional neuroimaging study, we demonstrated that an area of pMTG implicated in semantic control by a recent meta-analysis was activated in a conjunction of (i) semantic association over size judgements and (ii) action over colour feature matching. Under these circumstances the same region showed functional coupling with the inferior frontal gyrus - another crucial site for semantic control. Structural and functional connectivity analyses demonstrated that this site is at the nexus of networks recruited in automatic semantic processing (the default mode network) and executively demanding tasks (the multiple-demand network). Moreover, in both task and task-free contexts, pMTG exhibited functional properties that were more similar to ventral parts of inferior frontal cortex, implicated in controlled semantic retrieval, than more dorsal inferior frontal sulcus, implicated in domain-general control. Finally, the pMTG region was functionally correlated at rest with other regions implicated in control-demanding semantic tasks, including inferior frontal gyrus and intraparietal sulcus. We suggest that pMTG may play a crucial role within a large-scale network that allows the integration of automatic retrieval in the default mode network with executively-demanding goal-oriented cognition, and that this could support our ability to understand actions and non

  19. Physical exercise and executive functions in preadolescent children, adolescents and young adults: a meta-analysis.

    Science.gov (United States)

    Verburgh, Lot; Königs, Marsh; Scherder, Erik J A; Oosterlaan, Jaap

    2014-06-01

    The goal of this meta-analysis was to aggregate available empirical studies on the effects of physical exercise on executive functions in preadolescent children (6-12 years of age), adolescents (13-17 years of age) and young adults (18-35 years of age). The electronic databases PubMed, EMBASE and SPORTDiscus were searched for relevant studies reporting on the effects of physical exercise on executive functions. Nineteen studies were selected. There was a significant overall effect of acute physical exercise on executive functions (d=0.52, 95% CI 0.29 to 0.76, pexercise (d=0.14, 95%CI -0.04 to 0.32, p=0.19) on executive functions (Q (1)=5.08, pexercise on the domain's inhibition/interference control (d=0.46, 95% CI 0.33 to 0.60, pexercise on planning (d=0.16, 95% CI 0.18 to 0.89, p=0.18). Results suggest that acute physical exercise enhances executive functioning. The number of studies on chronic physical exercise is limited and it should be investigated whether chronic physical exercise shows effects on executive functions comparable to acute physical exercise. This is highly relevant in preadolescent children and adolescents, given the importance of well-developed executive functions for daily life functioning and the current increase in sedentary behaviour in these age groups. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  20. Executive Function in Preschool-Age Children: Integrating Measurement, Neurodevelopment, and Translational Research

    Science.gov (United States)

    Griffin, James A., Ed.; McCardle, Peggy, Ed.; Freund, Lisa, Ed.

    2016-01-01

    A primary aim of the neuropsychological revolution has been the mapping of what has come to be known as executive function (EF). This term encompasses a range of mental processes such as working memory, inhibitory control, and cognitive flexibility that, together, regulate our social behavior, and our emotional and cognitive well-being. In this…

  1. Workers in an Integrating World. World Development Report, 1995. Executive Summary.

    Science.gov (United States)

    World Bank, Washington, DC.

    This executive summary examines the rapid changes occurring in economic markets and employment around the world. The report concludes that problems of low incomes, poor working conditions, and insecurity affecting many of the world's workers can be tackled effectively in ways that reduce poverty and regional inequality. Sound domestic policy and a…

  2. The Phenomenon of "Being-In-Management" in Executive Education Programmes: An Integrative View

    Science.gov (United States)

    Sewchurran, Kosheek; McDonogh, Jennifer

    2015-01-01

    Currently, we experience a situation in society in general as well as business school education where leaders and executives prefer to remain ambivalent and inauthentic about humanity's worsening socio-economic challenges. As a result of this, we continue with regimes of common sense that have lost their legitimacy and perpetuate an unsustainable…

  3. Stethoscope: A platform for interactive visual analysis of query execution plans

    NARCIS (Netherlands)

    M.M. Gawade (Mrunal); M.L. Kersten (Martin)

    2012-01-01

    textabstractSearching for the performance bottleneck in an execution trace is an error prone and time consuming activity. Existing tools oer some comfort by providing a visual representation of trace for analysis. In this paper we present the Stethoscope, an interactive visual tool to inspect and

  4. Stethoscope: a platform for interactive visual analysis of query execution plans

    NARCIS (Netherlands)

    Gawade, M.; Kersten, M.

    2012-01-01

    Searching for the performance bottleneck in an execution trace is an error prone and time consuming activity. Existing tools offer some comfort by providing a visual representation of trace for analysis. In this paper we present the Stethoscope, an interactive visual tool to inspect and ana- lyze

  5. Physical exercise and executive functions in preadolescent children, adolescents and young adults: a meta-analysis

    NARCIS (Netherlands)

    Verburgh, L.; Konigs, M.; Scherder, E.J.A.; Oosterlaan, J.

    2014-01-01

    Purpose: The goal of this meta-analysis was to aggregate available empirical studies on the effects of physical exercise on executive functions in preadolescent children (6-12 years of age), adolescents (13-17 years of age) and young adults (18-35 years of age). Method: The electronic databases

  6. Physical exercise and executive functions in preadolescent children, adolescents and young adults: a meta-analysis

    NARCIS (Netherlands)

    Verburgh, L.; Konigs, M.; Scherder, E.J.A.; Oosterlaan, J.

    2013-01-01

    Purpose: The goal of this meta-analysis was to aggregate available empirical studies on the effects of physical exercise on executive functions in preadolescent children (6-12 years of age), adolescents (13-17 years of age) and young adults (18-35 years of age). Method: The electronic databases

  7. Some Supplementary Methods for the Analysis of the Delis-Kaplan Executive Function System

    Science.gov (United States)

    Crawford, John R.; Garthwaite, Paul H.; Sutherland, David; Borland, Nicola

    2011-01-01

    Supplementary methods for the analysis of the Delis-Kaplan Executive Function System (Delis, Kaplan, & Kramer, 2001) are made available, including (a) quantifying the number of abnormally low achievement scores exhibited by an individual and accompanying this with an estimate of the percentage of the normative population expected to exhibit at…

  8. INFORMATION ARCHITECTURE ANALYSIS USING BUSINESS INTELLIGENCE TOOLS BASED ON THE INFORMATION NEEDS OF EXECUTIVES

    Directory of Open Access Journals (Sweden)

    Fabricio Sobrosa Affeldt

    2013-08-01

    Full Text Available Devising an information architecture system that enables an organization to centralize information regarding its operational, managerial and strategic performance is one of the challenges currently facing information technology. The present study aimed to analyze an information architecture system developed using Business Intelligence (BI technology. The analysis was performed based on a questionnaire enquiring as to whether the information needs of executives were met during the process. A theoretical framework was applied consisting of information architecture and BI technology, using a case study methodology. Results indicated that the transaction processing systems studied did not meet the information needs of company executives. Information architecture using data warehousing, online analytical processing (OLAP tools and data mining may provide a more agile means of meeting these needs. However, some items must be included and others modified, in addition to improving the culture of information use by company executives.

  9. Towards an Improved Integration of Sustainability in Finance - The French (eco)system. Executive summary

    International Nuclear Information System (INIS)

    Morel, Romain; Cochran, Ian; Robins, Nick

    2015-11-01

    In the run-up of the COP21, much international attention is focused on France. While mainly related to climate change negotiations, this creates an opportunity to take a broader look at French domestic policies and practices on sustainability. This report presents the French financial system and draws lessons from the French ongoing experience in improving the integration of sustainability issues that could be shared with other countries. The present report summarizes and analyses the key initiatives and dynamics at stake in France. It focuses on both the climate-related issues that have recently received significant attention and the development of broader Environmental, Social and Governance (ESG) issues over the past twenty years. The dynamics that have shaped the last two decades have both led to and been influenced by the emergence of an 'ecosystem' of commercial, public and non-profit actors and experts involved in the appropriation and integration of sustainability issues across the sector. Using the framework of analysis presented in the UNEP Inquiry global report, this case study examines the landscape of actors, private initiatives and public policy that has driven the emergence of this ecosystem and helped foster capacity building and the acquisition of expertise among sectoral actors. (authors)

  10. Analysis of Conflict Centers in Projects Procured with Traditional and Integrated Methods in Nigeria

    Directory of Open Access Journals (Sweden)

    Martin O. Dada

    2012-07-01

    Full Text Available Conflicts in any organization can either be functional or dysfunctional and can contribute to or detract from the achievement of organizational or project objectives. This study investigated the frequency and intensity of conflicts, using five conflict centers, on projects executed with either the integrated or traditional method in Nigeria. Questionnaires were administered through purposive and snowballing techniques on 274 projects located in twelve states of Nigeria and Abuja. 94 usable responses were obtained. The collected data were subjected to both descriptive and inferential statistical analysis. In projects procured with traditional methods, conflicts relating to resources for project execution had the greatest frequency, while conflicts around project/client goals had the least frequency. For projects executed with integrated methods, conflicts due to administrative procedures were ranked highest while conflicts due to project/client goals were ranked least. Regarding seriousness of conflict, conflicts due to administrative procedures and resources for project execution were ranked highest respectively for projects procured with traditional and integrated methods. Additionally, in terms of seriousness, personality issues and project/client goals were the least sources of conflict in projects executed with traditional and integrated methods. There were no significant differences in the incidence of conflicts, using the selected conflict centers, between the traditional and integrated procurement methods. There was however significant difference in the intensity or seriousness of conflicts between projects executed with the traditional method and those executed with integrated methods in the following areas: technical issues, administrative matters and personality issues. The study recommends that conscious efforts should be made at teambuilding on projects executed with integrated methods.

  11. Integrated piping structural analysis system

    International Nuclear Information System (INIS)

    Motoi, Toshio; Yamadera, Masao; Horino, Satoshi; Idehata, Takamasa

    1979-01-01

    Structural analysis of the piping system for nuclear power plants has become larger in scale and in quantity. In addition, higher quality analysis is regarded as of major importance nowadays from the point of view of nuclear plant safety. In order to fulfill to the above requirements, an integrated piping structural analysis system (ISAP-II) has been developed. Basic philosophy of this system is as follows: 1. To apply the date base system. All information is concentrated. 2. To minimize the manual process in analysis, evaluation and documentation. Especially to apply the graphic system as much as possible. On the basis of the above philosophy four subsystems were made. 1. Data control subsystem. 2. Analysis subsystem. 3. Plotting subsystem. 4. Report subsystem. Function of the data control subsystem is to control all information of the data base. Piping structural analysis can be performed by using the analysis subsystem. Isometric piping drawing and mode shape, etc. can be plotted by using the plotting subsystem. Total analysis report can be made without the manual process through the reporting subsystem. (author)

  12. Meta-analysis of executive functioning in ecstasy/polydrug users.

    Science.gov (United States)

    Roberts, C A; Jones, A; Montgomery, C

    2016-06-01

    Ecstasy/3,4-methylenedioxymethamphetamine (MDMA) use is proposed to cause damage to serotonergic (5-HT) axons in humans. Therefore, users should show deficits in cognitive processes that rely on serotonin-rich, prefrontal areas of the brain. However, there is inconsistency in findings to support this hypothesis. The aim of the current study was to examine deficits in executive functioning in ecstasy users compared with controls using meta-analysis. We identified k = 39 studies, contributing 89 effect sizes, investigating executive functioning in ecstasy users and polydrug-using controls. We compared function-specific task performance in 1221 current ecstasy users and 1242 drug-using controls, from tasks tapping the executive functions - updating, switching, inhibition and access to long-term memory. The significant main effect demonstrated overall executive dysfunction in ecstasy users [standardized mean difference (SMD) = -0.18, 95% confidence interval (CI) -0.26 to -0.11, Z = 5.05, p Ecstasy users showed significant performance deficits in access (SMD = -0.33, 95% CI -0.46 to -0.19, Z = 4.72, p ecstasy users to date and provides a behavioural correlate of potential serotonergic neurotoxicity.

  13. The Execution and Evaluation of an Integrated Business Common Core Curriculum.

    Science.gov (United States)

    Pharr, Steven W.; Morris, John S.; Stover, Dana; Byers, C. Randall; Reyes, Mario G.

    1998-01-01

    Describes the rationale, process, and organization of an integrated, cross-disciplinary undergraduate program known as the Integrated Business Common Core (IBC) at the University of Idaho. Indicates that IBC's goal is to provide students with an understanding of key business issues, with emphasis on processes. (2 tables and 11 references) (JDI)

  14. Integrated sequence analysis. Final report

    International Nuclear Information System (INIS)

    Andersson, K.; Pyy, P.

    1998-02-01

    The NKS/RAK subprojet 3 'integrated sequence analysis' (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term 'methodology' denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  15. Longitudinal Analysis of Music Education on Executive Functions in Primary School Children

    Science.gov (United States)

    Jaschke, Artur C.; Honing, Henkjan; Scherder, Erik J. A.

    2018-01-01

    Background: Research on the effects of music education on cognitive abilities has generated increasing interest across the scientific community. Nonetheless, longitudinal studies investigating the effects of structured music education on cognitive sub-functions are still rare. Prime candidates for investigating a relationship between academic achievement and music education appear to be executive functions such as planning, working memory, and inhibition. Methods: One hundred and forty-seven primary school children, Mage = 6.4 years, SD = 0.65 were followed for 2.5 years. Participants were randomized into four groups: two music intervention groups, one active visual arts group, and a no arts control group. Neuropsychological tests assessed verbal intelligence and executive functions. Additionally, a national pupil monitor provided data on academic performance. Results: Children in the visual arts group perform better on visuospatial memory tasks as compared to the three other conditions. However, the test scores on inhibition, planning and verbal intelligence increased significantly in the two music groups over time as compared to the visual art and no arts controls. Mediation analysis with executive functions and verbal IQ as mediator for academic performance have shown a possible far transfer effect from executive sub-function to academic performance scores. Discussion: The present results indicate a positive influence of long-term music education on cognitive abilities such as inhibition and planning. Of note, following a two-and-a-half year long visual arts program significantly improves scores on a visuospatial memory task. All results combined, this study supports a far transfer effect from music education to academic achievement mediated by executive sub-functions. PMID:29541017

  16. Longitudinal Analysis of Music Education on Executive Functions in Primary School Children

    Directory of Open Access Journals (Sweden)

    Artur C. Jaschke

    2018-02-01

    Full Text Available Background: Research on the effects of music education on cognitive abilities has generated increasing interest across the scientific community. Nonetheless, longitudinal studies investigating the effects of structured music education on cognitive sub-functions are still rare. Prime candidates for investigating a relationship between academic achievement and music education appear to be executive functions such as planning, working memory, and inhibition.Methods: One hundred and forty-seven primary school children, Mage = 6.4 years, SD = 0.65 were followed for 2.5 years. Participants were randomized into four groups: two music intervention groups, one active visual arts group, and a no arts control group. Neuropsychological tests assessed verbal intelligence and executive functions. Additionally, a national pupil monitor provided data on academic performance.Results: Children in the visual arts group perform better on visuospatial memory tasks as compared to the three other conditions. However, the test scores on inhibition, planning and verbal intelligence increased significantly in the two music groups over time as compared to the visual art and no arts controls. Mediation analysis with executive functions and verbal IQ as mediator for academic performance have shown a possible far transfer effect from executive sub-function to academic performance scores.Discussion: The present results indicate a positive influence of long-term music education on cognitive abilities such as inhibition and planning. Of note, following a two-and-a-half year long visual arts program significantly improves scores on a visuospatial memory task. All results combined, this study supports a far transfer effect from music education to academic achievement mediated by executive sub-functions.

  17. Longitudinal Analysis of Music Education on Executive Functions in Primary School Children.

    Science.gov (United States)

    Jaschke, Artur C; Honing, Henkjan; Scherder, Erik J A

    2018-01-01

    Background: Research on the effects of music education on cognitive abilities has generated increasing interest across the scientific community. Nonetheless, longitudinal studies investigating the effects of structured music education on cognitive sub-functions are still rare. Prime candidates for investigating a relationship between academic achievement and music education appear to be executive functions such as planning, working memory, and inhibition. Methods: One hundred and forty-seven primary school children, M age = 6.4 years, SD = 0.65 were followed for 2.5 years. Participants were randomized into four groups: two music intervention groups, one active visual arts group, and a no arts control group. Neuropsychological tests assessed verbal intelligence and executive functions. Additionally, a national pupil monitor provided data on academic performance. Results: Children in the visual arts group perform better on visuospatial memory tasks as compared to the three other conditions. However, the test scores on inhibition, planning and verbal intelligence increased significantly in the two music groups over time as compared to the visual art and no arts controls. Mediation analysis with executive functions and verbal IQ as mediator for academic performance have shown a possible far transfer effect from executive sub-function to academic performance scores. Discussion: The present results indicate a positive influence of long-term music education on cognitive abilities such as inhibition and planning. Of note, following a two-and-a-half year long visual arts program significantly improves scores on a visuospatial memory task. All results combined, this study supports a far transfer effect from music education to academic achievement mediated by executive sub-functions.

  18. Executing on Integration: The Key to Success in Mergers and Acquisitions.

    Science.gov (United States)

    Bradley, Carol

    2016-01-01

    Health care mergers and acquisitions require a clearly stated vision and exquisite planning of integration activities to provide the best possible conditions for a successful transaction. During the due diligence process, key steps can be taken to create a shared vision and a plan to inspire confidence and build enthusiasm for all stakeholders. Integration planning should include a defined structure, roles and responsibilities, as well as a method for evaluation.

  19. Anorexia nervosa and bulimia nervosa: A meta-analysis of executive functioning.

    Science.gov (United States)

    Hirst, Rayna B; Beard, Charlotte L; Colby, Katrina A; Quittner, Zoe; Mills, Brent M; Lavender, Jason M

    2017-12-01

    Research investigating the link between eating disorder (ED) diagnosis and executive dysfunction has had conflicting results, yet no meta-analyses have examined the overall association of ED pathology with executive functioning (EF). Effect sizes were extracted from 32 studies comparing ED groups (27 of anorexia nervosa, 9 of bulimia nervosa) with controls to determine the grand mean effect on EF. Analyses included effects for individual EF measures, as well as an age-based subgroup analysis. There was a medium effect of ED diagnosis on executive functioning, with bulimia nervosa demonstrating a larger effect (Hedges's g=-0.70) than anorexia nervosa (g=-0.41). Within anorexia nervosa studies, subgroup analyses were conducted for age and diagnostic subtype. The effect of anorexia nervosa on EF was largest in adults; however, subgroup differences for age were not significant. Anorexia and bulimia nervosa are associated with EF deficits, which are particularly notable for individuals with bulimia nervosa. The present analysis includes recommendations for future studies regarding study design and EF measurement. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. A Thematic Analysis of Self-described Authentic Leadership Behaviors Among Experienced Nurse Executives.

    Science.gov (United States)

    Alexander, Catherine; Lopez, Ruth Palan

    2018-01-01

    The aim of this study is to understand the behaviors experienced nurse executives use to create healthy work environments (HWEs). The constructs of authentic leadership formed the conceptual framework for the study. The American Association of Critical-Care Nurses recommends authentic leadership as the preferred style of leadership for creating and sustaining HWEs. Behaviors associated with authentic leadership in nursing are not well understood. A purposive sample of 17 experienced nurse executives were recruited from across the United States for this qualitative study. Thematic analysis was used to analyze the in-depth, semistructured interviews. Four constructs of authentic leaders were supported and suggest unique applications of each including self-awareness (a private and professional self), balanced processing (open hearted), transparency (limiting exposure), and moral leadership (nursing compass). Authentic leadership may provide a sound foundation to support nursing leadership practices; however, its application to the discipline requires additional investigation.

  1. Self-image and Missions of Universities: An Empirical Analysis of Japanese University Executives

    Directory of Open Access Journals (Sweden)

    Masataka Murasawa

    2014-05-01

    Full Text Available As universities in Japan gain institutional autonomy in managing internal organizations, independent of governmental control as a result of deregulation and decentralizing reforms, it is becoming increasingly important that the executives and administrators of each institution demonstrate clear and strategic vision and ideas to external stakeholders, in order to maintain financially robust operations and attractiveness of their institutions. This paper considers whether and how the self-image, mission, and vision of universities are perceived and internalized by the management of Japanese universities and empirically examines the determinants of shaping such individual perceptions. The result of our descriptive analysis indicates that the recent government policy to internationalize domestic universities has not shown much progress in the view of university executives in Japan. An increasing emphasis on the roles of serving local needs in research and teaching is rather pursued by these universities. Individual perceptions among Japanese university executives with regard to the missions and functional roles to be played by their institutions are influenced by managerial rank as well as the field of their academic training. A multiple regression analysis reveals that the economy of scale brought out by an expanded undergraduate student enrollment gradually slows down and decelerate executive perceptions, with regard to establishing a globally recognized status in research and teaching. Moreover, Japanese universities with a small proportion of graduate student enrollment, likely opted out from competitions for gaining a greater respect in the global community of higher education between 2005 and 2012. Finally, the management in universities granted with the same amount of external research funds in both studied years responded more passively in 2012 than did in 2005 on the self-assessment of whether having established a status as a global

  2. A primer for biomedical scientists on how to execute model II linear regression analysis.

    Science.gov (United States)

    Ludbrook, John

    2012-04-01

    1. There are two very different ways of executing linear regression analysis. One is Model I, when the x-values are fixed by the experimenter. The other is Model II, in which the x-values are free to vary and are subject to error. 2. I have received numerous complaints from biomedical scientists that they have great difficulty in executing Model II linear regression analysis. This may explain the results of a Google Scholar search, which showed that the authors of articles in journals of physiology, pharmacology and biochemistry rarely use Model II regression analysis. 3. I repeat my previous arguments in favour of using least products linear regression analysis for Model II regressions. I review three methods for executing ordinary least products (OLP) and weighted least products (WLP) regression analysis: (i) scientific calculator and/or computer spreadsheet; (ii) specific purpose computer programs; and (iii) general purpose computer programs. 4. Using a scientific calculator and/or computer spreadsheet, it is easy to obtain correct values for OLP slope and intercept, but the corresponding 95% confidence intervals (CI) are inaccurate. 5. Using specific purpose computer programs, the freeware computer program smatr gives the correct OLP regression coefficients and obtains 95% CI by bootstrapping. In addition, smatr can be used to compare the slopes of OLP lines. 6. When using general purpose computer programs, I recommend the commercial programs systat and Statistica for those who regularly undertake linear regression analysis and I give step-by-step instructions in the Supplementary Information as to how to use loss functions. © 2011 The Author. Clinical and Experimental Pharmacology and Physiology. © 2011 Blackwell Publishing Asia Pty Ltd.

  3. Integrated sequence analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, K.; Pyy, P

    1998-02-01

    The NKS/RAK subprojet 3 `integrated sequence analysis` (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term `methodology` denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  4. Continuous executive function disruption interferes with application of an information integration categorization strategy.

    Science.gov (United States)

    Miles, Sarah J; Matsuki, Kazunaga; Minda, John Paul

    2014-07-01

    Category learning is often characterized as being supported by two separate learning systems. A verbal system learns rule-defined (RD) categories that can be described using a verbal rule and relies on executive functions (EFs) to learn via hypothesis testing. A nonverbal system learns non-rule-defined (NRD) categories that cannot be described by a verbal rule and uses automatic, procedural learning. The verbal system is dominant in that adults tend to use it during initial learning but may switch to the nonverbal system when the verbal system is unsuccessful. The nonverbal system has traditionally been thought to operate independently of EFs, but recent studies suggest that EFs may play a role in the nonverbal system-specifically, to facilitate the transition away from the verbal system. Accordingly, continuously interfering with EFs during the categorization process, so that EFs are never fully available to facilitate the transition, may be more detrimental to the nonverbal system than is temporary EF interference. Participants learned an NRD or an RD category while EFs were untaxed, taxed temporarily, or taxed continuously. When EFs were continuously taxed during NRD categorization, participants were less likely to use a nonverbal categorization strategy than when EFs were temporarily taxed, suggesting that when EFs were unavailable, the transition to the nonverbal system was hindered. For the verbal system, temporary and continuous interference had similar effects on categorization performance and on strategy use, illustrating that EFs play an important but different role in each of the category-learning systems.

  5. White matter integrity in veterans with mild traumatic brain injury: associations with executive function and loss of consciousness.

    Science.gov (United States)

    Sorg, Scott F; Delano-Wood, Lisa; Luc, Norman; Schiehser, Dawn M; Hanson, Karen L; Nation, Daniel A; Lanni, Elisa; Jak, Amy J; Lu, Kun; Meloy, M J; Frank, Lawrence R; Lohr, James B; Bondi, Mark W

    2014-01-01

    We investigated using diffusion tensor imaging (DTI) and the association between white matter integrity and executive function (EF) performance in postacute mild traumatic brain injury (mTBI). In addition, we examined whether injury severity, as measured by loss of consciousness (LOC) versus alterations in consciousness (AOC), is related to white matter microstructural alterations and neuropsychological outcome. Thirty Iraq and Afghanistan War era veterans with a history of mTBI and 15 healthy veteran control participants. There were no significant overall group differences between control and mTBI participants on DTI measures. However, a subgroup of mTBI participants with EF decrements (n = 13) demonstrated significantly decreased fractional anisotropy of prefrontal white matter, corpus callosum, and cingulum bundle structures compared with mTBI participants without EF decrements (n = 17) and control participants. Participants having mTBI with LOC were more likely to evidence reduced EF performances and disrupted ventral prefrontal white matter integrity when compared with either mTBI participants without LOC or control participants. Findings suggest that altered white matter integrity contributes to reduced EF in subgroups of veterans with a history of mTBI and that LOC may be a risk factor for reduced EF as well as associated changes to ventral prefrontal white matter.

  6. Integrative Workflows for Metagenomic Analysis

    Directory of Open Access Journals (Sweden)

    Efthymios eLadoukakis

    2014-11-01

    Full Text Available The rapid evolution of all sequencing technologies, described by the term Next Generation Sequencing (NGS, have revolutionized metagenomic analysis. They constitute a combination of high-throughput analytical protocols, coupled to delicate measuring techniques, in order to potentially discover, properly assemble and map allelic sequences to the correct genomes, achieving particularly high yields for only a fraction of the cost of traditional processes (i.e. Sanger. From a bioinformatic perspective, this boils down to many gigabytes of data being generated from each single sequencing experiment, rendering the management or even the storage, critical bottlenecks with respect to the overall analytical endeavor. The enormous complexity is even more aggravated by the versatility of the processing steps available, represented by the numerous bioinformatic tools that are essential, for each analytical task, in order to fully unveil the genetic content of a metagenomic dataset. These disparate tasks range from simple, nonetheless non-trivial, quality control of raw data to exceptionally complex protein annotation procedures, requesting a high level of expertise for their proper application or the neat implementation of the whole workflow. Furthermore, a bioinformatic analysis of such scale, requires grand computational resources, imposing as the sole realistic solution, the utilization of cloud computing infrastructures. In this review article we discuss different, integrative, bioinformatic solutions available, which address the aforementioned issues, by performing a critical assessment of the available automated pipelines for data management, quality control and annotation of metagenomic data, embracing various, major sequencing technologies and applications.

  7. Single-trial EEG-informed fMRI analysis of emotional decision problems in hot executive function.

    Science.gov (United States)

    Guo, Qian; Zhou, Tiantong; Li, Wenjie; Dong, Li; Wang, Suhong; Zou, Ling

    2017-07-01

    Executive function refers to conscious control in psychological process which relates to thinking and action. Emotional decision is a part of hot executive function and contains emotion and logic elements. As a kind of important social adaptation ability, more and more attention has been paid in recent years. Gambling task can be well performed in the study of emotional decision. As fMRI researches focused on gambling task show not completely consistent brain activation regions, this study adopted EEG-fMRI fusion technology to reveal brain neural activity related with feedback stimuli. In this study, an EEG-informed fMRI analysis was applied to process simultaneous EEG-fMRI data. First, relative power-spectrum analysis and K-means clustering method were performed separately to extract EEG-fMRI features. Then, Generalized linear models were structured using fMRI data and using different EEG features as regressors. The results showed that in the win versus loss stimuli, the activated regions almost covered the caudate, the ventral striatum (VS), the orbital frontal cortex (OFC), and the cingulate. Wide activation areas associated with reward and punishment were revealed by the EEG-fMRI integration analysis than the conventional fMRI results, such as the posterior cingulate and the OFC. The VS and the medial prefrontal cortex (mPFC) were found when EEG power features were performed as regressors of GLM compared with results entering the amplitudes of feedback-related negativity (FRN) as regressors. Furthermore, the brain region activation intensity was the strongest when theta-band power was used as a regressor compared with the other two fusion results. The EEG-based fMRI analysis can more accurately depict the whole-brain activation map and analyze emotional decision problems.

  8. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  9. Analysis of physical training influence on the technical execution of the dismounts off the uneven bars

    Directory of Open Access Journals (Sweden)

    V. Potop

    2018-02-01

    Full Text Available Purpose: highlighting of physical training dynamics and its influence on the biomechanical characteristics of the dismounts off uneven bars executed by junior gymnasts aged 12 to 15 years. Material: a number of 8 gymnasts aged 12 to 15 participated in this research. They performed 12 dismounts off the uneven bars during the Women’s Artistic Gymnastics Junior National Championships in the all-around event and apparatus finals. The technical execution of the uneven bars dismounts was assessed by means of Physics ToolKit and Kinovea programs in accordance with the method of movement postural orientation, monitoring the key elements of sports technique. Seven tests of motricity were used in this study: 3 tests for strength-speed of lower limbs and arms, 3 tests for strength of the abdominal, back and complex muscles and 1 test of specific endurance. Results: We highlighted the level of specific physical training of junior gymnasts aged 12-15 years; the kinematic and dynamic analysis of the key elements of sports technique in terms of trajectories of body segments, angular speeds and moment of force in the dismounts off uneven bars; also, the dynamics of sports performances achieved in competitions. Conclusions: regarding the correlation of the physical training indicators with the indicators of the kinematic and dynamic characteristics of the dismounts off uneven bars consistent with the results achieved in competition, we revealed strong connections between indicators at P<0.05 and P<0.01 which confirms the influence of the physical training on the technical execution of the dismounts off uneven bars executed by junior gymnasts.

  10. How Do Executive Functions Fit with the Cattell-Horn-Carroll Model? Some Evidence from a Joint Factor Analysis of the Delis-Kaplan Executive Function System and the Woodcock-Johnson III Tests of Cognitive Abilities

    Science.gov (United States)

    Floyd, Randy G.; Bergeron, Renee; Hamilton, Gloria; Parra, Gilbert R.

    2010-01-01

    This study investigated the relations among executive functions and cognitive abilities through a joint exploratory factor analysis and joint confirmatory factor analysis of 25 test scores from the Delis-Kaplan Executive Function System and the Woodcock-Johnson III Tests of Cognitive Abilities. Participants were 100 children and adolescents…

  11. Integrated minicomputer alpha analysis system

    International Nuclear Information System (INIS)

    Vasilik, D.G.; Coy, D.E.; Seamons, M.; Henderson, R.W.; Romero, L.L.; Thomson, D.A.

    1978-01-01

    Approximately 1,000 stack and occupation air samples from plutonium and uranium facilities at LASL are analyzed daily. The concentrations of radio-nuclides in air are determined by measuring absolute alpha activities of particulates collected on air sample filter media. The Integrated Minicomputer Pulse system (IMPULSE) is an interface between many detectors of extremely simple design and a Digital Equipment Corporation (DEC) PDP-11/04 minicomputer. The detectors are photomultiplier tubes faced with zinc sulfide (ZnS). The average detector background is approximately 0.07 cpm. The IMPULSE system includes two mainframes, each of which can hold up to 64 detectors. The current hardware configuration includes 64 detectors in one mainframe and 40 detectors in the other. Each mainframe contains a minicomputer with 28K words of Random Access Memory. One minicomputer controls the detectors in both mainframes. A second computer was added for fail-safe redundancy and to support other laboratory computer requirements. The main minicomputer includes a dual floppy disk system and a dual DEC 'RK05' disk system for mass storage. The RK05 facilitates report generation and trend analysis. The IMPULSE hardware provides for passage of data from the detectors to the computer, and for passage of status and control information from the computer to the detector stations

  12. Worst-case execution time analysis-driven object cache design

    DEFF Research Database (Denmark)

    Huber, Benedikt; Puffitsch, Wolfgang; Schoeberl, Martin

    2012-01-01

    result in a WCET analysis‐friendly design. Aiming for a time‐predictable design, we therefore propose to employ WCET analysis techniques for the design space exploration of processor architectures. We evaluated different object cache configurations using static analysis techniques. The number of field......Hard real‐time systems need a time‐predictable computing platform to enable static worst‐case execution time (WCET) analysis. All performance‐enhancing features need to be WCET analyzable. However, standard data caches containing heap‐allocated data are very hard to analyze statically....... In this paper we explore a new object cache design, which is driven by the capabilities of static WCET analysis. Simulations of standard benchmarks estimating the expected average case performance usually drive computer architecture design. The design decisions derived from this methodology do not necessarily...

  13. Fast and Safe Concrete Code Execution for Reinforcing Static Analysis and Verification

    Directory of Open Access Journals (Sweden)

    M. Belyaev

    2015-01-01

    Full Text Available The problem of improving precision of static analysis and verification techniques for C is hard due to simplification assumptions these techniques make about the code model. We present a novel approach to improving precision by executing the code model in a controlled environment that captures program errors and contract violations in a memory and time efficient way. We implemented this approach as an executor module Tassadar as a part of bounded model checker Borealis. We tested Tassadar on two test sets, showing that its impact on performance of Borealis is minimal.The article is published in the authors’ wording.

  14. Automation of control and analysis of execution of official duties and instructions in the hierarchical organization

    Directory of Open Access Journals (Sweden)

    Demchenko A.I.

    2017-01-01

    Full Text Available The article considers the problem of monitoring over execution of official duties of employees. This problem is characteristic of the enterprises having a hierarchical management structure. The functions and the modes of monitoring are defined, the types of analysis of the staff activities are provided. The description of the program complex allowing distributing functions and instructions for between the employees is given. The developed computer program allows tracking the performance, creating reports. The computer program has a demarcation of access rights and provides the can be operated in both local, and a large-scale network.

  15. Integrated propulsion for near-Earth space missions. Volume 1: Executive summary

    Science.gov (United States)

    Dailey, C. L.; Meissinger, H. F.; Lovberg, R. H.; Zafran, S.

    1981-01-01

    Tradeoffs between electric propulsion system mass ratio and transfer time from LEO to GEO were conducted parametrically for various thruster efficiency, specific impulse, and other propulsion parameters. A computer model was developed for performing orbit transfer calculations which included the effects of aerodynamic drag, radiation degradation, and occultation. The tradeoff results showed that thruster technology areas for integrated propulsion should be directed towards improving primary thruster efficiency in the range from 1500 to 2500 seconds, and be continued towards reducing specific mass. Comparison of auxiliary propulsion systems showed large total propellant mass savings with integrated electric auxiliary propulsion. Stationkeeping is the most demanding on orbit propulsion requirement. At area densities above 0.5 sq m/kg, East-West stationkeeping requirements from solar pressure exceed North-South stationkeeping requirements from gravitational forces. A solar array pointing strategy was developed to minimize the effects of atmospheric drag at low altitude, enabling electric propulsion to initiate orbit transfer at Shuttle's maximum cargo carrying altitude. Gravity gradient torques are used during ascent to sustain the spacecraft roll motion required for optimum solar array illumination. A near optimum cover glass thickness of 6 mils was established for LEO to GEO transfer.

  16. Systematic Review and Meta-Analysis of Neuropsychiatric Symptoms and Executive Functioning in Adults With Phenylketonuria

    Science.gov (United States)

    Bilder, Deborah A.; Noel, J. Kay; Baker, Erin R.; Irish, William; Chen, Yinpu; Merilainen, Markus J.; Prasad, Suyash; Winslow, Barbara J.

    2016-01-01

    ABSTRACT This systematic review and meta-analysis (MA) investigates the impact of elevated blood phenylalanine (Phe) on neuropsychiatric symptoms in adults with phenylketonuria (PKU). The meta-analysis of PKU is challenging because high-quality evidence is lacking due to the limited number of affected individuals and few placebo-controlled, double-blind studies of adults with high and low blood Phe. Neuropsychiatric symptoms associated with PKU exceed general population estimates for inattention, hyperactivity, depression, and anxiety. High Phe is associated with an increased prevalence of neuropsychiatric symptoms and executive functioning deficits whereas low Phe is associated with improved neurological performance. Findings support lifelong maintenance of low blood Phe. PMID:27805419

  17. Winning at litigation through decision analysis creating and executing winning strategies in any litigation or dispute

    CERN Document Server

    Celona, John

    2016-01-01

    This book is the first in-depth guide to applying the philosophy, theory, and methods of decision analysis to creating and executing winning legal strategies. With explanations that progress from introductory to advanced and practice problems at the end of each chapter, this is a book the reader will want to use and refer to for years to come. Practicing decision analysts, operations research and management science students, attorneys and law students will find this book an invaluable addition to their knowledge and skills. John Celona has over three decades of experience in teaching and applying decision analysis. John lectures in the School of Engineering at Stanford University and is on faculty at The Stanford Center for Professional Development, the American Course on Drug Development and Regulatory Sciences, and the Academy of the American Society for Healthcare Risk Management.

  18. Problems in mathematical analysis III integration

    CERN Document Server

    Kaczor, W J

    2003-01-01

    We learn by doing. We learn mathematics by doing problems. This is the third volume of Problems in Mathematical Analysis. The topic here is integration for real functions of one real variable. The first chapter is devoted to the Riemann and the Riemann-Stieltjes integrals. Chapter 2 deals with Lebesgue measure and integration. The authors include some famous, and some not so famous, integral inequalities related to Riemann integration. Many of the problems for Lebesgue integration concern convergence theorems and the interchange of limits and integrals. The book closes with a section on Fourier series, with a concentration on Fourier coefficients of functions from particular classes and on basic theorems for convergence of Fourier series. The book is primarily geared toward students in analysis, as a study aid, for problem-solving seminars, or for tutorials. It is also an excellent resource for instructors who wish to incorporate problems into their lectures. Solutions for the problems are provided in the boo...

  19. Efficient Execution of Microscopy Image Analysis on CPU, GPU, and MIC Equipped Cluster Systems.

    Science.gov (United States)

    Andrade, G; Ferreira, R; Teodoro, George; Rocha, Leonardo; Saltz, Joel H; Kurc, Tahsin

    2014-10-01

    High performance computing is experiencing a major paradigm shift with the introduction of accelerators, such as graphics processing units (GPUs) and Intel Xeon Phi (MIC). These processors have made available a tremendous computing power at low cost, and are transforming machines into hybrid systems equipped with CPUs and accelerators. Although these systems can deliver a very high peak performance, making full use of its resources in real-world applications is a complex problem. Most current applications deployed to these machines are still being executed in a single processor, leaving other devices underutilized. In this paper we explore a scenario in which applications are composed of hierarchical data flow tasks which are allocated to nodes of a distributed memory machine in coarse-grain, but each of them may be composed of several finer-grain tasks which can be allocated to different devices within the node. We propose and implement novel performance aware scheduling techniques that can be used to allocate tasks to devices. We evaluate our techniques using a pathology image analysis application used to investigate brain cancer morphology, and our experimental evaluation shows that the proposed scheduling strategies significantly outperforms other efficient scheduling techniques, such as Heterogeneous Earliest Finish Time - HEFT, in cooperative executions using CPUs, GPUs, and MICs. We also experimentally show that our strategies are less sensitive to inaccuracy in the scheduling input data and that the performance gains are maintained as the application scales.

  20. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    Science.gov (United States)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  1. Integrated logistic support analysis system

    International Nuclear Information System (INIS)

    Carnicero Iniguez, E.J.; Garcia de la Sen, R.

    1993-01-01

    Integrating logic support into a system results in a large volume of information having to be managed which can only be achieved with the help of computer applications. Both past experience and growing needs in such tasks have led Emperesarios Agrupados to undertake an ambitious development project which is described in this paper. (author)

  2. Analysis of integrated energy systems

    International Nuclear Information System (INIS)

    Matsuhashi, Takaharu; Kaya, Yoichi; Komiyama, Hiroshi; Hayashi, Taketo; Yasukawa, Shigeru.

    1988-01-01

    World attention is now attracted to the concept of Novel Horizontally Integrated Energy System (NHIES). In NHIES, all fossil fuels are fist converted into CO and H 2 . Potential environmental contaminants such as sulfur are removed during this process. CO turbines are mainly used to generate electric power. Combustion is performed in pure oxygen produced through air separation, making it possible to completely prevent the formation of thermal NOx. Thus, NHIES would release very little amount of such substances that would contribute to acid rain. In this system, the intermediate energy sources of CO, H 2 and O 2 are integrated horizontally. They are combined appropriately to produce a specific form of final energy source. The integration of intermediate energy sources can provide a wide variety of final energy sources, allowing any type of fossil fuel to serve as an alternative to other types of fossil fuel. Another feature of NHIES is the positive use of nuclear fuel to reduce the formation of CO 2 . Studies are under way in Japan to develop a new concept of integrated energy system. These studies are especially aimed at decreased overall efficiency and introduction of new liquid fuels that are high in conversion efficiency. Considerations are made on the final form of energy source, robust control, acid fallout, and CO 2 reduction. (Nogami, K.)

  3. Installing and Executing Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) and Its Dependencies

    Science.gov (United States)

    2017-02-01

    SUPPLEMENTARY NOTES 14. ABSTRACT Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) is a novel information framework developed...prototyping. It supports dynamic plugin of analysis modules, for either research or analysis tasks. The framework integrates multiple image processing...Requirements 2 3. Installing the Software for IOAIDE 2 3.1 Load ARL Software 2 3.2 Load ARL Applications 4 3.3 Load the DSPro Software 7 3.4 Update Java

  4. Integrative Analysis of Omics Big Data.

    Science.gov (United States)

    Yu, Xiang-Tian; Zeng, Tao

    2018-01-01

    The diversity and huge omics data take biology and biomedicine research and application into a big data era, just like that popular in human society a decade ago. They are opening a new challenge from horizontal data ensemble (e.g., the similar types of data collected from different labs or companies) to vertical data ensemble (e.g., the different types of data collected for a group of person with match information), which requires the integrative analysis in biology and biomedicine and also asks for emergent development of data integration to address the great changes from previous population-guided to newly individual-guided investigations.Data integration is an effective concept to solve the complex problem or understand the complicate system. Several benchmark studies have revealed the heterogeneity and trade-off that existed in the analysis of omics data. Integrative analysis can combine and investigate many datasets in a cost-effective reproducible way. Current integration approaches on biological data have two modes: one is "bottom-up integration" mode with follow-up manual integration, and the other one is "top-down integration" mode with follow-up in silico integration.This paper will firstly summarize the combinatory analysis approaches to give candidate protocol on biological experiment design for effectively integrative study on genomics and then survey the data fusion approaches to give helpful instruction on computational model development for biological significance detection, which have also provided newly data resources and analysis tools to support the precision medicine dependent on the big biomedical data. Finally, the problems and future directions are highlighted for integrative analysis of omics big data.

  5. Containment integrity analysis under accidents

    International Nuclear Information System (INIS)

    Lin Chengge; Zhao Ruichang; Liu Zhitao

    2010-01-01

    Containment integrity analyses for current nuclear power plants (NPPs) mainly focus on the internal pressure caused by design basis accidents (DBAs). In addition to the analyses of containment pressure response caused by DBAs, the behavior of containment during severe accidents (SAs) are also evaluated for AP1000 NPP. Since the conservatism remains in the assumptions,boundary conditions and codes, margin of the results of containment integrity analyses may be overestimated. Along with the improvements of the knowledge to the phenomena and process of relevant accidents, the margin overrated can be appropriately reduced by using the best estimate codes combined with the uncertainty methods, which could be beneficial to the containment design and construction of large passive plants (LPP) in China. (authors)

  6. Integrative biological analysis for neuropsychopharmacology.

    Science.gov (United States)

    Emmett, Mark R; Kroes, Roger A; Moskal, Joseph R; Conrad, Charles A; Priebe, Waldemar; Laezza, Fernanda; Meyer-Baese, Anke; Nilsson, Carol L

    2014-01-01

    Although advances in psychotherapy have been made in recent years, drug discovery for brain diseases such as schizophrenia and mood disorders has stagnated. The need for new biomarkers and validated therapeutic targets in the field of neuropsychopharmacology is widely unmet. The brain is the most complex part of human anatomy from the standpoint of number and types of cells, their interconnections, and circuitry. To better meet patient needs, improved methods to approach brain studies by understanding functional networks that interact with the genome are being developed. The integrated biological approaches--proteomics, transcriptomics, metabolomics, and glycomics--have a strong record in several areas of biomedicine, including neurochemistry and neuro-oncology. Published applications of an integrated approach to projects of neurological, psychiatric, and pharmacological natures are still few but show promise to provide deep biological knowledge derived from cells, animal models, and clinical materials. Future studies that yield insights based on integrated analyses promise to deliver new therapeutic targets and biomarkers for personalized medicine.

  7. Integrating reliability analysis and design

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1980-10-01

    This report describes the Interactive Reliability Analysis Project and demonstrates the advantages of using computer-aided design systems (CADS) in reliability analysis. Common cause failure problems require presentations of systems, analysis of fault trees, and evaluation of solutions to these. Results have to be communicated between the reliability analyst and the system designer. Using a computer-aided design system saves time and money in the analysis of design. Computer-aided design systems lend themselves to cable routing, valve and switch lists, pipe routing, and other component studies. At EG and G Idaho, Inc., the Applicon CADS is being applied to the study of water reactor safety systems

  8. Study on a new framework of Human Reliability Analysis to evaluate soft control execution error in advanced MCRs of NPPs

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Jung, Wondea; Seong, Poong Hyun

    2016-01-01

    Highlights: • The operation environment of MCRs in NPPs has changed by adopting new HSIs. • The operation action in NPP Advanced MCRs is performed by soft control. • New HRA framework should be considered in the HRA for advanced MCRs. • HRA framework for evaluation of soft control execution human error is suggested. • Suggested method will be helpful to analyze human reliability in advance MCRs. - Abstract: Since the Three Mile Island (TMI)-2 accident, human error has been recognized as one of the main causes of Nuclear Power Plant (NPP) accidents, and numerous studies related to Human Reliability Analysis (HRA) have been carried out. Most of these methods were developed considering the conventional type of Main Control Rooms (MCRs). However, the operating environment of MCRs in NPPs has changed with the adoption of new Human-System Interfaces (HSIs) that are based on computer-based technologies. The MCRs that include these digital technologies, such as large display panels, computerized procedures, and soft controls, are called advanced MCRs. Among the many features of advanced MCRs, soft controls are a particularly important feature because operating actions in NPP advanced MCRs are performed by soft control. Due to the differences in interfaces between soft control and hardwired conventional type control, different Human Error Probabilities (HEPs) and a new HRA framework should be considered in the HRA for advanced MCRs. To this end, a new framework of a HRA method for evaluating soft control execution human error is suggested by performing a soft control task analysis and the literature regarding widely accepted human error taxonomies is reviewed. Moreover, since most current HRA databases deal with operation in conventional MCRs and are not explicitly designed to deal with digital HSIs, empirical analysis of human error and error recovery considering soft controls under an advanced MCR mockup are carried out to collect human error data, which is

  9. Effects of prefrontal tDCS on executive function: Methodological considerations revealed by meta-analysis.

    Science.gov (United States)

    Imburgio, Michael J; Orr, Joseph M

    2018-05-01

    A meta-analysis of studies using single-session transcranial direct current stimulation (tDCS) to target the dorsolateral prefrontal cortex (DLPFC) was undertaken to examine the effect of stimulation on executive function (EF) in healthy samples. 27 studies were included in analyses, yielding 71 effect sizes. The most relevant measure for each task was determined a priori and used to calculate Hedge's g. Methodological characteristics of each study were examined individually as potential moderators of effect size. Stimulation effects on three domains of EF (inhibition of prepotent responses, mental set shifting, and information updating and monitoring) were analyzed separately. In line with previous work, the current study found no significant effect of anodal unilateral tDCS, cathodal unilateral tDCS, or bilateral tDCS on EF. Further moderator and subgroup analyses were only carried out for anodal unilateral montages due to the small number of studies using other montages. Subgroup analyses revealed a significant effect of anodal unilateral tDCS on updating tasks, but not on inhibition or set-shifting tasks. Cathode location significantly moderated the effect of anodal unilateral tDCS. Extracranial cathodes yielded a significant effect on EF while cranial cathodes yielded no effect. Anode size also significantly moderated effect of anodal unilateral tDCS, with smaller anodes being more effective than larger anodes. In summary, anodal DLPFC stimulation is more effective at improving updating ability than inhibition and set-shifting ability, but anodal stimulation can significantly improve general executive function when extracranial cathodes or small anodes are used. Future meta-analyses may examine how stimulation's effects on specific behavioral tasks, rather than broader domains, might be affected by methodological moderators. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. A time-series approach to random number generation: Using recurrence quantification analysis to capture executive behavior

    Directory of Open Access Journals (Sweden)

    Wouter eOomens

    2015-06-01

    Full Text Available The concept of executive functions plays a prominent role in contemporary experimental and clinical studies on cognition. One paradigm used in this framework is the random number generation (RNG task, the execution of which demands aspects of executive functioning, specifically inhibition and working memory. Data from the RNG task are best seen as a series of successive events. However, traditional RNG measures that are used to quantify executive functioning are mostly summary statistics referring to deviations from mathematical randomness. In the current study, we explore the utility of recurrence quantification analysis (RQA, a nonlinear method that keeps the entire sequence intact, as a better way to describe executive functioning compared to traditional measures. To this aim, 242 first- and second-year students completed a non-paced RNG task. Principal component analysis of their data showed that traditional and RQA measures convey more or less the same information. However, RQA measures do so more parsimoniously and have a better interpretation.

  11. Integrative cluster analysis in bioinformatics

    CERN Document Server

    Abu-Jamous, Basel; Nandi, Asoke K

    2015-01-01

    Clustering techniques are increasingly being put to use in the analysis of high-throughput biological datasets. Novel computational techniques to analyse high throughput data in the form of sequences, gene and protein expressions, pathways, and images are becoming vital for understanding diseases and future drug discovery. This book details the complete pathway of cluster analysis, from the basics of molecular biology to the generation of biological knowledge. The book also presents the latest clustering methods and clustering validation, thereby offering the reader a comprehensive review o

  12. Executive Functioning in Children with ASD : An Analysis of the BRIEF

    NARCIS (Netherlands)

    Blijd-Hoogewys, E. M. A.; Bezemer, M. L.; van Geert, P. L. C.

    2014-01-01

    The Behavior Rating Inventory of Executive Functions (BRIEF) screens for executive function deficits in 5- to 18-year-olds. Data of three autism subgroups, according to DSM-IV-TR criteria (N = 35 Autistic Disorder, N = 27 Asperger's Disorder and N = 65 PDD-NOS), were analyzed. The total group has

  13. The Assessment of Executive Functioning in People with Intellectual Disabilities: An Exploratory Analysis

    Science.gov (United States)

    Bevins, Shelley; Hurse, Emily

    2016-01-01

    The following article details a piece of service development work undertaken as part of the Plymouth Down Syndrome Screening Programme. The work aimed to review the use of three measures assessing executive functioning skills used within the Programme as well as with people without Down syndrome. Three tasks assessing executive functioning (the…

  14. Integral data analysis for resonance parameters determination

    International Nuclear Information System (INIS)

    Larson, N.M.; Leal, L.C.; Derrien, H.

    1997-09-01

    Neutron time-of-flight experiments have long been used to determine resonance parameters. Those resonance parameters have then been used in calculations of integral quantities such as Maxwellian averages or resonance integrals, and results of those calculations in turn have been used as a criterion for acceptability of the resonance analysis. However, the calculations were inadequate because covariances on the parameter values were not included in the calculations. In this report an effort to correct for that deficiency is documented: (1) the R-matrix analysis code SAMMY has been modified to include integral quantities of importance, (2) directly within the resonance parameter analysis, and (3) to determine the best fit to both differential (microscopic) and integral (macroscopic) data simultaneously. This modification was implemented because it is expected to have an impact on the intermediate-energy range that is important for criticality safety applications

  15. Based on Weibull Information Fusion Analysis Semiconductors Quality the Key Technology of Manufacturing Execution Systems Reliability

    Science.gov (United States)

    Huang, Zhi-Hui; Tang, Ying-Chun; Dai, Kai

    2016-05-01

    Semiconductor materials and Product qualified rate are directly related to the manufacturing costs and survival of the enterprise. Application a dynamic reliability growth analysis method studies manufacturing execution system reliability growth to improve product quality. Refer to classical Duane model assumptions and tracking growth forecasts the TGP programming model, through the failure data, established the Weibull distribution model. Combining with the median rank of average rank method, through linear regression and least squares estimation method, match respectively weibull information fusion reliability growth curve. This assumption model overcome Duane model a weakness which is MTBF point estimation accuracy is not high, through the analysis of the failure data show that the method is an instance of the test and evaluation modeling process are basically identical. Median rank in the statistics is used to determine the method of random variable distribution function, which is a good way to solve the problem of complex systems such as the limited sample size. Therefore this method has great engineering application value.

  16. The Effect of Group Therapy With Transactional Analysis Approach on Emotional Intelligence, Executive Functions and Drug Dependency.

    Science.gov (United States)

    Forghani, Masoomeh; Ghanbari Hashem Abadi, Bahram Ali

    2016-06-01

    The aim of the present study was to evaluate the effect of group psychotherapy with transactional analysis (TA) approach on emotional intelligence (EI), executive functions and substance dependency among drug-addicts at rehabilitation centers in Mashhad city, Iran, in 2013. In this quasi-experimental study with pretest, posttest, case- control stages, 30 patients were selected from a rehabilitation center and randomly divided into two groups. The case group received 12 sessions of group psychotherapy with transactional analysis approach. Then the effects of independent variable (group psychotherapy with TA approach) on EI, executive function and drug dependency were assessed. The Bar-on test was used for EI, Stroop test for measuring executive function and morphine test, meth-amphetamines and B2 test for evaluating drug dependency. Data were analyzed using multifactorial covariance analysis, Levenes' analysis, MANCOVA, t-student and Pearson correlation coefficient tests t with SPSS software. Our results showed that group psychotherapy with the TA approach was effective in improving EI, executive functions and decreasing drug dependency (P addicts and prevents addiction recurrence by improving the coping capabilities and some mental functions of the subjects. However, there are some limitations regarding this study including follow-up duration and sample size.

  17. Preliminary Integrated Safety Analysis Status Report

    International Nuclear Information System (INIS)

    Gwyn, D.

    2001-01-01

    This report provides the status of the potential Monitored Geologic Repository (MGR) Integrated Safety Analysis (EA) by identifying the initial work scope scheduled for completion during the ISA development period, the schedules associated with the tasks identified, safety analysis issues encountered, and a summary of accomplishments during the reporting period. This status covers the period from October 1, 2000 through March 30, 2001

  18. Execution and executability

    Science.gov (United States)

    Bradford, Robert W.; Harrison, Denise

    2015-09-01

    "We have a new strategy to grow our organization." Developing the plan is just the start. Implementing it in the organization is the real challenge. Many organizations don't fail due to lack of strategy; they struggle because it isn't effectively implemented. After working with hundreds of companies on strategy development, Denise and Robert have distilled the critical areas where organizations need to focus in order to enhance profitability through superior execution. If these questions are important to your organization, you'll find useful answers in the following articles: Do you find yourself overwhelmed by too many competing priorities? How do you limit how many strategic initiatives/projects your organization is working on at one time? How do you balance your resource requirements (time and money) with the availability of these resources? How do you balance your strategic initiative requirements with the day-to-day requirements of your organization?

  19. Integrability of dynamical systems algebra and analysis

    CERN Document Server

    Zhang, Xiang

    2017-01-01

    This is the first book to systematically state the fundamental theory of integrability and its development of ordinary differential equations with emphasis on the Darboux theory of integrability and local integrability together with their applications. It summarizes the classical results of Darboux integrability and its modern development together with their related Darboux polynomials and their applications in the reduction of Liouville and elementary integrabilty and in the center—focus problem, the weakened Hilbert 16th problem on algebraic limit cycles and the global dynamical analysis of some realistic models in fields such as physics, mechanics and biology. Although it can be used as a textbook for graduate students in dynamical systems, it is intended as supplementary reading for graduate students from mathematics, physics, mechanics and engineering in courses related to the qualitative theory, bifurcation theory and the theory of integrability of dynamical systems.

  20. Strategic Analysis of Technology Integration at Allstream

    OpenAIRE

    Brown, Jeff

    2011-01-01

    Innovation has been defined as the combination of invention and commercialization. Invention without commercialization is rarely, if ever, profitable. For the purposes of this paper the definition of innovation will be further expanded into the concept of technology integration. Successful technology integration not only includes new technology introduction, but also the operationalization of the new technology within each business unit of the enterprise. This paper conducts an analysis of Al...

  1. A Conceptual Framework of Human Reliability Analysis for Execution Human Error in NPP Advanced MCRs

    International Nuclear Information System (INIS)

    Jang, In Seok; Kim, Ar Ryum; Seong, Poong Hyun; Jung, Won Dea

    2014-01-01

    The operation environment of Main Control Rooms (MCRs) in Nuclear Power Plants (NPPs) has changed with the adoption of new human-system interfaces that are based on computer-based technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called Advanced MCRs. Among the many features of Advanced MCRs, soft controls are a particularly important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, and touch screens, operators can select a specific screen, then choose the controller, and finally manipulate the given devices. Due to the different interfaces between soft control and hardwired conventional type control, different human error probabilities and a new Human Reliability Analysis (HRA) framework should be considered in the HRA for advanced MCRs. In other words, new human error modes should be considered for interface management tasks such as navigation tasks, and icon (device) selection tasks in monitors and a new framework of HRA method taking these newly generated human error modes into account should be considered. In this paper, a conceptual framework for a HRA method for the evaluation of soft control execution human error in advanced MCRs is suggested by analyzing soft control tasks

  2. A Conceptual Framework of Human Reliability Analysis for Execution Human Error in NPP Advanced MCRs

    Energy Technology Data Exchange (ETDEWEB)

    Jang, In Seok; Kim, Ar Ryum; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Jung, Won Dea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-08-15

    The operation environment of Main Control Rooms (MCRs) in Nuclear Power Plants (NPPs) has changed with the adoption of new human-system interfaces that are based on computer-based technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called Advanced MCRs. Among the many features of Advanced MCRs, soft controls are a particularly important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, and touch screens, operators can select a specific screen, then choose the controller, and finally manipulate the given devices. Due to the different interfaces between soft control and hardwired conventional type control, different human error probabilities and a new Human Reliability Analysis (HRA) framework should be considered in the HRA for advanced MCRs. In other words, new human error modes should be considered for interface management tasks such as navigation tasks, and icon (device) selection tasks in monitors and a new framework of HRA method taking these newly generated human error modes into account should be considered. In this paper, a conceptual framework for a HRA method for the evaluation of soft control execution human error in advanced MCRs is suggested by analyzing soft control tasks.

  3. Analysis Method for Integrating Components of Product

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jun Ho [Inzest Co. Ltd, Seoul (Korea, Republic of); Lee, Kun Sang [Kookmin Univ., Seoul (Korea, Republic of)

    2017-04-15

    This paper presents some of the methods used to incorporate the parts constituting a product. A new relation function concept and its structure are introduced to analyze the relationships of component parts. This relation function has three types of information, which can be used to establish a relation function structure. The relation function structure of the analysis criteria was established to analyze and present the data. The priority components determined by the analysis criteria can be integrated. The analysis criteria were divided based on their number and orientation, as well as their direct or indirect characteristic feature. This paper presents a design algorithm for component integration. This algorithm was applied to actual products, and the components inside the product were integrated. Therefore, the proposed algorithm was used to conduct research to improve the brake discs for bicycles. As a result, an improved product similar to the related function structure was actually created.

  4. Analysis Method for Integrating Components of Product

    International Nuclear Information System (INIS)

    Choi, Jun Ho; Lee, Kun Sang

    2017-01-01

    This paper presents some of the methods used to incorporate the parts constituting a product. A new relation function concept and its structure are introduced to analyze the relationships of component parts. This relation function has three types of information, which can be used to establish a relation function structure. The relation function structure of the analysis criteria was established to analyze and present the data. The priority components determined by the analysis criteria can be integrated. The analysis criteria were divided based on their number and orientation, as well as their direct or indirect characteristic feature. This paper presents a design algorithm for component integration. This algorithm was applied to actual products, and the components inside the product were integrated. Therefore, the proposed algorithm was used to conduct research to improve the brake discs for bicycles. As a result, an improved product similar to the related function structure was actually created.

  5. An Effectiveness Analysis of the U.S. Federal Government Executive Branch Ethics Policy and Program

    National Research Council Canada - National Science Library

    Stewart, Chanet

    2003-01-01

    .... As public servants, whether elected or non-elected, Executive Branch employees are expected to make decisions and spend tax-payer dollars in ways that promote the overall interests of the American public...

  6. An integrated acquisition, display, and analysis system

    International Nuclear Information System (INIS)

    Ahmad, T.; Huckins, R.J.

    1987-01-01

    The design goal of the ND9900/Genuie was to integrate a high performance data acquisition and display subsystem with a state-of-the-art 32-bit supermicrocomputer. This was achieved by integrating a Digital Equipment Corporation MicroVAX II CPU board with acquisition and display controllers via the Q-bus. The result is a tightly coupled processing and analysis system for Pulse Height Analysis and other applications. The system architecture supports distributed processing, so that acquisition and display functions are semi-autonomous, making the VAX concurrently available for applications programs

  7. Executives' speech expressiveness: analysis of perceptive and acoustic aspects of vocal dynamics.

    Science.gov (United States)

    Marquezin, Daniela Maria Santos Serrano; Viola, Izabel; Ghirardi, Ana Carolina de Assis Moura; Madureira, Sandra; Ferreira, Léslie Piccolotto

    2015-01-01

    To analyze speech expressiveness in a group of executives based on perceptive and acoustic aspects of vocal dynamics. Four male subjects participated in the research study (S1, S2, S3, and S4). The assessments included the Kingdomality test to obtain the keywords of communicative attitudes; perceptive-auditory assessment to characterize vocal quality and dynamics, performed by three judges who are speech language pathologists; perceptiveauditory assessment to judge the chosen keywords; speech acoustics to assess prosodic elements (Praat software); and a statistical analysis. According to the perceptive-auditory analysis of vocal dynamics, S1, S2, S3, and S4 did not show vocal alterations and all of them were considered with lowered habitual pitch. S1: pointed out as insecure, nonobjective, nonempathetic, and unconvincing with inappropriate use of pauses that are mainly formed by hesitations; inadequate separation of prosodic groups with breaking of syntagmatic constituents. S2: regular use of pauses for respiratory reload, organization of sentences, and emphasis, which is considered secure, little objective, empathetic, and convincing. S3: pointed out as secure, objective, empathetic, and convincing with regular use of pauses for respiratory reload and organization of sentences and hesitations. S4: the most secure, objective, empathetic, and convincing, with proper use of pauses for respiratory reload, planning, and emphasis; prosodic groups agreed with the statement, without separating the syntagmatic constituents. The speech characteristics and communicative attitudes were highlighted in two subjects in a different manner, in such a way that the slow rate of speech and breaks of the prosodic groups transmitted insecurity, little objectivity, and nonpersuasion.

  8. Abel integral equations analysis and applications

    CERN Document Server

    Gorenflo, Rudolf

    1991-01-01

    In many fields of application of mathematics, progress is crucially dependent on the good flow of information between (i) theoretical mathematicians looking for applications, (ii) mathematicians working in applications in need of theory, and (iii) scientists and engineers applying mathematical models and methods. The intention of this book is to stimulate this flow of information. In the first three chapters (accessible to third year students of mathematics and physics and to mathematically interested engineers) applications of Abel integral equations are surveyed broadly including determination of potentials, stereology, seismic travel times, spectroscopy, optical fibres. In subsequent chapters (requiring some background in functional analysis) mapping properties of Abel integral operators and their relation to other integral transforms in various function spaces are investi- gated, questions of existence and uniqueness of solutions of linear and nonlinear Abel integral equations are treated, and for equatio...

  9. The role of executive functioning in children's attentional pain control: an experimental analysis.

    Science.gov (United States)

    Verhoeven, Katrien; Dick, Bruce; Eccleston, Christopher; Goubert, Liesbet; Crombez, Geert

    2014-02-01

    Directing attention away from pain is often used in children's pain treatment programs to control pain. However, empirical evidence concerning its effectiveness is inconclusive. We therefore sought to understand other influencing factors, including executive function and its role in the pain experience. This study investigates the role of executive functioning in the effectiveness of distraction. School children (n=164) completed executive functioning tasks (inhibition, switching, and working memory) and performed a cold-pressor task. One half of the children simultaneously performed a distracting tone-detection task; the other half did not. Results showed that participants in the distraction group were engaged in the distraction task and were reported to pay significantly less attention to pain than controls. Executive functioning influenced distraction task engagement. More specifically, participants with good inhibition and working memory abilities performed the distraction task better; participants with good switching abilities reported having paid more attention to the distraction task. Furthermore, distraction was found to be ineffective in reducing pain intensity and affect. Executive functioning did not influence the effectiveness of distraction. However, a relationship was found between executive functioning and pain affect, indicating that participants with good inhibition and working memory abilities experienced the cold-pressor task as less stressful and unpleasant. Our findings suggest that distraction as a process for managing pain is complex. While it appears that executive function may play a role in adult distraction, in this study it did not direct attention away from pain. It may instead be involved in the overall pain experience. Copyright © 2013 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  10. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    Science.gov (United States)

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  11. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    A new formulation termed the Integrated Force Method (IFM) was proposed by Patnaik ... nated ``Structure (nY m)'' where (nY m) are the force and displacement degrees of ..... Patnaik S N, Yadagiri S 1976 Frequency analysis of structures.

  12. Executive Selection in Government Agencies: An Analysis of the Department of the Navy and Immigration and Naturalization Services Senior Executive Service Selection Processes

    National Research Council Canada - National Science Library

    Jordan, Mark

    2001-01-01

    .... The Senior Executive Service (SES) selection process for the Department of the Navy (DON) is analyzed and compared to the SES selection process used by the Immigration and Naturalization Service...

  13. DWI and complex brain network analysis predicts vascular cognitive impairment in spontaneous hypertensive rats undergoing executive function tests

    Directory of Open Access Journals (Sweden)

    Xavier eLópez-Gil

    2014-07-01

    Full Text Available The identification of biomarkers of vascular cognitive impairment is urgent for its early diagnosis. The aim of this study was to detect and monitor changes in brain structure and connectivity, and to correlate them with the decline in executive function. We examined the feasibility of early diagnostic magnetic resonance imaging to predict cognitive impairment before onset in an animal model of chronic hypertension: Spontaneously Hypertensive Rats. Cognitive performance was tested in an operant conditioning paradigm that evaluated learning, memory and behavioral flexibility skills. Behavioral tests were coupled with longitudinal diffusion weighted imaging acquired with 126 diffusion gradient directions and 0.3 mm3 isometric resolution at 10, 14, 18, 22, 26 and 40 weeks after birth. Diffusion weighted imaging was analyzed in 2 different ways, by regional characterization of diffusion tensor imaging indices, and by assessing changes in structural brain network organization based on Q-Ball tractography. Already at the first evaluated times, diffusion tensor imaging scalar maps revealed significant differences in many regions, suggesting loss of integrity in white and grey matter of spontaneously hypertensive rats when compared to normotensive control rats. In addition, graph theory analysis of the structural brain network demonstrated a significant decrease of hierarchical modularity, global and local efficacy, with predictive value as shown by regional 3-fold cross validation study. Moreover, these decreases were significantly correlated with the behavioral performance deficits observed at subsequent time points, suggesting that the diffusion weighted imaging and connectivity studies can unravel neuroimaging alterations even overt signs of cognitive impairment become apparent.

  14. Is there a relationship between language switching and executive functions in bilingualism? Introducing a within-group analysis approach

    Directory of Open Access Journals (Sweden)

    Anna eSoveri

    2011-08-01

    Full Text Available Several studies have suggested a bilingual advantage in executive functions, presumably due to bilinguals’ massive practice with language switching that requires executive resources, but the results are still somewhat controversial. Previous studies are also plagued by the inherent limitations of a natural groups design where the participant groups are bound to differ in many ways in addition to the variable used to classify them. In an attempt to introduce a complementary analysis approach, we employed multiple regression to study whether the performance of 30-75-year-old Finnish-Swedish bilinguals (n= 38 on tasks measuring different executive functions (inhibition, updating, and set shifting could be predicted by the frequency of language switches in everyday life (as measured by a language switching questionnaire, L2 age of acquisition, or by the self-estimated degree of use of both languages in everyday life. Most consistent effects were found for the set shifting task where a higher rate of everyday language switches was related to a smaller mixing cost in errors. Mixing cost is thought to reflect top-down management of competing task sets, thus resembling the bilingual situation where decisions of which language to use has to be made in each conversation. These findings provide additional support to the idea that some executive functions in bilinguals are affected by a lifelong experience in language switching and, perhaps even more importantly, suggest a complementary approach to the study of this issue.

  15. Development and integration of a LabVIEW-based modular architecture for automated execution of electrochemical catalyst testing.

    Science.gov (United States)

    Topalov, Angel A; Katsounaros, Ioannis; Meier, Josef C; Klemm, Sebastian O; Mayrhofer, Karl J J

    2011-11-01

    This paper describes a system for performing electrochemical catalyst testing where all hardware components are controlled simultaneously using a single LabVIEW-based software application. The software that we developed can be operated in both manual mode for exploratory investigations and automatic mode for routine measurements, by using predefined execution procedures. The latter enables the execution of high-throughput or combinatorial investigations, which decrease substantially the time and cost for catalyst testing. The software was constructed using a modular architecture which simplifies the modification or extension of the system, depending on future needs. The system was tested by performing stability tests of commercial fuel cell electrocatalysts, and the advantages of the developed system are discussed. © 2011 American Institute of Physics

  16. Build and Execute Environment

    Energy Technology Data Exchange (ETDEWEB)

    2017-04-21

    At exascale, the challenge becomes to develop applications that run at scale and use exascale platforms reliably, efficiently, and flexibly. Workflows become much more complex because they must seamlessly integrate simulation and data analytics. They must include down-sampling, post-processing, feature extraction, and visualization. Power and data transfer limitations require these analysis tasks to be run in-situ or in-transit. We expect successful workflows will comprise multiple linked simulations along with tens of analysis routines. Users will have limited development time at scale and, therefore, must have rich tools to develop, debug, test, and deploy applications. At this scale, successful workflows will compose linked computations from an assortment of reliable, well-defined computation elements, ones that can come and go as required, based on the needs of the workflow over time. We propose a novel framework that utilizes both virtual machines (VMs) and software containers to create a workflow system that establishes a uniform build and execution environment (BEE) beyond the capabilities of current systems. In this environment, applications will run reliably and repeatably across heterogeneous hardware and software. Containers, both commercial (Docker and Rocket) and open-source (LXC and LXD), define a runtime that isolates all software dependencies from the machine operating system. Workflows may contain multiple containers that run different operating systems, different software, and even different versions of the same software. We will run containers in open-source virtual machines (KVM) and emulators (QEMU) so that workflows run on any machine entirely in user-space. On this platform of containers and virtual machines, we will deliver workflow software that provides services, including repeatable execution, provenance, checkpointing, and future proofing. We will capture provenance about how containers were launched and how they interact to annotate

  17. Do Tasks Make a Difference? Accounting for Heterogeneity of Performance of Children with Reading Difficulties on Tasks of Executive Function: Findings from a Meta-Analysis

    Science.gov (United States)

    Booth, Josephine N.; Boyle, James M. E.; Kelly, Steve W.

    2010-01-01

    Research studies have implicated executive functions in reading difficulties (RD). But while some studies have found children with RD to be impaired on tasks of executive function other studies report unimpaired performance. A meta-analysis was carried out to determine whether these discrepant findings can be accounted for by differences in the…

  18. Integrating neural network technology and noise analysis

    International Nuclear Information System (INIS)

    Uhrig, R.E.; Oak Ridge National Lab., TN

    1995-01-01

    The integrated use of neural network and noise analysis technologies offers advantages not available by the use of either technology alone. The application of neural network technology to noise analysis offers an opportunity to expand the scope of problems where noise analysis is useful and unique ways in which the integration of these technologies can be used productively. The two-sensor technique, in which the responses of two sensors to an unknown driving source are related, is used to demonstration such integration. The relationship between power spectral densities (PSDs) of accelerometer signals is derived theoretically using noise analysis to demonstrate its uniqueness. This relationship is modeled from experimental data using a neural network when the system is working properly, and the actual PSD of one sensor is compared with the PSD of that sensor predicted by the neural network using the PSD of the other sensor as an input. A significant deviation between the actual and predicted PSDs indicate that system is changing (i.e., failing). Experiments carried out on check values and bearings illustrate the usefulness of the methodology developed. (Author)

  19. INSIGHT: an integrated scoping analysis tool for in-core fuel management of PWR

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Noda, Hidefumi; Ito, Nobuaki; Maruyama, Taiji.

    1997-01-01

    An integrated software tool for scoping analysis of in-core fuel management, INSIGHT, has been developed to automate the scoping analysis and to improve the fuel cycle cost using advanced optimization techniques. INSIGHT is an interactive software tool executed on UNIX based workstations that is equipped with an X-window system. INSIGHT incorporates the GALLOP loading pattern (LP) optimization module that utilizes hybrid genetic algorithms, the PATMAKER interactive LP design module, the MCA multicycle analysis module, an integrated database, and other utilities. Two benchmark problems were analyzed to confirm the key capabilities of INSIGHT: LP optimization and multicycle analysis. The first was the single cycle LP optimization problem that included various constraints. The second one was the multicycle LP optimization problem that includes the assembly burnup limitation at rod cluster control (RCC) positions. The results for these problems showed the feasibility of INSIGHT for the practical scoping analysis, whose work almost consists of LP generation and multicycle analysis. (author)

  20. [Integrated health care organizations: guideline for analysis].

    Science.gov (United States)

    Vázquez Navarrete, M Luisa; Vargas Lorenzo, Ingrid; Farré Calpe, Joan; Terraza Núñez, Rebeca

    2005-01-01

    There has been a tendency recently to abandon competition and to introduce policies that promote collaboration between health providers as a means of improving the efficiency of the system and the continuity of care. A number of countries, most notably the United States, have experienced the integration of health care providers to cover the continuum of care of a defined population. Catalonia has witnessed the steady emergence of increasing numbers of integrated health organisations (IHO) but, unlike the United States, studies on health providers' integration are scarce. As part of a research project currently underway, a guide was developed to study Catalan IHOs, based on a classical literature review and the development of a theoretical framework. The guide proposes analysing the IHO's performance in relation to their final objectives of improving the efficiency and continuity of health care by an analysis of the integration type (based on key characteristics); external elements (existence of other suppliers, type of services' payment mechanisms); and internal elements (model of government, organization and management) that influence integration. Evaluation of the IHO's performance focuses on global strategies and results on coordination of care and efficiency. Two types of coordination are evaluated: information coordination and coordination of care management. Evaluation of the efficiency of the IHO refers to technical and allocative efficiency. This guide may have to be modified for use in the Catalan context.

  1. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  2. An integrated system for genetic analysis

    Directory of Open Access Journals (Sweden)

    Duan Xiao

    2006-04-01

    Full Text Available Abstract Background Large-scale genetic mapping projects require data management systems that can handle complex phenotypes and detect and correct high-throughput genotyping errors, yet are easy to use. Description We have developed an Integrated Genotyping System (IGS to meet this need. IGS securely stores, edits and analyses genotype and phenotype data. It stores information about DNA samples, plates, primers, markers and genotypes generated by a genotyping laboratory. Data are structured so that statistical genetic analysis of both case-control and pedigree data is straightforward. Conclusion IGS can model complex phenotypes and contain genotypes from whole genome association studies. The database makes it possible to integrate genetic analysis with data curation. The IGS web site http://bioinformatics.well.ox.ac.uk/project-igs.shtml contains further information.

  3. Integrated Reliability and Risk Analysis System (IRRAS)

    International Nuclear Information System (INIS)

    Russell, K.D.; McKay, M.K.; Sattison, M.B.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance

  4. Alcohol harm reduction advertisements: a content analysis of topic, objective, emotional tone, execution and target audience.

    Science.gov (United States)

    Dunstone, Kimberley; Brennan, Emily; Slater, Michael D; Dixon, Helen G; Durkin, Sarah J; Pettigrew, Simone; Wakefield, Melanie A

    2017-04-11

    Public health mass media campaigns may contribute to reducing the health and social burden attributed to alcohol consumption, but little is known about which advertising characteristics have been used, or have been effective, in alcohol harm reduction campaigns to date. As a first step towards encouraging further research to identify the impact of various advertising characteristics, this study aimed to systematically identify and examine the content of alcohol harm reduction advertisements (ads). Ads were identified through an exhaustive internet search of Google, YouTube, Vimeo, and relevant government and health agency websites. Eligible ads were: English language, produced between 2006 and 2014, not primarily focused on drink-driving or alcohol in pregnancy, and not alcohol industry funded. Systematic content analysis of all ads was performed; each ad was double-coded. In total, 110 individual ads from 72 different alcohol harm reduction campaigns were identified, with the main source countries being Australia (40%) and the United Kingdom (26%). The dominant topic for 52% of ads was short-term harms, while 10% addressed long-term harms, 18% addressed underage drinking, 17% communicated a how-to-change message, and 3% advocated for policy change. The behavioural objective of most ads was to motivate audiences to reduce their alcohol consumption (38%) or to behave responsibly and/or not get drunk when drinking (33%). Only 10% of all ads mentioned low-risk drinking guidelines. Eighty-seven percent of ads used a dramatisation execution style and 74% had a negative emotional tone. Ninety percent of ads contained messages or content that appeared to target adults, and 36% specifically targeted young adults. Some message attributes have been employed more frequently than others, suggesting several promising avenues for future audience or population-based research to compare the relative effectiveness of different characteristics of alcohol harm reduction ads. Given

  5. Alcohol harm reduction advertisements: a content analysis of topic, objective, emotional tone, execution and target audience

    Directory of Open Access Journals (Sweden)

    Kimberley Dunstone

    2017-04-01

    Full Text Available Abstract Background Public health mass media campaigns may contribute to reducing the health and social burden attributed to alcohol consumption, but little is known about which advertising characteristics have been used, or have been effective, in alcohol harm reduction campaigns to date. As a first step towards encouraging further research to identify the impact of various advertising characteristics, this study aimed to systematically identify and examine the content of alcohol harm reduction advertisements (ads. Method Ads were identified through an exhaustive internet search of Google, YouTube, Vimeo, and relevant government and health agency websites. Eligible ads were: English language, produced between 2006 and 2014, not primarily focused on drink-driving or alcohol in pregnancy, and not alcohol industry funded. Systematic content analysis of all ads was performed; each ad was double-coded. Results In total, 110 individual ads from 72 different alcohol harm reduction campaigns were identified, with the main source countries being Australia (40% and the United Kingdom (26%. The dominant topic for 52% of ads was short-term harms, while 10% addressed long-term harms, 18% addressed underage drinking, 17% communicated a how-to-change message, and 3% advocated for policy change. The behavioural objective of most ads was to motivate audiences to reduce their alcohol consumption (38% or to behave responsibly and/or not get drunk when drinking (33%. Only 10% of all ads mentioned low-risk drinking guidelines. Eighty-seven percent of ads used a dramatisation execution style and 74% had a negative emotional tone. Ninety percent of ads contained messages or content that appeared to target adults, and 36% specifically targeted young adults. Conclusions Some message attributes have been employed more frequently than others, suggesting several promising avenues for future audience or population-based research to compare the relative effectiveness of

  6. Integrity Analysis of Damaged Steam Generator Tubes

    International Nuclear Information System (INIS)

    Stanic, D.

    1998-01-01

    Variety of degradation mechanisms affecting steam generator tubes makes steam generators as one of the critical components in the nuclear power plants. Depending of their nature, degradation mechanisms cause different types of damages. It requires performance of extensive integrity analysis in order to access various conditions of crack behavior under operating and accidental conditions. Development and application of advanced eddy current techniques for steam generator examination provide good characterization of found damages. Damage characteristics (shape, orientation and dimensions) may be defined and used for further evaluation of damage influence on tube integrity. In comparison with experimental and analytical methods, numerical methods are also efficient tools for integrity assessment. Application of finite element methods provides relatively simple modeling of different type of damages and simulation of various operating conditions. The stress and strain analysis may be performed for elastic and elasto-plastic state with good ability for visual presentation of results. Furthermore, the fracture mechanics parameters may be calculated. Results obtained by numerical analysis supplemented with experimental results are the base for definition of alternative plugging criteria which may significantly reduce the number of plugged tubes. (author)

  7. Integrating Multi-Purpose Natural Language Understanding, Robot's Memory, and Symbolic Planning for Task Execution in Humanoid Robots

    DEFF Research Database (Denmark)

    Wächter, Mirko; Ovchinnikova, Ekaterina; Wittenbeck, Valerij

    2017-01-01

    We propose an approach for instructing a robot using natural language to solve complex tasks in a dynamic environment. In this study, we elaborate on a framework that allows a humanoid robot to understand natural language, derive symbolic representations of its sensorimotor experience, generate....... The framework is implemented within the robot development environment ArmarX. We evaluate the framework on the humanoid robot ARMAR-III in the context of two experiments: a demonstration of the real execution of a complex task in the kitchen environment on ARMAR-III and an experiment with untrained users...

  8. The integrated microbial genome resource of analysis.

    Science.gov (United States)

    Checcucci, Alice; Mengoni, Alessio

    2015-01-01

    Integrated Microbial Genomes and Metagenomes (IMG) is a biocomputational system that allows to provide information and support for annotation and comparative analysis of microbial genomes and metagenomes. IMG has been developed by the US Department of Energy (DOE)-Joint Genome Institute (JGI). IMG platform contains both draft and complete genomes, sequenced by Joint Genome Institute and other public and available genomes. Genomes of strains belonging to Archaea, Bacteria, and Eukarya domains are present as well as those of viruses and plasmids. Here, we provide some essential features of IMG system and case study for pangenome analysis.

  9. Integrated analysis of genetic data with R

    Directory of Open Access Journals (Sweden)

    Zhao Jing

    2006-01-01

    Full Text Available Abstract Genetic data are now widely available. There is, however, an apparent lack of concerted effort to produce software systems for statistical analysis of genetic data compared with other fields of statistics. It is often a tremendous task for end-users to tailor them for particular data, especially when genetic data are analysed in conjunction with a large number of covariates. Here, R http://www.r-project.org, a free, flexible and platform-independent environment for statistical modelling and graphics is explored as an integrated system for genetic data analysis. An overview of some packages currently available for analysis of genetic data is given. This is followed by examples of package development and practical applications. With clear advantages in data management, graphics, statistical analysis, programming, internet capability and use of available codes, it is a feasible, although challenging, task to develop it into an integrated platform for genetic analysis; this will require the joint efforts of many researchers.

  10. Effects of physical activity on executive functions, attention and academic performance in preadolescent children: a meta-analysis.

    Science.gov (United States)

    de Greeff, Johannes W; Bosker, Roel J; Oosterlaan, Jaap; Visscher, Chris; Hartman, E

    2018-05-01

    The aim of this meta-analysis was to provide a systematic review of intervention studies that investigated the effects of physical activity on multiple domains of executive functions, attention and academic performance in preadolescent children (6-12 years of age). In addition, a systematic quantification of the effects of physical activity on these domains is provided. Systematic review and meta-analysis. Searches of electronic databases and examining relevant reviews between 2000 and April 2017 resulted in 31 intervention studies meeting the inclusion criteria. Four subdomains of executive functions (inhibition, working memory, cognitive flexibility and planning), three subdomains of attention (selective, divided and sustained) and three subdomains of academic performance (mathematics, spelling and reading) were distinguished. Effects for different study designs (acute physical activity or longitudinal physical activity programs), type of physical activity (aerobic or cognitively engaging) and duration of intervention were examined separately. Acute physical activity has a positive effect on attention (g=0.43; 95% CI=0.09, 0.77; 6 studies), while longitudinal physical activity programs has a positive effect on executive functions (g=0.24; 95% CI=0.09, 0.39; 12 studies), attention (g=0.90; 95% CI=0.56, 1.24; 1 study) and academic performance (g=0.26; 95% CI=0.02, 0.49; 3 studies). The effects did depend on the subdomain. Positive effects were found for physical activity on executive functions, attention and academic performance in preadolescent children. Largest effects are expected for interventions that aim for continuous regular physical activity over several weeks. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  11. Advancing Alternative Analysis: Integration of Decision Science

    DEFF Research Database (Denmark)

    Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina

    2016-01-01

    Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate......, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect......) engaging the systematic development and evaluation of decision approaches and tools; (2) using case studies to advance the integration of decision analysis into alternatives analysis; (3) supporting transdisciplinary research; and (4) supporting education and outreach efforts....

  12. Messing Up Texas?: A Re-Analysis of the Effects of Executions on Homicides.

    Directory of Open Access Journals (Sweden)

    Patrick T Brandt

    Full Text Available Executions in Texas from 1994-2005 do not deter homicides, contrary to the results of Land et al. (2009. We find that using different models--based on pre-tests for unit roots that correct for earlier model misspecifications--one cannot reject the null hypothesis that executions do not lead to a change in homicides in Texas over this period. Using additional control variables, we show that variables such as the number of prisoners in Texas may drive the main drop in homicides over this period. Such conclusions however are highly sensitive to model specification decisions, calling into question the assumptions about fixed parameters and constant structural relationships. This means that using dynamic regressions to account for policy changes that may affect homicides need to be done with significant care and attention.

  13. Messing Up Texas?: A Re-Analysis of the Effects of Executions on Homicides.

    Science.gov (United States)

    Brandt, Patrick T; Kovandzic, Tomislav V

    2015-01-01

    Executions in Texas from 1994-2005 do not deter homicides, contrary to the results of Land et al. (2009). We find that using different models--based on pre-tests for unit roots that correct for earlier model misspecifications--one cannot reject the null hypothesis that executions do not lead to a change in homicides in Texas over this period. Using additional control variables, we show that variables such as the number of prisoners in Texas may drive the main drop in homicides over this period. Such conclusions however are highly sensitive to model specification decisions, calling into question the assumptions about fixed parameters and constant structural relationships. This means that using dynamic regressions to account for policy changes that may affect homicides need to be done with significant care and attention.

  14. Integration and segregation in auditory scene analysis

    Science.gov (United States)

    Sussman, Elyse S.

    2005-03-01

    Assessment of the neural correlates of auditory scene analysis, using an index of sound change detection that does not require the listener to attend to the sounds [a component of event-related brain potentials called the mismatch negativity (MMN)], has previously demonstrated that segregation processes can occur without attention focused on the sounds and that within-stream contextual factors influence how sound elements are integrated and represented in auditory memory. The current study investigated the relationship between the segregation and integration processes when they were called upon to function together. The pattern of MMN results showed that the integration of sound elements within a sound stream occurred after the segregation of sounds into independent streams and, further, that the individual streams were subject to contextual effects. These results are consistent with a view of auditory processing that suggests that the auditory scene is rapidly organized into distinct streams and the integration of sequential elements to perceptual units takes place on the already formed streams. This would allow for the flexibility required to identify changing within-stream sound patterns, needed to appreciate music or comprehend speech..

  15. Longitudinal analysis of music education on executive functions in primary school children

    OpenAIRE

    Jaschke, A.C.; Honing, H.; Scherder, E.J.A.

    2018-01-01

    Background: Research on the effects of music education on cognitive abilities has generated increasing interest across the scientific community. Nonetheless, longitudinal studies investigating the effects of structured music education on cognitive sub-functions are still rare. Prime candidates for investigating a relationship between academic achievement and music education appear to be executive functions such as planning, working memory, and inhibition. Methods: One hundred and forty-seven ...

  16. Longitudinal Analysis of Music Education on Executive Functions in Primary School Children

    OpenAIRE

    Artur C. Jaschke; Artur C. Jaschke; Henkjan Honing; Erik J. A. Scherder

    2018-01-01

    Background: Research on the effects of music education on cognitive abilities has generated increasing interest across the scientific community. Nonetheless, longitudinal studies investigating the effects of structured music education on cognitive sub-functions are still rare. Prime candidates for investigating a relationship between academic achievement and music education appear to be executive functions such as planning, working memory, and inhibition.Methods: One hundred and forty-seven p...

  17. Analysis of doses reported to the Health and Safety Executive's Central Index of Dose Information

    International Nuclear Information System (INIS)

    1993-01-01

    This publication analyses the occupational exposure summary information reported to the UK Health and Safety Executive for the years 1986-91. In particular, it considers evidence pointing to the relative success of employers in restricting occupational exposure to ionising radiation over this period. Exposure in the nuclear industry, industrial radiography, non-coal miners underground, medicine, dentistry and the transport sector is discussed. (UK)

  18. Age-related commonalities and differences in the relationship between executive functions and intelligence: Analysis of the NAB executive functions module and WAIS-IV scores.

    Science.gov (United States)

    Buczylowska, Dorota; Petermann, Franz

    2017-01-01

    Data from five subtests of the Executive Functions Module of the German Neuropsychological Assessment Battery (NAB) and all ten core subtests of the German Wechsler Adult Intelligence Scale - Fourth Edition (WAIS-IV) were used to examine the relationship between executive functions and intelligence in a comparison of two age groups: individuals aged 18-59 years and individuals aged 60-88 years. The NAB subtests Categories and Word Generation demonstrated a consistent correlation pattern for both age groups. However, the NAB Judgment subtest correlated more strongly with three WAIS-IV indices, the Full Scale IQ (FSIQ), and the General Ability Index (GAI) in the older adult group than in the younger group. Additionally, in the 60-88 age group, the Executive Functions Index (EFI) was more strongly correlated with the Verbal Comprehension Index (VCI) than with the Perceptual Reasoning Index (PRI). Both age groups demonstrated a strong association of the EFI with the FSIQ and the Working Memory Index (WMI). The results imply the potential diagnostic utility of the Judgment subtest and a significant relationship between executive functioning and crystallized intelligence at older ages. Furthermore, it may be concluded that there is a considerable age-independent overlap between the EFI and general intelligence, as well as between the EFI and working memory.

  19. Executive Dysfunction

    Science.gov (United States)

    Rabinovici, Gil D.; Stephens, Melanie L.; Possin, Katherine L.

    2015-01-01

    Purpose of Review: Executive functions represent a constellation of cognitive abilities that drive goal-oriented behavior and are critical to the ability to adapt to an ever-changing world. This article provides a clinically oriented approach to classifying, localizing, diagnosing, and treating disorders of executive function, which are pervasive in clinical practice. Recent Findings: Executive functions can be split into four distinct components: working memory, inhibition, set shifting, and fluency. These components may be differentially affected in individual patients and act together to guide higher-order cognitive constructs such as planning and organization. Specific bedside and neuropsychological tests can be applied to evaluate components of executive function. While dysexecutive syndromes were first described in patients with frontal lesions, intact executive functioning relies on distributed neural networks that include not only the prefrontal cortex, but also the parietal cortex, basal ganglia, thalamus, and cerebellum. Executive dysfunction arises from injury to any of these regions, their white matter connections, or neurotransmitter systems. Dysexecutive symptoms therefore occur in most neurodegenerative diseases and in many other neurologic, psychiatric, and systemic illnesses. Management approaches are patient specific and should focus on treatment of the underlying cause in parallel with maximizing patient function and safety via occupational therapy and rehabilitation. Summary: Executive dysfunction is extremely common in patients with neurologic disorders. Diagnosis and treatment hinge on familiarity with the clinical components and neuroanatomic correlates of these complex, high-order cognitive processes. PMID:26039846

  20. Integrated framework for dynamic safety analysis

    International Nuclear Information System (INIS)

    Kim, Tae Wan; Karanki, Durga R.

    2012-01-01

    In the conventional PSA (Probabilistic Safety Assessment), detailed plant simulations by independent thermal hydraulic (TH) codes are used in the development of accident sequence models. Typical accidents in a NPP involve complex interactions among process, safety systems, and operator actions. As independent TH codes do not have the models of operator actions and full safety systems, they cannot literally simulate the integrated and dynamic interactions of process, safety systems, and operator responses. Offline simulation with pre decided states and time delays may not model the accident sequences properly. Moreover, when stochastic variability in responses of accident models is considered, defining all the combinations for simulations will be cumbersome task. To overcome some of these limitations of conventional safety analysis approach, TH models are coupled with the stochastic models in the dynamic event tree (DET) framework, which provides flexibility to model the integrated response due to better communication as all the accident elements are in the same model. The advantages of this framework also include: Realistic modeling in dynamic scenarios, comprehensive results, integrated approach (both deterministic and probabilistic models), and support for HRA (Human Reliability Analysis)

  1. Integrating health and environmental impact analysis

    DEFF Research Database (Denmark)

    Reis, S; Morris, G.; Fleming, L. E.

    2015-01-01

    which addresses human activity in all its social, economic and cultural complexity. The new approach must be integral to, and interactive, with the natural environment. We see the continuing failure to truly integrate human health and environmental impact analysis as deeply damaging, and we propose...... while equally emphasizing the health of the environment, and the growing calls for 'ecological public health' as a response to global environmental concerns now suffusing the discourse in public health. More revolution than evolution, ecological public health will demand new perspectives regarding...... the interconnections among society, the economy, the environment and our health and well-being. Success must be built on collaborations between the disparate scientific communities of the environmental sciences and public health as well as interactions with social scientists, economists and the legal profession...

  2. Advancing Alternative Analysis: Integration of Decision Science.

    Science.gov (United States)

    Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina M; Blake, Ann; Carroll, William F; Corbett, Charles J; Hansen, Steffen Foss; Lempert, Robert J; Linkov, Igor; McFadden, Roger; Moran, Kelly D; Olivetti, Elsa; Ostrom, Nancy K; Romero, Michelle; Schoenung, Julie M; Seager, Thomas P; Sinsheimer, Peter; Thayer, Kristina A

    2017-06-13

    Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate the safety and viability of potential substitutes for hazardous chemicals. We assessed whether decision science may assist the alternatives analysis decision maker in comparing alternatives across a range of metrics. A workshop was convened that included representatives from government, academia, business, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and were prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect on other groups' findings. We concluded that the further incorporation of decision science into alternatives analysis would advance the ability of companies and regulators to select alternatives to harmful ingredients and would also advance the science of decision analysis. We advance four recommendations: a ) engaging the systematic development and evaluation of decision approaches and tools; b ) using case studies to advance the integration of decision analysis into alternatives analysis; c ) supporting transdisciplinary research; and d ) supporting education and outreach efforts. https://doi.org/10.1289/EHP483.

  3. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  4. Structural integrity analysis of a steam turbine

    International Nuclear Information System (INIS)

    Villagarcia, Maria P.

    1997-01-01

    One of the most critical components of a power utility is the rotor of the steam turbine. Catastrophic failures of the last decades have promoted the development of life assessment procedures for rotors. The present study requires the knowledge of operating conditions, component geometry, the properties of materials, history of the component, size, location and nature of the existing flaws. The aim of the present work is the obtention of a structural integrity analysis procedure for a steam turbine rotor, taking into account the above-mentioned parameters. In this procedure, a stress thermal analysis by finite elements is performed initially, in order to obtain the temperature and stress distribution for a subsequent analysis by fracture mechanics. The risk of a fast fracture due to flaws in the central zone of the rotor is analyzed. The procedure is applied to an operating turbine: the main steam turbine of the Atucha I nuclear power utility. (author)

  5. A taxonomy of integral reaction path analysis

    Energy Technology Data Exchange (ETDEWEB)

    Grcar, Joseph F.; Day, Marcus S.; Bell, John B.

    2004-12-23

    W. C. Gardiner observed that achieving understanding through combustion modeling is limited by the ability to recognize the implications of what has been computed and to draw conclusions about the elementary steps underlying the reaction mechanism. This difficulty can be overcome in part by making better use of reaction path analysis in the context of multidimensional flame simulations. Following a survey of current practice, an integral reaction flux is formulated in terms of conserved scalars that can be calculated in a fully automated way. Conditional analyses are then introduced, and a taxonomy for bidirectional path analysis is explored. Many examples illustrate the resulting path analysis and uncover some new results about nonpremixed methane-air laminar jets.

  6. The ASDEX integrated data analysis system AIDA

    International Nuclear Information System (INIS)

    Grassie, K.; Gruber, O.; Kardaun, O.; Kaufmann, M.; Lackner, K.; Martin, P.; Mast, K.F.; McCarthy, P.J.; Mertens, V.; Pohl, D.; Rang, U.; Wunderlich, R.

    1989-11-01

    Since about two years, the ASDEX integrated data analysis system (AIDA), which combines the database (DABA) and the statistical analysis system (SAS), is successfully in operation. Besides a considerable, but meaningful, reduction of the 'raw' shot data, it offers the advantage of carefully selected and precisely defined datasets, which are easily accessible for informative tabular data overviews (DABA), and multi-shot analysis (SAS). Even rather complicated, statistical analyses can be performed efficiently within this system. In this report, we want to summarise AIDA's main features, give some details on its set-up and on the physical models which have been used for the derivation of the processed data. We also give short introduction how to use DABA and SAS. (orig.)

  7. Accelerator physics analysis with an integrated toolkit

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.; Satogata, T.

    1992-08-01

    Work is in progress on an integrated software toolkit for linear and nonlinear accelerator design, analysis, and simulation. As a first application, ''beamline'' and ''MXYZPTLK'' (differential algebra) class libraries, were used with an X Windows graphics library to build an user-friendly, interactive phase space tracker which, additionally, finds periodic orbits. This program was used to analyse a theoretical lattice which contains octupoles and decapoles to find the 20th order, stable and unstable periodic orbits and to explore the local phase space structure

  8. The Integral A Crux for Analysis

    CERN Document Server

    Krantz, Steven G

    2011-01-01

    This book treats all of the most commonly used theories of the integral. After motivating the idea of integral, we devote a full chapter to the Riemann integral and the next to the Lebesgue integral. Another chapter compares and contrasts the two theories. The concluding chapter offers brief introductions to the Henstock integral, the Daniell integral, the Stieltjes integral, and other commonly used integrals. The purpose of this book is to provide a quick but accurate (and detailed) introduction to all aspects of modern integration theory. It should be accessible to any student who has had ca

  9. Numerical Analysis of Diaphragm Wall Model Executed in Poznań Clay Formation Applying Selected Fem Codes

    Directory of Open Access Journals (Sweden)

    Superczyńska M.

    2016-09-01

    Full Text Available The paper presents results of numerical calculations of a diaphragm wall model executed in Poznań clay formation. Two selected FEM codes were applied, Plaxis and Abaqus. Geological description of Poznań clay formation in Poland as well as geotechnical conditions on construction site in Warsaw city area were presented. The constitutive models of clay implemented both in Plaxis and Abaqus were discussed. The parameters of the Poznań clay constitutive models were assumed based on authors’ experimental tests. The results of numerical analysis were compared taking into account the measured values of horizontal displacements.

  10. Making working memory work: A meta-analysis of executive control and working memory training in younger and older adults

    OpenAIRE

    Karbach, Julia; Verhaeghen, Paul

    2014-01-01

    This meta-analysis examined the effects of process-based cognitive training (49 studies) in the domains of executive function and working memory in older adults (>60 years). The interventions resulted in significant effects on the trained task (pre-to-posttest net gain: MSD = 0.5 compared to active control, MSD = 0.8 compared to passive control; net posttest effect: MSD = 1.2 compared to active control, MSD = 1.1 compared to passive control), significant near transfer (pre-post: MSD = 0.3, 0....

  11. Worst-case Throughput Analysis for Parametric Rate and Parametric Actor Execution Time Scenario-Aware Dataflow Graphs

    Directory of Open Access Journals (Sweden)

    Mladen Skelin

    2014-03-01

    Full Text Available Scenario-aware dataflow (SADF is a prominent tool for modeling and analysis of dynamic embedded dataflow applications. In SADF the application is represented as a finite collection of synchronous dataflow (SDF graphs, each of which represents one possible application behaviour or scenario. A finite state machine (FSM specifies the possible orders of scenario occurrences. The SADF model renders the tightest possible performance guarantees, but is limited by its finiteness. This means that from a practical point of view, it can only handle dynamic dataflow applications that are characterized by a reasonably sized set of possible behaviours or scenarios. In this paper we remove this limitation for a class of SADF graphs by means of SADF model parametrization in terms of graph port rates and actor execution times. First, we formally define the semantics of the model relevant for throughput analysis based on (max,+ linear system theory and (max,+ automata. Second, by generalizing some of the existing results, we give the algorithms for worst-case throughput analysis of parametric rate and parametric actor execution time acyclic SADF graphs with a fully connected, possibly infinite state transition system. Third, we demonstrate our approach on a few realistic applications from digital signal processing (DSP domain mapped onto an embedded multi-processor architecture.

  12. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  13. Office of Integrated Assessment and Policy Analysis

    International Nuclear Information System (INIS)

    Parzyck, D.C.

    1980-01-01

    The mission of the Office of Integrated Assessments and Policy Analysis (OIAPA) is to examine current and future policies related to the development and use of energy technologies. The principal ongoing research activity to date has focused on the impacts of several energy sources, including coal, oil shale, solar, and geothermal, from the standpoint of the Resource Conservation and Recovery Act. An additional project has recently been initiated on an evaluation of impacts associated with the implementation of the Toxic Substances Control Act. The impacts of the Resource Conservation and Recovery Act and the Toxic Substances Control Act on energy supply constitute the principal research focus of OIAPA for the near term. From these studies a research approach will be developed to identify certain common elements in the regulatory evaluation cycle as a means of evaluating subsequent environmental, health, and socioeconomic impact. It is planned that an integrated assessment team examine studies completed or underway on the following aspects of major regulations: health, risk assessment, testing protocols, environment control cost/benefits, institutional structures, and facility siting. This examination would assess the methodologies used, determine the general applicability of such studies, and present in a logical form information that appears to have broad general application. A suggested action plan for the State of Tennessee on radioactive and hazardous waste management is outlined

  14. Integrating Data Transformation in Principal Components Analysis

    KAUST Repository

    Maadooliat, Mehdi

    2015-01-02

    Principal component analysis (PCA) is a popular dimension reduction method to reduce the complexity and obtain the informative aspects of high-dimensional datasets. When the data distribution is skewed, data transformation is commonly used prior to applying PCA. Such transformation is usually obtained from previous studies, prior knowledge, or trial-and-error. In this work, we develop a model-based method that integrates data transformation in PCA and finds an appropriate data transformation using the maximum profile likelihood. Extensions of the method to handle functional data and missing values are also developed. Several numerical algorithms are provided for efficient computation. The proposed method is illustrated using simulated and real-world data examples.

  15. Do tasks make a difference? Accounting for heterogeneity of performance of children with reading difficulties on tasks of executive function: findings from a meta-analysis.

    Science.gov (United States)

    Booth, Josephine N; Boyle, James M E; Kelly, Steve W

    2010-03-01

    Research studies have implicated executive functions in reading difficulties (RD). But while some studies have found children with RD to be impaired on tasks of executive function other studies report unimpaired performance. A meta-analysis was carried out to determine whether these discrepant findings can be accounted for by differences in the tasks of executive function that are utilized. A total of 48 studies comparing the performance on tasks of executive function of children with RD with their typically developing peers were included in the meta-analysis, yielding 180 effect sizes. An overall effect size of 0.57 (SE .03) was obtained, indicating that children with RD have impairments on tasks of executive function. However, effect sizes varied considerably suggesting that the impairment is not uniform. Moderator analysis revealed that task modality and IQ-achievement discrepancy definitions of RD influenced the magnitude of effect; however, the age and gender of participants and the nature of the RD did not have an influence. While the children's RD were associated with executive function impairments, variation in effect size is a product of the assessment task employed, underlying task demands, and definitional criteria.

  16. Executive summary

    NARCIS (Netherlands)

    van Nimwegen, N.; van Nimwegen, N.; van der Erf, R.

    2009-01-01

    The Demography Monitor 2008 gives a concise overview of current demographic trends and related developments in education, the labour market and retirement for the European Union and some other countries. This executive summary highlights the major findings of the Demography Monitor 2008 and further

  17. The Use of Canonical Correlation Analysis to Assess the Relationship Between Executive Functioning and Verbal Memory in Older Adults

    Directory of Open Access Journals (Sweden)

    Pedro Silva Moreira MSc

    2015-08-01

    Full Text Available Executive functioning (EF, which is considered to govern complex cognition, and verbal memory (VM are constructs assumed to be related. However, it is not known the magnitude of the association between EF and VM, and how sociodemographic and psychological factors may affect this relationship, including in normal aging. In this study, we assessed different EF and VM parameters, via a battery of neurocognitive/psychological tests, and performed a Canonical Correlation Analysis (CCA to explore the connection between these constructs, in a sample of middle-aged and older healthy individuals without cognitive impairment ( N = 563, 50+ years of age. The analysis revealed a positive and moderate association between EF and VM independently of gender, age, education, global cognitive performance level, and mood. These results confirm that EF presents a significant association with VM performance.

  18. Life sciences payload definition and integration study. Volume 1: Executive summary. [carry-on laboratory for Spacelab

    Science.gov (United States)

    1974-01-01

    The definition and integration tasks involved in the development of design concepts for a carry-on laboratory (COL), to be compatible with Spacelab operations, were divided into the following study areas: (1) identification of research and equipment requirements of the COL; (2) development of a number of conceptual layouts for COL based on the defined research of final conceptual designs; and (4) development of COL planning information for definition of COL/Spacelab interface data, cost data, and program cost schedules, including design drawings of a selected COL to permit fabrication of a functional breadboard.

  19. Making working memory work: a meta-analysis of executive-control and working memory training in older adults.

    Science.gov (United States)

    Karbach, Julia; Verhaeghen, Paul

    2014-11-01

    This meta-analysis examined the effects of process-based executive-function and working memory training (49 articles, 61 independent samples) in older adults (> 60 years). The interventions resulted in significant effects on performance on the trained task and near-transfer tasks; significant results were obtained for the net pretest-to-posttest gain relative to active and passive control groups and for the net effect at posttest relative to active and passive control groups. Far-transfer effects were smaller than near-transfer effects but were significant for the net pretest-to-posttest gain relative to passive control groups and for the net gain at posttest relative to both active and passive control groups. We detected marginally significant differences in training-induced improvements between working memory and executive-function training, but no differences between the training-induced improvements observed in older adults and younger adults, between the benefits associated with adaptive and nonadaptive training, or between the effects in active and passive control conditions. Gains did not vary with total training time. © The Author(s) 2014.

  20. Qualitative Analysis of Integration Adapter Modeling

    OpenAIRE

    Ritter, Daniel; Holzleitner, Manuel

    2015-01-01

    Integration Adapters are a fundamental part of an integration system, since they provide (business) applications access to its messaging channel. However, their modeling and configuration remain under-represented. In previous work, the integration control and data flow syntax and semantics have been expressed in the Business Process Model and Notation (BPMN) as a semantic model for message-based integration, while adapter and the related quality of service modeling were left for further studi...

  1. PHIDIAS- Pathogen Host Interaction Data Integration and Analysis

    Indian Academy of Sciences (India)

    PHIDIAS- Pathogen Host Interaction Data Integration and Analysis- allows searching of integrated genome sequences, conserved domains and gene expressions data related to pathogen host interactions in high priority agents for public health and security ...

  2. A lightweight messaging-based distributed processing and workflow execution framework for real-time and big data analysis

    Science.gov (United States)

    Laban, Shaban; El-Desouky, Aly

    2014-05-01

    To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear

  3. Specification and Preliminary Validation of IAT (Integrated Analysis Techniques) Methods: Executive Summary.

    Science.gov (United States)

    1985-03-01

    conceptual framwork , and preliminary validation of IAT concepts. Planned work for FY85, including more extensive validation, is also described. 20...Developments: Required Capabilities .... ......... 10 2-1 IAT Conceptual Framework - FY85 (FEO) ..... ........... 11 2-2 Recursive Nature of Decomposition...approach: 1) Identify needs & requirements for IAT. 2) Develop IAT conceptual framework. 3) Validate IAT methods. 4) Develop applications materials. To

  4. Executive functioning deficits among adults with Bipolar Disorder (types I and II): A systematic review and meta-analysis.

    Science.gov (United States)

    Dickinson, Tania; Becerra, Rodrigo; Coombes, Jacqui

    2017-08-15

    Executive functioning (EF) deficits contribute to a significant proportion of the burden of disease associated with bipolar disorder (BD). Yet, there is still debate in the literature regarding the exact profile of executive functioning in BD. The purpose of the present project was to assess whether EF deficits exist among adults suffering BD, and whether these deficits (if apparent) differ by BD subtype. A systematic search identified relevant literature. Randomised controlled trials that used neuropsychological assessment to investigate EF among adults 16-65 years) with a remitted DSM diagnosis of BD (type I or II) were included. Studies were published between 1994 and 2015. A systematic review and meta-analysis were undertaken. For individual studies, standardised mean differences (Cohen's d) and 95% confidence intervals were calculated and represented in forest plots to illustrate differences in executive performance between groups. Summary effects were produced and tests of heterogeneity employed to assess the dispersion and generalisability of results. Thirty-six studies met criteria for inclusion. Six domains of EF were identified: Set-shifting (SS), inhibition (INH), planning (PLA), verbal fluency (VF), working memory (WM), and attention (ATT). BD1s performed worse than HCs in all domains. BD2s demonstrated impairment in VF, WM, SS, and ATT. The results were mixed for comparisons between BD1s and BD2s, but revealed that BD2s can experience similar (or sometimes greater) EF impairment. Only a limited number of studies that included BD2 samples were available for inclusion in the current study. Subgroup analysis to elucidate potential moderators of within-study variance was not undertaken. This is the first systematic review and meta-analysis to have compared the EF of remitted BD1s, BD2s, and HCs. The results provided useful insight into the EF profile of patients with BD, and offered commentary as to some of the contradictory results reported in the

  5. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  6. Migration in Deltas: An Integrated Analysis

    Science.gov (United States)

    Nicholls, Robert J.; Hutton, Craig W.; Lazar, Attila; Adger, W. Neil; Allan, Andrew; Arto, Inaki; Vincent, Katharine; Rahman, Munsur; Salehin, Mashfiqus; Sugata, Hazra; Ghosh, Tuhin; Codjoe, Sam; Appeaning-Addo, Kwasi

    2017-04-01

    Deltas and low-lying coastal regions have long been perceived as vulnerable to global sea-level rise, with the potential for mass displacement of exposed populations. The assumption of mass displacement of populations in deltas requires a comprehensive reassessment in the light of present and future migration in deltas, including the potential role of adaptation to influence these decisions. At present, deltas are subject to multiple drivers of environmental change and often have high population densities as they are accessible and productive ecosystems. Climate change, catchment management, subsidence and land cover change drive environmental change across all deltas. Populations in deltas are also highly mobile, with significant urbanization trends and the growth of large cities and mega-cities within or adjacent to deltas across Asia and Africa. Such migration is driven primarily by economic opportunity, yet environmental change in general, and climate change in particular, are likely to play an increasing direct and indirect role in future migration trends. The policy challenges centre on the role of migration within regional adaptation strategies to climate change; the protection of vulnerable populations; and the future of urban settlements within deltas. This paper reviews current knowledge on migration and adaptation to environmental change to discern specific issues pertinent to delta regions. It develops a new integrated methodology to assess present and future migration in deltas using the Volta delta in Ghana, Mahanadi delta in India and Ganges-Brahmaputra-Meghna delta across India and Bangladesh. The integrated method focuses on: biophysical changes and spatial distribution of vulnerability; demographic changes and migration decision-making using multiple methods and data; macro-economic trends and scenarios in the deltas; and the policies and governance structures that constrain and enable adaptation. The analysis is facilitated by a range of

  7. Fast, Interactive Worst-Case Execution Time Analysis With Back-Annotation

    DEFF Research Database (Denmark)

    Harmon, Trevor; Schoeberl, Martin; Kirner, Raimund

    2012-01-01

    into the development cycle, requiring WCET analysis to be postponed until a final verification phase. In this paper, we propose interactive WCET analysis as a new method to provide near-instantaneous WCET feedback to the developer during software programming. We show that interactive WCET analysis is feasible using...

  8. Integration of risk analysis, land use planning, and cost analysis

    International Nuclear Information System (INIS)

    Rajen, G.; Sanchez, G.

    1994-01-01

    The Department of Energy (DOE) and the Pueblo of San Ildefonso (Pueblo), which is a sovereign Indian tribe, have often been involved in adversarial situations regarding the Los Alamos National Laboratory (LANL). The Pueblo shares a common boundary with the LANL. This paper describes an on-going project that could alter the DOE and the Pueblo's relationship to one of cooperation; and unite the DOE and the Pueblo in a Pollution Prevention/Waste Minimization, and Integrated Risk Analysis and Land Use Planning effort

  9. IMP: Integrated method for power analysis

    Energy Technology Data Exchange (ETDEWEB)

    1989-03-01

    An integrated, easy to use, economical package of microcomputer programs has been developed which can be used by small hydro developers to evaluate potential sites for small scale hydroelectric plants in British Columbia. The programs enable evaluation of sites located far from the nearest stream gauging station, for which streamflow data are not available. For each of the province's 6 hydrologic regions, a streamflow record for one small watershed is provided in the data base. The program can then be used to generate synthetic streamflow records and to compare results obtained by the modelling procedure with the actual data. The program can also be used to explore the significance of modelling parameters and to develop a detailed appreciation for the accuracy which can be obtained under various circumstances. The components of the program are an atmospheric model of precipitation; a watershed model that will generate a continuous series of streamflow data, based on information from the atmospheric model; a flood frequency analysis system that uses site-specific topographic data plus information from the atmospheric model to generate a flood frequency curve; a hydroelectric power simulation program which determines daily energy output for a run-of-river or reservoir storage site based on selected generation facilities and the time series generated in the watershed model; and a graphic analysis package that provides direct visualization of data and modelling results. This report contains a description of the programs, a user guide, the theory behind the model, the modelling methodology, and results from a workshop that reviewed the program package. 32 refs., 16 figs., 18 tabs.

  10. INTEGRATED MANAGEMENT SYSTEM: THE ROLE OF THE EXECUTIVE SECRETARY PROFESSIONALS SISTEMA DE GESTÃO INTEGRADO: A ATUAÇÃO DO SECRETÁRIO EXECUTIVO

    Directory of Open Access Journals (Sweden)

    Marcilia Helena de Sousa Mascarenhas

    2011-10-01

    Full Text Available

    Considering the advance of globalization and high competition between the companies, the need of continuous improvement is a requirement of the market. The implementation of the System of Management Integrated (SGI enables the company to and qualify its participants to have higher productivity, with smaller cost, preserving the health of its employees and the environment. The SGI involves the application of approaches for attend to the requirements of the Quality Systems, Environmental management, System of Security and Health in the Work and Social Responsibility, which are determined by Brazilian and/or international standards. In this context it is possible to see the changes in the profile of the executive secretarial professional, making him more qualified and acting straightly in the management skills and assisting in its management processes to ensure satisfactory outcomes for the organization. The present article covers through bibliographical research the concepts of the various management systems and the importance of the action of the executive secretarial professional assistance to the manager in the implementation of the System of Management Integrated.

    Considerando o avanço da globalização e a alta competitividade entre as empresas, a necessidade de melhoria contínua é uma exigência do mercado. A implementação do Sistema de Gestão Integrado (SGI viabiliza qualificar a empresa e capacitar seus participantes para ter maior produtividade, com menor custo, preservando a saúde dos seus funcionários e o meio ambiente. O SGI envolve a aplicação de métodos para atender aos requisitos dos Sistemas da Qualidade, de Gestão Ambiental, Sistema de Segurança e Saúde no Trabalho e de Responsabilidade Social, que são determinados por normas brasileiras e/ou internacionais. Neste contexto observam-se as mudanças no perfil do profissional de secretariado executivo, tornando-o mais

  11. Identifying patterns of motor performance, executive functioning, and verbal ability in preschool children: A latent profile analysis.

    Science.gov (United States)

    Houwen, Suzanne; Kamphorst, Erica; van der Veer, Gerda; Cantell, Marja

    2018-04-30

    A relationship between motor performance and cognitive functioning is increasingly being recognized. Yet, little is known about the precise nature of the relationship between both domains, especially in early childhood. To identify distinct constellations of motor performance, executive functioning (EF), and verbal ability in preschool aged children; and to explore how individual and contextual variables are related to profile membership. The sample consisted of 119 3- to 4-year old children (62 boys; 52%). The home based assessments consisted of a standardized motor test (Movement Assessment Battery for Children - 2), five performance-based EF tasks measuring inhibition and working memory, and the Receptive Vocabulary subtest from the Wechsler Preschool and Primary Scale of Intelligence Third Edition. Parents filled out the Behavior Rating Inventory of Executive Function - Preschool version. Latent profile analysis (LPA) was used to delineate profiles of motor performance, EF, and verbal ability. Chi-square statistics and multinomial logistic regression analysis were used to examine whether profile membership was predicted by age, gender, risk of motor coordination difficulties, ADHD symptomatology, language problems, and socioeconomic status (SES). LPA yielded three profiles with qualitatively distinct response patterns of motor performance, EF, and verbal ability. Quantitatively, the profiles showed most pronounced differences with regard to parent ratings and performance-based tests of EF, as well as verbal ability. Risk of motor coordination difficulties and ADHD symptomatology were associated with profile membership, whereas age, gender, language problems, and SES were not. Our results indicate that there are distinct subpopulations of children who show differential relations with regard to motor performance, EF, and verbal ability. The fact that we found both quantitative as well as qualitative differences between the three patterns of profiles underscores

  12. K West integrated water treatment system subproject safety analysis document

    International Nuclear Information System (INIS)

    SEMMENS, L.S.

    1999-01-01

    This Accident Analysis evaluates unmitigated accident scenarios, and identifies Safety Significant and Safety Class structures, systems, and components for the K West Integrated Water Treatment System

  13. K West integrated water treatment system subproject safety analysis document

    Energy Technology Data Exchange (ETDEWEB)

    SEMMENS, L.S.

    1999-02-24

    This Accident Analysis evaluates unmitigated accident scenarios, and identifies Safety Significant and Safety Class structures, systems, and components for the K West Integrated Water Treatment System.

  14. Integrating health and environmental impact analysis.

    Science.gov (United States)

    Reis, S; Morris, G; Fleming, L E; Beck, S; Taylor, T; White, M; Depledge, M H; Steinle, S; Sabel, C E; Cowie, H; Hurley, F; Dick, J McP; Smith, R I; Austen, M

    2015-10-01

    Scientific investigations have progressively refined our understanding of the influence of the environment on human health, and the many adverse impacts that human activities exert on the environment, from the local to the planetary level. Nonetheless, throughout the modern public health era, health has been pursued as though our lives and lifestyles are disconnected from ecosystems and their component organisms. The inadequacy of the societal and public health response to obesity, health inequities, and especially global environmental and climate change now calls for an ecological approach which addresses human activity in all its social, economic and cultural complexity. The new approach must be integral to, and interactive, with the natural environment. We see the continuing failure to truly integrate human health and environmental impact analysis as deeply damaging, and we propose a new conceptual model, the ecosystems-enriched Drivers, Pressures, State, Exposure, Effects, Actions or 'eDPSEEA' model, to address this shortcoming. The model recognizes convergence between the concept of ecosystems services which provides a human health and well-being slant to the value of ecosystems while equally emphasizing the health of the environment, and the growing calls for 'ecological public health' as a response to global environmental concerns now suffusing the discourse in public health. More revolution than evolution, ecological public health will demand new perspectives regarding the interconnections among society, the economy, the environment and our health and well-being. Success must be built on collaborations between the disparate scientific communities of the environmental sciences and public health as well as interactions with social scientists, economists and the legal profession. It will require outreach to political and other stakeholders including a currently largely disengaged general public. The need for an effective and robust science-policy interface has

  15. III SBC Guidelines on the Analysis and Issuance of Electrocardiographic Reports - Executive Summary

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Pastore

    Full Text Available Abstract The third version of the guidelines covers recently described topics, such as ion channel diseases, acute ischemic changes, the electrocardiogram in athletes, and analysis of ventricular repolarization. It sought to revise the criteria for overloads, conduction disorders, and analysis of data for internet transmission.

  16. Noise analysis of switched integrator preamplifiers

    International Nuclear Information System (INIS)

    Sun Hongbo; Li Yulan; Zhu Weibin

    2004-01-01

    The main noise sources of switched integrator preamplifiers are discussed, and their noise performance are given combined PSpice simulation and experiments on them. Then, some practical methods on how to reduce noise of preamplifiers in two different integrator modes are provided. (authors)

  17. Social Ecological Model Analysis for ICT Integration

    Science.gov (United States)

    Zagami, Jason

    2013-01-01

    ICT integration of teacher preparation programmes was undertaken by the Australian Teaching Teachers for the Future (TTF) project in all 39 Australian teacher education institutions and highlighted the need for guidelines to inform systemic ICT integration approaches. A Social Ecological Model (SEM) was used to positively inform integration…

  18. IPAD: the Integrated Pathway Analysis Database for Systematic Enrichment Analysis.

    Science.gov (United States)

    Zhang, Fan; Drabier, Renee

    2012-01-01

    Next-Generation Sequencing (NGS) technologies and Genome-Wide Association Studies (GWAS) generate millions of reads and hundreds of datasets, and there is an urgent need for a better way to accurately interpret and distill such large amounts of data. Extensive pathway and network analysis allow for the discovery of highly significant pathways from a set of disease vs. healthy samples in the NGS and GWAS. Knowledge of activation of these processes will lead to elucidation of the complex biological pathways affected by drug treatment, to patient stratification studies of new and existing drug treatments, and to understanding the underlying anti-cancer drug effects. There are approximately 141 biological human pathway resources as of Jan 2012 according to the Pathguide database. However, most currently available resources do not contain disease, drug or organ specificity information such as disease-pathway, drug-pathway, and organ-pathway associations. Systematically integrating pathway, disease, drug and organ specificity together becomes increasingly crucial for understanding the interrelationships between signaling, metabolic and regulatory pathway, drug action, disease susceptibility, and organ specificity from high-throughput omics data (genomics, transcriptomics, proteomics and metabolomics). We designed the Integrated Pathway Analysis Database for Systematic Enrichment Analysis (IPAD, http://bioinfo.hsc.unt.edu/ipad), defining inter-association between pathway, disease, drug and organ specificity, based on six criteria: 1) comprehensive pathway coverage; 2) gene/protein to pathway/disease/drug/organ association; 3) inter-association between pathway, disease, drug, and organ; 4) multiple and quantitative measurement of enrichment and inter-association; 5) assessment of enrichment and inter-association analysis with the context of the existing biological knowledge and a "gold standard" constructed from reputable and reliable sources; and 6) cross-linking of

  19. Integrated High-Level Waste System Planning - Utilizing an Integrated Systems Planning Approach to Ensure End-State Definitions are Met and Executed - 13244

    Energy Technology Data Exchange (ETDEWEB)

    Ling, Lawrence T. [URS-Savannah River Remediation, Savannah River Site, Building 766-H Room 2205, Aiken, SC 29808 (United States); Chew, David P. [URS-Savannah River Remediation, Savannah River Site, Building 766-H Room 2426, Aiken, SC 29808 (United States)

    2013-07-01

    The Savannah River Site (SRS) is a Department of Energy site which has produced nuclear materials for national defense, research, space, and medical programs since the 1950's. As a by-product of this activity, approximately 37 million gallons of high-level liquid waste containing approximately 292 million curies of radioactivity is stored on an interim basis in 45 underground storage tanks. Originally, 51 tanks were constructed and utilized to support the mission. Four tanks have been closed and taken out of service and two are currently undergoing the closure process. The Liquid Waste System is a highly integrated operation involving safely storing liquid waste in underground storage tanks; removing, treating, and dispositioning the low-level waste fraction in grout; vitrifying the higher activity waste at the Defense Waste Processing Facility; and storing the vitrified waste in stainless steel canisters until permanent disposition. After waste removal and processing, the storage and processing facilities are decontaminated and closed. A Liquid Waste System Plan (hereinafter referred to as the Plan) was developed to integrate and document the activities required to disposition legacy and future High-Level Waste and to remove from service radioactive liquid waste tanks and facilities. It establishes and records a planning basis for waste processing in the liquid waste system through the end of the program mission. The integrated Plan which recognizes the challenges of constrained funding provides a path forward to complete the liquid waste mission within all regulatory and legal requirements. The overarching objective of the Plan is to meet all Federal Facility Agreement and Site Treatment Plan regulatory commitments on or ahead of schedule while preserving as much life cycle acceleration as possible through incorporation of numerous cost savings initiatives, elimination of non-essential scope, and deferral of other scope not on the critical path to compliance

  20. Integrated High-Level Waste System Planning - Utilizing an Integrated Systems Planning Approach to Ensure End-State Definitions are Met and Executed - 13244

    International Nuclear Information System (INIS)

    Ling, Lawrence T.; Chew, David P.

    2013-01-01

    The Savannah River Site (SRS) is a Department of Energy site which has produced nuclear materials for national defense, research, space, and medical programs since the 1950's. As a by-product of this activity, approximately 37 million gallons of high-level liquid waste containing approximately 292 million curies of radioactivity is stored on an interim basis in 45 underground storage tanks. Originally, 51 tanks were constructed and utilized to support the mission. Four tanks have been closed and taken out of service and two are currently undergoing the closure process. The Liquid Waste System is a highly integrated operation involving safely storing liquid waste in underground storage tanks; removing, treating, and dispositioning the low-level waste fraction in grout; vitrifying the higher activity waste at the Defense Waste Processing Facility; and storing the vitrified waste in stainless steel canisters until permanent disposition. After waste removal and processing, the storage and processing facilities are decontaminated and closed. A Liquid Waste System Plan (hereinafter referred to as the Plan) was developed to integrate and document the activities required to disposition legacy and future High-Level Waste and to remove from service radioactive liquid waste tanks and facilities. It establishes and records a planning basis for waste processing in the liquid waste system through the end of the program mission. The integrated Plan which recognizes the challenges of constrained funding provides a path forward to complete the liquid waste mission within all regulatory and legal requirements. The overarching objective of the Plan is to meet all Federal Facility Agreement and Site Treatment Plan regulatory commitments on or ahead of schedule while preserving as much life cycle acceleration as possible through incorporation of numerous cost savings initiatives, elimination of non-essential scope, and deferral of other scope not on the critical path to compliance

  1. The Effects of Reducing Preparation Time on the Execution of Intentionally Curved Trajectories: Optimization and Geometrical Analysis

    Directory of Open Access Journals (Sweden)

    Dovrat Kohen

    2017-06-01

    Full Text Available When subjects are intentionally preparing a curved trajectory, they are engaged in a time-consuming trajectory planning process that is separate from target selection. To investigate the construction of such a plan, we examined the effect of artificially shortening preparation time on the performance of intentionally curved trajectories using the Timed Response task that enforces initiation of movements prematurely. Fifteen subjects performed obstacle avoidance movements toward one of four targets that were presented 25 or 350 ms before the “go” signal, imposing short and long preparation time conditions with mean values of 170 ms and 493 ms, respectively. While trajectories with short preparation times showed target specificity at their onset, they were significantly more variable and showed larger angular deviations from the lines connecting their initial position and the target, compared to the trajectories with long preparation times. Importantly, the trajectories of the short preparation time movements still reached their end-point targets accurately, with comparable movement durations. We hypothesize that success in the short preparation time condition is a result of an online control mechanism that allows further refinement of the plan during its execution and study this control mechanism with a novel trajectory analysis approach using minimum jerk optimization and geometrical modeling approaches. Results show a later agreement of the short preparation time trajectories with the optimal minimum jerk trajectory, accompanied by a later initiation of a parabolic segment. Both observations are consistent with the existence of an online trajectory planning process.Our results suggest that when preparation time is not sufficiently long, subjects execute a more variable and less optimally prepared initial trajectory and exploit online control mechanisms to refine their actions on the fly.

  2. The Effects of Reducing Preparation Time on the Execution of Intentionally Curved Trajectories: Optimization and Geometrical Analysis

    Science.gov (United States)

    Kohen, Dovrat; Karklinsky, Matan; Meirovitch, Yaron; Flash, Tamar; Shmuelof, Lior

    2017-01-01

    When subjects are intentionally preparing a curved trajectory, they are engaged in a time-consuming trajectory planning process that is separate from target selection. To investigate the construction of such a plan, we examined the effect of artificially shortening preparation time on the performance of intentionally curved trajectories using the Timed Response task that enforces initiation of movements prematurely. Fifteen subjects performed obstacle avoidance movements toward one of four targets that were presented 25 or 350 ms before the “go” signal, imposing short and long preparation time conditions with mean values of 170 ms and 493 ms, respectively. While trajectories with short preparation times showed target specificity at their onset, they were significantly more variable and showed larger angular deviations from the lines connecting their initial position and the target, compared to the trajectories with long preparation times. Importantly, the trajectories of the short preparation time movements still reached their end-point targets accurately, with comparable movement durations. We hypothesize that success in the short preparation time condition is a result of an online control mechanism that allows further refinement of the plan during its execution and study this control mechanism with a novel trajectory analysis approach using minimum jerk optimization and geometrical modeling approaches. Results show a later agreement of the short preparation time trajectories with the optimal minimum jerk trajectory, accompanied by a later initiation of a parabolic segment. Both observations are consistent with the existence of an online trajectory planning process.Our results suggest that when preparation time is not sufficiently long, subjects execute a more variable and less optimally prepared initial trajectory and exploit online control mechanisms to refine their actions on the fly. PMID:28706478

  3. CFD Analysis for Advanced Integrated Head Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Won Ho; Kang, Tae Kyo; Cho, Yeon Ho; Kim, Hyun Min [KEPCO Engineering and Construction Co., Daejeon (Korea, Republic of)

    2016-10-15

    The Integrated Head Assembly (IHA) is permanently installed on the reactor vessel closure head during the normal plant operation and refueling operation. It consists of a number of systems and components such as the head lifting system, seismic support system, Control Element Drive Mechanism (CEDM) cooling system, cable support system, cooling shroud assemblies. With the operating experiences of the IHA, the needs for the design change to the current APR1400 IHA arouse to improve the seismic resistance and to accommodate the convenient maintenance. In this paper, the effects of the design changes were rigorously studied for the various sizes of the inlet openings to assure the proper cooling of the CEDMs. And the system pressure differentials and required flow rate for the CEDM cooling fan were analyzed regarding the various operating conditions for determining the capacity of the fan. As a part of the design process of the AIHA, the number of air inlets and baffle regions are reduced by simplifying the design of the APR1400 IHA. The design change of the baffle regions has been made such that the maximum possible space are occupied inside the IHA cooling shroud shell while avoiding the interference with CEDMs. So, only the air inlet opening was studied for the design change to supply a sufficient cooling air flow for each CEDM. The size and location of the air inlets in middle cooling shroud assembly were determined by the CFD analyses of the AIHA. And the case CFD analyses were performed depending on the ambient air temperature and fan operating conditions. The size of the air inlet openings is increased by comparison with the initial AIHA design, and it is confirmed that the cooling air flow rate for each CEDM meet the design requirement of 800 SCFM ± 10% with the increased air inlets. At the initial analysis, the fan outlet flow rate was assumed as 48.3 lbm/s, but the result revealed that the less outflow rate at the fan is enough to meet the design requirement

  4. Executive summary

    International Nuclear Information System (INIS)

    1981-02-01

    This paper is an 'executive summary' of work undertaken to review proposals for transport, handling and emplacement of high level radioactive wastes in an underground repository, appropriate to the U.K. context, with particular reference to: waste block size and configuration; self-shielded or partially-shielded block; stages of disposal; transportation within the repository; emplacement in vertical holes or horizontal tunnels; repository access by adit, incline or shaft; and costs. The paper contains a section on general conclusions and recommendations. (U.K.)

  5. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  6. Game analysis of product-service integration

    Directory of Open Access Journals (Sweden)

    Heping Zhong

    2014-10-01

    Full Text Available Purpose: This paper aims at defining the value creation mechanism and income distribution strategies of product-service integration in order to promote product-service integration of a firm.Design/methodology/approach: This paper conducts researches quantitatively on the coordination mechanism of product-service integration by using game theory, and uses the methods of Shapley value and Equal growth rate to further discuss income distribution strategies of product-service integration.Findings: Product-service integration increases the total income of a firm and the added value of the income decreases as the unit price demand variation coefficient of products and services increases, while decreases as the marginal cost of products increases, decreases as the marginal cost of services increases. Moreover, the findings suggest that both income distribution strategies of product-service integration based on Shapley value method and Equal growth rate method can make the product department and service department of a firm win-win and realize the pareto improvement. The choice of what kind of distribution strategy to coordinate the actions between departments depends on the department playing dominant role in the firm. Generally speaking, for a firm at the center of market, when the product department is the main contributor to firm income, the service department will choose the income distribution strategy of product-service integration based on Shapley value method; when the service department is the main contributor to firm income, the service department will choose the income distribution strategy of product-service integration based on Equal growth rate method.Research limitations/implications: This paper makes some strict assumptions such as complete information, risk neutral, linear cost function and so on and the discussion is limited to the simple relationship between product department and service department.Practical implications: Product

  7. An item response theory analysis of the Executive Interview and development of the EXIT8: A Project FRONTIER Study.

    Science.gov (United States)

    Jahn, Danielle R; Dressel, Jeffrey A; Gavett, Brandon E; O'Bryant, Sid E

    2015-01-01

    The Executive Interview (EXIT25) is an effective measure of executive dysfunction, but may be inefficient due to the time it takes to complete 25 interview-based items. The current study aimed to examine psychometric properties of the EXIT25, with a specific focus on determining whether a briefer version of the measure could comprehensively assess executive dysfunction. The current study applied a graded response model (a type of item response theory model for polytomous categorical data) to identify items that were most closely related to the underlying construct of executive functioning and best discriminated between varying levels of executive functioning. Participants were 660 adults ages 40 to 96 years living in West Texas, who were recruited through an ongoing epidemiological study of rural health and aging, called Project FRONTIER. The EXIT25 was the primary measure examined. Participants also completed the Trail Making Test and Controlled Oral Word Association Test, among other measures, to examine the convergent validity of a brief form of the EXIT25. Eight items were identified that provided the majority of the information about the underlying construct of executive functioning; total scores on these items were associated with total scores on other measures of executive functioning and were able to differentiate between cognitively healthy, mildly cognitively impaired, and demented participants. In addition, cutoff scores were recommended based on sensitivity and specificity of scores. A brief, eight-item version of the EXIT25 may be an effective and efficient screening for executive dysfunction among older adults.

  8. NATO Guide for Judgement-Based Operational Analysis in Defence Decision Making : Executive Leaflet

    NARCIS (Netherlands)

    Wijnmalen, D.J.D.; et al

    2012-01-01

    Judgment plays an important role in all Operational Analysis (OA). NATO practitioners have determined that approaches in OA that are based on human judgement are increasingly critical to defence decision making. The purpose of the NATO Guide for Judgement-Based OA in Defence Decision Making is to

  9. Executive summary: Mod-1 wind turbine generator analysis and design report

    Science.gov (United States)

    1979-01-01

    Activities leading to the detail design of a wind turbine generator having a nominal rating of 1.8 megawatts are reported. Topics covered include (1) system description; (2) structural dynamics; (3) stability analysis; (4) mechanical subassemblies design; (5) power generation subsystem; and (6) control and instrumentation subsystem.

  10. Processing and Integration of Contextual Information in Monkey Ventrolateral Prefrontal Neurons during Selection and Execution of Goal-Directed Manipulative Actions.

    Science.gov (United States)

    Bruni, Stefania; Giorgetti, Valentina; Bonini, Luca; Fogassi, Leonardo

    2015-08-26

    The prefrontal cortex (PFC) is deemed to underlie the complexity, flexibility, and goal-directedness of primates' behavior. Most neurophysiological studies performed so far investigated PFC functions with arm-reaching or oculomotor tasks, thus leaving unclear whether, and to which extent, PFC neurons also play a role in goal-directed manipulative actions, such as those commonly used by primates during most of their daily activities. Here we trained two macaques to perform or withhold grasp-to-eat and grasp-to-place actions, depending on the combination of two subsequently presented cues: an auditory go/no-go cue (high/low tone) and a visually presented target (food/object). By varying the order of presentation of the two cues, we could segment and independently evaluate the processing and integration of contextual information allowing the monkey to make a decision on whether or not to act, and what action to perform. We recorded 403 task-related neurons from the ventrolateral prefrontal cortex (VLPFC): unimodal sensory-driven (37%), motor-related (21%), unimodal sensory-and-motor (23%), and multisensory (19%) neurons. Target and go/no-go selectivity characterized most of the recorded neurons, particularly those endowed with motor-related discharge. Interestingly, multisensory neurons appeared to encode a behavioral decision independently from the sensory modality of the stimulus allowing the monkey to make it: some of them reflected the decision to act or refraining from acting (56%), whereas others (44%) encoded the decision to perform (or withhold) a specific action (e.g., grasp-to-eat). Our findings indicate that VLPFC neurons play a role in the processing of contextual information underlying motor decision during goal-directed manipulative actions. We demonstrated that macaque ventrolateral prefrontal cortex (VLPFC) neurons show remarkable selectivity for different aspects of the contextual information allowing the monkey to select and execute goal

  11. Advanced Concept Architecture Design and Integrated Analysis (ACADIA)

    Science.gov (United States)

    2017-11-03

    1 Advanced Concept Architecture Design and Integrated Analysis (ACADIA) Submitted to the National Institute of Aerospace (NIA) on...Research Report 20161001 - 20161030 Advanced Concept Architecture Design and Integrated Analysis (ACADIA) W911NF-16-2-0229 8504Cedric Justin, Youngjun

  12. Integrating fire management analysis into land management planning

    Science.gov (United States)

    Thomas J. Mills

    1983-01-01

    The analysis of alternative fire management programs should be integrated into the land and resource management planning process, but a single fire management analysis model cannot meet all planning needs. Therefore, a set of simulation models that are analytically separate from integrated land management planning models are required. The design of four levels of fire...

  13. Integration of Formal Job Hazard Analysis and ALARA Work Practice

    International Nuclear Information System (INIS)

    NELSEN, D.P.

    2002-01-01

    ALARA work practices have traditionally centered on reducing radiological exposure and controlling contamination. As such, ALARA policies and procedures are not well suited to a wide range of chemical and human health issues. Assessing relative risk, identifying appropriate engineering/administrative controls and selecting proper Personal Protective Equipment (PPE) for non nuclear work activities extends beyond the limitations of traditional ALARA programs. Forging a comprehensive safety management program in today's (2002) work environment requires a disciplined dialog between health and safety professionals (e.g. safety, engineering, environmental, quality assurance, industrial hygiene, ALARA, etc.) and personnel working in the field. Integrating organizational priorities, maintaining effective pre-planning of work and supporting a team-based approach to safety management represents today's hallmark of safety excellence. Relying on the mandates of any single safety program does not provide industrial hygiene with the tools necessary to implement an integrated safety program. The establishment of tools and processes capable of sustaining a comprehensive safety program represents a key responsibility of industrial hygiene. Fluor Hanford has built integrated safety management around three programmatic attributes: (1) Integration of radiological, chemical and ergonomic issues under a single program. (2) Continuous improvement in routine communications among work planning/scheduling, job execution and management. (3) Rapid response to changing work conditions, formalized work planning and integrated worker involvement

  14. Analysis of the Apollo spacecraft operational data management system. Executive summary

    Science.gov (United States)

    1971-01-01

    A study was made of Apollo, Skylab, and several other data management systems to determine those techniques which could be applied to the management of operational data for future manned spacecraft programs. The results of the study are presented and include: (1) an analysis of present data management systems, (2) a list of requirements for future operational data management systems, (3) an evaluation of automated data management techniques, and (4) a plan for data management applicable to future space programs.

  15. Small-wind-systems application analysis. Technical report and executive summary

    Science.gov (United States)

    1981-06-01

    A small wind energy conversion systems (SWECS) analysis was conducted to estimate the potential market for SWEC, or wind machines smaller than 100 kW for five selected applications. The goals were to aid manufacturers in attaining financing by convincing venture capital investors of the potential of SWECS and to aid government planners in allocating R and D expenditures that will effectively advance SWECS commercialization. Based on these goals, the study: (1) provides a basis for assisting the DOE in planning R and D programs that will advance the state of SWECS industry; (2) quantifies estimates of market size vs. installed system cost to enable industry to plan expansion of capacity and product lines; (3) identifies marketing strategies for industry to use in attaining financing from investors and in achieving sales goals; and (4) provides DOE with data that will assist in determining actions, incentives, and/or legislation required to achieve a commercially viable SWECS industry. The five applications were selected through an initial screening and priority-ranking analysis. The year of analysis was 1985, but all dollar amounts, such as fuel costs, are expressed in 1980 dollars. The five SWECS applications investigated were farm residences, non-farm residences, rural electric cooperatives, feed grinders, and remote communities.

  16. Reconstruction of methods of execution of the death penalty by shooting in the years 1949-1954 based on exhumation research of "prison fields" in Osobowicki Cemetery in Wroclaw. Part II--analysis of gunshot injuries and an attempt at reconstructing the course of execution.

    Science.gov (United States)

    Szleszkowski, Łukasz; Thannhäuser, Agata; Kawecki, Jerzy; Szwagrzyk, Krzysztof; Swiatek, Barbara

    2012-01-01

    The analysis of gunshot injuries in prisoners who were executed in Wroclaw penitentiary in the years 1949-1954 shows divergences from legal regulations describing the method of execution. This observation leads to the conclusion that the predominant method of execution of the death penalty was a gunshot or gunshots to the back of the head, which is analogous to the results of exhumation works on collective graves of war prisoners executed during World War II in the territory of the former Soviet Union.

  17. Position statement executive summary: guidelines and recommendations for laboratory analysis in the diagnosis and management of diabetes mellitus.

    Science.gov (United States)

    Sacks, David B; Arnold, Mark; Bakris, George L; Bruns, David E; Horvath, Andrea Rita; Kirkman, M Sue; Lernmark, Ake; Metzger, Boyd E; Nathan, David M

    2011-06-01

    Multiple laboratory tests are used in the diagnosis and management of patients with diabetes mellitus. The quality of the scientific evidence supporting the use of these assays varies substantially. An expert committee compiled evidence-based recommendations for the use of laboratory analysis in patients with diabetes. A new system was developed to grade the overall quality of the evidence and the strength of the recommendations. A draft of the guidelines was posted on the Internet, and the document was modified in response to comments. The guidelines were reviewed by the joint Evidence-Based Laboratory Medicine Committee of the AACC and the National Academy of Clinical Biochemistry and were accepted after revisions by the Professional Practice Committee and subsequent approval by the Executive Committee of the American Diabetes Association. In addition to the long-standing criteria based on measurement of venous plasma glucose, diabetes can be diagnosed by demonstrating increased hemoglobin A(1c) (HbA(1c)) concentrations in the blood. Monitoring of glycemic control is performed by the patients measuring their own plasma or blood glucose with meters and by laboratory analysis of HbA(1c). The potential roles of noninvasive glucose monitoring, genetic testing, and measurement of autoantibodies, urine albumin, insulin, proinsulin, C-peptide, and other analytes are addressed. The guidelines provide specific recommendations based on published data or derived from expert consensus. Several analytes are found to have minimal clinical value at the present time, and measurement of them is not recommended.

  18. Real analysis measure theory, integration, and Hilbert spaces

    CERN Document Server

    Stein, Elias M

    2005-01-01

    Real Analysis is the third volume in the Princeton Lectures in Analysis, a series of four textbooks that aim to present, in an integrated manner, the core areas of analysis. Here the focus is on the development of measure and integration theory, differentiation and integration, Hilbert spaces, and Hausdorff measure and fractals. This book reflects the objective of the series as a whole: to make plain the organic unity that exists between the various parts of the subject, and to illustrate the wide applicability of ideas of analysis to other fields of mathematics and science. After

  19. Semantic web for integrated network analysis in biomedicine.

    Science.gov (United States)

    Chen, Huajun; Ding, Li; Wu, Zhaohui; Yu, Tong; Dhanapalan, Lavanya; Chen, Jake Y

    2009-03-01

    The Semantic Web technology enables integration of heterogeneous data on the World Wide Web by making the semantics of data explicit through formal ontologies. In this article, we survey the feasibility and state of the art of utilizing the Semantic Web technology to represent, integrate and analyze the knowledge in various biomedical networks. We introduce a new conceptual framework, semantic graph mining, to enable researchers to integrate graph mining with ontology reasoning in network data analysis. Through four case studies, we demonstrate how semantic graph mining can be applied to the analysis of disease-causal genes, Gene Ontology category cross-talks, drug efficacy analysis and herb-drug interactions analysis.

  20. Greening the Grid: Pathways to Integrate 175 Gigawatts of Renewable Energy into India’s Electric Grid, Vol. I. National Study. Executive Summary

    Energy Technology Data Exchange (ETDEWEB)

    Palchak, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cochran, Jaquelin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Deshmukh, Ranjit [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ehlen, Ali [National Renewable Energy Lab. (NREL), Golden, CO (United States); Soonee, Sushil Kumar [Power System Operation Corporation Limited (POSOCO), New Delhi (India); Narasimhan, S. R. [Power System Operation Corporation Limited (POSOCO), New Delhi (India); Joshi, Mohit [Power System Operation Corporation Limited (POSOCO), New Delhi (India); McBennett, Brendan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Milligan, Michael [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sreedharan, Priya [US Agency for International Development (USAID), Washington, DC (United States); Chernyakhovskiy, Ilya [National Renewable Energy Lab. (NREL), Golden, CO (United States); Abhyankar, Nikit [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-06-01

    The use of renewable energy (RE) sources, primarily wind and solar generation, is poised to grow significantly within the Indian power system. The Government of India has established an installed capacity target of 175 gigawatts (GW) RE by 2022 that includes 60 GW of wind and 100 GW of solar, up from current capacities of 29 GW wind and 9 GW solar. India’s contribution to global efforts on climate mitigation extends this ambition to 40% non-fossil-based generation capacity by 2030. Global experience demonstrates that power systems can integrate wind and solar at this scale; however, evidence-based planning is important to achieve wind and solar integration at least cost. The purpose of this analysis is to evaluate the operation of India’s power grid with 175 GW of RE in order to identify potential cost and operational concerns and actions needed to efficiently integrate this level of wind and solar generation.

  1. Integrated risk analysis of global climate change

    International Nuclear Information System (INIS)

    Shlyakhter, Alexander; Wilson, Richard; Valverde A, L.J. Jr.

    1995-01-01

    This paper discusses several factors that should be considered in integrated risk analyses of global climate change. We begin by describing how the problem of global climate change can be subdivided into largely independent parts that can be linked together in an analytically tractable fashion. Uncertainty plays a central role in integrated risk analyses of global climate change. Accordingly, we consider various aspects of uncertainty as they relate to the climate change problem. We also consider the impacts of these uncertainties on various risk management issues, such as sequential decision strategies, value of information, and problems of interregional and intergenerational equity. (author)

  2. Executive summary

    International Nuclear Information System (INIS)

    2002-01-01

    On 18 May 2001, the Finnish Parliament ratified the Decision in Principle on the final disposal facility for spent nuclear fuel at Olkiluoto, within the municipality of Eurajoki. The Municipality Council and the government has made positive decisions earlier, at the end of 2000, and in compliance with the Nuclear Energy Act, Parliament's ratification was then required. The decision is valid for the spent fuel generated by the existing Finnish nuclear power plants and means that the construction of the final disposal facility is considered to be in line with the overall good of society. Earlier steps included, amongst others, the approval of the technical project by the Safety Authority. Future steps include construction of an underground rock characterisation facility, ONKALO (2003-2004), and application for separate construction and operating licences for the final disposal facility (from about 2010). How did this political and societal decision come about? The FSC Workshop provided the opportunity to present the history leading up to the Decision in Principle (DiP), and to examine future perspectives with an emphasis on stakeholder involvement. This Executive Summary gives an overview of the presentations and discussions that took place at the workshop. It presents, for the most part, a factual account of the individual presentations and of the discussions that took place. It relies importantly on the notes that were taken at the meeting. Most materials are elaborated upon in a fuller way in the texts that the various speakers and session moderators contributed for these proceedings. The structure of the Executive Summary follows the structure of the workshop itself. Complementary to this Summary and also provided with this document, is a NEA Secretariat's perspective aiming to place the results of all discussions, feedback and site visit into an international perspective. (authors)

  3. Receiving Basin for Offsite Fuels and the Resin Regeneration Facility Safety Analysis Report, Executive Summary

    Energy Technology Data Exchange (ETDEWEB)

    Shedrow, C.B.

    1999-11-29

    The Safety Analysis Report documents the safety authorization basis for the Receiving Basin for Offsite Fuels (RBOF) and the Resin Regeneration Facility (RRF) at the Savannah River Site (SRS). The present mission of the RBOF and RRF is to continue in providing a facility for the safe receipt, storage, handling, and shipping of spent nuclear fuel assemblies from power and research reactors in the United States, fuel from SRS and other Department of Energy (DOE) reactors, and foreign research reactors fuel, in support of the nonproliferation policy. The RBOF and RRF provide the capability to handle, separate, and transfer wastes generated from nuclear fuel element storage. The DOE and Westinghouse Savannah River Company, the prime operating contractor, are committed to managing these activities in such a manner that the health and safety of the offsite general public, the site worker, the facility worker, and the environment are protected.

  4. Receiving Basin for Offsite Fuels and the Resin Regeneration Facility Safety Analysis Report, Executive Summary

    International Nuclear Information System (INIS)

    Shedrow, C.B.

    1999-01-01

    The Safety Analysis Report documents the safety authorization basis for the Receiving Basin for Offsite Fuels (RBOF) and the Resin Regeneration Facility (RRF) at the Savannah River Site (SRS). The present mission of the RBOF and RRF is to continue in providing a facility for the safe receipt, storage, handling, and shipping of spent nuclear fuel assemblies from power and research reactors in the United States, fuel from SRS and other Department of Energy (DOE) reactors, and foreign research reactors fuel, in support of the nonproliferation policy. The RBOF and RRF provide the capability to handle, separate, and transfer wastes generated from nuclear fuel element storage. The DOE and Westinghouse Savannah River Company, the prime operating contractor, are committed to managing these activities in such a manner that the health and safety of the offsite general public, the site worker, the facility worker, and the environment are protected

  5. Trends in Planetary Data Analysis. Executive summary of the Planetary Data Workshop

    Science.gov (United States)

    Evans, N.

    1984-09-01

    Planetary data include non-imaging remote sensing data, which includes spectrometric, radiometric, and polarimetric remote sensing observations. Also included are in-situ, radio/radar data, and Earth based observation. Also discussed is development of a planetary data system. A catalog to identify observations will be the initial entry point for all levels of users into the data system. There are seven distinct data support services: encyclopedia, data index, data inventory, browse, search, sample, and acquire. Data systems for planetary science users must provide access to data, process, store, and display data. Two standards will be incorporated into the planetary data system: Standard communications protocol and Standard format data unit. The data system configuration must combine a distributed system with those of a centralized system. Fiscal constraints have made prioritization important. Activities include saving previous mission data, planning/cost analysis, and publishing of proceedings.

  6. Integrated watershed analysis: adapting to changing times

    Science.gov (United States)

    Gordon H. Reeves

    2013-01-01

    Resource managers are increasingly required to conduct integrated analyses of aquatic and terrestrial ecosystems before undertaking any activities. Th ere are a number of research studies on the impacts of management actions on these ecosystems, as well as a growing body of knowledge about ecological processes that aff ect them, particularly aquatic ecosystems, which...

  7. Relay Feedback Analysis for Double Integral Plants

    Directory of Open Access Journals (Sweden)

    Zhen Ye

    2011-01-01

    Full Text Available Double integral plants under relay feedback are studied. Complete results on the uniqueness of solutions, existence, and stability of the limit cycles are established using the point transformation method. Analytical expressions are also given for determining the amplitude and period of a limit cycle from the plant parameters.

  8. The effect of obsessive-compulsive symptomatology on executive functions in schizophrenia: a systematic review and meta-analysis.

    Science.gov (United States)

    Cunill, Ruth; Huerta-Ramos, Elena; Castells, Xavier

    2013-11-30

    The presence of obsessive-compulsive symptoms (OCS) and obsessive-compulsive disorder (OCD) is frequent in patients with schizophrenia and has been associated with greater functional impairment. The impact of these features on cognitive function is unclear. In this article, we performed a systematic review and meta-analysis to assess the effect of OCS/OCD on executive functions in schizophrenia patients. Results indicate that schizophrenia patients with OCS/OCD were more impaired in abstract thinking than schizophrenia patients without OCS/OCD. This finding provides support to the double jeopardy hypothesis and may partially explain the greater functional impairment shown in schizo-obsessive patients compared to those with schizophrenia. Inconsistent results were found for set-shifting, cognitive flexibility, cognitive inhibition and verbal fluency, as indicated by the high statistical heterogeneity found. Potential sources of heterogeneity such as definition of OCS/OCD, age of onset, severity of negative symptoms and premorbid intelligence were planned to be explored but there was an insufficient number of studies to perform these analyses. Our findings highlight the complexity of the relationship between OCS/OCD and schizophrenia and warrant further investigation of the cognitive function of schizo-obsessive patients. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. Integration of video and radiation analysis data

    International Nuclear Information System (INIS)

    Menlove, H.O.; Howell, J.A.; Rodriguez, C.A.; Eccleston, G.W.; Beddingfield, D.; Smith, J.E.; Baumgart, C.W.

    1995-01-01

    For the past several years, the integration of containment and surveillance (C/S) with nondestructive assay (NDA) sensors for monitoring the movement of nuclear material has focused on the hardware and communications protocols in the transmission network. Little progress has been made in methods to utilize the combined C/S and NDA data for safeguards and to reduce the inspector time spent in nuclear facilities. One of the fundamental problems in the integration of the combined data is that the two methods operate in different dimensions. The C/S video data is spatial in nature; whereas, the NDA sensors provide radiation levels versus time data. The authors have introduced a new method to integrate spatial (digital video) with time (radiation monitoring) information. This technology is based on pattern recognition by neural networks, provides significant capability to analyze complex data, and has the ability to learn and adapt to changing situations. This technique has the potential of significantly reducing the frequency of inspection visits to key facilities without a loss of safeguards effectiveness

  10. Integration of Design and Control through Model Analysis

    DEFF Research Database (Denmark)

    Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay

    2002-01-01

    A systematic computer aided analysis of the process model is proposed as a pre-solution step for integration of design and control problems. The process model equations are classified in terms of balance equations, constitutive equations and conditional equations. Analysis of the phenomena models...... (structure selection) issues for the integrated problems are considered. (C) 2002 Elsevier Science Ltd. All rights reserved....... representing the constitutive equations identify the relationships between the important process and design variables, which help to understand, define and address some of the issues related to integration of design and control. Furthermore, the analysis is able to identify a set of process (control) variables...

  11. Nationwide Buildings Energy Research enabled through an integrated Data Intensive Scientific Workflow and Advanced Analysis Environment

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lansing, Carina S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elsethagen, Todd O. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hathaway, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Guillen, Zoe C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dirks, James A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Skorski, Daniel C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stephan, Eric G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorrissen, Willy J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorton, Ian [Carnegie Mellon Univ., Pittsburgh, PA (United States); Liu, Yan [Concordia Univ., Montreal, QC (Canada)

    2014-01-28

    Modern workflow systems enable scientists to run ensemble simulations at unprecedented scales and levels of complexity, allowing them to study system sizes previously impossible to achieve, due to the inherent resource requirements needed for the modeling work. However as a result of these new capabilities the science teams suddenly also face unprecedented data volumes that they are unable to analyze with their existing tools and methodologies in a timely fashion. In this paper we will describe the ongoing development work to create an integrated data intensive scientific workflow and analysis environment that offers researchers the ability to easily create and execute complex simulation studies and provides them with different scalable methods to analyze the resulting data volumes. The integration of simulation and analysis environments is hereby not only a question of ease of use, but supports fundamental functions in the correlated analysis of simulation input, execution details and derived results for multi-variant, complex studies. To this end the team extended and integrated the existing capabilities of the Velo data management and analysis infrastructure, the MeDICi data intensive workflow system and RHIPE the R for Hadoop version of the well-known statistics package, as well as developing a new visual analytics interface for the result exploitation by multi-domain users. The capabilities of the new environment are demonstrated on a use case that focusses on the Pacific Northwest National Laboratory (PNNL) building energy team, showing how they were able to take their previously local scale simulations to a nationwide level by utilizing data intensive computing techniques not only for their modeling work, but also for the subsequent analysis of their modeling results. As part of the PNNL research initiative PRIMA (Platform for Regional Integrated Modeling and Analysis) the team performed an initial 3 year study of building energy demands for the US Eastern

  12. Functional Analysis for an Integrated Capability of Arrival/Departure/Surface Management with Tactical Runway Management

    Science.gov (United States)

    Phojanamongkolkij, Nipa; Okuniek, Nikolai; Lohr, Gary W.; Schaper, Meilin; Christoffels, Lothar; Latorella, Kara A.

    2014-01-01

    develop the ConOps will include: developing scenarios to fully test environmental, procedural, and data availability assumptions; executing the analysis by a walk-through of the integrated system using these scenarios; defining the appropriate role of operators in terms of their monitoring requirements and decision authority; executing the analysis by a walk-through of the integrated system with operator involvement; characterizing the environmental, system data requirements, and operator role assumptions for the ConOps.

  13. The executive leader in the postcrisis era

    Directory of Open Access Journals (Sweden)

    Bazarov, Tahir Y.

    2013-06-01

    Full Text Available This article describes psychological challenges that executive leaders of companies face nowadays. The study of the social context is based on changes that took place with the development of information technologies. The analysis touches upon such phenomena as virtualization, involvement in the external sociocommunicative environment, and the emergence of multiple identity. It is suggested that in order to adapt to changing conditions one should follow the path of self-development—in particular, to develop attention, imagination, and willpower. In connection with the traits generally attributed to executive leaders, the article emphasizes self-adjustment; common sense as an integral part of intuition, emotions, and imagination; and the readiness to make choices in fifty-fifty situations.

  14. Analysis of Optimal Operation of an Energy Integrated Distillation Plant

    DEFF Research Database (Denmark)

    Li, Hong Wen; Hansen, C.A.; Gani, Rafiqul

    2003-01-01

    The efficiency of manufacturing systems can be significantly increased through diligent application of control based on mathematical models thereby enabling more tight integration of decision making with systems operation. In the present paper analysis of optimal operation of an energy integrated...

  15. Integrated program of using of Probabilistic Safety Analysis in Spain

    International Nuclear Information System (INIS)

    1998-01-01

    Since 25 June 1986, when the CSN (Nuclear Safety Conseil) approve the Integrated Program of Probabilistic Safety Analysis, this program has articulated the main activities of CSN. This document summarize the activities developed during these years and reviews the Integrated programme

  16. Analysis of integrated video and radiation data

    International Nuclear Information System (INIS)

    Howell, J.A.; Menlove, H.O.; Rodriguez, C.A.; Beddingfield, D.; Vasil, A.

    1995-01-01

    We have developed prototype software for a facility-monitoring application that will detect anomalous activity in a nuclear facility. The software, which forms the basis of a simple model, automatically reviews and analyzes integrated safeguards data from continuous unattended monitoring systems. This technology, based on pattern recognition by neural networks, provides significant capability to analyze complex data and has the ability to learn and adapt to changing situations. It is well suited for large automated facilities, reactors, spent-fuel storage facilities, reprocessing plants, and nuclear material storage vaults

  17. Signal integrity analysis on discontinuous microstrip line

    International Nuclear Information System (INIS)

    Qiao, Qingyang; Dai, Yawen; Chen, Zipeng

    2013-01-01

    In high speed PCB design, microstirp lines were used to control the impedance, however, the discontinuous microstrip line can cause signal integrity problems. In this paper, we use the transmission line theory to study the characteristics of microstrip lines. Research results indicate that the discontinuity such as truncation, gap and size change result in the problems such as radiation, reflection, delay and ground bounce. We change the discontinuities to distributed parameter circuits, analysed the steady-state response and transient response and the phase delay. The transient response cause radiation and voltage jump.

  18. Advantages of Integrative Data Analysis for Developmental Research

    Science.gov (United States)

    Bainter, Sierra A.; Curran, Patrick J.

    2015-01-01

    Amid recent progress in cognitive development research, high-quality data resources are accumulating, and data sharing and secondary data analysis are becoming increasingly valuable tools. Integrative data analysis (IDA) is an exciting analytical framework that can enhance secondary data analysis in powerful ways. IDA pools item-level data across…

  19. Integrated care: a comprehensive bibliometric analysis and literature review

    Directory of Open Access Journals (Sweden)

    Xiaowei Sun

    2014-06-01

    Full Text Available Introduction: Integrated care could not only fix up fragmented health care but also improve the continuity of care and the quality of life. Despite the volume and variety of publications, little is known about how ‘integrated care’ has developed. There is a need for a systematic bibliometric analysis on studying the important features of the integrated care literature.Aim: To investigate the growth pattern, core journals and jurisdictions and identify the key research domains of integrated care.Methods: We searched Medline/PubMed using the search strategy ‘(delivery of health care, integrated [MeSH Terms] OR integrated care [Title/Abstract]’ without time and language limits. Second, we extracted the publishing year, journals, jurisdictions and keywords of the retrieved articles. Finally, descriptive statistical analysis by the Bibliographic Item Co-occurrence Matrix Builder and hierarchical clustering by SPSS were used.Results: As many as 9090 articles were retrieved. Results included: (1 the cumulative numbers of the publications on integrated care rose perpendicularly after 1993; (2 all documents were recorded by 1646 kinds of journals. There were 28 core journals; (3 the USA is the predominant publishing country; and (4 there are six key domains including: the definition/models of integrated care, interdisciplinary patient care team, disease management for chronically ill patients, types of health care organizations and policy, information system integration and legislation/jurisprudence.Discussion and conclusion: Integrated care literature has been most evident in developed countries. International Journal of Integrated Care is highly recommended in this research area. The bibliometric analysis and identification of publication hotspots provides researchers and practitioners with core target journals, as well as an overview of the field for further research in integrated care.

  20. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  1. Analysis of Price Variation and Market Integration of Prosopis ...

    African Journals Online (AJOL)

    Analysis of Price Variation and Market Integration of Prosopis Africana (guill. ... select five markets based on the presence of traders selling the commodity in the markets ... T- test result showed that Prosopis africana seed trade is profitable and ...

  2. Integrating Data Transformation in Principal Components Analysis

    KAUST Repository

    Maadooliat, Mehdi; Huang, Jianhua Z.; Hu, Jianhua

    2015-01-01

    Principal component analysis (PCA) is a popular dimension reduction method to reduce the complexity and obtain the informative aspects of high-dimensional datasets. When the data distribution is skewed, data transformation is commonly used prior

  3. Effect of education on listening comprehension of sentences on healthy elderly: analysis of number of correct responses and task execution time.

    Science.gov (United States)

    Silagi, Marcela Lima; Rabelo, Camila Maia; Schochat, Eliane; Mansur, Letícia Lessa

    2017-11-13

    To analyze the effect of education on sentence listening comprehension on cognitively healthy elderly. A total of 111 healthy elderly, aged 60-80 years of both genders were divided into two groups according to educational level: low education (0-8 years of formal education) and high education (≥9 years of formal education). The participants were assessed using the Revised Token Test, an instrument that supports the evaluation of auditory comprehension of orders with different working memory and syntactic complexity demands. The indicators used for performance analysis were the number of correct responses (accuracy analysis) and task execution time (temporal analysis) in the different blocks. The low educated group had a lower number of correct responses than the high educated group on all blocks of the test. In the temporal analysis, participants with low education had longer execution time for commands on the first four blocks related to working memory. However, the two groups had similar execution time for blocks more related to syntactic comprehension. Education influenced sentence listening comprehension on elderly. Temporal analysis allowed to infer over the relationship between comprehension and other cognitive abilities, and to observe that the low educated elderly did not use effective compensation strategies to improve their performances on the task. Therefore, low educational level, associated with aging, may potentialize the risks for language decline.

  4. Fluidic Logic Used in a Systems Approach to Enable Integrated Single-cell Functional Analysis

    Directory of Open Access Journals (Sweden)

    Naveen Ramalingam

    2016-09-01

    Full Text Available The study of single cells has evolved over the past several years to include expression and genomic analysis of an increasing number of single cells. Several studies have demonstrated wide-spread variation and heterogeneity within cell populations of similar phenotype. While the characterization of these populations will likely set the foundation for our understanding of genomic- and expression-based diversity, it will not be able to link the functional differences of a single cell to its underlying genomic structure and activity. Currently, it is difficult to perturb single cells in a controlled environment, monitor and measure the response due to perturbation, and link these response measurements to downstream genomic and transcriptomic analysis. In order to address this challenge, we developed a platform to integrate and miniaturize many of the experimental steps required to study single-cell function. The heart of this platform is an elastomer-based Integrated Fluidic Circuit (IFC that uses fluidic logic to select and sequester specific single cells based on a phenotypic trait for downstream experimentation. Experiments with sequestered cells that have been performed include on-chip culture, exposure to a variety of stimulants, and post-exposure image-based response analysis, followed by preparation of the mRNA transcriptome for massively parallel sequencing analysis. The flexible system embodies experimental design and execution that enable routine functional studies of single cells.

  5. Cost benefit analysis of power plant database integration

    International Nuclear Information System (INIS)

    Wilber, B.E.; Cimento, A.; Stuart, R.

    1988-01-01

    A cost benefit analysis of plant wide data integration allows utility management to evaluate integration and automation benefits from an economic perspective. With this evaluation, the utility can determine both the quantitative and qualitative savings that can be expected from data integration. The cost benefit analysis is then a planning tool which helps the utility to develop a focused long term implementation strategy that will yield significant near term benefits. This paper presents a flexible cost benefit analysis methodology which is both simple to use and yields accurate, verifiable results. Included in this paper is a list of parameters to consider, a procedure for performing the cost savings analysis, and samples of this procedure when applied to a utility. A case study is presented involving a specific utility where this procedure was applied. Their uses of the cost-benefit analysis are also described

  6. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  7. Empirical Analysis of the Integration Activity of Business Structures in the Regions of Russia

    Directory of Open Access Journals (Sweden)

    Maria Gennadyevna Karelina

    2015-12-01

    Full Text Available The article investigates the integration activity of business structures in the regions of Russia. A wide variety of approaches to the study of the problems and prospects of economic integration and the current dispute on the role of integration processes in the regional economic development have determined the complexity of the concepts “integration” and “integration activities” in order to develop the objective conditions to analyse the integration activity of business structures in the Russian regions. The monitoring of the current legal system of the Russian Federation carried out in the area of statistics and compiling statistical databases on mergers and acquisitions has showed the absence of the formal executive authority dealing with the compiling and collections of information on the integration activity at the regional level. In this connection, the data of Russian information and analytical agencies are made from the information and analytical base. As the research tools, the methods of analysis of structural changes, methods of analysis of economic differentiation and concentration, methods of non-parametric statistics are used. The article shows the close relationship between the social and economic development of the subjects of Russia and the integrated business structures functioning on its territory. An investigation of the integration activity structure and dynamics in the subjects of the Russian Federation based on the statistical data for the period from 2003 to 2012 has revealed the increasing heterogeneity of the integration activity of business structures in the regions of Russia. The hypothesis of a substantial divergence of mergers and acquisitions of corporate structures in the Russian regions was confirmed by the high values of the Gini coefficient, the Herfindahl index, and the decile coefficient of differentiation. The research results are of practical importance since they can be used to improve the existing

  8. Integrated systems analysis of the PIUS reactor

    Energy Technology Data Exchange (ETDEWEB)

    Fullwood, F.; Kroeger, P.; Higgins, J. [Brookhaven National Lab., Upton, NY (United States)] [and others

    1993-11-01

    Results are presented of a systems failure analysis of the PIUS plant systems that are used during normal reactor operation and postulated accidents. This study was performed to provide the NRC with an understanding of the behavior of the plant. The study applied two diverse failure identification methods, Failure Modes Effects & Criticality Analysis (FMECA) and Hazards & Operability (HAZOP) to the plant systems, supported by several deterministic analyses. Conventional PRA methods were also used along with a scheme for classifying events by initiator frequency and combinations of failures. Principal results of this study are: (a) an extensive listing of potential event sequences, grouped in categories that can be used by the NRC, (b) identification of support systems that are important to safety, and (c) identification of key operator actions.

  9. Integrated systems analysis of the PIUS reactor

    International Nuclear Information System (INIS)

    Fullwood, F.; Kroeger, P.; Higgins, J.

    1993-11-01

    Results are presented of a systems failure analysis of the PIUS plant systems that are used during normal reactor operation and postulated accidents. This study was performed to provide the NRC with an understanding of the behavior of the plant. The study applied two diverse failure identification methods, Failure Modes Effects ampersand Criticality Analysis (FMECA) and Hazards ampersand Operability (HAZOP) to the plant systems, supported by several deterministic analyses. Conventional PRA methods were also used along with a scheme for classifying events by initiator frequency and combinations of failures. Principal results of this study are: (a) an extensive listing of potential event sequences, grouped in categories that can be used by the NRC, (b) identification of support systems that are important to safety, and (c) identification of key operator actions

  10. Integrative data analysis of male reproductive disorders

    DEFF Research Database (Denmark)

    Edsgard, Stefan Daniel

    of such data in conjunction with data from publicly available repositories. This thesis presents an introduction to disease genetics and molecular systems biology, followed by four studies that each provide detailed clues to the etiology of male reproductive disorders. Finally, a fifth study illustrates......-wide association data with respect to copy number variation and show that the aggregated effect of rare variants can influence the risk for testicular cancer. Paper V provides an example of the application of RNA-Seq for expression analysis of a species with an unsequenced genome. We analysed the plant...... of this thesis is the identification of the molecular basis of male reproductive disorders, with a special focus on testicular cancer. To this end, clinical samples were characterized by microarraybased transcription and genomic variation assays and molecular entities were identified by computational analysis...

  11. Occupational exposure to ionising radiation 1990-1996. Analysis of doses reported to the Health and Safety Executive's Central Index of Dose Information

    International Nuclear Information System (INIS)

    1998-01-01

    The Central Index of Dose Information (CIDI) is the Health and Safety Executive's (HSE's) national database of occupational exposure to ionising radiation. It is operated under contract by the National Radiological Protection Board (NRPB). CIDI receives annually, from Approved Dosimetry Services (ADS) summaries of radiation doses recorded for employees designated as classified persons in the United Kingdom. This is the second analysis of dose summary information to be published. (author)

  12. An integrated platform for biomolecule interaction analysis

    Science.gov (United States)

    Jan, Chia-Ming; Tsai, Pei-I.; Chou, Shin-Ting; Lee, Shu-Sheng; Lee, Chih-Kung

    2013-02-01

    We developed a new metrology platform which can detect real-time changes in both a phase-interrogation mode and intensity mode of a SPR (surface plasmon resonance). We integrated a SPR and ellipsometer to a biosensor chip platform to create a new biomolecular interaction measurement mechanism. We adopted a conductive ITO (indium-tinoxide) film to the bio-sensor platform chip to expand the dynamic range and improve measurement accuracy. The thickness of the conductive film and the suitable voltage constants were found to enhance performance. A circularly polarized ellipsometry configuration was incorporated into the newly developed platform to measure the label-free interactions of recombinant human C-reactive protein (CRP) with immobilized biomolecule target monoclonal human CRP antibody at various concentrations. CRP was chosen as it is a cardiovascular risk biomarker and is an acute phase reactant as well as a specific prognostic indicator for inflammation. We found that the sensitivity of a phaseinterrogation SPR is predominantly dependent on the optimization of the sample incidence angle. The effect of the ITO layer effective index under DC and AC effects as well as an optimal modulation were experimentally performed and discussed. Our experimental results showed that the modulated dynamic range for phase detection was 10E-2 RIU based on a current effect and 10E-4 RIU based on a potential effect of which a 0.55 (°/RIU) measurement was found by angular-interrogation. The performance of our newly developed metrology platform was characterized to have a higher sensitivity and less dynamic range when compared to a traditional full-field measurement system.

  13. An analysis of 3D particle path integration algorithms

    International Nuclear Information System (INIS)

    Darmofal, D.L.; Haimes, R.

    1996-01-01

    Several techniques for the numerical integration of particle paths in steady and unsteady vector (velocity) fields are analyzed. Most of the analysis applies to unsteady vector fields, however, some results apply to steady vector field integration. Multistep, multistage, and some hybrid schemes are considered. It is shown that due to initialization errors, many unsteady particle path integration schemes are limited to third-order accuracy in time. Multistage schemes require at least three times more internal data storage than multistep schemes of equal order. However, for timesteps within the stability bounds, multistage schemes are generally more accurate. A linearized analysis shows that the stability of these integration algorithms are determined by the eigenvalues of the local velocity tensor. Thus, the accuracy and stability of the methods are interpreted with concepts typically used in critical point theory. This paper shows how integration schemes can lead to erroneous classification of critical points when the timestep is finite and fixed. For steady velocity fields, we demonstrate that timesteps outside of the relative stability region can lead to similar integration errors. From this analysis, guidelines for accurate timestep sizing are suggested for both steady and unsteady flows. In particular, using simulation data for the unsteady flow around a tapered cylinder, we show that accurate particle path integration requires timesteps which are at most on the order of the physical timescale of the flow

  14. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  15. Integrative Genomic Analysis of Complex traits

    DEFF Research Database (Denmark)

    Ehsani, Ali Reza

    In the last decade rapid development in biotechnologies has made it possible to extract extensive information about practically all levels of biological organization. An ever-increasing number of studies are reporting miltilayered datasets on the entire DNA sequence, transceroption, protein...... expression, and metabolite abundance of more and more populations in a multitude of invironments. However, a solid model for including all of this complex information in one analysis, to disentangle genetic variation and the underlying genetic architecture of complex traits and diseases, has not yet been...

  16. Stateless and stateful implementations of faithful execution

    Science.gov (United States)

    Pierson, Lyndon G; Witzke, Edward L; Tarman, Thomas D; Robertson, Perry J; Eldridge, John M; Campbell, Philip L

    2014-12-16

    A faithful execution system includes system memory, a target processor, and protection engine. The system memory stores a ciphertext including value fields and integrity fields. The value fields each include an encrypted executable instruction and the integrity fields each include an encrypted integrity value for determining whether a corresponding one of the value fields has been modified. The target processor executes plaintext instructions decoded from the ciphertext while the protection engine is coupled between the system memory and the target processor. The protection engine includes logic to retrieve the ciphertext from the system memory, decrypt the value fields into the plaintext instructions, perform an integrity check based on the integrity fields to determine whether any of the corresponding value fields have been modified, and provide the plaintext instructions to the target processor for execution.

  17. Major Depressive Disorder Is Associated with Broad Impairments on Neuropsychological Measures of Executive Function: A Meta-Analysis and Review

    Science.gov (United States)

    Snyder, Hannah R.

    2013-01-01

    Cognitive impairments are now widely acknowledged as an important aspect of major depressive disorder (MDD), and it has been proposed that executive function (EF) may be particularly impaired in patients with MDD. However, the existence and nature of EF impairments associated with depression remain strongly debated. Although many studies have…

  18. Project analysis and integration economic analyses summary

    Science.gov (United States)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  19. Integrative analysis of metabolomics and transcriptomics data

    DEFF Research Database (Denmark)

    Brink-Jensen, Kasper; Bak, Søren; Jørgensen, Kirsten

    2013-01-01

    ) measurements from the same samples, to identify genes controlling the production of metabolites. Due to the high dimensionality of both LC-MS and DNA microarray data, dimension reduction and variable selection are key elements of the analysis. Our proposed approach starts by identifying the basis functions......The abundance of high-dimensional measurements in the form of gene expression and mass spectroscopy calls for models to elucidate the underlying biological system. For widely studied organisms like yeast, it is possible to incorporate prior knowledge from a variety of databases, an approach used...... ("building blocks") that constitute the output from a mass spectrometry experiment. Subsequently, the weights of these basis functions are related to the observations from the corresponding gene expression data in order to identify which genes are associated with specific patterns seen in the metabolite data...

  20. Vertically Integrated Seismological Analysis II : Inference

    Science.gov (United States)

    Arora, N. S.; Russell, S.; Sudderth, E.

    2009-12-01

    Methods for automatically associating detected waveform features with hypothesized seismic events, and localizing those events, are a critical component of efforts to verify the Comprehensive Test Ban Treaty (CTBT). As outlined in our companion abstract, we have developed a hierarchical model which views detection, association, and localization as an integrated probabilistic inference problem. In this abstract, we provide more details on the Markov chain Monte Carlo (MCMC) methods used to solve this inference task. MCMC generates samples from a posterior distribution π(x) over possible worlds x by defining a Markov chain whose states are the worlds x, and whose stationary distribution is π(x). In the Metropolis-Hastings (M-H) method, transitions in the Markov chain are constructed in two steps. First, given the current state x, a candidate next state x‧ is generated from a proposal distribution q(x‧ | x), which may be (more or less) arbitrary. Second, the transition to x‧ is not automatic, but occurs with an acceptance probability—α(x‧ | x) = min(1, π(x‧)q(x | x‧)/π(x)q(x‧ | x)). The seismic event model outlined in our companion abstract is quite similar to those used in multitarget tracking, for which MCMC has proved very effective. In this model, each world x is defined by a collection of events, a list of properties characterizing those events (times, locations, magnitudes, and types), and the association of each event to a set of observed detections. The target distribution π(x) = P(x | y), the posterior distribution over worlds x given the observed waveform data y at all stations. Proposal distributions then implement several types of moves between worlds. For example, birth moves create new events; death moves delete existing events; split moves partition the detections for an event into two new events; merge moves combine event pairs; swap moves modify the properties and assocations for pairs of events. Importantly, the rules for

  1. Development of safety analysis technology for integral reactor

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Suk K.; Song, J. H.; Chung, Y. J. and others

    1999-03-01

    Inherent safety features and safety system characteristics of the SMART integral reactor are investigated in this study. Performance and safety of the SMART conceptual design have been evaluated and confirmed through the performance and safety analyses using safety analysis system codes as well as a preliminary performance and safety analysis methodology. SMART design base events and their acceptance criteria are identified to develop a preliminary PIRT for the SMART integral reactor. Using the preliminary PIRT, a set of experimental program for the thermal hydraulic separate effect tests and the integral effect tests was developed for the thermal hydraulic model development and the system code validation. Safety characteristics as well as the safety issues of the integral reactor has been identified during the study, which will be used to resolve the safety issues and guide the regulatory criteria for the integral reactor. The results of the performance and safety analyses performed during the study were used to feedback for the SMART conceptual design. The performance and safety analysis code systems as well as the preliminary safety analysis methodology developed in this study will be validated as the SMART design evolves. The performance and safety analysis technology developed during the study will be utilized for the SMART basic design development. (author)

  2. Overcoming barriers to integrating economic analysis into risk assessment.

    Science.gov (United States)

    Hoffmann, Sandra

    2011-09-01

    Regulatory risk analysis is designed to provide decisionmakers with a clearer understanding of how policies are likely to affect risk. The systems that produce risk are biological, physical, and social and economic. As a result, risk analysis is an inherently interdisciplinary task. Yet in practice, risk analysis has been interdisciplinary in only limited ways. Risk analysis could provide more accurate assessments of risk if there were better integration of economics and other social sciences into risk assessment itself. This essay examines how discussions about risk analysis policy have influenced the roles of various disciplines in risk analysis. It explores ways in which integrated bio/physical-economic modeling could contribute to more accurate assessments of risk. It reviews examples of the kind of integrated economics-bio/physical modeling that could be used to enhance risk assessment. The essay ends with a discussion of institutional barriers to greater integration of economic modeling into risk assessment and provides suggestions on how these might be overcome. © 2011 Society for Risk Analysis.

  3. A network analysis of leadership theory : the infancy of integration.

    OpenAIRE

    Meuser, J. D.; Gardner, W. L.; Dinh, J. E.; Hu, J.; Liden, R. C.; Lord, R. G.

    2016-01-01

    We investigated the status of leadership theory integration by reviewing 14 years of published research (2000 through 2013) in 10 top journals (864 articles). The authors of these articles examined 49 leadership approaches/theories, and in 293 articles, 3 or more of these leadership approaches were included in their investigations. Focusing on these articles that reflected relatively extensive integration, we applied an inductive approach and used graphic network analysis as a guide for drawi...

  4. Integrated analysis of oxide nuclear fuel sintering

    International Nuclear Information System (INIS)

    Baranov, V.; Kuzmin, R.; Tenishev, A.; Timoshin, I.; Khlunov, A.; Ivanov, A.; Petrov, I.

    2011-01-01

    Dilatometric and thermal-gravimetric investigations have been carried out for the sintering process of oxide nuclear fuel in gaseous Ar - 8% H 2 atmosphere at temperatures up to 1600 0 C. The pressed compacts were fabricated under real production conditions of the OAO MSZ with application of two different technologies, so called 'dry' and 'wet' technologies. Effects of the grain size growth after the heating to different temperatures were observed. In order to investigate the effects produced by rate of heating on properties of sintered fuel pellets, the heating rates were varied from 1 to 8 0 C per minute. Time of isothermal overexposure at maximal temperature (1600 0 C) was about 8 hours. Real production conditions were imitated. The results showed that the sintering process of the fuel pellets produced by two technologies differs. The samples sintered under different heating rates were studied with application of scanning electronic microscopy analysis for determination of mean grain size. A simulation of heating profile for industrial furnaces was performed to reduce the beam cycles and estimate the effects of variation of the isothermal overexposure temperatures. Based on this data, an optimization of the sintering conditions was performed in operations terms of OAO MSZ. (authors)

  5. Executive summary

    International Nuclear Information System (INIS)

    Besmann, T.M.

    2015-01-01

    The State-of-the-Art Report on Multi-scale Modelling of Nuclear Fuels describes the state of fundamental materials models that can represent fuel behaviour, the methodologies for obtaining material properties, and modelling principles as they can be implemented in fuel performance codes. This report, while far from being a detailed assessment of nuclear fuel modelling, provides a recognition of the approaches to the significant aspects of fuel modelling and examples of their application. Fuel behaviour phenomena are discussed that are applicable across the spectrum of fuel forms, from conventional LWR oxide pellets to MOX, carbide, and metal SFR fuel, to coated particle fuel for gas-cooled reactors. A key issue is microstructural evolution during burn-up, and the state of understanding of that phenomenon is considered at length. Covered in the discussions are the basic material properties of heat capacity, free energy, and thermal conductivity and diffusion. Also included are the more functional effects of restructuring, bubble formation, constituent redistribution, fuel and clad oxidation, and fuel clad and environmental interactions. Fuel fabrication is considered as are many material modelling challenges, such as representing injection casting of metallic fuels, as seen in the preparation of nuclear fuel. The last set of contributions covered the basic principles for modelling phenomena and determining fundamental materials properties, a look at the state of fuel performance codes and a last note about integrating across multiple scales. The state-of-the-art of modelling phenomena related to nuclear fuel has advanced significantly in recent years. The representation of atomic level behaviour is increasingly becoming more accurate as capabilities to utilise larger sets of atoms evolve and empirical potentials improve. At the mesoscale, models for transport and microstructure evolution have also progressed with new techniques that well represent restructuring. At

  6. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  7. Momentum integral network method for thermal-hydraulic transient analysis

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.

    1983-01-01

    A new momentum integral network method has been developed, and tested in the MINET computer code. The method was developed in order to facilitate the transient analysis of complex fluid flow and heat transfer networks, such as those found in the balance of plant of power generating facilities. The method employed in the MINET code is a major extension of a momentum integral method reported by Meyer. Meyer integrated the momentum equation over several linked nodes, called a segment, and used a segment average pressure, evaluated from the pressures at both ends. Nodal mass and energy conservation determined nodal flows and enthalpies, accounting for fluid compression and thermal expansion

  8. Study on integrated design and analysis platform of NPP

    International Nuclear Information System (INIS)

    Lu Dongsen; Gao Zuying; Zhou Zhiwei

    2001-01-01

    Many calculation software have been developed to nuclear system's design and safety analysis, such as structure design software, fuel design and manage software, thermal hydraulic analysis software, severe accident simulation software, etc. This study integrates those software to a platform, develops visual modeling tool for Retran, NGFM90. And in this platform, a distribution calculation method is also provided for couple calculation between different software. The study will improve the design and analysis of NPP

  9. Executive summary

    International Nuclear Information System (INIS)

    Christy, Robert F.; Tajima, Eizo

    1987-01-01

    The Historical Review describes the events leading up to the present reassessment of the dosimetry of the atomic bombs at Hiroshima and Nagasaki. To make that reassessment, working groups were set up in Japan and in the United States. These groups organized their efforts into ten major areas: yields of the bombs, radiation leakage from the bombs, transport of radiation in air over ground, thermoluminescence measurements of gamma rays, measurements of neutrons, residual radioactivity, house and terrain shielding, organ dosimetry, preparation of a dosimetry system, and uncertainty analysis. In this report on the reassessment, one chapter is devoted to each of the first nine areas; a future will deal with the last area, uncertainty analysis. The chapters were prepared by writing groups, listed as the authors of the chapters. The chapters are based on a large number of individual papers, some of which are included in this report as appendixes to the relevant chapters

  10. Integrated failure probability estimation based on structural integrity analysis and failure data: Natural gas pipeline case

    International Nuclear Information System (INIS)

    Dundulis, Gintautas; Žutautaitė, Inga; Janulionis, Remigijus; Ušpuras, Eugenijus; Rimkevičius, Sigitas; Eid, Mohamed

    2016-01-01

    In this paper, the authors present an approach as an overall framework for the estimation of the failure probability of pipelines based on: the results of the deterministic-probabilistic structural integrity analysis (taking into account loads, material properties, geometry, boundary conditions, crack size, and defected zone thickness), the corrosion rate, the number of defects and failure data (involved into the model via application of Bayesian method). The proposed approach is applied to estimate the failure probability of a selected part of the Lithuanian natural gas transmission network. The presented approach for the estimation of integrated failure probability is a combination of several different analyses allowing us to obtain: the critical crack's length and depth, the failure probability of the defected zone thickness, dependency of the failure probability on the age of the natural gas transmission pipeline. A model's uncertainty analysis and uncertainty propagation analysis are performed, as well. - Highlights: • Degradation mechanisms of natural gas transmission pipelines. • Fracture mechanic analysis of the pipe with crack. • Stress evaluation of the pipe with critical crack. • Deterministic-probabilistic structural integrity analysis of gas pipeline. • Integrated estimation of pipeline failure probability by Bayesian method.

  11. Direct integration multiple collision integral transport analysis method for high energy fusion neutronics

    International Nuclear Information System (INIS)

    Koch, K.R.

    1985-01-01

    A new analysis method specially suited for the inherent difficulties of fusion neutronics was developed to provide detailed studies of the fusion neutron transport physics. These studies should provide a better understanding of the limitations and accuracies of typical fusion neutronics calculations. The new analysis method is based on the direct integration of the integral form of the neutron transport equation and employs a continuous energy formulation with the exact treatment of the energy angle kinematics of the scattering process. In addition, the overall solution is analyzed in terms of uncollided, once-collided, and multi-collided solution components based on a multiple collision treatment. Furthermore, the numerical evaluations of integrals use quadrature schemes that are based on the actual dependencies exhibited in the integrands. The new DITRAN computer code was developed on the Cyber 205 vector supercomputer to implement this direct integration multiple-collision fusion neutronics analysis. Three representative fusion reactor models were devised and the solutions to these problems were studied to provide suitable choices for the numerical quadrature orders as well as the discretized solution grid and to understand the limitations of the new analysis method. As further verification and as a first step in assessing the accuracy of existing fusion-neutronics calculations, solutions obtained using the new analysis method were compared to typical multigroup discrete ordinates calculations

  12. IMG: the integrated microbial genomes database and comparative analysis system

    Science.gov (United States)

    Markowitz, Victor M.; Chen, I-Min A.; Palaniappan, Krishna; Chu, Ken; Szeto, Ernest; Grechkin, Yuri; Ratner, Anna; Jacob, Biju; Huang, Jinghua; Williams, Peter; Huntemann, Marcel; Anderson, Iain; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2012-01-01

    The Integrated Microbial Genomes (IMG) system serves as a community resource for comparative analysis of publicly available genomes in a comprehensive integrated context. IMG integrates publicly available draft and complete genomes from all three domains of life with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and reviewing the annotations of genes and genomes in a comparative context. IMG's data content and analytical capabilities have been continuously extended through regular updates since its first release in March 2005. IMG is available at http://img.jgi.doe.gov. Companion IMG systems provide support for expert review of genome annotations (IMG/ER: http://img.jgi.doe.gov/er), teaching courses and training in microbial genome analysis (IMG/EDU: http://img.jgi.doe.gov/edu) and analysis of genomes related to the Human Microbiome Project (IMG/HMP: http://www.hmpdacc-resources.org/img_hmp). PMID:22194640

  13. Executive Summary

    International Nuclear Information System (INIS)

    2012-01-01

    The OECD Nuclear Energy Agency (NEA) Integration Group for the Safety Case (IGSC) organised a workshop to assess current understanding on the use of cementitious materials in radioactive waste disposal. The workshop was hosted by the Belgian Agency for Radioactive Waste and Enriched Fissile Materials (Ondraf/Niras), in Brussels, Belgium on 17-19 November 2009. The workshop brought together a wide range of people involved in supporting safety case development and having an interest in cementitious materials: namely, cement and concrete experts, repository designers, scientists, safety assessors, disposal programme managers and regulators. The workshop was designed primarily to consider issues relevant to the post-closure safety of radioactive waste disposal, but also addressed some related operational issues, such as cementitious barrier emplacement. Where relevant, information on cementitious materials from analogous natural and anthropogenic systems was also considered. The workshop agenda is included as Appendix A. The workshop focused on: - The uses of different cementitious materials in various repository designs. - The evolution of cementitious materials over long time scales (1000s to 100000s of years). - The interaction of cementitious materials with surrounding components of the repository (e.g. waste, container, buffer, backfill, host rock). - The workshop comprised: - Plenary sessions in which the state-of-the-art on repository design and understanding the phenomenology of cementitious materials and their interactions were presented and discussed. - Dedicated working group sessions, which were used to discuss key safety assessment and safety case questions in more detail. For example: How strong is the scientific basis for incorporating the various aspects of the behaviour and interactions of cementitious materials in safety assessments and safety cases? How can the behaviour and interactions of cementitious materials best be incorporated within the

  14. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    Science.gov (United States)

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-11-28

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. All local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  15. A simple hypothesis of executive function

    Directory of Open Access Journals (Sweden)

    Bruno eKopp

    2012-06-01

    Full Text Available Executive function is traditionally conceptualized as a set of abilities required to guide behavior toward goals. Here, an integrated theoretical framework for executive function is developed which has its roots in the notion of hierarchical mental models. Further following Duncan (2010a,b, executive function is construed as a hierarchical recursive system of test-operation-test-exit units (Miller, Galanter, and Pribram, 1960. Importantly, it is shown that this framework can be used to model the main regional prefrontal syndromes, which are characterized by apathetic, disinhibited and dysexecutive cognition and behavior, respectively. Implications of these considerations for the neuropsychological assessment of executive function are discussed.

  16. Integrative sparse principal component analysis of gene expression data.

    Science.gov (United States)

    Liu, Mengque; Fan, Xinyan; Fang, Kuangnan; Zhang, Qingzhao; Ma, Shuangge

    2017-12-01

    In the analysis of gene expression data, dimension reduction techniques have been extensively adopted. The most popular one is perhaps the PCA (principal component analysis). To generate more reliable and more interpretable results, the SPCA (sparse PCA) technique has been developed. With the "small sample size, high dimensionality" characteristic of gene expression data, the analysis results generated from a single dataset are often unsatisfactory. Under contexts other than dimension reduction, integrative analysis techniques, which jointly analyze the raw data of multiple independent datasets, have been developed and shown to outperform "classic" meta-analysis and other multidatasets techniques and single-dataset analysis. In this study, we conduct integrative analysis by developing the iSPCA (integrative SPCA) method. iSPCA achieves the selection and estimation of sparse loadings using a group penalty. To take advantage of the similarity across datasets and generate more accurate results, we further impose contrasted penalties. Different penalties are proposed to accommodate different data conditions. Extensive simulations show that iSPCA outperforms the alternatives under a wide spectrum of settings. The analysis of breast cancer and pancreatic cancer data further shows iSPCA's satisfactory performance. © 2017 WILEY PERIODICALS, INC.

  17. Executive summary

    International Nuclear Information System (INIS)

    2002-01-01

    A new generation of reactor designs is being developed that is intended to meet the requirements of the 21. century. In the short term, the most important requirement is to overcome the relative non-competitiveness of current reactor designs in the de-regulated market. For this purpose, evolutionary light water reactor (LWR) designs have been maturing and are being actively promoted. These are specifically designed to be less expensive to build and operate than the previous generation of LWRs, genuinely competitive with alternative forms of generation and at the same time establish higher levels of safety. A new generation of modular, small-to-medium (100-300 MWe/module), integral design water-cooled reactors are under development. These are designed to be competitive with nuclear and non-nuclear power plants, to have significantly enhanced safety, to be proliferation-resistant and to reduce the amount of radioactive waste produced. A different approach to improve competitiveness is the re-emergence of high-temperature reactors (HTRs) using gas turbine technology to obtain higher thermal efficiencies, low construction and operating costs, inherent safety characteristics and low proliferation risk. In the longer term, assuming that the current stagnation in the market is successfully overcome, other requirements related to long-term sustainability will emerge. Important amongst these will be the need to minimise the environmental burden passed on to future generations (or at least to ensure that the cost to future generations is in balance with the benefits to the current generation), the need to establish sustainability of fuel and the need to minimise stocks of separated plutonium at the minimum possible working level and to minimise accessibility to plutonium. In this context, topics of interest to the workshop were: reactors consuming excess plutonium, advanced LWRs, HTRs, fast spectrum reactors, subcritical systems, minor-actinide systems and radical innovative

  18. Executive Summary

    International Nuclear Information System (INIS)

    Vari, Anna; Sakuma, Hideki

    2008-01-01

    KASAM (Sweden), NRC (USA), NWMO (Canada), and HSK (Switzerland), as well as representatives of NEA and its working parties the Integration Group for the Safety Case (IGSC) and the Regulators? Forum. Presentations, discussions, and lessons learned are summarised in this paper

  19. Executive summary

    International Nuclear Information System (INIS)

    2014-01-01

    The workshop included an opening session, six sessions with participant presentations followed by short discussion, and two facilitated discussion sessions. The contributions presented were devoted to new methodological developments, projects with external hazards analysis activities, interesting aspects of external hazards analysis and expected challenges for future analyses. The contributions presented and the discussions organized during the workshop provided valuable input for strengthening the role of WGRISK in supporting the development and application of probabilistic safety assessment and risk-oriented decision making methods in the area of external hazards. The workshop supported the key general objectives of collecting and exchanging information from OECD member states on the methods and approaches used in probabilistic safety assessment in this area. The focus of the workshop was on external events PSA for nuclear power plants, including all modes of operation. The workshop scope was generally limited to external, natural hazards, including those hazards where the distinction between natural and man-made hazards is not sharp (e.g., external floods caused by dam failures). The participation was open to experts from regulatory authorities and their technical support organizations, research organizations, utilities, nuclear power plant (NPP) designers and vendors, industry associations and observers from OECD NEA member countries. The Background, Objectives, Organization, and Topics of the Workshop are presented in chapter 1. The WGRISK activities preceding the workshop and leading to the decision to organize it are described in Chapter 2. The detailed information about the presentations, discussions, and results of the workshop is presented in Chapter 3. Detailed information about the conclusions made during the workshop is presented in Chapter 4. The list of participants, the workshop agenda and the papers/presentations are attached in the appendixes

  20. Train integrity detection risk analysis based on PRISM

    Science.gov (United States)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  1. Multigroup confirmatory factor analysis and structural invariance with age of the Behavior Rating Inventory of Executive Function (BRIEF)--French version.

    Science.gov (United States)

    Fournet, Nathalie; Roulin, Jean-Luc; Monnier, Catherine; Atzeni, Thierry; Cosnefroy, Olivier; Le Gall, Didier; Roy, Arnaud

    2015-01-01

    The parent and teacher forms of the French version of the Behavioral Rating Inventory of Executive Function (BRIEF) were used to evaluate executive function in everyday life in a large sample of healthy children (N = 951) aged between 5 and 18. Several psychometric methods were applied, with a view to providing clinicians with tools for score interpretation. The parent and teacher forms of the BRIEF were acceptably reliable. Demographic variables (such as age and gender) were found to influence the BRIEF scores. Confirmatory factor analysis was then used to test five competing models of the BRIEF's latent structure. Two of these models (a three-factor model and a two-factor model, both based on a nine-scale structure) had a good fit. However, structural invariance with age was only obtained with the two-factor model. The French version of the BRIEF provides a useful measure of everyday executive function and can be recommended for use in clinical research and practice.

  2. Executive summary

    International Nuclear Information System (INIS)

    2010-01-01

    development of nitride fuel and a pyrochemical process for transmutation of minor actinides; RIAR (Russian Federation) presented the RIAR DOVITA-1/2 P and T programme, discussing the results of 15 years of R and D activity; JAEA presented an experimental evaluation of Am- and Np-bearing mixed-oxide fuel properties; CEA discussed minor actinide recycling in sodium fast reactor and the 2008 status of the Phenix experimental programme; The technical session about Progress in partitioning, waste forms and management comprised two invited paper and eight oral presentations: The first invited talk of the session, given by CEA concerned future nuclear fuel cycles and the prospects and challenges lying therein; CEA gave an overview on a new EC activity: Actinide Recycling by Separation and Transmutation (ACSEPT), which is under the FP7 of EURATOM; FZJ (Germany) presented developments regarding a new SANEX process for actinide (III)/lanthanide (III) separation; Recent R and D activities pertaining to innovative extractants and adsorbents for partitioning of minor actinides at JAEA were reported on; ANL (USA) gave the second invited talk of the session, on fission product partitioning and integrated waste management; Studies on separation of actinides and lanthanides by extraction chromatography using 2.6-bis triazinyl pyridine were presented by IGCAR (India); CRIEPI (Japan) presented recent developments on pyrochemical processing and metal fuel cycle technology; KAERI (Korea) reported on fission product partitioning and waste salt minimisation during pyro-process; Current progress in R and D on MSR fuel cycle technology in the Czech Republic was presented by NRI; UC Berkeley (USA) presented the effects of repository conditions on environmental impact reduction through recycling; The technical session about Progress in materials, including spallation targets and coolants comprised one invited paper and two oral presentations: The topic of the invited talk of the session, presented by

  3. Argentinean integrated small reactor design and scale economy analysis of integrated reactor

    International Nuclear Information System (INIS)

    Florido, P. C.; Bergallo, J. E.; Ishida, M. V.

    2000-01-01

    This paper describes the design of CAREM, which is Argentinean integrated small reactor project and the scale economy analysis results of integrated reactor. CAREM project consists on the development, design and construction of a small nuclear power plant. CAREM is an advanced reactor conceived with new generation design solutions and standing on the large experience accumulated in the safe operation of Light Water Reactors. The CAREM is an indirect cycle reactor with some distinctive and characteristic features that greatly simplify the reactor and also contribute to a highly level of safety: integrated primary cooling system, self pressurized, primary cooling by natural circulation and safety system relying on passive features. For a fully doupled economic evaluation of integrated reactors done by IREP (Integrated Reactor Evaluation Program) code transferred to IAEA, CAREM have been used as a reference point. The results shows that integrated reactors become competitive with power larger than 200MWe with Argentinean cheapest electricity option. Due to reactor pressure vessel construction limit, low pressure drop steam generator are used to reach power output of 200MWe for natural circulation. For forced circulation, 300MWe can be achieved. (author)

  4. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    Science.gov (United States)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  5. INS integrated motion analysis for autonomous vehicle navigation

    Science.gov (United States)

    Roberts, Barry; Bazakos, Mike

    1991-01-01

    The use of inertial navigation system (INS) measurements to enhance the quality and robustness of motion analysis techniques used for obstacle detection is discussed with particular reference to autonomous vehicle navigation. The approach to obstacle detection used here employs motion analysis of imagery generated by a passive sensor. Motion analysis of imagery obtained during vehicle travel is used to generate range measurements to points within the field of view of the sensor, which can then be used to provide obstacle detection. Results obtained with an INS integrated motion analysis approach are reviewed.

  6. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  7. Analysis and Modeling of Integrated Magnetics for LLC resonant Converters

    DEFF Research Database (Denmark)

    Li, Mingxiao; Ouyang, Ziwei; Zhao, Bin

    2017-01-01

    Shunt-inserted transformers are widely used toobtain high leakage inductance. This paper investigates thismethod in depth to make it applicable to integrate resonantinductor for the LLC resonant converters. The analysis andmodel of magnetizing inductance and leakage inductance forshunt...... transformers can provide a significantdifference. The way to obtain the desirable magnetizing andleakage inductance value for LLC resonant converters issimplified by the creation of air gaps together with a magneticshunt. The calculation and relation are validated by finiteelement analysis (FEA) simulations...

  8. Integrated dynamic modeling and management system mission analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, A.K.

    1994-12-28

    This document summarizes the mission analysis performed on the Integrated Dynamic Modeling and Management System (IDMMS). The IDMMS will be developed to provide the modeling and analysis capability required to understand the TWRS system behavior in terms of the identified TWRS performance measures. The IDMMS will be used to demonstrate in a verified and validated manner the satisfactory performance of the TWRS system configuration and assurance that the requirements have been satisfied.

  9. Integrated dynamic modeling and management system mission analysis

    International Nuclear Information System (INIS)

    Lee, A.K.

    1994-01-01

    This document summarizes the mission analysis performed on the Integrated Dynamic Modeling and Management System (IDMMS). The IDMMS will be developed to provide the modeling and analysis capability required to understand the TWRS system behavior in terms of the identified TWRS performance measures. The IDMMS will be used to demonstrate in a verified and validated manner the satisfactory performance of the TWRS system configuration and assurance that the requirements have been satisfied

  10. Heater-Integrated Cantilevers for Nano-Samples Thermogravimetric Analysis

    OpenAIRE

    Toffoli, Valeria; Carrato, Sergio; Lee, Dongkyu; Jeon, Sangmin; Lazzarino, Marco

    2013-01-01

    The design and characteristics of a micro-system for thermogravimetric analysis (TGA) in which heater, temperature sensor and mass sensor are integrated into a single device are presented. The system consists of a suspended cantilever that incorporates a microfabricated resistor, used as both heater and thermometer. A three-dimensional finite element analysis was used to define the structure parameters. TGA sensors were fabricated by standard microlithographic techniques and tested using mill...

  11. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2006-01-01

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and design...

  12. Multi-criteria decision analysis integrated with GIS for radio ...

    African Journals Online (AJOL)

    Multi-criteria decision analysis integrated with GIS for radio astronomical observatory site selection in peninsular of Malaysia. R Umar, Z.Z. Abidin, Z.A. Ibrahim, M.K.A. Kamarudin, S.N. Hazmin, A Endut, H Juahir ...

  13. Integrated analysis for genotypic adaptation in rice | Das | African ...

    African Journals Online (AJOL)

    Integrated analysis for genotypic adaptation in rice. S Das, RC Misra, MC Pattnaik, SK Sinha. Abstract. Development of varieties with high yield potential coupled with wide adaptability is an important plant breeding objective. The presence of genotype by environment (GxE) interaction plays a crucial role in determining the ...

  14. Integration of Design and Control Through Model Analysis

    DEFF Research Database (Denmark)

    Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay

    2000-01-01

    of the phenomena models representing the process model identify the relationships between the important process and design variables, which help to understand, define and address some of the issues related to integration of design and control issues. The model analysis is highlighted through examples involving...... processes with mass and/or energy recycle. (C) 2000 Elsevier Science Ltd. All rights reserved....

  15. Executable Use Cases

    DEFF Research Database (Denmark)

    Jørgensen, Jens Bæk; Bossen, Claus

    2004-01-01

    and into the work processes they're to support. However, prototypes typically provide an explicit representation only of the system itself. Executable use cases, on the other hand, can also describe the environment. EUCs are designed to: narrow the gap between informal ideas about requirements and the formalization...... modeling. This article describes a case study in which developers used EUCs to prototype an electronic patient record system for hospitals in Aarhus, Denmark.......Many software experts argue that when we design a new system, we should create an explicit description of the environment in which the proposed system is to be used. The argument becomes crucial for pervasive computing, which aims to tightly integrate systems into their environments...

  16. Executive summary

    International Nuclear Information System (INIS)

    2009-01-01

    Prior to the workshop two CSNI/WGHOF surveys were distributed - one addressing regulatory expectations and the other focussing on approaches the licensees use to justify organisational suitability. The regulatory survey requested a brief overview of the situation related to plant organisations in their country - both regulatory expectations and formal requirements. The licensee survey requested information on how they ensure effective organisational suitability, resources and competencies at their plants. The findings from these surveys were used in conjunction with other factors to identify the key issues for the workshop discussions sessions. Although there have been noticeable improvements in recent years in the documentation of the suitability of licensee organisations, almost all regulatory and licensee survey respondents stated that serious questions remain concerning the maturity of the processes to demonstrate the adequacy of the organisational resources and competencies. Other needed improvements identified by licensees include: - indicators for early detection of losses associated with organisational resources and knowledge; - measures for the preservation of specialised knowledge; - refinement of event (root cause) analyses processes to better take into account and identify organisational causes and contributing factors; - better means to evaluate the adequacy of resources. The survey revealed that methods that licensees have found useful to identify and demonstrate the adequacy of the organisational structure, resources and competencies include business oversight process, peer reviews and benchmarking, event and change analysis and establishment of a reference 'organisational baseline'. The workshop participants identified a number of important attributes that characterise a 'good' organisation. One attribute is that the organisation should be open to learning from their experiences and from the experiences of others. This would include taking advantage

  17. Enhancing yeast transcription analysis through integration of heterogeneous data

    DEFF Research Database (Denmark)

    Grotkjær, Thomas; Nielsen, Jens

    2004-01-01

    of Saccharomyces cerevisiae whole genome transcription data. A special focus is on the quantitative aspects of normalisation and mathematical modelling approaches, since they are expected to play an increasing role in future DNA microarray analysis studies. Data analysis is exemplified with cluster analysis......DNA microarray technology enables the simultaneous measurement of the transcript level of thousands of genes. Primary analysis can be done with basic statistical tools and cluster analysis, but effective and in depth analysis of the vast amount of transcription data requires integration with data...... from several heterogeneous data Sources, such as upstream promoter sequences, genome-scale metabolic models, annotation databases and other experimental data. In this review, we discuss how experimental design, normalisation, heterogeneous data and mathematical modelling can enhance analysis...

  18. Integration of End-User Cloud Storage for CMS Analysis

    CERN Document Server

    Riahi, Hassen; Álvarez Ayllón, Alejandro; Balcas, Justas; Ciangottini, Diego; Hernández, José M; Keeble, Oliver; Magini, Nicolò; Manzi, Andrea; Mascetti, Luca; Mascheroni, Marco; Tanasijczuk, Andres Jorge; Vaandering, Eric Wayne

    2018-01-01

    End-user Cloud storage is increasing rapidly in popularity in research communities thanks to the collaboration capabilities it offers, namely synchronisation and sharing. CERN IT has implemented a model of such storage named, CERNBox, integrated with the CERN AuthN and AuthZ services. To exploit the use of the end-user Cloud storage for the distributed data analysis activity, the CMS experiment has started the integration of CERNBox as a Grid resource. This will allow CMS users to make use of their own storage in the Cloud for their analysis activities as well as to benefit from synchronisation and sharing capabilities to achieve results faster and more effectively. It will provide an integration model of Cloud storages in the Grid, which is implemented and commissioned over the world’s largest computing Grid infrastructure, Worldwide LHC Computing Grid (WLCG). In this paper, we present the integration strategy and infrastructure changes needed in order to transparently integrate end-user Cloud storage with...

  19. Application of Stochastic Sensitivity Analysis to Integrated Force Method

    Directory of Open Access Journals (Sweden)

    X. F. Wei

    2012-01-01

    Full Text Available As a new formulation in structural analysis, Integrated Force Method has been successfully applied to many structures for civil, mechanical, and aerospace engineering due to the accurate estimate of forces in computation. Right now, it is being further extended to the probabilistic domain. For the assessment of uncertainty effect in system optimization and identification, the probabilistic sensitivity analysis of IFM was further investigated in this study. A set of stochastic sensitivity analysis formulation of Integrated Force Method was developed using the perturbation method. Numerical examples are presented to illustrate its application. Its efficiency and accuracy were also substantiated with direct Monte Carlo simulations and the reliability-based sensitivity method. The numerical algorithm was shown to be readily adaptable to the existing program since the models of stochastic finite element and stochastic design sensitivity are almost identical.

  20. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  1. Integrative Analysis of Metabolic Models – from Structure to Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Hartmann, Anja, E-mail: hartmann@ipk-gatersleben.de [Leibniz Institute of Plant Genetics and Crop Plant Research (IPK), Gatersleben (Germany); Schreiber, Falk [Monash University, Melbourne, VIC (Australia); Martin-Luther-University Halle-Wittenberg, Halle (Germany)

    2015-01-26

    The characterization of biological systems with respect to their behavior and functionality based on versatile biochemical interactions is a major challenge. To understand these complex mechanisms at systems level modeling approaches are investigated. Different modeling formalisms allow metabolic models to be analyzed depending on the question to be solved, the biochemical knowledge and the availability of experimental data. Here, we describe a method for an integrative analysis of the structure and dynamics represented by qualitative and quantitative metabolic models. Using various formalisms, the metabolic model is analyzed from different perspectives. Determined structural and dynamic properties are visualized in the context of the metabolic model. Interaction techniques allow the exploration and visual analysis thereby leading to a broader understanding of the behavior and functionality of the underlying biological system. The System Biology Metabolic Model Framework (SBM{sup 2} – Framework) implements the developed method and, as an example, is applied for the integrative analysis of the crop plant potato.

  2. Construction of an integrated database to support genomic sequence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, W.; Overbeek, R.

    1994-11-01

    The central goal of this project is to develop an integrated database to support comparative analysis of genomes including DNA sequence data, protein sequence data, gene expression data and metabolism data. In developing the logic-based system GenoBase, a broader integration of available data was achieved due to assistance from collaborators. Current goals are to easily include new forms of data as they become available and to easily navigate through the ensemble of objects described within the database. This report comments on progress made in these areas.

  3. Containment integrity analysis with SAMPSON/DCRA module

    International Nuclear Information System (INIS)

    Hosoda, Seigo; Shirakawa, Noriyuki; Naitoh, Masanori

    2006-01-01

    The integrity of PWR containment under a severe accident is analyzed using the debris concrete reaction analysis code. If core fuels melt through the pressure vessel and the debris accumulates on the reactor cavity of a lower part of containment, its temperature continues to rise due to decay heat and the debris ablates the concrete floor. In case that cooling water is issued into the containment cavity and the amount of debris is limited to 30% of core fuels, our analyses showed that the debris could be cooled and frozen so that integrity of containment could hold. (author)

  4. ADAMS executive and operating system

    Science.gov (United States)

    Pittman, W. D.

    1981-01-01

    The ADAMS Executive and Operating System, a multitasking environment under which a variety of data reduction, display and utility programs are executed, a system which provides a high level of isolation between programs allowing them to be developed and modified independently, is described. The Airborne Data Analysis/Monitor System (ADAMS) was developed to provide a real time data monitoring and analysis capability onboard Boeing commercial airplanes during flight testing. It inputs sensor data from an airplane performance data by applying transforms to the collected sensor data, and presents this data to test personnel via various display media. Current utilization and future development are addressed.

  5. Risk-based systems analysis of emerging high-level waste tank remediation technologies. Volume 1: Executive summary

    International Nuclear Information System (INIS)

    Peters, B.B.; Cameron, R.J.; McCormack, W.D.

    1994-08-01

    This report describes a System Analysis Model developed under the US Department of Energy (DOE) Office of Technology Development (OTD) Underground Storage Tank-Integrated Demonstration (UST-ID) program to aid technology development funding decisions for radioactive tank waste remediation. Current technology development selection methods evaluate new technologies in isolation from other components of an overall tank waste remediation system. These methods do not show the relative effect of new technologies on tank remediation systems as a whole. Consequently, DOE may spend its resources on technologies that promise to improve a single function but have a small or possibly negative, impact on the overall system, or DOE may overlook a technology that does not address a high priority problem in the system but that does, if implemented, offer sufficient overall improvements. Systems engineering and detailed analyses often conducted under the National Environmental Policy Act (NEPA 1969) use a ''whole system'' approach but are costly, too time-consuming, and often not sufficiently focused to support the needs of the technology program decision-makers. An alternative approach is required to evaluate these systems impacts but still meet the budget and schedule needs of the technology program

  6. Building a web-based CAD server for clinical use, evaluation, and incremental learning. Implementation of analysis function based on execution result and clinical feedback

    International Nuclear Information System (INIS)

    Nomura, Yukihiro; Hayashi, Naoto; Masutani, Yoshitaka; Yoshikawa, Takeharu; Nemoto, Mitsutaka; Hanaoka, Shouhei; Maeda, Eriko; Ohtomo, Kuni; Miki, Soichiro

    2010-01-01

    Development of clinical image analysis software such as computer-assisted detection/diagnosis (CAD) involves a cycle of algorithm development, software implementation, clinical use, refinement of algorithm and software based on feedback. This cycle is expected to accelerate development of CAD software. We have been building a web-based CAD server that enables radiologists to use CAD software and to give feedback in clinical environment. The platform has been utilized in our hospital for 16 months, and more than 2,000 cases of feedback data have been accumulated. In this report, we introduce additional functions for performance evaluation based on executed results of CAD software and clinical feedback. (author)

  7. Tav4SB: integrating tools for analysis of kinetic models of biological systems.

    Science.gov (United States)

    Rybiński, Mikołaj; Lula, Michał; Banasik, Paweł; Lasota, Sławomir; Gambin, Anna

    2012-04-05

    Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. The Taverna services for Systems Biology (Tav4SB) project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project's Web page: http://bioputer.mimuw.edu.pl/tav4sb/. The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.

  8. 77 FR 51523 - Senior Executive Service Performance Review Board Membership

    Science.gov (United States)

    2012-08-24

    ... COUNCIL OF THE INSPECTORS GENERAL ON INTEGRITY AND EFFICIENCY Senior Executive Service Performance... required to establish one or more Senior Executive Service (SES) performance review boards. The purpose of these boards is to review and evaluate the initial appraisal of a senior executive's performance by the...

  9. A continuum of executive function deficits in early subcortical vascular cognitive impairment: A systematic review and meta-analysis.

    Science.gov (United States)

    Sudo, Felipe Kenji; Amado, Patricia; Alves, Gilberto Sousa; Laks, Jerson; Engelhardt, Eliasz

    2017-01-01

    Subcortical Vascular Cognitive Impairment (SVCI) is a clinical continuum of vascular-related cognitive impairment, including Vascular Mild Cognitive Impairment (VaMCI) and Vascular Dementia. Deficits in Executive Function (EF) are hallmarks of the disorder, but the best methods to assess this function have yet to be determined. The insidious and almost predictable course of SVCI and the multidimensional concept of EF suggest that a temporal dissociation of impairments in EF domains exists early in the disorder. This study aims to review and analyze data from the literature about performance of VaMCI patients on the most used EF tests through a meta-analytic approach. Medline, Web of Knowledge and PsycINFO were searched, using the terms: "vascular mild cognitive impairment" OR "vascular cognitive impairment no dementia" OR "vascular mild neurocognitive disorder" AND "dysexecutive" OR "executive function". Meta-analyses were conducted for each of the selected tests, using random-effect models. Systematic review showed major discrepancies among the results of the studies included. Meta-analyses evidenced poorer performance on the Trail-Making Test part B and the Stroop color test by VaMCI patients compared to controls. A continuum of EF impairments has been proposed in SVCI. Early deficits appear to occur in cognitive flexibility and inhibitory control.

  10. Strategic management: a new dimension of the nurse executive's role.

    Science.gov (United States)

    Johnson, L J

    1990-09-01

    The growth of corporate orientation for health care structures, with a focus on bottom-line management, has radically altered the role of nurse executives. With the organization's emphasis on performance, productivity, and results, successful nurse executives are now integrating the management of the delivery of nursing care with the management of complex corporate structures and relationships. The editor of Executive Development discusses the rapidly changing expectations and demands of the contemporary nurse executive's work. The nurse executive's role can be viewed from many perspectives: its scope, its value, its structure, its content. Content--"What does the nurse executive do that makes a real difference?"--is the focus here.

  11. Plant-wide integrated equipment monitoring and analysis system

    International Nuclear Information System (INIS)

    Morimoto, C.N.; Hunter, T.A.; Chiang, S.C.

    2004-01-01

    A nuclear power plant equipment monitoring system monitors plant equipment and reports deteriorating equipment conditions. The more advanced equipment monitoring systems can also provide information for understanding the symptoms and diagnosing the root cause of a problem. Maximizing the equipment availability and minimizing or eliminating consequential damages are the ultimate goals of equipment monitoring systems. GE Integrated Equipment Monitoring System (GEIEMS) is designed as an integrated intelligent monitoring and analysis system for plant-wide application for BWR plants. This approach reduces system maintenance efforts and equipment monitoring costs and provides information for integrated planning. This paper describes GEIEMS and how the current system is being upgraded to meet General Electric's vision for plant-wide decision support. (author)

  12. STINGRAY: system for integrated genomic resources and analysis.

    Science.gov (United States)

    Wagner, Glauber; Jardim, Rodrigo; Tschoeke, Diogo A; Loureiro, Daniel R; Ocaña, Kary A C S; Ribeiro, Antonio C B; Emmel, Vanessa E; Probst, Christian M; Pitaluga, André N; Grisard, Edmundo C; Cavalcanti, Maria C; Campos, Maria L M; Mattoso, Marta; Dávila, Alberto M R

    2014-03-07

    The STINGRAY system has been conceived to ease the tasks of integrating, analyzing, annotating and presenting genomic and expression data from Sanger and Next Generation Sequencing (NGS) platforms. STINGRAY includes: (a) a complete and integrated workflow (more than 20 bioinformatics tools) ranging from functional annotation to phylogeny; (b) a MySQL database schema, suitable for data integration and user access control; and (c) a user-friendly graphical web-based interface that makes the system intuitive, facilitating the tasks of data analysis and annotation. STINGRAY showed to be an easy to use and complete system for analyzing sequencing data. While both Sanger and NGS platforms are supported, the system could be faster using Sanger data, since the large NGS datasets could potentially slow down the MySQL database usage. STINGRAY is available at http://stingray.biowebdb.org and the open source code at http://sourceforge.net/projects/stingray-biowebdb/.

  13. An analysis of National Health Service Trust websites on the occupational backgrounds of 'Non-Executive Directors' on England's Acute Trusts.

    Science.gov (United States)

    Pritchard, Colin; Harding, Andrew Je

    2014-05-01

    To explore the occupational backgrounds of English Non-Executive Directors (NED) on Acute National Health Service (NHS) Trusts. Data extrapolated from Trust websites of NED' occupational backgrounds by gender and occupations, and inter-rater reliability test undertaken. Data were available on all but 24 of the 166 Acute Trusts' from all regions. Trust Chairs and NED were categorised by their dominant occupation. Differentiating NED with and without health or social care leadership experience. The ratings of NED' occupations positively correlated (p business' (Francis, 2013) rather than developing a more patient-centred, clinically led and integrated NHS? It is suggested that Boards need more NED with health and social care leadership experience and methods to identify the 'patient's agenda' to create 'a common culture' that places 'patients at the centre of everything we do' (Hunt, 2012). A key context for Trust Boards operations is funding, which Francis' terms of reference excluded, an issue that is briefly discussed.

  14. An Integrated Solution for Performing Thermo-fluid Conjugate Analysis

    Science.gov (United States)

    Kornberg, Oren

    2009-01-01

    A method has been developed which integrates a fluid flow analyzer and a thermal analyzer to produce both steady state and transient results of 1-D, 2-D, and 3-D analysis models. The Generalized Fluid System Simulation Program (GFSSP) is a one dimensional, general purpose fluid analysis code which computes pressures and flow distributions in complex fluid networks. The MSC Systems Improved Numerical Differencing Analyzer (MSC.SINDA) is a one dimensional general purpose thermal analyzer that solves network representations of thermal systems. Both GFSSP and MSC.SINDA have graphical user interfaces which are used to build the respective model and prepare it for analysis. The SINDA/GFSSP Conjugate Integrator (SGCI) is a formbase graphical integration program used to set input parameters for the conjugate analyses and run the models. The contents of this paper describes SGCI and its thermo-fluids conjugate analysis techniques and capabilities by presenting results from some example models including the cryogenic chill down of a copper pipe, a bar between two walls in a fluid stream, and a solid plate creating a phase change in a flowing fluid.

  15. Analysis of Waste Isolation Pilot Plant Samples: Integrated Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Britt, Phillip F [ORNL

    2015-03-01

    Analysis of Waste Isolation Pilot Plant Samples: Integrated Summary Report. Summaries of conclusions, analytical processes, and analytical results. Analysis of samples taken from the Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico in support of the WIPP Technical Assessment Team (TAT) activities to determine to the extent feasible the mechanisms and chemical reactions that may have resulted in the breach of at least one waste drum and release of waste material in WIPP Panel 7 Room 7 on February 14, 2014. This report integrates and summarizes the results contained in three separate reports, described below, and draws conclusions based on those results. Chemical and Radiochemical Analyses of WIPP Samples R-15 C5 SWB and R16 C-4 Lip; PNNL-24003, Pacific Northwest National Laboratory, December 2014 Analysis of Waste Isolation Pilot Plant (WIPP) Underground and MgO Samples by the Savannah River National Laboratory (SRNL); SRNL-STI-2014-00617; Savannah River National Laboratory, December 2014 Report for WIPP UG Sample #3, R15C5 (9/3/14); LLNL-TR-667015; Lawrence Livermore National Laboratory, January 2015 This report is also contained in the Waste Isolation Pilot Plant Technical Assessment Team Report; SRNL-RP-2015-01198; Savannah River National Laboratory, March 17, 2015, as Appendix C: Analysis Integrated Summary Report.

  16. Vehicle Integrated Performance Analysis, the VIPA Experience: Reconnecting with Technical Integration

    Science.gov (United States)

    McGhee, David S.

    2005-01-01

    Today's NASA is facing significant challenges and changes. The Exploration initiative indicates a large increase in projects with limited increase in budget. The Columbia report has criticized NASA for its lack of insight and technical integration impacting its ability to provide safety. The Aldridge report is advocating NASA find new ways of doing business. Very early in the Space Launch Initiative (SLI) program a small team of engineers at MSFC were asked to propose a process for performing a system level assessment of a launch vehicle. The request was aimed primarily at providing insight and making NASA a "smart buyer." Out of this effort the VIPA team was created. The difference between the VIPA effort and many integration attempts is that VIPA focuses on using experienced people from various disciplines and a process which focuses them on a technically integrated assessment. Most previous attempts have focused on developing an all encompassing software tool. In addition, VIPA anchored its process formulation in the experience of its members and in early developmental Space Shuttle experience. The primary reference for this is NASA-TP-2001-210092, "Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned," and discussions with its authors. The foundations of VIPA's process are described. The VIPA team also recognized the need to drive detailed analysis earlier in the design process. Analyses and techniques typically done in later design phases, are brought forward using improved computing technology. The intent is to allow the identification of significant sensitivities, trades, and design issues much earlier in the program. This process is driven by the T-model for Technical Integration described in the aforementioned reference. VIPA's approach to performing system level technical integration is discussed in detail. Proposed definitions are offered to clarify this discussion and the general systems integration dialog. VIPA

  17. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  18. Integration of facility modeling capabilities for nuclear nonproliferation analysis

    International Nuclear Information System (INIS)

    Garcia, Humberto; Burr, Tom; Coles, Garill A.; Edmunds, Thomas A.; Garrett, Alfred; Gorensek, Maximilian; Hamm, Luther; Krebs, John; Kress, Reid L.; Lamberti, Vincent; Schoenwald, David; Tzanos, Constantine P.; Ward, Richard C.

    2012-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  19. Integration Of Facility Modeling Capabilities For Nuclear Nonproliferation Analysis

    International Nuclear Information System (INIS)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  20. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  1. Inertial navigation sensor integrated motion analysis for autonomous vehicle navigation

    Science.gov (United States)

    Roberts, Barry; Bhanu, Bir

    1992-01-01

    Recent work on INS integrated motion analysis is described. Results were obtained with a maximally passive system of obstacle detection (OD) for ground-based vehicles and rotorcraft. The OD approach involves motion analysis of imagery acquired by a passive sensor in the course of vehicle travel to generate range measurements to world points within the sensor FOV. INS data and scene analysis results are used to enhance interest point selection, the matching of the interest points, and the subsequent motion-based computations, tracking, and OD. The most important lesson learned from the research described here is that the incorporation of inertial data into the motion analysis program greatly improves the analysis and makes the process more robust.

  2. Sparse Group Penalized Integrative Analysis of Multiple Cancer Prognosis Datasets

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Xie, Yang; Ma, Shuangge

    2014-01-01

    SUMMARY In cancer research, high-throughput profiling studies have been extensively conducted, searching for markers associated with prognosis. Because of the “large d, small n” characteristic, results generated from the analysis of a single dataset can be unsatisfactory. Recent studies have shown that integrative analysis, which simultaneously analyzes multiple datasets, can be more effective than single-dataset analysis and classic meta-analysis. In most of existing integrative analysis, the homogeneity model has been assumed, which postulates that different datasets share the same set of markers. Several approaches have been designed to reinforce this assumption. In practice, different datasets may differ in terms of patient selection criteria, profiling techniques, and many other aspects. Such differences may make the homogeneity model too restricted. In this study, we assume the heterogeneity model, under which different datasets are allowed to have different sets of markers. With multiple cancer prognosis datasets, we adopt the AFT (accelerated failure time) model to describe survival. This model may have the lowest computational cost among popular semiparametric survival models. For marker selection, we adopt a sparse group MCP (minimax concave penalty) approach. This approach has an intuitive formulation and can be computed using an effective group coordinate descent algorithm. Simulation study shows that it outperforms the existing approaches under both the homogeneity and heterogeneity models. Data analysis further demonstrates the merit of heterogeneity model and proposed approach. PMID:23938111

  3. Simulation analysis of globally integrated logistics and recycling strategies

    Energy Technology Data Exchange (ETDEWEB)

    Song, S.J.; Hiroshi, K. [Hiroshima Inst. of Tech., Graduate School of Mechanical Systems Engineering, Dept. of In formation and Intelligent Systems Engineering, Hiroshima (Japan)

    2004-07-01

    This paper focuses on the optimal analysis of world-wide recycling activities associated with managing the logistics and production activities in global manufacturing whose activities stretch across national boundaries. Globally integrated logistics and recycling strategies consist of the home country and two free trading economic blocs, NAFTA and ASEAN, where significant differences are found in production and disassembly cost, tax rates, local content rules and regulations. Moreover an optimal analysis of globally integrated value-chain was developed by applying simulation optimization technique as a decision-making tool. The simulation model was developed and analyzed by using ProModel packages, and the results help to identify some of the appropriate conditions required to make well-performed logistics and recycling plans in world-wide collaborated manufacturing environment. (orig.)

  4. Integration, warehousing, and analysis strategies of Omics data.

    Science.gov (United States)

    Gedela, Srinubabu

    2011-01-01

    "-Omics" is a current suffix for numerous types of large-scale biological data generation procedures, which naturally demand the development of novel algorithms for data storage and analysis. With next generation genome sequencing burgeoning, it is pivotal to decipher a coding site on the genome, a gene's function, and information on transcripts next to the pure availability of sequence information. To explore a genome and downstream molecular processes, we need umpteen results at the various levels of cellular organization by utilizing different experimental designs, data analysis strategies and methodologies. Here comes the need for controlled vocabularies and data integration to annotate, store, and update the flow of experimental data. This chapter explores key methodologies to merge Omics data by semantic data carriers, discusses controlled vocabularies as eXtensible Markup Languages (XML), and provides practical guidance, databases, and software links supporting the integration of Omics data.

  5. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    Science.gov (United States)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  6. Computational Approaches for Integrative Analysis of the Metabolome and Microbiome

    Directory of Open Access Journals (Sweden)

    Jasmine Chong

    2017-11-01

    Full Text Available The study of the microbiome, the totality of all microbes inhabiting the host or an environmental niche, has experienced exponential growth over the past few years. The microbiome contributes functional genes and metabolites, and is an important factor for maintaining health. In this context, metabolomics is increasingly applied to complement sequencing-based approaches (marker genes or shotgun metagenomics to enable resolution of microbiome-conferred functionalities associated with health. However, analyzing the resulting multi-omics data remains a significant challenge in current microbiome studies. In this review, we provide an overview of different computational approaches that have been used in recent years for integrative analysis of metabolome and microbiome data, ranging from statistical correlation analysis to metabolic network-based modeling approaches. Throughout the process, we strive to present a unified conceptual framework for multi-omics integration and interpretation, as well as point out potential future directions.

  7. [Guidelines to good execution of analysis: some applications and developments. Laboratoire d'Analyses de Biologie Me'dicale Lecoeur].

    Science.gov (United States)

    Lecoeur, Y

    1998-01-01

    The decree concerning the Guidelines for Good Execution of Analyses (GGEA) promulgated on December 4, 1994 entered into application on January 1, 1995. The definition and necessity for the GGEA is discussed in the first part of this article. Actually, the GGEA is a revolutionary change for biology laboratories which must now work within the framework of precise guidelines. This may raise certain problems for private laboratories. The goal of the GGEA is to assure good quality analyses and thus patient care. It is designed as a positive aid for the biologist. Thus after two years of application, it is time to improve the initial text taking into account experience in the field. In the future, the official authorities and leaders in the profession will have to choose between the GGEA and official approval.

  8. Sensitivity Analysis Based on Markovian Integration by Parts Formula

    Directory of Open Access Journals (Sweden)

    Yongsheng Hang

    2017-10-01

    Full Text Available Sensitivity analysis is widely applied in financial risk management and engineering; it describes the variations brought by the changes of parameters. Since the integration by parts technique for Markov chains is well developed in recent years, in this paper we apply it for computation of sensitivity and show the closed-form expressions for two commonly-used time-continuous Markovian models. By comparison, we conclude that our approach outperforms the existing technique of computing sensitivity on Markovian models.

  9. Corporate Disclosure, Materiality, and Integrated Report: An Event Study Analysis

    OpenAIRE

    Maria Cleofe Giorgino; Enrico Supino; Federico Barnabè

    2017-01-01

    Within the extensive literature investigating the impacts of corporate disclosure in supporting the sustainable growth of an organization, few studies have included in the analysis the materiality issue referred to the information being disclosed. This article aims to address this gap, exploring the effect produced on capital markets by the publication of a recent corporate reporting tool, Integrated Report (IR). The features of this tool are that it aims to represent the multidimensional imp...

  10. Process Integration Analysis of an Industrial Hydrogen Production Process

    OpenAIRE

    Stolten, Detlef; Grube, Thomas; Tock, Laurence; Maréchal, François; Metzger, Christian; Arpentinier, Philippe

    2010-01-01

    The energy efficiency of an industrial hydrogen production process using steam methane reforming (SMR) combined with the water gas shift reaction (WGS) is analyzed using process integration techniques based on heat cascade calculation and pinch analysis with the aim of identifying potential measures to enhance the process performance. The challenge is to satisfy the high temperature heat demand of the SMR reaction by minimizing the consumption of natural gas to feed the combustion and to expl...

  11. Meta-Analysis of the Relationship between Deep Brain Stimulation in Patients with Parkinson’s Disease and Performance in Evaluation Tests for Executive Brain Functions

    Directory of Open Access Journals (Sweden)

    A. M. Martínez-Martínez

    2017-01-01

    Full Text Available Parkinson’s disease (PD is a neurodegenerative condition, which compromises the motor functions and causes the alteration of some executive brain functions. The presence of changes in cognitive symptoms in PD could be due to the procedure of deep brain stimulation (DBS. We searched in several databases for studies that compared performance in executive function tests before and after the DBS procedure in PE and then performed a meta-analysis. After the initial search, there were 15 articles that specifically evaluated the functions of verbal fluency, working memory, cognitive flexibility, abstract thinking, and inhibition. It was found that there were differences in the evaluation of the cognitive functions in terms of the protocols, which generated heterogeneity in the results of the meta-analysis. Likewise, a tendency to diminish functions like verbal fluency and inhibition was found, being this consistent with similar studies. In the other functions evaluated, no difference was found between pre- and postsurgery scores. Monitoring of this type of function is recommended after the procedure.

  12. A graphical user interface for RAId, a knowledge integrated proteomics analysis suite with accurate statistics.

    Science.gov (United States)

    Joyce, Brendan; Lee, Danny; Rubio, Alex; Ogurtsov, Aleksey; Alves, Gelio; Yu, Yi-Kuo

    2018-03-15

    RAId is a software package that has been actively developed for the past 10 years for computationally and visually analyzing MS/MS data. Founded on rigorous statistical methods, RAId's core program computes accurate E-values for peptides and proteins identified during database searches. Making this robust tool readily accessible for the proteomics community by developing a graphical user interface (GUI) is our main goal here. We have constructed a graphical user interface to facilitate the use of RAId on users' local machines. Written in Java, RAId_GUI not only makes easy executions of RAId but also provides tools for data/spectra visualization, MS-product analysis, molecular isotopic distribution analysis, and graphing the retrieval versus the proportion of false discoveries. The results viewer displays and allows the users to download the analyses results. Both the knowledge-integrated organismal databases and the code package (containing source code, the graphical user interface, and a user manual) are available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads/raid.html .

  13. Building-integrated renewable energy policy analysis in China

    Institute of Scientific and Technical Information of China (English)

    姚春妮; 郝斌

    2009-01-01

    With the dramatic development of renewable energy all over the world,and for purpose of adjusting energy structure,the Ministry of Construction of China plans to promote the large scale application of renewable energy in buildings. In order to ensure the validity of policy-making,this work firstly exerts a method to do cost-benefit analysis for three kinds of technologies such as building-integrated solar hot water (BISHW) system,building-integrated photovoltaic (BIPV) technology and ground water heat pump (GWHP). Through selecting a representative city of every climate region,the analysis comes into different results for different climate regions in China and respectively different suggestion for policy-making. On the analysis basis,the Ministry of Construction (MOC) and the Ministry of Finance of China (MOF) united to start-up Building-integrated Renewable Energy Demonstration Projects (BIREDP) in 2006. In the demonstration projects,renewable energy takes place of traditional energy to supply the domestic hot water,electricity,air-conditioning and heating. Through carrying out the demonstration projects,renewable energy related market has been expanded. More and more relative companies and local governments take the opportunity to promote the large scale application of renewable energy in buildings.

  14. Lectures on functional analysis and the Lebesgue integral

    CERN Document Server

    Komornik, Vilmos

    2016-01-01

    This textbook, based on three series of lectures held by the author at the University of Strasbourg, presents functional analysis in a non-traditional way by generalizing elementary theorems of plane geometry to spaces of arbitrary dimension. This approach leads naturally to the basic notions and theorems. Most results are illustrated by the small ℓp spaces. The Lebesgue integral, meanwhile, is treated via the direct approach of Frigyes Riesz, whose constructive definition of measurable functions leads to optimal, clear-cut versions of the classical theorems of Fubini-Tonelli and Radon-Nikodým. Lectures on Functional Analysis and the Lebesgue Integral presents the most important topics for students, with short, elegant proofs. The exposition style follows the Hungarian mathematical tradition of Paul Erdős and others. The order of the first two parts, functional analysis and the Lebesgue integral, may be reversed. In the third and final part they are combined to study various spaces of continuous and integ...

  15. Tabled Execution in Scheme

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, J J; Lumsdaine, A; Quinlan, D J

    2008-08-19

    Tabled execution is a generalization of memorization developed by the logic programming community. It not only saves results from tabled predicates, but also stores the set of currently active calls to them; tabled execution can thus provide meaningful semantics for programs that seemingly contain infinite recursions with the same arguments. In logic programming, tabled execution is used for many purposes, both for improving the efficiency of programs, and making tasks simpler and more direct to express than with normal logic programs. However, tabled execution is only infrequently applied in mainstream functional languages such as Scheme. We demonstrate an elegant implementation of tabled execution in Scheme, using a mix of continuation-passing style and mutable data. We also show the use of tabled execution in Scheme for a problem in formal language and automata theory, demonstrating that tabled execution can be a valuable tool for Scheme users.

  16. NordicWalking Performance Analysis with an Integrated Monitoring System

    Directory of Open Access Journals (Sweden)

    Francesco Mocera

    2018-05-01

    Full Text Available There is a growing interest in Nordic walking both from the fitness and medical point of views due to its possible therapeutic applications. The proper execution of the technique is an essential requirement to maximize the benefits of this practice. This is the reason why a monitoring system for outdoor Nordic walking activity was developed. Using data obtained from synchronized sensors, it is possible to have a complete overview of the users’ movements. The system described in this paper is able to measure: the pole angle during the pushing phase, the arms cycle frequency and synchronization and the pushing force applied to the ground. Furthermore, data from a GPS module give an image of the environment where the activity session takes place, in terms of the distance, slope, as well as the ground typology. A heart rate sensor is used to monitor the effort of the user through his/her Beats Per Minute (BPM. In this work, the developed monitoring system is presented, explaining how to use the gathered data to obtain the main feedback parameters for Nordic walking performance analysis. The comparison between left and right arm measurements allowed validating the system as a tool for technique evaluation. Finally, a procedure to estimate the peak pushing force from acceleration measurements is proposed.

  17. Integrated Modeling, Analysis, and Verification for Space Missions, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project will further MBSE technology in fundamental ways by strengthening the link between SysML tools and framework engineering execution environments. Phoenix...

  18. Heater-Integrated Cantilevers for Nano-Samples Thermogravimetric Analysis

    Directory of Open Access Journals (Sweden)

    Valeria Toffoli

    2013-12-01

    Full Text Available The design and characteristics of a micro-system for thermogravimetric analysis (TGA in which heater, temperature sensor and mass sensor are integrated into a single device are presented. The system consists of a suspended cantilever that incorporates a microfabricated resistor, used as both heater and thermometer. A three-dimensional finite element analysis was used to define the structure parameters. TGA sensors were fabricated by standard microlithographic techniques and tested using milli-Q water and polyurethane microcapsule. The results demonstrated that our approach provides a faster and more sensitive TGA with respect to commercial systems.

  19. Heater-Integrated Cantilevers for Nano-Samples Thermogravimetric Analysis

    Science.gov (United States)

    Toffoli, Valeria; Carrato, Sergio; Lee, Dongkyu; Jeon, Sangmin; Lazzarino, Marco

    2013-01-01

    The design and characteristics of a micro-system for thermogravimetric analysis (TGA) in which heater, temperature sensor and mass sensor are integrated into a single device are presented. The system consists of a suspended cantilever that incorporates a microfabricated resistor, used as both heater and thermometer. A three-dimensional finite element analysis was used to define the structure parameters. TGA sensors were fabricated by standard microlithographic techniques and tested using milli-Q water and polyurethane microcapsule. The results demonstrated that our approach provides a faster and more sensitive TGA with respect to commercial systems.

  20. Structural integrity analysis of an INPP building under external loading

    International Nuclear Information System (INIS)

    Dundulis, G.; Karalevicius, R.; Uspuras, E.; Kulak, R.F.; Marchertas, A.

    2005-01-01

    After the terrorist attacks in New York and Washington D. C. using civil airplanes, the evaluation of civil airplane crashes into civil and NPP structures has become very important. The interceptions of many terrorists' communications reveal that the use of commandeered commercial aircraft is still a major part of their plans for destruction. Aircraft crash or other flying objects in the territory of the Ignalina Nuclear Power Plant (INPP) represents a concern to the plant. Aircraft traveling at high velocity have a destructive potential. The aircraft crash may damage the roof and walls of buildings, pipelines, electric motors, cases of power supplies, power cables of electricity transmission and other elements and systems, which are important for safety. Therefore, the evaluation of the structural response to an of aircraft crash is important and was selected for analysis. The structural integrity analysis due to the effects of an aircraft crash on an NPP building structure is the subject of this paper. The finite element method was used for the structural analysis of a typical Ignalina NPP building. The structural integrity analysis was performed for a portion of the ALS using the dynamic loading of an aircraft crash impact model. The computer code NEPTUNE was used for this analysis. The local effects caused by impact of the aircraft's engine on the building wall were evaluated independently by using an empirical formula. (authors)

  1. Business process modeling for the Virginia Department of Transportation : a demonstration with the integrated six-year improvement program and the statewide transportation improvement program : executive summary.

    Science.gov (United States)

    2005-01-01

    This effort demonstrates business process modeling to describe the integration of particular planning and programming activities of a state highway agency. The motivations to document planning and programming activities are that: (i) resources for co...

  2. Penalized differential pathway analysis of integrative oncogenomics studies.

    Science.gov (United States)

    van Wieringen, Wessel N; van de Wiel, Mark A

    2014-04-01

    Through integration of genomic data from multiple sources, we may obtain a more accurate and complete picture of the molecular mechanisms underlying tumorigenesis. We discuss the integration of DNA copy number and mRNA gene expression data from an observational integrative genomics study involving cancer patients. The two molecular levels involved are linked through the central dogma of molecular biology. DNA copy number aberrations abound in the cancer cell. Here we investigate how these aberrations affect gene expression levels within a pathway using observational integrative genomics data of cancer patients. In particular, we aim to identify differential edges between regulatory networks of two groups involving these molecular levels. Motivated by the rate equations, the regulatory mechanism between DNA copy number aberrations and gene expression levels within a pathway is modeled by a simultaneous-equations model, for the one- and two-group case. The latter facilitates the identification of differential interactions between the two groups. Model parameters are estimated by penalized least squares using the lasso (L1) penalty to obtain a sparse pathway topology. Simulations show that the inclusion of DNA copy number data benefits the discovery of gene-gene interactions. In addition, the simulations reveal that cis-effects tend to be over-estimated in a univariate (single gene) analysis. In the application to real data from integrative oncogenomic studies we show that inclusion of prior information on the regulatory network architecture benefits the reproducibility of all edges. Furthermore, analyses of the TP53 and TGFb signaling pathways between ER+ and ER- samples from an integrative genomics breast cancer study identify reproducible differential regulatory patterns that corroborate with existing literature.

  3. GeNNet: an integrated platform for unifying scientific workflows and graph databases for transcriptome data analysis

    Directory of Open Access Journals (Sweden)

    Raquel L. Costa

    2017-07-01

    Full Text Available There are many steps in analyzing transcriptome data, from the acquisition of raw data to the selection of a subset of representative genes that explain a scientific hypothesis. The data produced can be represented as networks of interactions among genes and these may additionally be integrated with other biological databases, such as Protein-Protein Interactions, transcription factors and gene annotation. However, the results of these analyses remain fragmented, imposing difficulties, either for posterior inspection of results, or for meta-analysis by the incorporation of new related data. Integrating databases and tools into scientific workflows, orchestrating their execution, and managing the resulting data and its respective metadata are challenging tasks. Additionally, a great amount of effort is equally required to run in-silico experiments to structure and compose the information as needed for analysis. Different programs may need to be applied and different files are produced during the experiment cycle. In this context, the availability of a platform supporting experiment execution is paramount. We present GeNNet, an integrated transcriptome analysis platform that unifies scientific workflows with graph databases for selecting relevant genes according to the evaluated biological systems. It includes GeNNet-Wf, a scientific workflow that pre-loads biological data, pre-processes raw microarray data and conducts a series of analyses including normalization, differential expression inference, clusterization and gene set enrichment analysis. A user-friendly web interface, GeNNet-Web, allows for setting parameters, executing, and visualizing the results of GeNNet-Wf executions. To demonstrate the features of GeNNet, we performed case studies with data retrieved from GEO, particularly using a single-factor experiment in different analysis scenarios. As a result, we obtained differentially expressed genes for which biological functions were

  4. Integration of facility modeling capabilities for nuclear nonproliferation analysis

    International Nuclear Information System (INIS)

    Burr, Tom; Gorensek, M.B.; Krebs, John; Kress, Reid L.; Lamberti, Vincent; Schoenwald, David; Ward, Richard C.

    2012-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclearnonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facilitymodeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facilitymodeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facilitymodelingcapabilities and illustrates how they could be integrated and utilized for nonproliferationanalysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facilitymodeling tools. After considering a representative sampling of key facilitymodelingcapabilities, the proposed integration framework is illustrated with several examples.

  5. Integrated situational awareness for cyber attack detection, analysis, and mitigation

    Science.gov (United States)

    Cheng, Yi; Sagduyu, Yalin; Deng, Julia; Li, Jason; Liu, Peng

    2012-06-01

    Real-time cyberspace situational awareness is critical for securing and protecting today's enterprise networks from various cyber threats. When a security incident occurs, network administrators and security analysts need to know what exactly has happened in the network, why it happened, and what actions or countermeasures should be taken to quickly mitigate the potential impacts. In this paper, we propose an integrated cyberspace situational awareness system for efficient cyber attack detection, analysis and mitigation in large-scale enterprise networks. Essentially, a cyberspace common operational picture will be developed, which is a multi-layer graphical model and can efficiently capture and represent the statuses, relationships, and interdependencies of various entities and elements within and among different levels of a network. Once shared among authorized users, this cyberspace common operational picture can provide an integrated view of the logical, physical, and cyber domains, and a unique visualization of disparate data sets to support decision makers. In addition, advanced analyses, such as Bayesian Network analysis, will be explored to address the information uncertainty, dynamic and complex cyber attack detection, and optimal impact mitigation issues. All the developed technologies will be further integrated into an automatic software toolkit to achieve near real-time cyberspace situational awareness and impact mitigation in large-scale computer networks.

  6. Measure and integral an introduction to real analysis

    CERN Document Server

    Wheeden, Richard L

    2015-01-01

    Now considered a classic text on the topic, Measure and Integral: An Introduction to Real Analysis provides an introduction to real analysis by first developing the theory of measure and integration in the simple setting of Euclidean space, and then presenting a more general treatment based on abstract notions characterized by axioms and with less geometric content.Published nearly forty years after the first edition, this long-awaited Second Edition also:Studies the Fourier transform of functions in the spaces L1, L2, and Lp, 1 p Shows the Hilbert transform to be a bounded operator on L2, as an application of the L2 theory of the Fourier transform in the one-dimensional caseCovers fractional integration and some topics related to mean oscillation properties of functions, such as the classes of Hölder continuous functions and the space of functions of bounded mean oscillationDerives a subrepresentation formula, which in higher dimensions plays a role roughly similar to the one played by the fundamental theor...

  7. Integrative Analysis of Cancer Diagnosis Studies with Composite Penalization

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Ma, Shuangge

    2013-01-01

    Summary In cancer diagnosis studies, high-throughput gene profiling has been extensively conducted, searching for genes whose expressions may serve as markers. Data generated from such studies have the “large d, small n” feature, with the number of genes profiled much larger than the sample size. Penalization has been extensively adopted for simultaneous estimation and marker selection. Because of small sample sizes, markers identified from the analysis of single datasets can be unsatisfactory. A cost-effective remedy is to conduct integrative analysis of multiple heterogeneous datasets. In this article, we investigate composite penalization methods for estimation and marker selection in integrative analysis. The proposed methods use the minimax concave penalty (MCP) as the outer penalty. Under the homogeneity model, the ridge penalty is adopted as the inner penalty. Under the heterogeneity model, the Lasso penalty and MCP are adopted as the inner penalty. Effective computational algorithms based on coordinate descent are developed. Numerical studies, including simulation and analysis of practical cancer datasets, show satisfactory performance of the proposed methods. PMID:24578589

  8. Editor, Executive and Entrepreneur

    DEFF Research Database (Denmark)

    Bøe-Lillegraven, Tor; Wilberg, Erik

    2016-01-01

    To survive in today’s increasingly complex business environments, firms must embrace strategic paradoxes: contradictory yet interrelated objectives that persist over time. This can be one of toughest of all leadership challenges, as managers must accept inconsistency and contradictions....... In this article, we develop and empirically test a set of hypotheses related to ambidexterity, a key example of a paradoxical strategy. Through our analysis of data from a survey of executive leaders, we find a link between organizational ambidexterity and strategic planning, suggesting that the complexities...... of navigating in explorative ventures require more explicit strategy work than the old certainties of a legacy business. We identify and discuss inherent paradoxes and their implications for firm performance in 22 industry-specific strategies, where empirical industry data shows a pattern of conflict between...

  9. Dysfunctional default mode network and executive control network in people with Internet gaming disorder: Independent component analysis under a probability discounting task.

    Science.gov (United States)

    Wang, L; Wu, L; Lin, X; Zhang, Y; Zhou, H; Du, X; Dong, G

    2016-04-01

    The present study identified the neural mechanism of risky decision-making in Internet gaming disorder (IGD) under a probability discounting task. Independent component analysis was used on the functional magnetic resonance imaging data from 19 IGD subjects (22.2 ± 3.08 years) and 21 healthy controls (HC, 22.8 ± 3.5 years). For the behavioral results, IGD subjects prefer the risky to the fixed options and showed shorter reaction time compared to HC. For the imaging results, the IGD subjects showed higher task-related activity in default mode network (DMN) and less engagement in the executive control network (ECN) than HC when making the risky decisions. Also, we found the activities of DMN correlate negatively with the reaction time and the ECN correlate positively with the probability discounting rates. The results suggest that people with IGD show altered modulation in DMN and deficit in executive control function, which might be the reason for why the IGD subjects continue to play online games despite the potential negative consequences. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  10. Methodology for dimensional variation analysis of ITER integrated systems

    International Nuclear Information System (INIS)

    Fuentes, F. Javier; Trouvé, Vincent; Cordier, Jean-Jacques; Reich, Jens

    2016-01-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  11. Methodology for dimensional variation analysis of ITER integrated systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuentes, F. Javier, E-mail: FranciscoJavier.Fuentes@iter.org [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France); Trouvé, Vincent [Assystem Engineering & Operation Services, rue J-M Jacquard CS 60117, 84120 Pertuis (France); Cordier, Jean-Jacques; Reich, Jens [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France)

    2016-11-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  12. Technology integrated teaching in Malaysian schools: GIS, a SWOT analysis

    Directory of Open Access Journals (Sweden)

    Habibah Lateh, vasugiammai muniandy

    2011-08-01

    , articles and proceeding papers. Researches had been continuously done in integrating GIS into Geography syllabus. Thus, this article describes and discusses the barriers and opportunities of implementing GIS in schools with a deep focus of how GIS could enhance the process of teaching and learning geography. The purpose of the study is to determine the effectiveness of GIS in enhancing students’ interest towards the subject. Barriers that might limit the implementation of GIS in schools also briefly discussedin this article. The capabilities of GIS in schools and teaching with GIS is also a part of this article. SWOT analysis is used to find the strength, threaten, opportunities and weakness of GIS to be integrated in Malaysian schools. A content analysis was performed using articles from local and abroad publications regarding technology integration and GIS. Conference proceedings were also analyzed. This content analysis included 35 articles selected from ICT and GIS publication in Malaysia and abroad. The content analysis was done in order to identify the barriers of trying GIS in schools in Malaysia. Thus, this article discusses strengths, weaknesses, opportunities and threatens. The future of GIS in Malaysian Schools has been added into the conclusion.

  13. Analysis on working pressure selection of ACME integral test facility

    International Nuclear Information System (INIS)

    Chen Lian; Chang Huajian; Li Yuquan; Ye Zishen; Qin Benke

    2011-01-01

    An integral effects test facility, advanced core cooling mechanism experiment facility (ACME) was designed to verify the performance of the passive safety system and validate its safety analysis codes of a pressurized water reactor power plant. Three test facilities for AP1000 design were introduced and review was given. The problems resulted from the different working pressures of its test facilities were analyzed. Then a detailed description was presented on the working pressure selection of ACME facility as well as its characteristics. And the approach of establishing desired testing initial condition was discussed. The selected 9.3 MPa working pressure covered almost all important passive safety system enables the ACME to simulate the LOCAs with the same pressure and property similitude as the prototype. It's expected that the ACME design would be an advanced core cooling integral test facility design. (authors)

  14. Application of symplectic integrator to numerical fluid analysis

    International Nuclear Information System (INIS)

    Tanaka, Nobuatsu

    2000-01-01

    This paper focuses on application of the symplectic integrator to numerical fluid analysis. For the purpose, we introduce Hamiltonian particle dynamics to simulate fluid behavior. The method is based on both the Hamiltonian formulation of a system and the particle methods, and is therefore called Hamiltonian Particle Dynamics (HPD). In this paper, an example of HPD applications, namely the behavior of incompressible inviscid fluid, is solved. In order to improve accuracy of HPD with respect to space, CIVA, which is a highly accurate interpolation method, is combined, but the combined method is subject to problems in that the invariants of the system are not conserved in a long-time computation. For solving the problems, symplectic time integrators are introduced and the effectiveness is confirmed by numerical analyses. (author)

  15. The SMART Theory and Modeling Team: An Integrated Element of Mission Development and Science Analysis

    Science.gov (United States)

    Hesse, Michael; Birn, J.; Denton, Richard E.; Drake, J.; Gombosi, T.; Hoshino, M.; Matthaeus, B.; Sibeck, D.

    2005-01-01

    When targeting physical understanding of space plasmas, our focus is gradually shifting away from discovery-type investigations to missions and studies that address our basic understanding of processes we know to be important. For these studies, theory and models provide physical predictions that need to be verified or falsified by empirical evidence. Within this paradigm, a tight integration between theory, modeling, and space flight mission design and execution is essential. NASA's Magnetospheric MultiScale (MMS) mission is a pathfinder in this new era of space research. The prime objective of MMS is to understand magnetic reconnection, arguably the most fundamental of plasma processes. In particular, MMS targets the microphysical processes, which permit magnetic reconnection to operate in the collisionless plasmas that permeate space and astrophysical systems. More specifically, MMS will provide closure to such elemental questions as how particles become demagnetized in the reconnection diffusion region, which effects determine the reconnection rate, and how reconnection is coupled to environmental conditions such as magnetic shear angles. Solutions to these problems have remained elusive in past and present spacecraft missions primarily due to instrumental limitations - yet they are fundamental to the large-scale dynamics of collisionless plasmas. Owing to the lack of measurements, most of our present knowledge of these processes is based on results from modern theory and modeling studies of the reconnection process. Proper design and execution of a mission targeting magnetic reconnection should include this knowledge and have to ensure that all relevant scales and effects can be resolved by mission measurements. The SMART mission has responded to this need through a tight integration between instrument and theory and modeling teams. Input from theory and modeling is fed into all aspects of science mission design, and theory and modeling activities are tailored

  16. Tutoring executives online

    DEFF Research Database (Denmark)

    Bignoux, Stephane; Sund, Kristian J.

    2018-01-01

    Studies of learning and student satisfaction in the context of online university programmes have largely neglected programmes catering specifically to business executives. Such executives have typically been away from higher education for a number of years, and have collected substantial practical...... experience in the subject matters they are taught. Their expectations in terms of both content and delivery may therefore be different from non-executive students. We explore perceptions of the quality of tutoring in the context of an online executive MBA programme through participant interviews. We find...... that in addition to some of the tutor behaviours already discussed in the literature, executive students look specifically for practical industry knowledge and experience in tutors, when judging how effective a tutor is. This has implications for both the recruitment and training of online executive MBA tutors....

  17. Tutoring Executives Online

    DEFF Research Database (Denmark)

    Bignoux, Stephane; Sund, Kristian J.

    2016-01-01

    Studies of learning and student satisfaction in the context of online university programs have largely neglected programs catering specifically to business executives. Such executives have typically been away from higher education for a number of years, and have collected substantial practical...... experience in the subject matters they are taught. Their expectations in terms of both content and delivery may therefore be different from non-executive students. We explore perceptions of the quality of tutoring in the context of an online executive MBA program through participant interviews. We find...... that in addition to some of the tutor behaviors already discussed in the literature, executive students look specifically for practical industry knowledge and experience in tutors, when judging how effective a tutor is. This has implications for both the recruitment and training of online executive MBA tutors....

  18. Integrated information system for analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Galperin, A.

    1994-01-01

    Performing complicated engineering analyses of a nuclear power plant requires storage and manipulation of a large amount of information, both data and knowledge. This information is characterized by its multidisciplinary nature, complexity, and diversity. The problems caused by inefficient and lengthy manual operations involving the data flow management within the frame-work of the safety-related analysis of a power plant can be solved by applying the computer aided engineering principles. These principles are the basis of the design of an integrated information storage system (IRIS). The basic idea is to create a computerized environment, which includes both database and functional capabilities. Consideration and analysis of the data types and required data manipulation capabilities as well as operational requirements, resulted in the choice of an object-oriented data-base management system (OODBMS) as a development platform for solving the software engineering problems. Several advantages of OODBMSs over conventional relations database systems were found of crucial importance, especially providing the necessary flexibility for different data types and extensibility potential. A detailed design of a data model is produced for the plant technical data and for the storage of analysis results. The overall system architecture was designed to assure the feasibility of integrating database capabilities with procedures and functions written in conventional algorithmic programming languages

  19. Vertically integrated analysis of human DNA. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    Olson, M.

    1997-10-01

    This project has been oriented toward improving the vertical integration of the sequential steps associated with the large-scale analysis of human DNA. The central focus has been on an approach to the preparation of {open_quotes}sequence-ready{close_quotes} maps, which is referred to as multiple-complete-digest (MCD) mapping, primarily directed at cosmid clones. MCD mapping relies on simple experimental steps, supported by advanced image-analysis and map-assembly software, to produce extremely accurate restriction-site and clone-overlap maps. We believe that MCD mapping is one of the few high-resolution mapping systems that has the potential for high-level automation. Successful automation of this process would be a landmark event in genome analysis. Once other higher organisms, paving the way for cost-effective sequencing of these genomes. Critically, MCD mapping has the potential to provide built-in quality control for sequencing accuracy and to make possible a highly integrated end product even if there are large numbers of discontinuities in the actual sequence.

  20. DESIGN ANALYSIS OF ELECTRICAL MACHINES THROUGH INTEGRATED NUMERICAL APPROACH

    Directory of Open Access Journals (Sweden)

    ARAVIND C.V.

    2016-02-01

    Full Text Available An integrated design platform for the newer type of machines is presented in this work. The machine parameters are evaluated out using developed modelling tool. With the machine parameters, the machine is modelled using computer aided tool. The designed machine is brought to simulation tool to perform electromagnetic and electromechanical analysis. In the simulation, conditions setting are performed to setup the materials, meshes, rotational speed and the excitation circuit. Electromagnetic analysis is carried out to predict the behavior of the machine based on the movement of flux in the machines. Besides, electromechanical analysis is carried out to analyse the speed-torque characteristic, the current-torque characteristic and the phase angle-torque characteristic. After all the results are analysed, the designed machine is used to generate S block function that is compatible with MATLAB/SIMULINK tool for the dynamic operational characteristics. This allows the integration of existing drive system into the new machines designed in the modelling tool. An example of the machine design is presented to validate the usage of such a tool.

  1. Solid waste integrated cost analysis model: 1991 project year report

    Energy Technology Data Exchange (ETDEWEB)

    1991-01-01

    The purpose of the City of Houston's 1991 Solid Waste Integrated Cost Analysis Model (SWICAM) project was to continue the development of a computerized cost analysis model. This model is to provide solid waste managers with tool to evaluate the dollar cost of real or hypothetical solid waste management choices. Those choices have become complicated by the implementation of Subtitle D of the Resources Conservation and Recovery Act (RCRA) and the EPA's Integrated Approach to managing municipal solid waste;. that is, minimize generation, maximize recycling, reduce volume (incinerate), and then bury (landfill) only the remainder. Implementation of an integrated solid waste management system involving all or some of the options of recycling, waste to energy, composting, and landfilling is extremely complicated. Factors such as hauling distances, markets, and prices for recyclable, costs and benefits of transfer stations, and material recovery facilities must all be considered. A jurisdiction must determine the cost impacts of implementing a number of various possibilities for managing, handling, processing, and disposing of waste. SWICAM employs a single Lotus 123 spreadsheet to enable a jurisdiction to predict or assess the costs of its waste management system. It allows the user to select his own process flow for waste material and to manipulate the model to include as few or as many options as he or she chooses. The model will calculate the estimated cost for those choices selected. The user can then change the model to include or exclude waste stream components, until the mix of choices suits the user. Graphs can be produced as a visual communication aid in presenting the results of the cost analysis. SWICAM also allows future cost projections to be made.

  2. An Integrated Analysis of Changes in Water Stress in Europe

    DEFF Research Database (Denmark)

    Henrichs, T.; Lehner, B.; Alcamo, J.

    2002-01-01

    Future changes in water availability with climate change and changes in water use due to socio-economic development are to occur in parallel. In an integrated analysis we bring together these aspects of global change in a consistent manner, and analyse the water stress situation in Europe. We find...... that today high water stress exists in one-fifth of European river basin area. Under a scenario projection, increases in water use throughout Eastern Europe are accompanied by decreases in water availability in most of Southern Europe--combining these trends leads to a marked increase in water stress...

  3. Case for integral core-disruptive accident analysis

    International Nuclear Information System (INIS)

    Luck, L.B.; Bell, C.R.

    1985-01-01

    Integral analysis is an approach used at the Los Alamos National Laboratory to cope with the broad multiplicity of accident paths and complex phenomena that characterize the transition phase of core-disruptive accident progression in a liquid-metal-cooled fast breeder reactor. The approach is based on the combination of a reference calculation, which is intended to represent a band of similar accident paths, and associated system- and separate-effect studies, which are designed to determine the effect of uncertainties. Results are interpreted in the context of a probabilistic framework. The approach was applied successfully in two studies; illustrations from the Clinch River Breeder Reactor licensing assessment are included

  4. Integrated polymer waveguides for absorbance detection in chemical analysis systems

    DEFF Research Database (Denmark)

    Mogensen, Klaus Bo; El-Ali, Jamil; Wolff, Anders

    2003-01-01

    A chemical analysis system for absorbance detection with integrated polymer waveguides is reported for the first time. The fabrication procedure relies on structuring of a single layer of the photoresist SU-8, so both the microfluidic channel network and the optical components, which include planar....... The emphasis of this paper is on the signal-to-noise ratio of the detection and its relation to the sensitivity. Two absorbance cells with an optical path length of 100 μm and 1000 μm were characterized and compared in terms of sensitivity, limit of detection and effective path length for measurements...

  5. Analysis of Conflict Centers in Projects Procured with Traditional and Integrated Methods in Nigeria

    OpenAIRE

    Martin O. Dada

    2012-01-01

    Conflicts in any organization can either be functional or dysfunctional and can contribute to or detract from the achievement of organizational or project objectives. This study investigated the frequency and intensity of conflicts, using five conflict centers, on projects executed with either the integrated or traditional method in Nigeria. Questionnaires were administered through purposive and snowballing techniques on 274 projects located in twelve states of Nigeria and Abuja. 94 usable ...

  6. Integrated Modeling for the James Webb Space Telescope (JWST) Project: Structural Analysis Activities

    Science.gov (United States)

    Johnston, John; Mosier, Mark; Howard, Joe; Hyde, Tupper; Parrish, Keith; Ha, Kong; Liu, Frank; McGinnis, Mark

    2004-01-01

    This paper presents viewgraphs about structural analysis activities and integrated modeling for the James Webb Space Telescope (JWST). The topics include: 1) JWST Overview; 2) Observatory Structural Models; 3) Integrated Performance Analysis; and 4) Future Work and Challenges.

  7. Executive Information Systems' Multidimensional Models

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Executive Information Systems are design to improve the quality of strategic level of management in organization through a new type of technology and several techniques for extracting, transforming, processing, integrating and presenting data in such a way that the organizational knowledge filters can easily associate with this data and turn it into information for the organization. These technologies are known as Business Intelligence Tools. But in order to build analytic reports for Executive Information Systems (EIS in an organization we need to design a multidimensional model based on the business model from the organization. This paper presents some multidimensional models that can be used in EIS development and propose a new model that is suitable for strategic business requests.

  8. Integrated omics analysis of specialized metabolism in medicinal plants.

    Science.gov (United States)

    Rai, Amit; Saito, Kazuki; Yamazaki, Mami

    2017-05-01

    Medicinal plants are a rich source of highly diverse specialized metabolites with important pharmacological properties. Until recently, plant biologists were limited in their ability to explore the biosynthetic pathways of these metabolites, mainly due to the scarcity of plant genomics resources. However, recent advances in high-throughput large-scale analytical methods have enabled plant biologists to discover biosynthetic pathways for important plant-based medicinal metabolites. The reduced cost of generating omics datasets and the development of computational tools for their analysis and integration have led to the elucidation of biosynthetic pathways of several bioactive metabolites of plant origin. These discoveries have inspired synthetic biology approaches to develop microbial systems to produce bioactive metabolites originating from plants, an alternative sustainable source of medicinally important chemicals. Since the demand for medicinal compounds are increasing with the world's population, understanding the complete biosynthesis of specialized metabolites becomes important to identify or develop reliable sources in the future. Here, we review the contributions of major omics approaches and their integration to our understanding of the biosynthetic pathways of bioactive metabolites. We briefly discuss different approaches for integrating omics datasets to extract biologically relevant knowledge and the application of omics datasets in the construction and reconstruction of metabolic models. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.

  9. Integrative analysis of the mitochondrial proteome in yeast.

    Directory of Open Access Journals (Sweden)

    Holger Prokisch

    2004-06-01

    Full Text Available In this study yeast mitochondria were used as a model system to apply, evaluate, and integrate different genomic approaches to define the proteins of an organelle. Liquid chromatography mass spectrometry applied to purified mitochondria identified 546 proteins. By expression analysis and comparison to other proteome studies, we demonstrate that the proteomic approach identifies primarily highly abundant proteins. By expanding our evaluation to other types of genomic approaches, including systematic deletion phenotype screening, expression profiling, subcellular localization studies, protein interaction analyses, and computational predictions, we show that an integration of approaches moves beyond the limitations of any single approach. We report the success of each approach by benchmarking it against a reference set of known mitochondrial proteins, and predict approximately 700 proteins associated with the mitochondrial organelle from the integration of 22 datasets. We show that a combination of complementary approaches like deletion phenotype screening and mass spectrometry can identify over 75% of the known mitochondrial proteome. These findings have implications for choosing optimal genome-wide approaches for the study of other cellular systems, including organelles and pathways in various species. Furthermore, our systematic identification of genes involved in mitochondrial function and biogenesis in yeast expands the candidate genes available for mapping Mendelian and complex mitochondrial disorders in humans.

  10. Integrated intelligent instruments using supercritical fluid technology for soil analysis

    International Nuclear Information System (INIS)

    Liebman, S.A.; Phillips, C.; Fitzgerald, W.; Levy, E.J.

    1994-01-01

    Contaminated soils pose a significant challenge for characterization and remediation programs that require rapid, accurate and comprehensive data in the field or laboratory. Environmental analyzers based on supercritical fluid (SF) technology have been designed and developed for meeting these global needs. The analyzers are designated the CHAMP Systems (Chemical Hazards Automated Multimedia Processors). The prototype instrumentation features SF extraction (SFE) and on-line capillary gas chromatographic (GC) analysis with chromatographic and/or spectral identification detectors, such as ultra-violet, Fourier transform infrared and mass spectrometers. Illustrations are given for a highly automated SFE-capillary GC/flame ionization (FID) configuration to provide validated screening analysis for total extractable hydrocarbons within ca. 5--10 min, as well as a full qualitative/quantitative analysis in 25--30 min. Data analysis using optional expert system and neural networks software is demonstrated for test gasoline and diesel oil mixtures in this integrated intelligent instrument approach to trace organic analysis of soils and sediments

  11. Integrative Analysis of Prognosis Data on Multiple Cancer Subtypes

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Zhang, Yawei; Lan, Qing; Rothman, Nathaniel; Zheng, Tongzhang; Ma, Shuangge

    2014-01-01

    Summary In cancer research, profiling studies have been extensively conducted, searching for genes/SNPs associated with prognosis. Cancer is diverse. Examining the similarity and difference in the genetic basis of multiple subtypes of the same cancer can lead to a better understanding of their connections and distinctions. Classic meta-analysis methods analyze each subtype separately and then compare analysis results across subtypes. Integrative analysis methods, in contrast, analyze the raw data on multiple subtypes simultaneously and can outperform meta-analysis methods. In this study, prognosis data on multiple subtypes of the same cancer are analyzed. An AFT (accelerated failure time) model is adopted to describe survival. The genetic basis of multiple subtypes is described using the heterogeneity model, which allows a gene/SNP to be associated with prognosis of some subtypes but not others. A compound penalization method is developed to identify genes that contain important SNPs associated with prognosis. The proposed method has an intuitive formulation and is realized using an iterative algorithm. Asymptotic properties are rigorously established. Simulation shows that the proposed method has satisfactory performance and outperforms a penalization-based meta-analysis method and a regularized thresholding method. An NHL (non-Hodgkin lymphoma) prognosis study with SNP measurements is analyzed. Genes associated with the three major subtypes, namely DLBCL, FL, and CLL/SLL, are identified. The proposed method identifies genes that are different from alternatives and have important implications and satisfactory prediction performance. PMID:24766212

  12. Integrated Data Analysis (IDCA) Program - PETN Class 4 Standard

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Shelley, Timothy J. [Air Force Research Lab. (AFRL), Tyndall AFB, FL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2012-08-01

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of PETN Class 4. The PETN was found to have: 1) an impact sensitivity (DH50) range of 6 to 12 cm, 2) a BAM friction sensitivity (F50) range 7 to 11 kg, TIL (0/10) of 3.7 to 7.2 kg, 3) a ABL friction sensitivity threshold of 5 or less psig at 8 fps, 4) an ABL ESD sensitivity threshold of 0.031 to 0.326 j/g, and 5) a thermal sensitivity of an endothermic feature with Tmin = ~ 141 °C, and a exothermic feature with a Tmax = ~205°C.

  13. Integrated Data Collection Analysis (IDCA) Program — Ammonium Nitrate

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms, Redstone Arsenal, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-05-17

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of ammonium nitrate (AN). AN was tested, in most cases, as both received from manufacturer and dried/sieved. The participants found the AN to be: 1) insensitive in Type 12A impact testing (although with a wide range of values), 2) completely insensitive in BAM friction testing, 3) less sensitive than the RDX standard in ABL friction testing, 4) less sensitive than RDX in ABL ESD testing, and 5) less sensitive than RDX and PETN in DSC thermal analyses.

  14. An integrated economic and distributional analysis of energy policies

    International Nuclear Information System (INIS)

    Labandeira, Xavier; Labeaga, Jose M.; Rodriguez, Miguel

    2009-01-01

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  15. An integrated economic and distributional analysis of energy policies

    Energy Technology Data Exchange (ETDEWEB)

    Labandeira, Xavier [Facultade de CC. Economicas, University of Vigo, 36310 Vigo (Spain); Labeaga, Jose M. [Instituto de Estudios Fiscales, Avda. Cardenal Herrera Oria 378, 28035 Madrid (Spain); Rodriguez, Miguel [Facultade de CC. Empresariais e Turismo, University of Vigo, 32004 Ourense (Spain)

    2009-12-15

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  16. Simulation analysis for integrated evaluation of technical and commercial risk

    International Nuclear Information System (INIS)

    Gutleber, D.S.; Heiberger, E.M.; Morris, T.D.

    1995-01-01

    Decisions to invest in oil- and gasfield acquisitions or participating interests often are based on the perceived ability to enhance the economic value of the underlying asset. A multidisciplinary approach integrating reservoir engineering, operations and drilling, and deal structuring with Monte Carlo simulation modeling can overcome weaknesses of deterministic analysis and significantly enhance investment decisions. This paper discusses the use of spreadsheets and Monte Carlo simulation to generate probabilistic outcomes for key technical and economic parameters for ultimate identification of the economic volatility and value of potential deal concepts for a significant opportunity. The approach differs from a simple risk analysis for an individual well by incorporating detailed, full-field simulations that vary the reservoir parameters, capital and operating cost assumptions, and schedules on timing in the framework of various deal structures

  17. Babelomics: an integrative platform for the analysis of transcriptomics, proteomics and genomic data with advanced functional profiling

    Science.gov (United States)

    Medina, Ignacio; Carbonell, José; Pulido, Luis; Madeira, Sara C.; Goetz, Stefan; Conesa, Ana; Tárraga, Joaquín; Pascual-Montano, Alberto; Nogales-Cadenas, Ruben; Santoyo, Javier; García, Francisco; Marbà, Martina; Montaner, David; Dopazo, Joaquín

    2010-01-01

    Babelomics is a response to the growing necessity of integrating and analyzing different types of genomic data in an environment that allows an easy functional interpretation of the results. Babelomics includes a complete suite of methods for the analysis of gene expression data that include normalization (covering most commercial platforms), pre-processing, differential gene expression (case-controls, multiclass, survival or continuous values), predictors, clustering; large-scale genotyping assays (case controls and TDTs, and allows population stratification analysis and correction). All these genomic data analysis facilities are integrated and connected to multiple options for the functional interpretation of the experiments. Different methods of functional enrichment or gene set enrichment can be used to understand the functional basis of the experiment analyzed. Many sources of biological information, which include functional (GO, KEGG, Biocarta, Reactome, etc.), regulatory (Transfac, Jaspar, ORegAnno, miRNAs, etc.), text-mining or protein–protein interaction modules can be used for this purpose. Finally a tool for the de novo functional annotation of sequences has been included in the system. This provides support for the functional analysis of non-model species. Mirrors of Babelomics or command line execution of their individual components are now possible. Babelomics is available at http://www.babelomics.org. PMID:20478823

  18. Integrating Pavement Crack Detection and Analysis Using Autonomous Unmanned Aerial Vehicle Imagery

    Science.gov (United States)

    2015-03-27

    INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS UNMANNED AERIAL VEHICLE...protection in the United States. AFIT-ENV-MS-15-M-195 INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS UNMANNED AERIAL...APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENV-MS-15-M-195 INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS

  19. Sensitivity Analysis of the Integrated Medical Model for ISS Programs

    Science.gov (United States)

    Goodenow, D. A.; Myers, J. G.; Arellano, J.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Young, M.

    2016-01-01

    Sensitivity analysis estimates the relative contribution of the uncertainty in input values to the uncertainty of model outputs. Partial Rank Correlation Coefficient (PRCC) and Standardized Rank Regression Coefficient (SRRC) are methods of conducting sensitivity analysis on nonlinear simulation models like the Integrated Medical Model (IMM). The PRCC method estimates the sensitivity using partial correlation of the ranks of the generated input values to each generated output value. The partial part is so named because adjustments are made for the linear effects of all the other input values in the calculation of correlation between a particular input and each output. In SRRC, standardized regression-based coefficients measure the sensitivity of each input, adjusted for all the other inputs, on each output. Because the relative ranking of each of the inputs and outputs is used, as opposed to the values themselves, both methods accommodate the nonlinear relationship of the underlying model. As part of the IMM v4.0 validation study, simulations are available that predict 33 person-missions on ISS and 111 person-missions on STS. These simulated data predictions feed the sensitivity analysis procedures. The inputs to the sensitivity procedures include the number occurrences of each of the one hundred IMM medical conditions generated over the simulations and the associated IMM outputs: total quality time lost (QTL), number of evacuations (EVAC), and number of loss of crew lives (LOCL). The IMM team will report the results of using PRCC and SRRC on IMM v4.0 predictions of the ISS and STS missions created as part of the external validation study. Tornado plots will assist in the visualization of the condition-related input sensitivities to each of the main outcomes. The outcomes of this sensitivity analysis will drive review focus by identifying conditions where changes in uncertainty could drive changes in overall model output uncertainty. These efforts are an integral

  20. Multi - band Persistent Scatterer Interferometry data integration for landslide analysis

    Science.gov (United States)

    Bianchini, Silvia; Mateos, Rosa; Mora, Oscar; García, Inma; Sánchez, Ciscu; Sanabria, Margarita; López, Maite; Mulas, Joaquin; Hernández, Mario; Herrera, Gerardo

    2013-04-01

    We present a methodology to perform a geomorphological assessment of ground movements over wide areas, by improving Persistent Scatterer Interferometry (PSI) analysis for landslide studies. The procedure relies on the integrated use of multi-band EO data acquired by different satellite sensors in different time intervals, to provide a detailed investigation of ground displacements. The methodology, throughout the cross-comparison and integration of PS data in different microwave bands (ALOS in L-band, ERS1/2 and ENVISAT in C-band, COSMOSKY-MED in X-band), is applied on the Tramontana Range in the northwestern part of Mallorca island (Spain), extensively affected by mass movements across time, especially during the last years. We increase the confidence degree of the available interferometric data and we homogenize all PS targets by implementing and classifying them through common criteria. Therefore, PSI results are combined with geo-thematic data and pre-existing landslide inventories of the study area, in order to improve the landslide database, providing additional information on the detected ground displacements. The results of this methodology are used to elaborate landslide activity maps, permitting to jointly exploit heterogeneous PS data for analyzing landslides at regional scale. Moreover, from a geomorphological perspective, the proposed approach exploits the implemented PS data to achieve a reliable spatial analysis of movement rates, whatever referred to certain landslide phenomena or to other natural processes, in order to perform ground motion activity maps within a wide area.

  1. Integrated severe accident containment analysis with the CONTAIN computer code

    International Nuclear Information System (INIS)

    Bergeron, K.D.; Williams, D.C.; Rexroth, P.E.; Tills, J.L.

    1985-12-01

    Analysis of physical and radiological conditions iunside the containment building during a severe (core-melt) nuclear reactor accident requires quantitative evaluation of numerous highly disparate yet coupled phenomenologies. These include two-phase thermodynamics and thermal-hydraulics, aerosol physics, fission product phenomena, core-concrete interactions, the formation and combustion of flammable gases, and performance of engineered safety features. In the past, this complexity has meant that a complete containment analysis would require application of suites of separate computer codes each of which would treat only a narrower subset of these phenomena, e.g., a thermal-hydraulics code, an aerosol code, a core-concrete interaction code, etc. In this paper, we describe the development and some recent applications of the CONTAIN code, which offers an integrated treatment of the dominant containment phenomena and the interactions among them. We describe the results of a series of containment phenomenology studies, based upon realistic accident sequence analyses in actual plants. These calculations highlight various phenomenological effects that have potentially important implications for source term and/or containment loading issues, and which are difficult or impossible to treat using a less integrated code suite

  2. Integrative analysis to select cancer candidate biomarkers to targeted validation

    Science.gov (United States)

    Heberle, Henry; Domingues, Romênia R.; Granato, Daniela C.; Yokoo, Sami; Canevarolo, Rafael R.; Winck, Flavia V.; Ribeiro, Ana Carolina P.; Brandão, Thaís Bianca; Filgueiras, Paulo R.; Cruz, Karen S. P.; Barbuto, José Alexandre; Poppi, Ronei J.; Minghim, Rosane; Telles, Guilherme P.; Fonseca, Felipe Paiva; Fox, Jay W.; Santos-Silva, Alan R.; Coletta, Ricardo D.; Sherman, Nicholas E.; Paes Leme, Adriana F.

    2015-01-01

    Targeted proteomics has flourished as the method of choice for prospecting for and validating potential candidate biomarkers in many diseases. However, challenges still remain due to the lack of standardized routines that can prioritize a limited number of proteins to be further validated in human samples. To help researchers identify candidate biomarkers that best characterize their samples under study, a well-designed integrative analysis pipeline, comprising MS-based discovery, feature selection methods, clustering techniques, bioinformatic analyses and targeted approaches was performed using discovery-based proteomic data from the secretomes of three classes of human cell lines (carcinoma, melanoma and non-cancerous). Three feature selection algorithms, namely, Beta-binomial, Nearest Shrunken Centroids (NSC), and Support Vector Machine-Recursive Features Elimination (SVM-RFE), indicated a panel of 137 candidate biomarkers for carcinoma and 271 for melanoma, which were differentially abundant between the tumor classes. We further tested the strength of the pipeline in selecting candidate biomarkers by immunoblotting, human tissue microarrays, label-free targeted MS and functional experiments. In conclusion, the proposed integrative analysis was able to pre-qualify and prioritize candidate biomarkers from discovery-based proteomics to targeted MS. PMID:26540631

  3. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  4. Integrated dynamic landscape analysis and modeling system (IDLAMS) : installation manual.

    Energy Technology Data Exchange (ETDEWEB)

    Li, Z.; Majerus, K. A.; Sundell, R. C.; Sydelko, P. J.; Vogt, M. C.

    1999-02-24

    The Integrated Dynamic Landscape Analysis and Modeling System (IDLAMS) is a prototype, integrated land management technology developed through a joint effort between Argonne National Laboratory (ANL) and the US Army Corps of Engineers Construction Engineering Research Laboratories (USACERL). Dr. Ronald C. Sundell, Ms. Pamela J. Sydelko, and Ms. Kimberly A. Majerus were the principal investigators (PIs) for this project. Dr. Zhian Li was the primary software developer. Dr. Jeffrey M. Keisler, Mr. Christopher M. Klaus, and Mr. Michael C. Vogt developed the decision analysis component of this project. It was developed with funding support from the Strategic Environmental Research and Development Program (SERDP), a land/environmental stewardship research program with participation from the US Department of Defense (DoD), the US Department of Energy (DOE), and the US Environmental Protection Agency (EPA). IDLAMS predicts land conditions (e.g., vegetation, wildlife habitats, and erosion status) by simulating changes in military land ecosystems for given training intensities and land management practices. It can be used by military land managers to help predict the future ecological condition for a given land use based on land management scenarios of various levels of training intensity. It also can be used as a tool to help land managers compare different land management practices and further determine a set of land management activities and prescriptions that best suit the needs of a specific military installation.

  5. The practical implementation of integrated safety management for nuclear safety analysis and fire hazards analysis documentation

    International Nuclear Information System (INIS)

    COLLOPY, M.T.

    1999-01-01

    In 1995 Mr. Joseph DiNunno of the Defense Nuclear Facilities Safety Board issued an approach to describe the concept of an integrated safety management program which incorporates hazard and safety analysis to address a multitude of hazards affecting the public, worker, property, and the environment. Since then the U S . Department of Energy (DOE) has adopted a policy to systematically integrate safety into management and work practices at all levels so that missions can be completed while protecting the public, worker, and the environment. While the DOE and its contractors possessed a variety of processes for analyzing fire hazards at a facility, activity, and job; the outcome and assumptions of these processes have not always been consistent for similar types of hazards within the safety analysis and the fire hazard analysis. Although the safety analysis and the fire hazard analysis are driven by different DOE Orders and requirements, these analyses should not be entirely independent and their preparation should be integrated to ensure consistency of assumptions, consequences, design considerations, and other controls. Under the DOE policy to implement an integrated safety management system, identification of hazards must be evaluated and agreed upon to ensure that the public. the workers. and the environment are protected from adverse consequences. The DOE program and contractor management need a uniform, up-to-date reference with which to plan. budget, and manage nuclear programs. It is crucial that DOE understand the hazards and risks necessarily to authorize the work needed to be performed. If integrated safety management is not incorporated into the preparation of the safety analysis and the fire hazard analysis, inconsistencies between assumptions, consequences, design considerations, and controls may occur that affect safety. Furthermore, confusion created by inconsistencies may occur in the DOE process to grant authorization of the work. In accordance with

  6. Essays in Executive Compensation

    NARCIS (Netherlands)

    D. Zhang (Dan)

    2012-01-01

    textabstractThis dissertation focuses on how executive compensation is designed and its implications for corporate finance and government regulations. Chapter 2 analyzes several proposals to restrict CEO compensation and calibrates two models of executive compensation that describe how firms would

  7. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository--Volume 1: Executive Summary

    International Nuclear Information System (INIS)

    Taylor, L.L.; Wilson, J.R.; Sanchez, L.Z.; Aguilar, R.; Trellue, H.R.; Cochrane, K.; Rath, J.S.

    1998-01-01

    The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3)

  8. The Integrated Data Base program: An executive-level data base of spent fuel and radioactive waste inventories, projections, and characteristics

    International Nuclear Information System (INIS)

    Klein, J.A.

    1987-01-01

    The Integrated Data Base (IDB) is the official US Department of Energy (DOE) data base for spent fuel and radioactive waste inventories and projections. As such, it should be as convenient to utilize as is practical. Examples of summary-level tables and figures are presented, as well as more-detailed graphics describing waste-form distribution by site and line charts illustrating historical and projected volume (or mass) changes. This information is readily accessible through the annual IDB publication. Other presentation formats are also available to the DOE community through a simple request to the IDB Program

  9. Greening the Grid: Pathways to Integrate 175 Gigawatts of Renewable Energy into India's Electric Grid, Vol. 1 -- National Study, Executive Summary

    Energy Technology Data Exchange (ETDEWEB)

    Cochran, Jaquelin [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-06-27

    The use of renewable energy (RE) sources, primarily wind and solar generation, is poised to grow significantly within the Indian power system. The Government of India has established a target of 175 gigawatts (GW) of installed RE capacity by 2022, including 60 GW of wind and 100 GW of solar, up from 29 GW wind and 9 GW solar at the beginning of 2017. Thanks to advanced weather and power system modeling made for this project, the study team is able to explore operational impacts of meeting India's RE targets and identify actions that may be favorable for integration.

  10. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Directory of Open Access Journals (Sweden)

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  11. SEURAT: visual analytics for the integrated analysis of microarray data.

    Science.gov (United States)

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  12. SEURAT: Visual analytics for the integrated analysis of microarray data

    Directory of Open Access Journals (Sweden)

    Bullinger Lars

    2010-06-01

    Full Text Available Abstract Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  13. International Academy of Astronautics 5th cosmic study--preparing for a 21st century program of integrated, Lunar and Martian exploration and development (executive summary).

    Science.gov (United States)

    Koelle, H H; Stephenson, D G

    2003-04-01

    This report is an initial review of plans for a extensive program to survey and develop the Moon and to explore the planet Mars during the 21st century. It presents current typical plans for separate, associated and fully integrated programs of Lunar and Martian research, exploration and development, and concludes that detailed integrated plans must be prepared and be subject to formal criticism. Before responsible politicians approve a new thrust into space they will demand attractive, defensible, and detailed proposals that explain the WHEN, HOW and WHY of each stage of an expanded program of 21st century space research, development and exploration. In particular, the claims of daring, innovative, but untried systems must be compared with the known performance of existing technologies. The time has come to supersede the present haphazard approach to strategic space studies with a formal international structure to plan for future advanced space missions under the aegis of the world's national space agencies, and supported by governments and the corporate sector. c2002 Elsevier Science Ltd. All rights reserved.

  14. A Case Study for Executive Leadership

    Science.gov (United States)

    Hill, Phyllis J.

    1975-01-01

    A newly appointed woman dean discusses the value of a management development program involving a process of self-analysis and self-determination of leadership style and effectiveness (the University of Illinois "Executive Leadership Seminar"). (JT)

  15. Transient flow analysis of integrated valve opening process

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xinming; Qin, Benke; Bo, Hanliang, E-mail: bohl@tsinghua.edu.cn; Xu, Xingxing

    2017-03-15

    Highlights: • The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the integrated valve (IV) is the key control component. • The transient flow experiment induced by IV is conducted and the test results are analyzed to get its working mechanism. • The theoretical model of IV opening process is established and applied to get the changing rule of the transient flow characteristic parameters. - Abstract: The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the IV is the key control component. The working principle of integrated valve (IV) is analyzed and the IV hydraulic experiment is conducted. There is transient flow phenomenon in the valve opening process. The theoretical model of IV opening process is established by the loop system control equations and boundary conditions. The valve opening boundary condition equation is established based on the IV three dimensional flow field analysis results and the dynamic analysis of the valve core movement. The model calculation results are in good agreement with the experimental results. On this basis, the model is used to analyze the transient flow under high temperature condition. The peak pressure head is consistent with the one under room temperature and the pressure fluctuation period is longer than the one under room temperature. Furthermore, the changing rule of pressure transients with the fluid and loop structure parameters is analyzed. The peak pressure increases with the flow rate and the peak pressure decreases with the increase of the valve opening time. The pressure fluctuation period increases with the loop pipe length and the fluctuation amplitude remains largely unchanged under different equilibrium pressure conditions. The research results lay the base for the vibration reduction analysis of the CRHDS.

  16. Embedding Temporal Constraints For Coordinated Execution in Habitat Automation

    Science.gov (United States)

    Morris, Paul; Schwabacher, Mark; Dalal, Michael; Fry, Charles

    2013-01-01

    Future NASA plans call for long-duration deep space missions with human crews. Because of light-time delay and other considerations, increased autonomy will be needed. This will necessitate integration of tools in such areas as anomaly detection, diagnosis, planning, and execution. In this paper we investigate an approach that integrates planning and execution by embedding planner-derived temporal constraints in an execution procedure. To avoid the need for propagation, we convert the temporal constraints to dispatchable form. We handle some uncertainty in the durations without it affecting the execution; larger variations may cause activities to be skipped.

  17. Performance Analysis of a Photovoltaic-Thermal Integrated System

    International Nuclear Information System (INIS)

    Radziemska, E.

    2009-01-01

    The present commercial photovoltaic solar cells (PV) converts solar energy into electricity with a relatively low efficiency, less than 20%. More than 80% of the absorbed solar energy is dumped to the surroundings again after photovoltaic conversion. Hybrid PV/T systems consist of PV modules coupled with the heat extraction devices. The PV/T collectors generate electric power and heat simultaneously. Stabilizing temperature of photovoltaic modules at low level is highly desirable to obtain efficiency increase. The total efficiency of 60-80% can be achieved with the whole PV/T system provided that the T system is operated near ambient temperature. The value of the low-T heat energy is typically much smaller than the value of the PV electricity. The PV/T systems can exist in many designs, but the most common models are with the use of water or air as a working fuid. Efficiency is the most valuable parameter for the economic analysis. It has substantial meaning in the case of installations with great nominal power, as air-cooled Building Integrated Photovoltaic Systems (BIPV). In this paper the performance analysis of a hybrid PV/T system is presented: an energetic analysis as well as an exergetic analysis. Exergy is always destroyed when a process involves a temperature change. This destruction is proportional to the entropy increase of the system together with its surroundings the destroyed exergy has been called energy. Exergy analysis identifies the location, the magnitude, and the sources of thermodynamic inefficiencies in a system. This information, which cannot be provided by other means (e.g., an energy analysis), is very useful for the improvement and cost-effectiveness of the system. Calculations were carried out for the tested water-cooled ASE-100-DGL-SM Solar watt module.

  18. Performance Analysis of a Photovoltaic-Thermal Integrated System

    Directory of Open Access Journals (Sweden)

    Ewa Radziemska

    2009-01-01

    Full Text Available The present commercial photovoltaic solar cells (PV converts solar energy into electricity with a relatively low efficiency, less than 20%. More than 80% of the absorbed solar energy is dumped to the surroundings again after photovoltaic conversion. Hybrid PV/T systems consist of PV modules coupled with the heat extraction devices. The PV/T collectors generate electric power and heat simultaneously. Stabilizing temperature of photovoltaic modules at low level is higly desirable to obtain efficiency increase. The total efficiency of 60–80% can be achieved with the whole PV/T system provided that the T system is operated near ambient temperature. The value of the low-T heat energy is typically much smaller than the value of the PV electricity. The PV/T systems can exist in many designs, but the most common models are with the use of water or air as a working fuid. Efficiency is the most valuable parameter for the economic analysis. It has substantial meaning in the case of installations with great nominal power, as air-cooled Building Integrated Photovoltaic Systems (BIPV. In this paper the performance analysis of a hybrid PV/T system is presented: an energetic analysis as well as an exergetic analysis. Exergy is always destroyed when a process involves a temperature change. This destruction is proportional to the entropy increase of the system together with its surroundings—the destroyed exergy has been called anergy. Exergy analysis identifies the location, the magnitude, and the sources of thermodynamic inefficiences in a system. This information, which cannot be provided by other means (e.g., an energy analysis, is very useful for the improvement and cost-effictiveness of the system. Calculations were carried out for the tested water-cooled ASE-100-DGL-SM Solarwatt module.

  19. PEA: an integrated R toolkit for plant epitranscriptome analysis.

    Science.gov (United States)

    Zhai, Jingjing; Song, Jie; Cheng, Qian; Tang, Yunjia; Ma, Chuang

    2018-05-29

    The epitranscriptome, also known as chemical modifications of RNA (CMRs), is a newly discovered layer of gene regulation, the biological importance of which emerged through analysis of only a small fraction of CMRs detected by high-throughput sequencing technologies. Understanding of the epitranscriptome is hampered by the absence of computational tools for the systematic analysis of epitranscriptome sequencing data. In addition, no tools have yet been designed for accurate prediction of CMRs in plants, or to extend epitranscriptome analysis from a fraction of the transcriptome to its entirety. Here, we introduce PEA, an integrated R toolkit to facilitate the analysis of plant epitranscriptome data. The PEA toolkit contains a comprehensive collection of functions required for read mapping, CMR calling, motif scanning and discovery, and gene functional enrichment analysis. PEA also takes advantage of machine learning technologies for transcriptome-scale CMR prediction, with high prediction accuracy, using the Positive Samples Only Learning algorithm, which addresses the two-class classification problem by using only positive samples (CMRs), in the absence of negative samples (non-CMRs). Hence PEA is a versatile epitranscriptome analysis pipeline covering CMR calling, prediction, and annotation, and we describe its application to predict N6-methyladenosine (m6A) modifications in Arabidopsis thaliana. Experimental results demonstrate that the toolkit achieved 71.6% sensitivity and 73.7% specificity, which is superior to existing m6A predictors. PEA is potentially broadly applicable to the in-depth study of epitranscriptomics. PEA Docker image is available at https://hub.docker.com/r/malab/pea, source codes and user manual are available at https://github.com/cma2015/PEA. chuangma2006@gmail.com. Supplementary data are available at Bioinformatics online.

  20. Change in body fat mass is independently associated with executive functions in older women: a secondary analysis of a 12-month randomized controlled trial.

    Directory of Open Access Journals (Sweden)

    Elizabeth Dao

    Full Text Available OBJECTIVES: To investigate the independent contribution of change in sub-total body fat and lean mass to cognitive performance, specifically the executive processes of selective attention and conflict resolution, in community-dwelling older women. METHODS: This secondary analysis included 114 women aged 65 to 75 years old. Participants were randomly allocated to once-weekly resistance training, twice-weekly resistance training, or twice-weekly balance and tone training. The primary outcome measure was the executive processes of selective attention and conflict resolution as assessed by the Stroop Test. Sub-total body fat and lean mass were measured by dual-energy x-ray absorptiometry (DXA to determine the independent association of change in both sub-total body fat and sub-total body lean mass with Stroop Test performance at trial completion. RESULTS: A multiple linear regression model showed reductions in sub-total body fat mass to be independently associated with better performance on the Stroop Test at trial completion after accounting for baseline Stroop performance, age, baseline global cognitive state, baseline number of comorbidities, baseline depression, and experimental group. The total variance explained was 39.5%; change in sub-total body fat mass explained 3.9% of the variance. Change in sub-total body lean mass was not independently associated with Stroop Test performance (P>0.05. CONCLUSION: Our findings suggest that reductions in sub-total body fat mass - not sub-total lean mass - is associated with better performance of selective attention and conflict resolution.

  1. HTGR-INTEGRATED COAL TO LIQUIDS PRODUCTION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Anastasia M Gandrik; Rick A Wood

    2010-10-01

    As part of the DOE’s Idaho National Laboratory (INL) nuclear energy development mission, the INL is leading a program to develop and design a high temperature gas-cooled reactor (HTGR), which has been selected as the base design for the Next Generation Nuclear Plant. Because an HTGR operates at a higher temperature, it can provide higher temperature process heat, more closely matched to chemical process temperatures, than a conventional light water reactor. Integrating HTGRs into conventional industrial processes would increase U.S. energy security and potentially reduce greenhouse gas emissions (GHG), particularly CO2. This paper focuses on the integration of HTGRs into a coal to liquids (CTL) process, for the production of synthetic diesel fuel, naphtha, and liquefied petroleum gas (LPG). The plant models for the CTL processes were developed using Aspen Plus. The models were constructed with plant production capacity set at 50,000 barrels per day of liquid products. Analysis of the conventional CTL case indicated a potential need for hydrogen supplementation from high temperature steam electrolysis (HTSE), with heat and power supplied by the HTGR. By supplementing the process with an external hydrogen source, the need to “shift” the syngas using conventional water-gas shift reactors was eliminated. HTGR electrical power generation efficiency was set at 40%, a reactor size of 600 MWth was specified, and it was assumed that heat in the form of hot helium could be delivered at a maximum temperature of 700°C to the processes. Results from the Aspen Plus model were used to perform a preliminary economic analysis and a life cycle emissions assessment. The following conclusions were drawn when evaluating the nuclear assisted CTL process against the conventional process: • 11 HTGRs (600 MWth each) are required to support production of a 50,000 barrel per day CTL facility. When compared to conventional CTL production, nuclear integration decreases coal

  2. HTGR-Integrated Coal To Liquids Production Analysis

    International Nuclear Information System (INIS)

    Gandrik, Anastasia M.; Wood, Rick A.

    2010-01-01

    As part of the DOE's Idaho National Laboratory (INL) nuclear energy development mission, the INL is leading a program to develop and design a high temperature gas-cooled reactor (HTGR), which has been selected as the base design for the Next Generation Nuclear Plant. Because an HTGR operates at a higher temperature, it can provide higher temperature process heat, more closely matched to chemical process temperatures, than a conventional light water reactor. Integrating HTGRs into conventional industrial processes would increase U.S. energy security and potentially reduce greenhouse gas emissions (GHG), particularly CO2. This paper focuses on the integration of HTGRs into a coal to liquids (CTL) process, for the production of synthetic diesel fuel, naphtha, and liquefied petroleum gas (LPG). The plant models for the CTL processes were developed using Aspen Plus. The models were constructed with plant production capacity set at 50,000 barrels per day of liquid products. Analysis of the conventional CTL case indicated a potential need for hydrogen supplementation from high temperature steam electrolysis (HTSE), with heat and power supplied by the HTGR. By supplementing the process with an external hydrogen source, the need to 'shift' the syngas using conventional water-gas shift reactors was eliminated. HTGR electrical power generation efficiency was set at 40%, a reactor size of 600 MWth was specified, and it was assumed that heat in the form of hot helium could be delivered at a maximum temperature of 700 C to the processes. Results from the Aspen Plus model were used to perform a preliminary economic analysis and a life cycle emissions assessment. The following conclusions were drawn when evaluating the nuclear assisted CTL process against the conventional process: (1) 11 HTGRs (600 MWth each) are required to support production of a 50,000 barrel per day CTL facility. When compared to conventional CTL production, nuclear integration decreases coal consumption by 66

  3. Lab-on-Valve Micro Sequential Injection: A Versatile Approach for Implementing Integrated Sample Pre-preparations and Executing (Bio)Chemical Assays

    DEFF Research Database (Denmark)

    Hansen, Elo Harald

    waste generation. Most recently, the socalled third generation of FIA has emerged, that is, the Lab-on-Valve (LOV) approach, the conceptual basis of which is to incorporate all the necessary unit operational manipulations required, and, when possible, even the detection device into a single small...... integrated microconduit, or “laboratory”, placed atop a selection valve. The lecture will detail the evolution of the three generations of FIA, emphasis being placed on the LOV approach. Proven itself as a versatile front end to a variety of detection techniques, its utility will be exemplified by a series...... of the renewable microcolumn concept. Despite their excellent analytical chemical capabilities, ETAAS as well as ICPMS often require that the samples are subjected to suitable pretreatment in order to obtain the necessary sensitivity and selectivity. Either in order to separate the analyte from potentially...

  4. Lab-on-Valve Micro Sequential Injection: A Versatile Approach for Implementing Integrated Sample Pre-preparations and Executing (Bio)Chemical Assays

    DEFF Research Database (Denmark)

    Hansen, Elo Harald

    waste generation. Most recently, the Lab-on-Valve (LOV) approach has emerged. Termed the third generation of FIA, the conceptual basis of the LOV is to incorporate all the necessary unit operational manipulations required in a chemical assay, and, when possible, even the detection device, into a single...... small integrated microconduit, or “laboratory”, placed atop a selection valve. In the lecture emphasis will be placed on the LOV approach. Proven itself as a versatile front end to a variety of detection techniques, its utility will be exemplified by various applications. Particular focus......-phase microcolumn concept utilising hydrophobic as well as hydrophilic bead materials. Although ETAAS and ICPMS both are characterised by excellent analytical chemical capabilities, they nevertheless often require that the samples be subjected to suitable pretreatment in order to obtain the necessary sensitivity...

  5. RADIA: RNA and DNA integrated analysis for somatic mutation detection.

    Directory of Open Access Journals (Sweden)

    Amie J Radenbaugh

    Full Text Available The detection of somatic single nucleotide variants is a crucial component to the characterization of the cancer genome. Mutation calling algorithms thus far have focused on comparing the normal and tumor genomes from the same individual. In recent years, it has become routine for projects like The Cancer Genome Atlas (TCGA to also sequence the tumor RNA. Here we present RADIA (RNA and DNA Integrated Analysis, a novel computational method combining the patient-matched normal and tumor DNA with the tumor RNA to detect somatic mutations. The inclusion of the RNA increases the power to detect somatic mutations, especially at low DNA allelic frequencies. By integrating an individual's DNA and RNA, we are able to detect mutations that would otherwise be missed by traditional algorithms that examine only the DNA. We demonstrate high sensitivity (84% and very high precision (98% and 99% for RADIA in patient data from endometrial carcinoma and lung adenocarcinoma from TCGA. Mutations with both high DNA and RNA read support have the highest validation rate of over 99%. We also introduce a simulation package that spikes in artificial mutations to patient data, rather than simulating sequencing data from a reference genome. We evaluate sensitivity on the simulation data and demonstrate our ability to rescue back mutations at low DNA allelic frequencies by including the RNA. Finally, we highlight mutations in important cancer genes that were rescued due to the incorporation of the RNA.

  6. Integrity analysis of an upper guide structure flange

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ki Hyoung; Kang, Sung Sik; Jhung, Myung Jo [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-10-15

    The integrity assessment of reactor vessel internals should be conducted in the design process to secure the safety of nuclear power plants. Various loads such as self-weight, seismic load, flow-induced load, and preload are applied to the internals. Therefore, the American Society of Mechanical Engineers (ASME) Code, Section III, defines the stress limit for reactor vessel internals. The present study focused on structural response analyses of the upper guide structure upper flange. The distributions of the stress intensity in the flange body were analyzed under various design load cases during normal operation. The allowable stress intensities along the expected sections of stress concentration were derived from the results of the finite element analysis for evaluating the structural integrity of the flange design. Furthermore, seismic analyses of the upper flange were performed to identify dynamic behavior with respect to the seismic and impact input. The mode superposition and full transient methods were used to perform time–history analyses, and the displacement at the lower end of the flange was obtained. The effect of the damping ratio on the response of the flange was also evaluated, and the acceleration was obtained. The results of elastic and seismic analyses in this study will be used as basic information to judge whether a flange design meets the acceptance criteria.

  7. Development of safety analysis technology for integral reactor

    International Nuclear Information System (INIS)

    Kim, Hee Cheol; Kim, K. K.; Kim, S. H.

    2002-04-01

    The state-of-the-arts for the integral reactor was performed to investigate the safety features. The safety and performance of SMART were assessed using the technologies developed during the study. For this purpose, the computer code system and the analysis methodology were developed and the safety and performance analyses on SMART basic design were carried out for the design basis event and accident. The experimental facilities were designed for the core flow distribution test and the self-pressurizing pressurizer performance test. The tests on the 2-phase critical flow with non-condensable gas were completed and the results were used to assess the critical flow model. Probabilistic Safety Assessment(PSA) was carried out to evaluate the safety level and to optimize the design by identifying and remedying any weakness in the design. A joint study with KINS was carried out to promote licensing environment. The generic safety issues of integral reactors were identified and the solutions were formulated. The economic evaluation of the SMART desalination plant and the activities related to the process control were carried out in the scope of the study

  8. Visual Data Analysis as an Integral Part of Environmental Management

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Joerg; Bethel, E. Wes; Horsman, Jennifer L.; Hubbard, Susan S.; Krishnan, Harinarayan; Romosan,, Alexandru; Keating, Elizabeth H.; Monroe, Laura; Strelitz, Richard; Moore, Phil; Taylor, Glenn; Torkian, Ben; Johnson, Timothy C.; Gorton, Ian

    2012-10-01

    The U.S. Department of Energy's (DOE) Office of Environmental Management (DOE/EM) currently supports an effort to understand and predict the fate of nuclear contaminants and their transport in natural and engineered systems. Geologists, hydrologists, physicists and computer scientists are working together to create models of existing nuclear waste sites, to simulate their behavior and to extrapolate it into the future. We use visualization as an integral part in each step of this process. In the first step, visualization is used to verify model setup and to estimate critical parameters. High-performance computing simulations of contaminant transport produces massive amounts of data, which is then analyzed using visualization software specifically designed for parallel processing of large amounts of structured and unstructured data. Finally, simulation results are validated by comparing simulation results to measured current and historical field data. We describe in this article how visual analysis is used as an integral part of the decision-making process in the planning of ongoing and future treatment options for the contaminated nuclear waste sites. Lessons learned from visually analyzing our large-scale simulation runs will also have an impact on deciding on treatment measures for other contaminated sites.

  9. Statistical Symbolic Execution with Informed Sampling

    Science.gov (United States)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  10. Integral finite element analysis of turntable bearing with flexible rings

    Science.gov (United States)

    Deng, Biao; Liu, Yunfei; Guo, Yuan; Tang, Shengjin; Su, Wenbin; Lei, Zhufeng; Wang, Pengcheng

    2018-03-01

    This paper suggests a method to calculate the internal load distribution and contact stress of the thrust angular contact ball turntable bearing by FEA. The influence of the stiffness of the bearing structure and the plastic deformation of contact area on the internal load distribution and contact stress of the bearing is considered. In this method, the load-deformation relationship of the rolling elements is determined by the finite element contact analysis of a single rolling element and the raceway. Based on this, the nonlinear contact between the rolling elements and the inner and outer ring raceways is same as a nonlinear compression spring and bearing integral finite element analysis model including support structure was established. The effects of structural deformation and plastic deformation on the built-in stress distribution of slewing bearing are investigated on basis of comparing the consequences of load distribution, inner and outer ring stress, contact stress and other finite element analysis results with the traditional bearing theory, which has guiding function for improving the design of slewing bearing.

  11. Development and assessment of best estimate integrated safety analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu (and others)

    2007-03-15

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published.

  12. Development and assessment of best estimate integrated safety analysis code

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu

    2007-03-01

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published

  13. Integrating Process Mining and Cognitive Analysis to Study EHR Workflow.

    Science.gov (United States)

    Furniss, Stephanie K; Burton, Matthew M; Grando, Adela; Larson, David W; Kaufman, David R

    2016-01-01

    There are numerous methods to study workflow. However, few produce the kinds of in-depth analyses needed to understand EHR-mediated workflow. Here we investigated variations in clinicians' EHR workflow by integrating quantitative analysis of patterns of users' EHR-interactions with in-depth qualitative analysis of user performance. We characterized 6 clinicians' patterns of information-gathering using a sequential process-mining approach. The analysis revealed 519 different screen transition patterns performed across 1569 patient cases. No one pattern was followed for more than 10% of patient cases, the 15 most frequent patterns accounted for over half ofpatient cases (53%), and 27% of cases exhibited unique patterns. By triangulating quantitative and qualitative analyses, we found that participants' EHR-interactive behavior was associated with their routine processes, patient case complexity, and EHR default settings. The proposed approach has significant potential to inform resource allocation for observation and training. In-depth observations helped us to explain variation across users.

  14. An Analysis of Information Technology Managers' and Executives' Security Concerns on Willingness to Adopt Cloud Computing Solutions

    Science.gov (United States)

    Tanque, Marcus M.

    2012-01-01

    The research conducted in this study inquires about Information Technology (IT) managers' and executives' attitudes, beliefs, and knowledge on Cloud Computing (CC) security. The study evaluated how these factors affect IT managers' and executives' willingness to adopt CC solutions in their organizations. Confidentiality,…

  15. Bayesian tomography and integrated data analysis in fusion diagnostics

    Science.gov (United States)

    Li, Dong; Dong, Y. B.; Deng, Wei; Shi, Z. B.; Fu, B. Z.; Gao, J. M.; Wang, T. B.; Zhou, Yan; Liu, Yi; Yang, Q. W.; Duan, X. R.

    2016-11-01

    In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varying smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.

  16. System integration and control strategy analysis of PEMFC car

    International Nuclear Information System (INIS)

    Sun, L.; Chen, Y.; Liu, Y.; Shi, P.

    2004-01-01

    A new fuel car was designed according to the prototype LN2000 hydrogen-oxygen fuel cell car. The new prototype consists of a compact fuel cell engine with separated fuel cell stack, nickel metal hydride battery, a motor with power of 30Kw/100Kw and an inverter with high efficiency. With in the powertrain, a two-shift Planet gear transmission was employed. The power performance was greatly improved. New battery with EMS, new self-developed fuel cell engine, the motor propulsion system and electronic controlled transmission make it feasible to control the whole fuel car automatically and efficiently with optimization. The presents the system integration and the control strategy analysis of the fuel cell car prototype. The paper can be used for reference for engineers in the field of fuel cell vehicle. (author)

  17. Urban Integrated Industrial Cogeneration Systems Analysis. Phase II final report

    Energy Technology Data Exchange (ETDEWEB)

    1984-01-01

    Through the Urban Integrated Industrial Cogeneration Systems Analysis (UIICSA), the City of Chicago embarked upon an ambitious effort to identify the measure the overall industrial cogeneration market in the city and to evaluate in detail the most promising market opportunities. This report discusses the background of the work completed during Phase II of the UIICSA and presents the results of economic feasibility studies conducted for three potential cogeneration sites in Chicago. Phase II focused on the feasibility of cogeneration at the three most promising sites: the Stockyards and Calumet industrial areas, and the Ford City commercial/industrial complex. Each feasibility case study considered the energy load requirements of the existing facilities at the site and the potential for attracting and serving new growth in the area. Alternative fuels and technologies, and ownership and financing options were also incorporated into the case studies. Finally, site specific considerations such as development incentives, zoning and building code restrictions and environmental requirements were investigated.

  18. Neutronics analysis for integration of ITER diagnostics port EP10

    Energy Technology Data Exchange (ETDEWEB)

    Colling, Bethany, E-mail: bethany.colling@ccfe.ac.uk [Culham Centre for Fusion Energy, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Department of Engineering, Lancaster University, Lancashire LA1 4YR (United Kingdom); Eade, Tim [Culham Centre for Fusion Energy, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Joyce, Malcolm J. [Department of Engineering, Lancaster University, Lancashire LA1 4YR (United Kingdom); Pampin, Raul; Seyvet, Fabien [Fusion for Energy, Josep Pla 2, Torres Diagonal Litoral B3, 08019 Barcelona (Spain); Turner, Andrew [Culham Centre for Fusion Energy, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Udintsev, Victor [ITER Organization, Route de Vinon-sur-Verdon, CS 90 046, 13067 St. Paul Lez Durance Cedex (France)

    2016-11-01

    Shutdown dose rate calculations have been performed on an integrated ITER C-lite neutronics model with equatorial port 10. A ‘fully shielded’ configuration, optimised for a given set of diagnostic designs (i.e. shielding in all available space within the port plug drawers), results in a shutdown dose rate in the port interspace, from the activation of materials comprising equatorial port 10, in excess of 2000 μSv/h. Achieving dose rates of 100 μSv/h or less, as required in areas where hands-on maintenance can be performed, in the port interspace region will be challenging. A combination of methods will need to be implemented, such as reducing mass and/or the use of reduced activation steel in the port interspace, optimisation of the diagnostic designs and shielding of the port interspace floor. Further analysis is required to test these options and the ongoing design optimisation of the EP10 diagnostic systems.

  19. Advanced data analysis in neuroscience integrating statistical and computational models

    CERN Document Server

    Durstewitz, Daniel

    2017-01-01

    This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering.  Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...

  20. Life-cycle analysis of product integrated polymer solar cells

    DEFF Research Database (Denmark)

    Espinosa Martinez, Nieves; García-Valverde, Rafael; Krebs, Frederik C

    2011-01-01

    A life cycle analysis (LCA) on a product integrated polymer solar module is carried out in this study. These assessments are well-known to be useful in developmental stages of a product in order to identify the bottlenecks for the up-scaling in its production phase for several aspects spanning from...... economics through design to functionality. An LCA study was performed to quantify the energy use and greenhouse gas (GHG) emissions from electricity use in the manufacture of a light-weight lamp based on a plastic foil, a lithium-polymer battery, a polymer solar cell, printed circuitry, blocking diode......, switch and a white light emitting semiconductor diode. The polymer solar cell employed in this prototype presents a power conversion efficiency in the range of 2 to 3% yielding energy payback times (EPBT) in the range of 1.3–2 years. Based on this it is worthwhile to undertake a life-cycle study...

  1. Integrated aerosol and thermalhydraulics modelling for CANDU safety analysis

    International Nuclear Information System (INIS)

    McDonald, B.H.; Hanna, B.N.

    1990-08-01

    Analysis of postulated accidents in CANDU reactors that could result in severe fuel damage requires the ability to model the formation of aerosols containing fission product materials and the transport of these aerosols from the fuel, through containment, to any leak to the atmosphere. Best-estimate calculations require intimate coupling and simultaneous solution of all the equations describing the entire range of physical and chemical phenomena involved. The prototype CATHENA/PACE-3D has been developed for integrated calculation of thermalhydraulic and aerosol events in a CANDU reactor during postulated accidents. Examples demonstrate the ability of CATHENA/PACE-3D to produce realistic flow and circulation patterns and reasonable accuracy in solution of two simple fluid-flow test cases for which analytical solutions exist

  2. Strategic Technology Investment Analysis: An Integrated System Approach

    Science.gov (United States)

    Adumitroaie, V.; Weisbin, C. R.

    2010-01-01

    Complex technology investment decisions within NASA are increasingly difficult to make such that the end results are satisfying the technical objectives and all the organizational constraints. Due to a restricted science budget environment and numerous required technology developments, the investment decisions need to take into account not only the functional impact on the program goals, but also development uncertainties and cost variations along with maintaining a healthy workforce. This paper describes an approach for optimizing and qualifying technology investment portfolios from the perspective of an integrated system model. The methodology encompasses multi-attribute decision theory elements and sensitivity analysis. The evaluation of the degree of robustness of the recommended portfolio provides the decision-maker with an array of viable selection alternatives, which take into account input uncertainties and possibly satisfy nontechnical constraints. The methodology is presented in the context of assessing capability development portfolios for NASA technology programs.

  3. The IMBA suite: integrated modules for bioassay analysis

    Energy Technology Data Exchange (ETDEWEB)

    Birchall, A.; Jarvis, N.S.; Peace, M.S.; Riddell, A.E.; Battersby, W.P

    1998-07-01

    The increasing complexity of models representing the biokinetic behaviour of radionuclides in the body following intake poses problems for people who are required to implement these models. The problem is exacerbated by the current paucity of suitable software. In order to remedy this situation, a collaboration between British Nuclear Fuels, Westlakes Research Institute and the National Radiological Protection Board has started with the aim of producing a suite of modules for estimating intakes and doses from bioassay measurements using the new ICRP models. Each module will have a single purpose (e.g. to calculate respiratory tract deposition) and will interface with other software using data files. The elements to be implemented initially are plutonium, uranium, caesium, iodine and tritium. It is intended to make the software available to other parties under terms yet to be decided. This paper describes the proposed suite of integrated modules for bioassay analysis, IMBA. (author)

  4. Technical assistance in relationship with the reloading analysis of the Laguna Verde Unit 2 reactor. Executive abstract

    International Nuclear Information System (INIS)

    Alonso V, G.; Castro B, M.; Gallegos E, R.; Hernandez L, H.; Montes T, J.L.; Ortiz S, J. J.; Perusquia C, R.

    1993-11-01

    The objective of the report was to carry out a comparative analysis of costs of energy generation among the designs GE9B of General Electric, 9X9-IX of SIEMENS and SVEA-96 of ABB ATOM, proposed to be used as recharge fuel in the Unit 2 of the Laguna Verde Nuclear Power station. (Author)

  5. ANALYSIS OF ENVIRONMENTAL FRAGILITY USING MULTI-CRITERIA ANALYSIS (MCE FOR INTEGRATED LANDSCAPE ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Abimael Cereda Junior

    2014-01-01

    Full Text Available The Geographic Information Systems brought greater possibilitie s to the representation and interpretation of the landscap e as well as the integrated a nalysis. However, this approach does not dispense technical and methodological substan tiation for achieving the computational universe. This work is grounded in ecodynamic s and empirical analysis of natural and anthr opogenic environmental Fragility a nd aims to propose and present an integrated paradigm of Multi-criteria Analysis and F uzzy Logic Model of Environmental Fragility, taking as a case study of the Basin of Monjolinho Stream in São Carlos-SP. The use of this methodology allowed for a reduct ion in the subjectivism influences of decision criteria, which factors might have its cartographic expression, respecting the complex integrated landscape.

  6. Functional Module Analysis for Gene Coexpression Networks with Network Integration.

    Science.gov (United States)

    Zhang, Shuqin; Zhao, Hongyu; Ng, Michael K

    2015-01-01

    Network has been a general tool for studying the complex interactions between different genes, proteins, and other small molecules. Module as a fundamental property of many biological networks has been widely studied and many computational methods have been proposed to identify the modules in an individual network. However, in many cases, a single network is insufficient for module analysis due to the noise in the data or the tuning of parameters when building the biological network. The availability of a large amount of biological networks makes network integration study possible. By integrating such networks, more informative modules for some specific disease can be derived from the networks constructed from different tissues, and consistent factors for different diseases can be inferred. In this paper, we have developed an effective method for module identification from multiple networks under different conditions. The problem is formulated as an optimization model, which combines the module identification in each individual network and alignment of the modules from different networks together. An approximation algorithm based on eigenvector computation is proposed. Our method outperforms the existing methods, especially when the underlying modules in multiple networks are different in simulation studies. We also applied our method to two groups of gene coexpression networks for humans, which include one for three different cancers, and one for three tissues from the morbidly obese patients. We identified 13 modules with three complete subgraphs, and 11 modules with two complete subgraphs, respectively. The modules were validated through Gene Ontology enrichment and KEGG pathway enrichment analysis. We also showed that the main functions of most modules for the corresponding disease have been addressed by other researchers, which may provide the theoretical basis for further studying the modules experimentally.

  7. Drugs for cardiovascular disease in India: perspectives of pharmaceutical executives and government officials on access and development-a qualitative analysis.

    Science.gov (United States)

    Newman, Charles; Ajay, Vamadevan S; Srinivas, Ravi; Bhalla, Sandeep; Prabhakaran, Dorairaj; Banerjee, Amitava

    2016-01-01

    India shoulders the greatest global burden of cardiovascular diseases (CVDs), which are the leading cause of mortality worldwide. Drugs are the bedrock of treatment and prevention of CVD. India's pharmaceutical industry is the third largest, by volume, globally, but access to CVD drugs in India is poor. There is a lack of qualitative data from government and pharmaceutical sectors regarding CVD drug development and access in India. By purposive sampling, we recruited either Indian government officials, or pharmaceutical company executives. We conducted a stakeholder analysis via semi-structured, face-to-face interviews in India. Topic guides allow for the exploration of key issues across multiple interviews, along with affording the interviewer the flexibility to examine matters arising from the discussions themselves. After transcription, interviews underwent inductive thematic analysis. Ten participants were interviewed (Government Officials: n = 5, and Pharmaceutical Executives: n = 5). Two themes emerged: i) 'Policy-derived Factors'; ii) 'Patient- derived Factors' with three findings. First, both government and pharmaceutical participants felt that the focus of Indian pharma is shifting to more complex, high-quality generics and to new drug development, but production of generic drugs rather than new molecular entities will remain a major activity. Second, current trial regulations in India may restrict India's potential role in the future development of CVD drugs. Third, it is likely that the Indian government will tighten its intellectual property regime in future, with potentially far-reaching implications on CVD drug development and access. Our stakeholder analysis provides some support for present patent regulations, whilst suggesting areas for further research in order to inform future policy decisions regarding CVD drug development and availability. Whilst interviewees suggested government policy plays an important role in shaping the industry, a

  8. The Holistic Integrity Test (HIT - quantified resilience analysis

    Directory of Open Access Journals (Sweden)

    Dobson Mike

    2016-01-01

    Full Text Available The Holistic Integrity Test (HIT - Quantified Resilience Analysis. Rising sea levels and wider climate change mean we face an increasing risk from flooding and other natural hazards. Tough economic times make it difficult to economically justify or afford the desired level of engineered risk reduction. Add to this significant uncertainty from a range of future predictions, constantly updated with new science. We therefore need to understand not just how to reduce the risk, but what could happen should above design standard events occur. In flood terms this includes not only the direct impacts (damage and loss of life, but the wider cascade impacts to infrastructure systems and the longer term impacts on the economy and society. However, understanding the “what if” is only the first part of the equation; a range of improvement measures to mitigate such effects need to be identified and implemented. These measures should consider reducing the risk, lessening the consequences, aiding the response, and speeding up the recovery. However, they need to be objectively assessed through quantitative analysis, which underpins them technically and economically. Without such analysis, it cannot be predicted how measures will perform if the extreme events occur. It is also vital to consider all possible hazards as measures for one hazard may hinder the response to another. The Holistic Integrity Test (HIT, uses quantitative system analysis and “HITs” the site, its infrastructure, contained dangers and wider regional system to determine how it copes with a range of severe shock events, Before, During and After the event, whilst also accounting for uncertainty (as illustrated in figure 1. First explained at the TINCE 2014 Nuclear Conference in Paris, it was explained in terms of a Nuclear Facility needing to analyse the site in response to post Fukushima needs; the hit is however universally applicable. The HIT has three key risk reduction goals: The

  9. Integration

    DEFF Research Database (Denmark)

    Emerek, Ruth

    2004-01-01

    Bidraget diskuterer de forskellige intergrationsopfattelse i Danmark - og hvad der kan forstås ved vellykket integration......Bidraget diskuterer de forskellige intergrationsopfattelse i Danmark - og hvad der kan forstås ved vellykket integration...

  10. Practical use of the integrated reporting framework – an analysis of the content of integrated reports of selected companies

    Directory of Open Access Journals (Sweden)

    Monika Raulinajtys-Grzybek

    2017-09-01

    Full Text Available Practical use of the integrated reporting framework – an analysis of the content of integrated reports of selected companies The purpose of the article is to provide a research tool for an initial assessment of whether a company’s integrated reports meet the objectives set out in the IIRC Integrated Reporting Framework and its empirical verification. In particular, the research addresses whether the reports meet the goal of improving the quality of information available and covering all factors that influence the organization’s ability to create value. The article uses the theoretical output on the principles of preparing integrated reports and analyzes the content of selected integrated reports. Based on the source analysis, a research tool has been developed for an initial assessment of whether an integrated report fulfills its objectives. It consists of 42 questions that verify the coverage of the defined elements and the implementation of the guiding principles set by the IIRC. For empirical verification of the tool, a comparative analysis was carried out for reports prepared by selected companies operating in the utilities sector. Answering questions from the research tool allows a researcher to formulate conclusions about the implementation of the guiding principles and the completeness of the presentation of the content elements. As a result of the analysis of selected integrated reports, it was stated that various elements of the report are presented with different levels of accuracy in different reports. Reports provide the most complete information on performance and strategy. The information about business model and prospective data is in some cases presented without making a link to other parts of the report – e.g. risks and opportunities, financial data or capitals. The absence of such links limits the ability to claim that an integrated report meets its objectives, since a set of individual reports, each presenting

  11. Analysis of the total system life cycle cost for the Civilian Radioactive Waste Management Program: executive summary

    International Nuclear Information System (INIS)

    1985-04-01

    The total-system life-cycle cost (TSLCC) analysis for the Department of Energy's Civilian Radioactive Waste Management Progrram is an ongoing activity that helps determine whether the revenue-producing mechanism established by the Nuclear Waste Policy Act of 1982 is sufficient to cover the cost of the program. This report is an input into the third evaluation of the adequacy of the fee. The total-system cost for the reference waste-management program in this analysis is estimated to be 24 to 30 billion (1984) dollars. For the sensitivity cases studied in this report, the costs could be as high as 35 billion dollars and as low as 21 billion dollars. Because factors like repository location, the quantity of waste generated, transportation-cask technology, and repository startup dates exert substantial impacts on total-system costs, there are several tradeoffs between these factors, and these tradeoffs can greatly influence the total cost of the program. The total-system cost for the reference program described in this report is higher by 3 to 5 billion dollars, or 15 to 20%, than the cost for the reference program of the TSLCC analysis of April 1984. More than two-thirds of this increase is in the cost of repository construction and operation. These repository costs have increased because of changing design concepts, different assumptions about the effort required to perform the necessary activities, and a change in the source data on which the earlier analysis was based. Development and evaluation costs have similarly increased because of a net addition to the work content. Transportation costs have increased because of different assumptions about repository locations and several characteristics of the transportation system. It is expected that the estimates of total-system costs will continue to change in response to both an evolving program strategy and better definition of the work required to achieve the program objectives

  12. Integrating PROOF Analysis in Cloud and Batch Clusters

    International Nuclear Information System (INIS)

    Rodríguez-Marrero, Ana Y; Fernández-del-Castillo, Enol; López García, Álvaro; Marco de Lucas, Jesús; Matorras Weinig, Francisco; González Caballero, Isidro; Cuesta Noriega, Alberto

    2012-01-01

    High Energy Physics (HEP) analysis are becoming more complex and demanding due to the large amount of data collected by the current experiments. The Parallel ROOT Facility (PROOF) provides researchers with an interactive tool to speed up the analysis of huge volumes of data by exploiting parallel processing on both multicore machines and computing clusters. The typical PROOF deployment scenario is a permanent set of cores configured to run the PROOF daemons. However, this approach is incapable of adapting to the dynamic nature of interactive usage. Several initiatives seek to improve the use of computing resources by integrating PROOF with a batch system, such as Proof on Demand (PoD) or PROOF Cluster. These solutions are currently in production at Universidad de Oviedo and IFCA and are positively evaluated by users. Although they are able to adapt to the computing needs of users, they must comply with the specific configuration, OS and software installed at the batch nodes. Furthermore, they share the machines with other workloads, which may cause disruptions in the interactive service for users. These limitations make PROOF a typical use-case for cloud computing. In this work we take profit from Cloud Infrastructure at IFCA in order to provide a dynamic PROOF environment where users can control the software configuration of the machines. The Proof Analysis Framework (PAF) facilitates the development of new analysis and offers a transparent access to PROOF resources. Several performance measurements are presented for the different scenarios (PoD, SGE and Cloud), showing a speed improvement closely correlated with the number of cores used.

  13. Pharmacy executive leadership issues and associated skills, knowledge, and abilities.

    Science.gov (United States)

    Meadows, Andrew B; Maine, Lucinda L; Keyes, Elizabeth K; Pearson, Kathy; Finstuen, Kenn

    2005-01-01

    To identify challenges that current and future pharmacy executives are facing or will face in the future and to define what skills, knowledge, and abilities (SKAs) are required to successfully negotiate these challenges. Delphi method for executive decision making. Civilian pharmacy profession. 110 pharmacists who graduated from the GlaxoSmithKline Executive Management Program for Pharmacy Leaders. Two iterations of the Delphi method for executive decision making separated by an expert panel content analysis. Round 1--participants were asked to identify five major issues they believed to be of greatest importance to pharmacy leaders in the next 5-10 years and name specific SKAs that might be needed by future leaders to successfully deal with those issues. An expert panel reviewed the issues, classified issues into specific domains, and titled each domain. Round 2-participants rated the SKAs on a 7-point scale according to their individual assessment of importance in each domain. For Delphi rounds 1 and 2, response rates were 21.8% and 18.2%, respectively. More than 100 total issue statements were identified. The expert panel sorted the issues into five domains: management and development of the pharmacy workforce, pharmacy finance, total quality management of work-flow systems, influences on the practice of pharmacy, and professional pharmacy leadership. Five of the top 15 SKAs-and all four highest ranked items--came from the professional pharmacy leadership domain, including ability to see the big picture, ability to demonstrate the value of pharmacy services, ability to lead and manage in an ethical manner, and skills for influencing an organization's senior leadership. Through successful integration of communication skills, critical thinking, and problem solving techniques, future public-sector pharmacy executives will be better equipped to effectively position their organizations and the profession for the challenges that lie ahead.

  14. Marketing Mix Formulation for Higher Education: An Integrated Analysis Employing Analytic Hierarchy Process, Cluster Analysis and Correspondence Analysis

    Science.gov (United States)

    Ho, Hsuan-Fu; Hung, Chia-Chi

    2008-01-01

    Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…

  15. Reduced integrity of the uncinate fasciculus and cingulum in depression: A stem-by-stem analysis.

    Science.gov (United States)

    Bhatia, Kartik D; Henderson, Luke A; Hsu, Eugene; Yim, Mark

    2018-04-07

    The subgenual cingulate gyrus (Brodmann's Area 25: BA25) is hypermetabolic in depression and has been targeted successfully with deep brain stimulation. Two of the white matter tracts that play a role in treatment response are the uncinate fasciculus (UF) and the cingulum bundle. The UF has three prefrontal stems, the most medial of which extends from BA25 (which deals with mood regulation) and the most lateral of which extends from the dorso-lateral prefrontal cortex (concerned with executive function). The cingulum bundle has numerous fibers connecting the lobes of the cerebrum, with the longest fibers extending from BA25 to the amygdala. We hypothesize that there is reduced integrity in the UF, specific to the medial prefrontal stems, as well as in the subgenual and amygdaloid fibers of the cingulum bundle. Our secondary hypothesis is that these changes are present from the early stages of depression. Compare the white matter integrity of stems of the UF and components of the cingulum bundle in first-onset depressed, recurrent/chronic depressed, and non-depressed control subjects. Depressed patients (n = 103, first-onset = 57, chronic = 46) and non-depressed control subjects (n = 74) underwent MRI with 32-directional DTI sequences. The uncinate fasciculi and cingulum bundles were seeded, and the fractional anisotropy (FA) measured in each of the three prefrontal stems and the body of the UF, as well as the subgenual, body, and amygdaloid fiber components of the cingulum bundle. FA measurements were compared between groups using ANOVA testing with post-hoc Tukey analysis. There were significant reductions in FA in the subgenual and polar stems of the UF bilaterally, as well as the subgenual and amygdaloid fibers of the cingulum bundle, in depressed patients compared with controls (p lateral UF stem or the main body of the cingulum. No significant difference was demonstrated in any of the tracts between first-onset and chronic depression patients

  16. Waveforms and Sonic Boom Perception and Response (WSPR): Low-Boom Community Response Program Pilot Test Design, Execution, and Analysis

    Science.gov (United States)

    Page, Juliet A.; Hodgdon, Kathleen K.; Krecker, Peg; Cowart, Robbie; Hobbs, Chris; Wilmer, Clif; Koening, Carrie; Holmes, Theresa; Gaugler, Trent; Shumway, Durland L.; hide

    2014-01-01

    The Waveforms and Sonic boom Perception and Response (WSPR) Program was designed to test and demonstrate the applicability and effectiveness of techniques to gather data relating human subjective response to multiple low-amplitude sonic booms. It was in essence a practice session for future wider scale testing on naive communities, using a purpose built low-boom demonstrator aircraft. The low-boom community response pilot experiment was conducted in California in November 2011. The WSPR team acquired sufficient data to assess and evaluate the effectiveness of the various physical and psychological data gathering techniques and analysis methods.

  17. Behavior and analysis of an integral abutment bridge.

    Science.gov (United States)

    2013-08-01

    As a result of abutment spalling on the integral abutment bridge over 400 South Street in Salt Lake City, Utah, the Utah Department of Transportation (UDOT) instigated research measures to better understand the behavior of integral abutment bridges. ...

  18. Reverse Engineering Integrated Circuits Using Finite State Machine Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oler, Kiri J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Miller, Carl H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-04-12

    In this paper, we present a methodology for reverse engineering integrated circuits, including a mathematical verification of a scalable algorithm used to generate minimal finite state machine representations of integrated circuits.

  19. Integrating Biological Perspectives:. a Quantum Leap for Microarray Expression Analysis

    Science.gov (United States)

    Wanke, Dierk; Kilian, Joachim; Bloss, Ulrich; Mangelsen, Elke; Supper, Jochen; Harter, Klaus; Berendzen, Kenneth W.

    2009-02-01

    Biologists and bioinformatic scientists cope with the analysis of transcript abundance and the extraction of meaningful information from microarray expression data. By exploiting biological information accessible in public databases, we try to extend our current knowledge over the plant model organism Arabidopsis thaliana. Here, we give two examples of increasing the quality of information gained from large scale expression experiments by the integration of microarray-unrelated biological information: First, we utilize Arabidopsis microarray data to demonstrate that expression profiles are usually conserved between orthologous genes of different organisms. In an initial step of the analysis, orthology has to be inferred unambiguously, which then allows comparison of expression profiles between orthologs. We make use of the publicly available microarray expression data of Arabidopsis and barley, Hordeum vulgare. We found a generally positive correlation in expression trajectories between true orthologs although both organisms are only distantly related in evolutionary time scale. Second, extracting clusters of co-regulated genes implies similarities in transcriptional regulation via similar cis-regulatory elements (CREs). Vice versa approaches, where co-regulated gene clusters are found by investigating on CREs were not successful in general. Nonetheless, in some cases the presence of CREs in a defined position, orientation or CRE-combinations is positively correlated with co-regulated gene clusters. Here, we make use of genes involved in the phenylpropanoid biosynthetic pathway, to give one positive example for this approach.

  20. Process integration and pinch analysis in sugarcane industry

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Adelk de Carvalho; Pinheiro, Ricardo Brant [UFMG, Departamento de Engenharia Nuclear, Programa de Pos-Graduacao em Ciencias e Tecnicas Nucleares, Belo Horizonte, MG (Brazil)], E-mail: rbp@nuclear.ufmg.br

    2010-07-01

    Process integration techniques were applied, particularly through the Pinch Analysis method, to sugarcane industry. Research was performed upon harvest data from an agroindustrial complex which processes sugarcane plant in excess of 3.5 million metric tons per year, producing motor fuel grade ethanol, standard quality sugar, and delivering excess electric power to the grid. Pinch Analysis was used in assessing internal heat recovery as well as external utility demand targets, while keeping the lowest but economically achievable targets for entropy increase. Efficiency on the use of energy was evaluated for the plant as it was found (the base case) as well as for five selected process and/or plant design modifications, always with guidance of the method. The first alternative design (case 2) was proposed to evaluate equipment mean idle time in the base case, to support subsequent comparisons. Cases 3 and 4 were used to estimate the upper limits of combined heat and power generation while raw material supply of the base case is kept; both the cases did not prove worth implementing. Cases 5 and 6 were devised to deal with the bottleneck of the plant, namely boiler capacity, in order to allow for some production increment. Inexpensive, minor modifications considered in case 5 were found unable to produce reasonable outcome gain. Nevertheless, proper changes in cane juice evaporation section (case 6) could allow sugar and ethanol combined production to rise up to 9.1% relative to the base case, without dropping cogenerated power. (author)

  1. Integrated Data Collection Analysis (IDCA) Program - SSST Testing Methods

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Whinnery, LeRoy L. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Phillips, Jason J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms (ATF), Huntsville, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-03-25

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the methods used for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis during the IDCA program. These methods changed throughout the Proficiency Test and the reasons for these changes are documented in this report. The most significant modifications in standard testing methods are: 1) including one specified sandpaper in impact testing among all the participants, 2) diversifying liquid test methods for selected participants, and 3) including sealed sample holders for thermal testing by at least one participant. This effort, funded by the Department of Homeland Security (DHS), is putting the issues of safe handling of these materials in perspective with standard military explosives. The study is adding SSST testing results for a broad suite of different HMEs to the literature. Ultimately the study will suggest new guidelines and methods and possibly establish the SSST testing accuracies needed to develop safe handling practices for HMEs. Each participating testing laboratory uses identical test materials and preparation methods wherever possible. The testing performers involved are Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Indian Head Division, Naval Surface Warfare Center, (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory (AFRL/RXQL). These tests are conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.

  2. Integrated Genomic Analysis of the Ubiquitin Pathway across Cancer Types

    Directory of Open Access Journals (Sweden)

    Zhongqi Ge

    2018-04-01

    Full Text Available Summary: Protein ubiquitination is a dynamic and reversible process of adding single ubiquitin molecules or various ubiquitin chains to target proteins. Here, using multidimensional omic data of 9,125 tumor samples across 33 cancer types from The Cancer Genome Atlas, we perform comprehensive molecular characterization of 929 ubiquitin-related genes and 95 deubiquitinase genes. Among them, we systematically identify top somatic driver candidates, including mutated FBXW7 with cancer-type-specific patterns and amplified MDM2 showing a mutually exclusive pattern with BRAF mutations. Ubiquitin pathway genes tend to be upregulated in cancer mediated by diverse mechanisms. By integrating pan-cancer multiomic data, we identify a group of tumor samples that exhibit worse prognosis. These samples are consistently associated with the upregulation of cell-cycle and DNA repair pathways, characterized by mutated TP53, MYC/TERT amplification, and APC/PTEN deletion. Our analysis highlights the importance of the ubiquitin pathway in cancer development and lays a foundation for developing relevant therapeutic strategies. : Ge et al. analyze a cohort of 9,125 TCGA samples across 33 cancer types to provide a comprehensive characterization of the ubiquitin pathway. They detect somatic driver candidates in the ubiquitin pathway and identify a cluster of patients with poor survival, highlighting the importance of this pathway in cancer development. Keywords: ubiquitin pathway, pan-cancer analysis, The Cancer Genome Atlas, tumor subtype, cancer prognosis, therapeutic targets, biomarker, FBXW7

  3. Penalized differential pathway analysis of integrative oncogenomics studies

    NARCIS (Netherlands)

    van Wieringen, W.N.; van de Wiel, M.A.

    2014-01-01

    Through integration of genomic data from multiple sources, we may obtain a more accurate and complete picture of the molecular mechanisms underlying tumorigenesis. We discuss the integration of DNA copy number and mRNA gene expression data from an observational integrative genomics study involving

  4. An integrated internal flow analysis for ramjet propulsion system

    Science.gov (United States)

    Hsieh, Shih-Yang

    An integrated numerical analysis has been conducted to study the ramjet internal flowfield. Emphasis is placed on the establishment of a unified numerical scheme and accurate representation of the internal flow development. The theoretical model is based on the complete conservation equations of mass, momentum, energy, and species concentration, with consideration of finite-rate chemical reactions and variable properties. Turbulence closure is achieved using a low-Reynolds number k-epsilon two-equation model. A new computation procedure capable of treating time-accurate, chemically reacting flows over a wide range of Mach number was developed. This numerical scheme allows for a unified treatment of the entire flowfield in a ramjet engine, including both the supersonic inlet and the combustion chamber. The algorithm is based on scaling the pressure terms in the momentum equations and preconditioning the conservation equations to circumvent numerical difficulties at low Mach numbers. The resulting equations are solved using the lower-upper (LU) factorization method in a fully-coupled manner, with the incorporation of a flux-differencing upwind TVD scheme to achieve high-order spatial accuracy. The transient behavior of the modeled system is preserved through implementation of the dual time-stepping integration technique. Calculations have been carried out for the flowfield in a typical ramjet engine consisting of an axisymmetric mixed-compression supersonic inlet and a coaxial dump combustor. Distinguished shock structures in the forward section of the inlet were clearly captured. The boundary layer thickening and flow separation behind the terminal shock due to shock/boundary-layer interactions and inlet configuration were observed. The mutual coupling between the inlet and combustor was carefully examined. In particular, strong vortices arising from the inlet shock/acoustic and shock/boundary-layer interactions may convect downstream and affect the combustion

  5. Packaged integrated opto-fluidic solution for harmful fluid analysis

    Science.gov (United States)

    Allenet, T.; Bucci, D.; Geoffray, F.; Canto, F.; Couston, L.; Jardinier, E.; Broquin, J.-E.

    2016-02-01

    Advances in nuclear fuel reprocessing have led to a surging need for novel chemical analysis tools. In this paper, we present a packaged lab-on-chip approach with co-integration of optical and micro-fluidic functions on a glass substrate as a solution. A chip was built and packaged to obtain light/fluid interaction in order for the entire device to make spectral measurements using the photo spectroscopy absorption principle. The interaction between the analyte solution and light takes place at the boundary between a waveguide and a fluid micro-channel thanks to the evanescent part of the waveguide's guided mode that propagates into the fluid. The waveguide was obtained via ion exchange on a glass wafer. The input and the output of the waveguides were pigtailed with standard single mode optical fibers. The micro-scale fluid channel was elaborated with a lithography procedure and hydrofluoric acid wet etching resulting in a 150+/-8 μm deep channel. The channel was designed with fluidic accesses, in order for the chip to be compatible with commercial fluidic interfaces/chip mounts. This allows for analyte fluid in external capillaries to be pumped into the device through micro-pipes, hence resulting in a fully packaged chip. In order to produce this co-integrated structure, two substrates were bonded. A study of direct glass wafer-to-wafer molecular bonding was carried-out to improve detector sturdiness and durability and put forward a bonding protocol with a bonding surface energy of γ>2.0 J.m-2. Detector viability was shown by obtaining optical mode measurements and detecting traces of 1.2 M neodymium (Nd) solute in 12+/-1 μL of 0.01 M and pH 2 nitric acid (HNO3) solvent by obtaining an absorption peak specific to neodymium at 795 nm.

  6. Spirituality and the physician executive.

    Science.gov (United States)

    Kaiser, L R

    2000-01-01

    The "s" word can now be spoken without flinching in health care organizations. Spirituality is becoming a common topic in management conferences around the world. Many U.S. corporations are recognizing the role of spirituality in creating a new humanistic capitalism that manages beyond the bottom line. Spirituality refers to a broad set of principles that transcend all religions. It is the relationship between yourself and something larger, such as the good of your patient or the welfare of the community. Spirituality means being in right relationship to all that is and understanding the mutual interdependence of all living beings. Physician executives should be primary proponents of spirituality in their organizations by: Modeling the power of spirituality in their own lives; integrating spiritual methodologies into clinical practice; fostering an integrative approach to patient care; encouraging the organization to tithe its profits for unmet community health needs; supporting collaborative efforts to improve the health of the community; and creating healing environments.

  7. Harmonic analysis in integrated energy system based on compressed sensing

    International Nuclear Information System (INIS)

    Yang, Ting; Pen, Haibo; Wang, Dan; Wang, Zhaoxia

    2016-01-01

    Highlights: • We propose a harmonic/inter-harmonic analysis scheme with compressed sensing theory. • Property of sparseness of harmonic signal in electrical power system is proved. • The ratio formula of fundamental and harmonic components sparsity is presented. • Spectral Projected Gradient-Fundamental Filter reconstruction algorithm is proposed. • SPG-FF enhances the precision of harmonic detection and signal reconstruction. - Abstract: The advent of Integrated Energy Systems enabled various distributed energy to access the system through different power electronic devices. The development of this has made the harmonic environment more complex. It needs low complexity and high precision of harmonic detection and analysis methods to improve power quality. To solve the shortages of large data storage capacities and high complexity of compression in sampling under the Nyquist sampling framework, this research paper presents a harmonic analysis scheme based on compressed sensing theory. The proposed scheme enables the performance of the functions of compressive sampling, signal reconstruction and harmonic detection simultaneously. In the proposed scheme, the sparsity of the harmonic signals in the base of the Discrete Fourier Transform (DFT) is numerically calculated first. This is followed by providing a proof of the matching satisfaction of the necessary conditions for compressed sensing. The binary sparse measurement is then leveraged to reduce the storage space in the sampling unit in the proposed scheme. In the recovery process, the scheme proposed a novel reconstruction algorithm called the Spectral Projected Gradient with Fundamental Filter (SPG-FF) algorithm to enhance the reconstruction precision. One of the actual microgrid systems is used as simulation example. The results of the experiment shows that the proposed scheme effectively enhances the precision of harmonic and inter-harmonic detection with low computing complexity, and has good

  8. [Integrity].

    Science.gov (United States)

    Gómez Rodríguez, Rafael Ángel

    2014-01-01

    To say that someone possesses integrity is to claim that that person is almost predictable about responses to specific situations, that he or she can prudentially judge and to act correctly. There is a closed interrelationship between integrity and autonomy, and the autonomy rests on the deeper moral claim of all humans to integrity of the person. Integrity has two senses of significance for medical ethic: one sense refers to the integrity of the person in the bodily, psychosocial and intellectual elements; and in the second sense, the integrity is the virtue. Another facet of integrity of the person is la integrity of values we cherish and espouse. The physician must be a person of integrity if the integrity of the patient is to be safeguarded. The autonomy has reduced the violations in the past, but the character and virtues of the physician are the ultimate safeguard of autonomy of patient. A field very important in medicine is the scientific research. It is the character of the investigator that determines the moral quality of research. The problem arises when legitimate self-interests are replaced by selfish, particularly when human subjects are involved. The final safeguard of moral quality of research is the character and conscience of the investigator. Teaching must be relevant in the scientific field, but the most effective way to teach virtue ethics is through the example of the a respected scientist.

  9. From organizational integration to clinical integration: analysis of the path between one level of integration to another using official documents

    Science.gov (United States)

    Mandza, Matey; Gagnon, Dominique; Carrier, Sébastien; Belzile, Louise; Demers, Louis

    2010-01-01

    Purpose Services’ integration comprises organizational, normative, economic, informational and clinical dimensions. Since 2004, the province of Quebec has devoted significant efforts to unify the governance of the main health and social care organizations of its various territories. Notwithstanding the uniformity of the national plan’s prescription, the territorial integration modalities greatly vary across the province. Theory This research is based upon a conceptual model of integration that comprises six components: inter-organizational partnership, case management, standardized assessment, a single entry point, a standardized service planning tool and a shared clinical file. Methods We conducted an embedded case study in six contrasted sites in terms of their level of integration. All documents prescribing the implementation of integration were retrieved and analyzed. Results and conclusions The analyzed documents demonstrate a growing local appropriation of the current integrative reform. Interestingly however, no link seems to exist between the quality of local prescriptions and the level of integration achieved in each site. This finding leads us to hypothesize that the variable quality of the operational accompaniment offered to implement these prescriptions is a variable in play.

  10. The International Research Experience: Executive MBA Distinctiveness.

    Science.gov (United States)

    Ambrose, David M.; Pol, Louis G.

    1995-01-01

    The University of Nebraska's Executive Master's in Business Administration (MBA) program has integrated international research activities into the curriculum. The university contracted with domestic corporations to conduct studies on prospects for international business. Research assignments include assessment of competitors, economic evaluations,…

  11. Evaluation and analysis of emergency maintenance due by third party action's, formulation and execution of contingency plans

    Energy Technology Data Exchange (ETDEWEB)

    Torres Vega, Raul; Nunez Ribera, Gary [TRANSIERRA S.A., Santa Cruz (Bolivia)

    2009-07-01

    In September 2008, in the Yacuiba - Rio Grande Gas Pipeline (GASYRG) located in the south of Bolivia, atypical situations took place. Due to the political events and the social mobilizations in which the country was, a series of third party actions developed putting in risk the facilities of the gas pipeline. These actions resulted in the rupture of a 1 inch instrumentation pipe, causing a leak witch some time caught fire and caused an interruption of the transportation service in that section of the pipeline, later on, another action derived in safety valve shut down causing a total pipeline shut down. In addition to these events we experience a fuel shortage, road blocks and a telephone communication system failure. In spite of these obstacles the maintenance activities were realized and emergency repairs put back in operating conditions the gas pipeline, task that was accomplished in a very short time, taking in account the situation. Later analysis, including all the adverse elements of the situation, result in the adoption of a series of measures and plans directed to mitigate the risk associated to this type of events, such as Mutual Aid Plans with fellow companies and institutions, fortification of the patrimonial security, stock material handling for emergency repairs, etc. (author)

  12. Energy optimization and prediction of complex petrochemical industries using an improved artificial neural network approach integrating data envelopment analysis

    International Nuclear Information System (INIS)

    Han, Yong-Ming; Geng, Zhi-Qiang; Zhu, Qun-Xiong

    2016-01-01

    Graphical abstract: This paper proposed an energy optimization and prediction of complex petrochemical industries based on a DEA-integrated ANN approach (DEA-ANN). The proposed approach utilizes the DEA model with slack variables for sensitivity analysis to determine the effective decision making units (DMUs) and indicate the optimized direction of the ineffective DMUs. Compared with the traditional ANN approach, the DEA-ANN prediction model is effectively verified by executing a linear comparison between all DMUs and the effective DMUs through the standard data source from the UCI (University of California at Irvine) repository. Finally, the proposed model is validated through an application in a complex ethylene production system of China petrochemical industry. Meanwhile, the optimization result and the prediction value are obtained to reduce energy consumption of the ethylene production system, guide ethylene production and improve energy efficiency. - Highlights: • The DEA-integrated ANN approach is proposed. • The DEA-ANN prediction model is effectively verified through the standard data source from the UCI repository. • The energy optimization and prediction framework of complex petrochemical industries based on the proposed method is obtained. • The proposed method is valid and efficient in improvement of energy efficiency in complex petrochemical plants. - Abstract: Since the complex petrochemical data have characteristics of multi-dimension, uncertainty and noise, it is difficult to accurately optimize and predict the energy usage of complex petrochemical systems. Therefore, this paper proposes a data envelopment analysis (DEA) integrated artificial neural network (ANN) approach (DEA-ANN). The proposed approach utilizes the DEA model with slack variables for sensitivity analysis to determine the effective decision making units (DMUs) and indicate the optimized direction of the ineffective DMUs. Compared with the traditional ANN approach, the DEA

  13. Proceedings of a NEA workshop on probabilistic structure integrity analysis and its relationship to deterministic analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-07-01

    This workshop was hosted jointly by the Swedish Nuclear Power Inspectorate (SKi) and the Swedish Royal Institute of Technology (KTH). It was sponsored by the Principal Working Group 3 (PWG-3) of the NEA CSNI. PWG-3 deals with the integrity of structures and components, and has three sub-groups, dealing with the integrity of metal components and structures, ageing of concrete structures, and the seismic behaviour of structures. The sub-group dealing with metal components has three mains areas of activity: non-destructive examination; fracture mechanics; and material degradation. The topic of this workshop is primarily probabilistic fracture mechanics, but probabilistic integrity analysis includes NDE and materials degradation also. Session 1 (5 papers) was devoted to the development of probabilistic models; Session 2 (5 papers) to the random modelling of defects and material properties; Session 3 (8 papers) to the applications of probabilistic modelling to nuclear components; Sessions 4 is a concluding panel discussion

  14. Proceedings of a NEA workshop on probabilistic structure integrity analysis and its relationship to deterministic analysis

    International Nuclear Information System (INIS)

    1996-01-01

    This workshop was hosted jointly by the Swedish Nuclear Power Inspectorate (SKi) and the Swedish Royal Institute of Technology (KTH). It was sponsored by the Principal Working Group 3 (PWG-3) of the NEA CSNI. PWG-3 deals with the integrity of structures and components, and has three sub-groups, dealing with the integrity of metal components and structures, ageing of concrete structures, and the seismic behaviour of structures. The sub-group dealing with metal components has three mains areas of activity: non-destructive examination; fracture mechanics; and material degradation. The topic of this workshop is primarily probabilistic fracture mechanics, but probabilistic integrity analysis includes NDE and materials degradation also. Session 1 (5 papers) was devoted to the development of probabilistic models; Session 2 (5 papers) to the random modelling of defects and material properties; Session 3 (8 papers) to the applications of probabilistic modelling to nuclear components; Sessions 4 is a concluding panel discussion

  15. Canonical integration and analysis of periodic maps using non-standard analysis and life methods

    Energy Technology Data Exchange (ETDEWEB)

    Forest, E.; Berz, M.

    1988-06-01

    We describe a method and a way of thinking which is ideally suited for the study of systems represented by canonical integrators. Starting with the continuous description provided by the Hamiltonians, we replace it by a succession of preferably canonical maps. The power series representation of these maps can be extracted with a computer implementation of the tools of Non-Standard Analysis and analyzed by the same tools. For a nearly integrable system, we can define a Floquet ring in a way consistent with our needs. Using the finite time maps, the Floquet ring is defined only at the locations s/sub i/ where one perturbs or observes the phase space. At most the total number of locations is equal to the total number of steps of our integrator. We can also produce pseudo-Hamiltonians which describe the motion induced by these maps. 15 refs., 1 fig.

  16. Executable research compendia in geoscience research infrastructures

    Science.gov (United States)

    Nüst, Daniel

    2017-04-01

    From generation through analysis and collaboration to communication, scientific research requires the right tools. Scientists create their own software using third party libraries and platforms. Cloud computing, Open Science, public data infrastructures, and Open Source enable scientists with unprecedented opportunites, nowadays often in a field "Computational X" (e.g. computational seismology) or X-informatics (e.g. geoinformatics) [0]. This increases complexity and generates more innovation, e.g. Environmental Research Infrastructures (environmental RIs [1]). Researchers in Computational X write their software relying on both source code (e.g. from https://github.com) and binary libraries (e.g. from package managers such as APT, https://wiki.debian.org/Apt, or CRAN, https://cran.r-project.org/). They download data from domain specific (cf. https://re3data.org) or generic (e.g. https://zenodo.org) data repositories, and deploy computations remotely (e.g. European Open Science Cloud). The results themselves are archived, given persistent identifiers, connected to other works (e.g. using https://orcid.org/), and listed in metadata catalogues. A single researcher, intentionally or not, interacts with all sub-systems of RIs: data acquisition, data access, data processing, data curation, and community support [3]. To preserve computational research [3] proposes the Executable Research Compendium (ERC), a container format closing the gap of dependency preservation by encapsulating the runtime environment. ERCs and RIs can be integrated for different uses: (i) Coherence: ERC services validate completeness, integrity and results (ii) Metadata: ERCs connect the different parts of a piece of research and faciliate discovery (iii) Exchange and Preservation: ERC as usable building blocks are the shared and archived entity (iv) Self-consistency: ERCs remove dependence on ephemeral sources (v) Execution: ERC services create and execute a packaged analysis but integrate with

  17. Marketing the Masters of Executive Management program

    OpenAIRE

    Barrera, Mark A.; Karriker, Timothy W.

    2007-01-01

    MBA Professional Report The purpose of this MBA project was to review the current Masters of Executive Management education curriculum at NPS. An internal analysis of the current program was conducted to fully understand the strategic goals of the program and the existing curriculum. An environmental scan of current and potential military customers was conducted to assess requirements for junior executive education and determine whether the MEM program corresponds with these requiremen...

  18. Does cortisol influence core executive functions? A meta-analysis of acute cortisol administration effects on working memory, inhibition, and set-shifting.

    Science.gov (United States)

    Shields, Grant S; Bonner, Joseph C; Moons, Wesley G

    2015-08-01

    The hormone cortisol is often believed to play a pivotal role in the effects of stress on human cognition. This meta-analysis is an attempt to determine the effects of acute cortisol administration on core executive functions. Drawing on both rodent and stress literatures, we hypothesized that acute cortisol administration would impair working memory and set-shifting but enhance inhibition. Additionally, because cortisol is thought to exert different nongenomic (rapid) and genomic (slow) effects, we further hypothesized that the effects of cortisol would differ as a function of the delay between cortisol administration and cognitive testing. Although the overall analyses were nonsignificant, after separating the rapid, nongenomic effects of cortisol from the slower, genomic effects of cortisol, the rapid effects of cortisol enhanced response inhibition, g+ = 0.113, p=.016, but impaired working memory, g+ = -0.315, p=.008, although these effects reversed over time. Contrary to our hypotheses, there was no effect of cortisol administration on set-shifting. Thus, although we did not find support for the idea that increases in cortisol influence set-shifting, we found that acute increases in cortisol exert differential effects on working memory and inhibition over time. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Nurse executive transformational leadership found in participative organizations.

    Science.gov (United States)

    Dunham-Taylor, J

    2000-05-01

    The study examined a national sample of 396 randomly selected hospital nurse executives to explore transformational leadership, stage of power, and organizational climate. Results from a few nurse executive studies have found nurse executives were transformational leaders. As executives were more transformational, they achieved better staff satisfaction and higher work group effectiveness. This study integrates Bass' transformational leadership model with Hagberg's power stage theory and Likert's organizational climate theory. Nurse executives (396) and staff reporting to them (1,115) rated the nurse executives' leadership style, staff extra effort, staff satisfaction, and work group effectiveness using Bass and Avolio's Multifactor Leadership Questionnaire. Executives' bosses (360) rated executive work group effectiveness. Executives completed Hagberg's Personal Power Profile and ranked their organizational climate using Likert's Profile of Organizational Characteristics. Nurse executives used transformational leadership fairly often; achieved fairly satisfied staff levels; were very effective according to bosses; were most likely at stage 3 (power by achievement) or stage 4 (power by reflection); and rated their hospital as a Likert System 3 Consultative Organization. Staff satisfaction and work group effectiveness decreased as nurse executives were more transactional. Higher transformational scores tended to occur with higher educational degrees and within more participative organizations. Transformational qualities can be enhanced by further education, by achieving higher power stages, and by being within more participative organizations.

  20. Autism Spectrum Disorder and intact executive functioning.

    Science.gov (United States)

    Ferrara, R; Ansermet, F; Massoni, F; Petrone, L; Onofri, E; Ricci, P; Archer, T; Ricci, S

    2016-01-01

    Earliest notions concerning autism (Autism Spectrum Disorders, ASD) describe the disturbance in executive functioning. Despite altered definition, executive functioning, expressed as higher cognitive skills required complex behaviors linked to the prefrontal cortex, are defective in autism. Specific difficulties in children presenting autism or verbal disabilities at executive functioning levels have been identified. Nevertheless, the developmental deficit of executive functioning in autism is highly diversified with huge individual variation and may even be absent. The aim of the present study to examine the current standing of intact executive functioning intact in ASD. Analysis of ASD populations, whether high-functioning, Asperger's or autism Broad Phenotype, studied over a range of executive functions including response inhibition, planning, cognitive flexibility, cognitive inhibition, and alerting networks indicates an absence of damage/impairment compared to the typically-developed normal control subjects. These findings of intact executive functioning in ASD subjects provide a strong foundation on which to construct applications for growth environments and the rehabilitation of autistic subjects.

  1. 75 FR 55816 - Senior Executive Service Performance Review Board Membership

    Science.gov (United States)

    2010-09-14

    ... DEPARTMENT OF THE INTERIOR Council of the Inspectors General on Integrity and Efficiency Senior..., each agency is required to establish one or more Senior Executive Service (SES) performance review boards. The purpose of these boards is to review and evaluate the initial appraisal of a senior executive...

  2. 78 FR 52513 - Senior Executive Service Performance Review Board Membership

    Science.gov (United States)

    2013-08-23

    ... COUNCIL OF THE INSPECTORS GENERAL ON INTEGRITY AND EFFICIENCY Senior Executive Service Performance... Management, each agency is required to establish one or more Senior Executive Service (SES) performance review boards. The purpose of these boards is to review and evaluate the initial appraisal of a senior...

  3. Microblogging for Class: An Analysis of Affective, Cognitive, Personal Integrative, and Social Integrative Gratifications

    Science.gov (United States)

    Gant, Camilla; Hadley, Patrick D.

    2014-01-01

    This study shows that undergraduate students can gratify cognitive, affective, social integrative, and personal integrative needs microblogging via a learning management system discussion tool. Moreover, the researchers find that microblogging about news regarding mass media events and issues via Blackboard heightened engagement, expanded…

  4. Executive dysfunction, brain aging, and political leadership.

    Science.gov (United States)

    Fisher, Mark; Franklin, David L; Post, Jerrold M

    2014-01-01

    Decision-making is an essential component of executive function, and a critical skill of political leadership. Neuroanatomic localization studies have established the prefrontal cortex as the critical brain site for executive function. In addition to the prefrontal cortex, white matter tracts as well as subcortical brain structures are crucial for optimal executive function. Executive function shows a significant decline beginning at age 60, and this is associated with age-related atrophy of prefrontal cortex, cerebral white matter disease, and cerebral microbleeds. Notably, age-related decline in executive function appears to be a relatively selective cognitive deterioration, generally sparing language and memory function. While an individual may appear to be functioning normally with regard to relatively obvious cognitive functions such as language and memory, that same individual may lack the capacity to integrate these cognitive functions to achieve normal decision-making. From a historical perspective, global decline in cognitive function of political leaders has been alternatively described as a catastrophic event, a slowly progressive deterioration, or a relatively episodic phenomenon. Selective loss of executive function in political leaders is less appreciated, but increased utilization of highly sensitive brain imaging techniques will likely bring greater appreciation to this phenomenon. Former Israeli Prime Minister Ariel Sharon was an example of a political leader with a well-described neurodegenerative condition (cerebral amyloid angiopathy) that creates a neuropathological substrate for executive dysfunction. Based on the known neuroanatomical and neuropathological changes that occur with aging, we should probably assume that a significant proportion of political leaders over the age of 65 have impairment of executive function.

  5. An example of system integration for RCRA policy analysis

    International Nuclear Information System (INIS)

    Tonn, B.; Goeltz, R.; Schmidt, K.

    1991-01-01

    This paper describes the synthesis of various computer technologies and software systems used on a project to estimate the costs of remediating Solid Waste Management Units (SWMUs) that fall under the corrective action provisions of the Resource Conservation and Recovery Act (RCRA). The project used two databases collected by Research Triangle Institute (RTI) that contain information on SWMUs and a PC-based software system called CORA that develops cost estimates for remediating SWMUs. The project team developed rules to categorize every SWMU in the databases by the kinds of technologies required to clean them up. These results were input into CORA, which estimated costs associated with the technologies. Early on, several computing challenges presented themselves. First, the databases have several hundred thousand records each. Second, the categorization rules could not be written to cover all combinations of variables. Third, CORA is run interactively and the analysis plan called for running CORA tens of thousands of times. Fourth, large data transfers needed to take place between RTI and Oak Ridge National Laboratory. Solutions to these problems required systems integration. SWMU categorization was streamlined by using INTERNET as was the data transfer. SAS was used to create files used by a program called SuperKey that was used to run CORA. Because the analysis plan required the generation of hundreds of thousands of cost estimates, memory management software was needed to allow the portable IBM P70 to do the job. During the course of the project, several other software packages were used, including: SAS System for Personal Computers (SAS/PC), DBase III, LOTUS 1-2-3, PIZAZZ PLUS, LOTUS Freelance Plus, and Word Perfect. Only the comprehensive use of all available hardware and software resources allowed this project to be completed within the time and budget constraints. 5 refs., 3 figs., 3 tabs

  6. The Venetian Ghetto: Semantic Modelling for an Integrated Analysis

    Directory of Open Access Journals (Sweden)

    Alessandra Ferrighi

    2017-12-01

    Full Text Available In the digital era, historians are embracing information technology as a research tool. New technologies offer investigation and interpretation, synthesis and communication tools that are more effective than the more traditional study methods, as they guarantee a multidisciplinary approach and analyses integration. Among the available technologies the best suited for the study or urban phenomena are databases (DB, the Geographic Information System (GIS, the Building Information Modelling (BIM and the multimedia tools (Video, APP for the dissemination of results. The case study described here concerns the analysis of part of Venice that changed its appearance from 1516 onwards, with the creation of the Jewish Ghetto. This was an event that would have repercussions throughout Europe, changing the course of history. Our research confirms that the exclusive use of one of the systems mentioned above (DB, GIS, BIM makes it possible to manage the complexity of the subject matter only partially. Consequently, it became necessary to analyse the possible interactions between such tools, so as to create a link between an alphanumeric DB and a geographical DB. The use of only GIS and BIM that provide for a 4D time management of objects turned out to be able to manage information and geometry in an effective and scalable way, providing a starting point for the mapping in depth of the historical analysis. Software products for digital modelling have changed in nature over time, going from simple viewing tools to simulation tools. The reconstruction of the time phases of the three Ghettos (Nuovo, Vecchio, and Nuovissimo and their visualisation through digital narratives of the history of that specific area of the city, for instance through videos, is making it possible for an increasing number of scholars and the general public to access the results of the study.

  7. IRRAS, Integrated Reliability and Risk Analysis System for PC

    International Nuclear Information System (INIS)

    Russell, K.D.

    1995-01-01

    1 - Description of program or function: IRRAS4.16 is a program developed for the purpose of performing those functions necessary to create and analyze a complete Probabilistic Risk Assessment (PRA). This program includes functions to allow the user to create event trees and fault trees, to define accident sequences and basic event failure data, to solve system and accident sequence fault trees, to quantify cut sets, and to perform uncertainty analysis on the results. Also included in this program are features to allow the analyst to generate reports and displays that can be used to document the results of an analysis. Since this software is a very detailed technical tool, the user of this program should be familiar with PRA concepts and the methods used to perform these analyses. 2 - Method of solution: IRRAS4.16 is written entirely in MODULA-2 and uses an integrated commercial graphics package to interactively construct and edit fault trees. The fault tree solving methods used are industry recognized top down algorithms. For quantification, the program uses standard methods to propagate the failure information through the generated cut sets. 3 - Restrictions on the complexity of the problem: Due to the complexity of and the variety of ways a fault tree can be defined it is difficult to define limits on the complexity of the problem solved by this software. It is, however, capable of solving a substantial fault tree due to efficient methods. At this time, the software can efficiently solve problems as large as other software currently used on mainframe computers. Does not include source code

  8. Corporate Disclosure, Materiality, and Integrated Report: An Event Study Analysis

    Directory of Open Access Journals (Sweden)

    Maria Cleofe Giorgino

    2017-11-01

    Full Text Available Within the extensive literature investigating the impacts of corporate disclosure in supporting the sustainable growth of an organization, few studies have included in the analysis the materiality issue referred to the information being disclosed. This article aims to address this gap, exploring the effect produced on capital markets by the publication of a recent corporate reporting tool, Integrated Report (IR. The features of this tool are that it aims to represent the multidimensional impact of the organization’s activity and assumes materiality as a guiding principle of the report drafting. Adopting the event study methodology associated with a statistical significance test for categorical data, our results verify that an organization’s release of IR is able to produce a statistically significant impact on the related share prices. Moreover, the term “integrated” assigned to the reports plays a significant role in the impact on capital markets. Our findings have beneficial implications for both researchers and practitioners, adding new evidence for the IR usefulness as a corporate disclosure tool and the effect of an organization’s decision to disclose material information.

  9. Linking Ayurveda and Western medicine by integrative analysis

    Directory of Open Access Journals (Sweden)

    Fazlin Mohd Fauzi

    2013-01-01

    Full Text Available In this article, we discuss our recent work in elucidating the mode-of-action of compounds used in traditional medicine including Ayurvedic medicine. Using computational (′in silico′ approach, we predict potential targets for Ayurvedic anti-cancer compounds, obtained from the Indian Plant Anticancer Database given its chemical structure. In our analysis, we observed that: (i the targets predicted can be connected to cancer pathogenesis i.e. steroid-5-alpha reductase 1 and 2 and estrogen receptor-β, and (ii predominantly hormone-dependent cancer targets were predicted for the anti-cancer compounds. Through the use of our in silico target prediction, we conclude that understanding how traditional medicine such as Ayurveda work through linking with the ′western′ understanding of chemistry and protein targets can be a fruitful avenue in addition to bridging the gap between the two different schools of thinking. Given that compounds used in Ayurveda have been tested and used for thousands of years (although not in the same approach as Western medicine, they can potentially be developed into potential new drugs. Hence, to further advance the case of Ayurvedic medicine, we put forward some suggestions namely: (a employing and integrating novel analytical methods given the advancements of ′omics′ and (b sharing experimental data and clinical results on studies done on Ayurvedic compounds in an easy and accessible way.

  10. Extreme Wave Analysis by Integrating Model and Wave Buoy Data

    Directory of Open Access Journals (Sweden)

    Fabio Dentale

    2018-03-01

    Full Text Available Estimating the extreme values of significant wave height (HS, generally described by the HS return period TR function HS(TR and by its confidence intervals, is a necessity in many branches of coastal science and engineering. The availability of indirect wave data generated by global and regional wind and wave model chains have brought radical changes to the estimation procedures of such probability distribution—weather and wave modeling systems are routinely run all over the world, and HS time series for each grid point are produced and published after assimilation (analysis of the ground truth. However, while the sources of such indirect data are numerous, and generally of good quality, many aspects of their procedures are hidden to the users, who cannot evaluate the reliability and the limits of the HS(TR deriving from such data. In order to provide a simple engineering tool to evaluate the probability of extreme sea-states as well as the quality of such estimates, we propose here a procedure based on integrating HS time series generated by model chains with those recorded by wave buoys in the same area.

  11. MEASURE: An integrated data-analysis and model identification facility

    Science.gov (United States)

    Singh, Jaidip; Iyer, Ravi K.

    1990-01-01

    The first phase of the development of MEASURE, an integrated data analysis and model identification facility is described. The facility takes system activity data as input and produces as output representative behavioral models of the system in near real time. In addition a wide range of statistical characteristics of the measured system are also available. The usage of the system is illustrated on data collected via software instrumentation of a network of SUN workstations at the University of Illinois. Initially, statistical clustering is used to identify high density regions of resource-usage in a given environment. The identified regions form the states for building a state-transition model to evaluate system and program performance in real time. The model is then solved to obtain useful parameters such as the response-time distribution and the mean waiting time in each state. A graphical interface which displays the identified models and their characteristics (with real time updates) was also developed. The results provide an understanding of the resource-usage in the system under various workload conditions. This work is targeted for a testbed of UNIX workstations with the initial phase ported to SUN workstations on the NASA, Ames Research Center Advanced Automation Testbed.

  12. Signal Integrity Analysis in Single and Bundled Carbon Nanotube Interconnects

    International Nuclear Information System (INIS)

    Majumder, M.K.; Pandya, N.D.; Kaushik, B.K.; Manhas, S.K.

    2013-01-01

    Carbon nanotube (CN T) can be considered as an emerging interconnect material in current nano scale regime. They are more promising than other interconnect materials such as Al or Cu because of their robustness to electromigration. This research paper aims to address the crosstalk-related issues (signal integrity) in interconnect lines. Different analytical models of single- (SWCNT), double- (DWCNT), and multiwalled CNTs (MWCNT) are studied to analyze the crosstalk delay at global interconnect lengths. A capacitively coupled three-line bus architecture employing CMOS driver is used for accurate estimation of crosstalk delay. Each line in bus architecture is represented with the equivalent RLC models of single and bundled SWCNT, DWCNT, and MWCNT interconnects. Crosstalk delay is observed at middle line (victim) when it switches in opposite direction with respect to the other two lines (aggressors). Using the data predicted by ITRS 2012, a comparative analysis on the basis of crosstalk delay is performed for bundled SWCNT/DWCNT and single MWCNT interconnects. It is observed that the overall crosstalk delay is improved by 40.92% and 21.37% for single MWCNT in comparison to bundled SWCNT and bundled DWCNT interconnects, respectively.

  13. Functional analysis in the study of differential and integral equations

    International Nuclear Information System (INIS)

    Sell, G.R.

    1976-01-01

    This paper illustrates the use of functional analysis in the study of differential equations. Our particular starting point, the theory of flows or dynamical systems, originated with the work of H. Poincare, who is the founder of the qualitative theory of ordinary differential equations. In the qualitative theory one tries to describe the behaviour of a solution, or a collection of solutions, without ''solving'' the differential equation. As a starting point one assumes the existence, and sometimes the uniqueness, of solutions and then one tries to describe the asymptotic behaviour, as time t→+infinity, of these solutions. We compare the notion of a flow with that of a C 0 -group of bounded linear operators on a Banach space. We shall show how the concept C 0 -group, or more generally a C 0 -semigroup, can be used to study the behaviour of solutions of certain differential and integral equations. Our main objective is to show how the concept of a C 0 -group and especially the notion of weak-compactness can be used to prove the existence of an invariant measure for a flow on a compact Hausdorff space. Applications to the theory of ordinary differential equations are included. (author)

  14. iDASH: integrating data for analysis, anonymization, and sharing

    Science.gov (United States)

    Bafna, Vineet; Boxwala, Aziz A; Chapman, Brian E; Chapman, Wendy W; Chaudhuri, Kamalika; Day, Michele E; Farcas, Claudiu; Heintzman, Nathaniel D; Jiang, Xiaoqian; Kim, Hyeoneui; Kim, Jihoon; Matheny, Michael E; Resnic, Frederic S; Vinterbo, Staal A

    2011-01-01

    iDASH (integrating data for analysis, anonymization, and sharing) is the newest National Center for Biomedical Computing funded by the NIH. It focuses on algorithms and tools for sharing data in a privacy-preserving manner. Foundational privacy technology research performed within iDASH is coupled with innovative engineering for collaborative tool development and data-sharing capabilities in a private Health Insurance Portability and Accountability Act (HIPAA)-certified cloud. Driving Biological Projects, which span different biological levels (from molecules to individuals to populations) and focus on various health conditions, help guide research and development within this Center. Furthermore, training and dissemination efforts connect the Center with its stakeholders and educate data owners and data consumers on how to share and use clinical and biological data. Through these various mechanisms, iDASH implements its goal of providing biomedical and behavioral researchers with access to data, software, and a high-performance computing environment, thus enabling them to generate and test new hypotheses. PMID:22081224

  15. Linking Ayurveda and Western medicine by integrative analysis.

    Science.gov (United States)

    Fauzi, Fazlin Mohd; Koutsoukas, Alexios; Lowe, Robert; Joshi, Kalpana; Fan, Tai-Ping; Glen, Robert C; Bender, Andreas

    2013-04-01

    In this article, we discuss our recent work in elucidating the mode-of-action of compounds used in traditional medicine including Ayurvedic medicine. Using computational ('in silico') approach, we predict potential targets for Ayurvedic anti-cancer compounds, obtained from the Indian Plant Anticancer Database given its chemical structure. In our analysis, we observed that: (i) the targets predicted can be connected to cancer pathogenesis i.e. steroid-5-alpha reductase 1 and 2 and estrogen receptor-β, and (ii) predominantly hormone-dependent cancer targets were predicted for the anti-cancer compounds. Through the use of our in silico target prediction, we conclude that understanding how traditional medicine such as Ayurveda work through linking with the 'western' understanding of chemistry and protein targets can be a fruitful avenue in addition to bridging the gap between the two different schools of thinking. Given that compounds used in Ayurveda have been tested and used for thousands of years (although not in the same approach as Western medicine), they can potentially be developed into potential new drugs. Hence, to further advance the case of Ayurvedic medicine, we put forward some suggestions namely: (a) employing and integrating novel analytical methods given the advancements of 'omics' and (b) sharing experimental data and clinical results on studies done on Ayurvedic compounds in an easy and accessible way.

  16. Thermally-induced voltage alteration for integrated circuit analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cole, E.I. Jr.

    2000-06-20

    A thermally-induced voltage alteration (TIVA) apparatus and method are disclosed for analyzing an integrated circuit (IC) either from a device side of the IC or through the IC substrate to locate any open-circuit or short-circuit defects therein. The TIVA apparatus uses constant-current biasing of the IC while scanning a focused laser beam over electrical conductors (i.e. a patterned metallization) in the IC to produce localized heating of the conductors. This localized heating produces a thermoelectric potential due to the Seebeck effect in any conductors with open-circuit defects and a resistance change in any conductors with short-circuit defects, both of which alter the power demand by the IC and thereby change the voltage of a source or power supply providing the constant-current biasing. By measuring the change in the supply voltage and the position of the focused and scanned laser beam over time, any open-circuit or short-circuit defects in the IC can be located and imaged. The TIVA apparatus can be formed in part from a scanning optical microscope, and has applications for qualification testing or failure analysis of ICs.

  17. Educating Executive Function

    Science.gov (United States)

    Blair, Clancy

    2016-01-01

    Executive functions are thinking skills that assist with reasoning, planning, problem solving, and managing one’s life. The brain areas that underlie these skills are interconnected with and influenced by activity in many different brain areas, some of which are associated with emotion and stress. One consequence of the stress-specific connections is that executive functions, which help us to organize our thinking, tend to be disrupted when stimulation is too high and we are stressed out, or too low when we are bored and lethargic. Given their central role in reasoning and also in managing stress and emotion, scientists have conducted studies, primarily with adults, to determine whether executive functions can be improved by training. By and large, results have shown that they can be, in part through computer-based videogame-like activities. Evidence of wider, more general benefits from such computer-based training, however, is mixed. Accordingly, scientists have reasoned that training will have wider benefits if it is implemented early, with very young children as the neural circuitry of executive functions is developing, and that it will be most effective if embedded in children’s everyday activities. Evidence produced by this research, however, is also mixed. In sum, much remains to be learned about executive function training. Without question, however, continued research on this important topic will yield valuable information about cognitive development. PMID:27906522

  18. Bayesian Integrated Data Analysis of Fast-Ion Measurements by Velocity-Space Tomography

    DEFF Research Database (Denmark)

    Salewski, M.; Nocente, M.; Jacobsen, A.S.

    2018-01-01

    Bayesian integrated data analysis combines measurements from different diagnostics to jointly measure plasma parameters of interest such as temperatures, densities, and drift velocities. Integrated data analysis of fast-ion measurements has long been hampered by the complexity of the strongly non...... framework. The implementation for different types of diagnostics as well as the uncertainties are discussed, and we highlight the importance of integrated data analysis of all available detectors....

  19. An Empirical Analysis of Post-Merger Organizational Integration

    DEFF Research Database (Denmark)

    Smeets, Valerie Anne Rolande; Ierulli, Kathryn; Gibbs, Michael

    2016-01-01

    existing establishments. Worker turnover is high after merger, but new hiring yields stable total employment. Target employees have higher turnover and reassignment, particularly if the target firm is small relative to the acquiring firm. These findings may suggest integration is costly, but can......We study post-merger organizational integration using linked employer-employee data. Integration is implemented by reassigning a small number of high skilled workers, especially in R&D and management. Workforce mixing is concentrated to establishments set up after merger rather than to previously...... be achieved by focusing on key employees. Alternatively, reassigning a few key employees is sufficient for achieving integration....

  20. Strategic Mobility 21: Integrated Tracking System Analysis and Concept Design

    National Research Council Canada - National Science Library

    Mallon, Lawrence G; Savacool, Edwin

    2007-01-01

    ... (ITS). This ITS design document identifies the technical and functional requirements for developing, procuring, and integrating components of an ITS capable of supporting an inland regional port, multi...

  1. CAD-Based Modeling of Advanced Rotary Wing Structures for Integrated 3-D Aeromechanics Analysis

    Science.gov (United States)

    Staruk, William

    This dissertation describes the first comprehensive use of integrated 3-D aeromechanics modeling, defined as the coupling of 3-D solid finite element method (FEM) structural dynamics with 3-D computational fluid dynamics (CFD), for the analysis of a real helicopter rotor. The development of this new methodology (a departure from how rotor aeroelastic analysis has been performed for 40 years), its execution on a real rotor, and the fundamental understanding of aeromechanics gained from it, are the key contributions of this dissertation. This work also presents the first CFD/CSD analysis of a tiltrotor in edgewise flight, revealing many of its unique loading mechanisms. The use of 3-D FEM, integrated with a trim solver and aerodynamics modeling, has the potential to enhance the design of advanced rotors by overcoming fundamental limitations of current generation beam-based analysis tools and offering integrated internal dynamic stress and strain predictions for design. Two primary goals drove this research effort: 1) developing a methodology to create 3-D CAD-based brick finite element models of rotors including multibody joints, controls, and aerodynamic interfaces, and 2) refining X3D, the US Army's next generation rotor structural dynamics solver featuring 3-D FEM within a multibody formulation with integrated aerodynamics, to model a tiltrotor in the edgewise conversion flight regime, which drives critical proprotor structural loads. Prior tiltrotor analysis has primarily focused on hover aerodynamics with rigid blades or forward flight whirl-flutter stability with simplified aerodynamics. The first goal was met with the development of a detailed methodology for generating multibody 3-D structural models, starting from CAD geometry, continuing to higher-order hexahedral finite element meshing, to final assembly of the multibody model by creating joints, assigning material properties, and defining the aerodynamic interface. Several levels of verification and

  2. Living PRAs [probabilistic risk analysis] made easier with IRRAS [Integrated Reliability and Risk Analysis System

    International Nuclear Information System (INIS)

    Russell, K.D.; Sattison, M.B.; Rasmuson, D.M.

    1989-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is an integrated PRA software tool that gives the user the ability to create and analyze fault trees and accident sequences using an IBM-compatible microcomputer. This program provides functions that range from graphical fault tree and event tree construction to cut set generation and quantification. IRRAS contains all the capabilities and functions required to create, modify, reduce, and analyze event tree and fault tree models used in the analysis of complex systems and processes. IRRAS uses advanced graphic and analytical techniques to achieve the greatest possible realization of the potential of the microcomputer. When the needs of the user exceed this potential, IRRAS can call upon the power of the mainframe computer. The role of the Idaho National Engineering Laboratory if the IRRAS program is that of software developer and interface to the user community. Version 1.0 of the IRRAS program was released in February 1987 to prove the concept of performing this kind of analysis on microcomputers. This version contained many of the basic features needed for fault tree analysis and was received very well by the PRA community. Since the release of Version 1.0, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version is designated ''IRRAS 2.0''. Version 3.0 will contain all of the features required for efficient event tree and fault tree construction and analysis. 5 refs., 26 figs

  3. MIV Project: Executive Summary

    DEFF Research Database (Denmark)

    Ravazzotti, Mariolina T.; Jørgensen, John Leif; Neefs, Marc

    1997-01-01

    Under the ESA contract #11453/95/NL/JG(SC), aiming at assessing the feasibility of Rendez-vous and docking of unmanned spacecrafts, a reference mission scenario was defined. This report gives an executive summary of the achievements and results from the project.......Under the ESA contract #11453/95/NL/JG(SC), aiming at assessing the feasibility of Rendez-vous and docking of unmanned spacecrafts, a reference mission scenario was defined. This report gives an executive summary of the achievements and results from the project....

  4. Short circuit analysis of distribution system with integration of DG

    DEFF Research Database (Denmark)

    Su, Chi; Liu, Zhou; Chen, Zhe

    2014-01-01

    and as a result bring challenges to the network protection system. This problem has been frequently discussed in the literature, but mostly considering only the balanced fault situation. This paper presents an investigation on the influence of full converter based wind turbine (WT) integration on fault currents......Integration of distributed generation (DG) such as wind turbines into distribution system is increasing all around the world, because of the flexible and environmentally friendly characteristics. However, DG integration may change the pattern of the fault currents in the distribution system...... during both balanced and unbalanced faults. Major factors such as external grid short circuit power capacity, WT integration location, connection type of WT integration transformer are taken into account. In turn, the challenges brought to the protection system in the distribution network are presented...

  5. Research Review: Executive function deficits in fetal alcohol spectrum disorders and attention-deficit/hyperactivity disorder – a meta-analysis

    Science.gov (United States)

    Kingdon, Danielle; Cardoso, Christopher; McGrath, Jennifer J.

    2018-01-01

    Background Attention-deficit/hyperactivity disorder (ADHD)-like symptoms are common in fetal alcohol spectrum disorders (FASD). FASD and ADHD groups both display executive function impairments; however, there is ongoing debate whether the pattern and magnitude of executive function deficits differs between these two types of disorders. Methods An electronic literature search was conducted (PubMed, PsychInfo; 1972–2013) to identify studies comparing the executive functioning of children with FASD with ADHD or control groups. FASD groups included those with and without dysmorphy (i.e., FAS, pFAS, ARND, and other FASD diagnoses). Effect sizes (Hedges’ g, standardized mean difference) were calculated. Random effects meta-analytic models were performed using the metafor package for R. Results Fifty-one studies met inclusion criteria (FASD N = 2,115; ADHD N = 453; controls N = 1,990). Children with FASD showed the strongest and most consistent deficits in planning, fluency, and set-shifting compared to controls (Hedges’ g = −0.94, −0.78) and children with ADHD (Hedges’ g = −0.72, −0.32). FASD was associated with moderate to large impairments in working memory, compared to controls (Hedges’ g = −.84, −.58) and small impairments relative to groups with ADHD (Hedges’ g = −.26). Smaller and less consistent deficits were found on measures of inhibition and vigilance relative to controls (Hedges’ g = −0.52, −0.31); FASD and ADHD were not differentiated on these measures. Moderator analyses indicated executive dysfunction was associated with older age, dysmorphy, and larger group differences in IQ. Sex and diagnostic system were not consistently related to effect size. Conclusions While FASD is associated with global executive impairments, executive function weaknesses are most consistent for measures of planning, fluency, and set-shifting. Neuropsychological measures assessing these executive function domains may improve differential diagnosis

  6. Integrative qualitative communication analysis of consultation and patient and practitioner perspectives: towards a theory of authentic caring in clinical relationships.

    Science.gov (United States)

    Salmon, Peter; Mendick, Nicola; Young, Bridget

    2011-03-01

    We developed a method whereby relationships can be studied simultaneously from the perspectives of each party and researchers' observations of their dialogue. Then we used this method to study how to recognise authentic, caring clinical relationships. Participants were 20 patients who had recently received surgery for breast cancer and nine surgeons with whom they had a post-operative consultation. We audiorecorded consultations, before interviewing patients and surgeons about their perceptions of the consultation and each other. Cross-case qualitative analyses (analysing consultations and surgeon and patient interviews, respectively) were supplemented by integrative, within-case analysis. Surgeons and patients described their relationship as personal and emotional, but emotional talk was absent from consultations. For patients and surgeons, their relationship depended, instead, on surgeons' expertise and character. Our integrative approach suggested that authentic caring in these relationships lay in practitioners' conscientious execution of their role and, contrary to currently influential views, not in an explicit emotional engagement. Relationships between patients and practitioners cannot be described adequately using analyses of interactions between them. Researchers will need to triangulate between these observations and the patient and practitioner perspectives in order to understand what makes for authentically caring relationships. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  7. Lexical quality and executive control predict children's first and second language reading comprehension.

    Science.gov (United States)

    Raudszus, Henriette; Segers, Eliane; Verhoeven, Ludo

    2018-01-01

    This study compared how lexical quality (vocabulary and decoding) and executive control (working memory and inhibition) predict reading comprehension directly as well as indirectly, via syntactic integration, in monolingual and bilingual fourth grade children. The participants were 76 monolingual and 102 bilingual children (mean age 10 years, SD  = 5 months) learning to read Dutch in the Netherlands. Bilingual children showed lower Dutch vocabulary, syntactic integration and reading comprehension skills, but better decoding skills than their monolingual peers. There were no differences in working memory or inhibition. Multigroup path analysis showed relatively invariant connections between predictors and reading comprehension for monolingual and bilingual readers. For both groups, there was a direct effect of lexical quality on reading comprehension. In addition, lexical quality and executive control indirectly influenced reading comprehension via syntactic integration. The groups differed in that inhibition more strongly predicted syntactic integration for bilingual than for monolingual children. For a subgroup of bilingual children, for whom home language vocabulary data were available ( n  = 56), there was an additional positive effect of home language vocabulary on second language reading comprehension. Together, the results suggest that similar processes underlie reading comprehension in first and second language readers, but that syntactic integration requires more executive control in second language reading. Moreover, bilingual readers additionally benefit from first language vocabulary to arrive at second language reading comprehension.

  8. Generalized Energy Flow Analysis Considering Electricity Gas and Heat Subsystems in Local-Area Energy Systems Integration

    Directory of Open Access Journals (Sweden)

    Jiaqi Shi

    2017-04-01

    Full Text Available To alleviate environmental pollution and improve the efficient use of energy, energy systems integration (ESI—covering electric power systems, heat systems and natural gas systems—has become an important trend in energy utilization. The traditional power flow calculation method, with the object as the power system, will prove difficult in meeting the requirements of the coupled energy flow analysis. This paper proposes a generalized energy flow (GEF analysis method which is suitable for an ESI containing electricity, heat and gas subsystems. First, the models of electricity, heat, and natural gas networks in the ESI are established. In view of the complexity of the conventional method to solve the gas network including the compressor, an improved practical equivalent method was adopted based on different control modes. On this basis, a hybrid method combining homotopy and the Newton-Raphson algorithm was executed to compute the nonlinear equations of GEF, and the Jacobi matrix reflecting the coupling relationship of multi-energy was derived considering the grid connected mode and island modes of the power system in the ESI. Finally, the validity of the proposed method in multi-energy flow calculation and the analysis of interacting characteristics was verified using practical cases.

  9. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  10. Graphical User Interface for Simulink Integrated Performance Analysis Model

    Science.gov (United States)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  11. A multilayered integrated sensor for three-dimensional, micro total analysis systems

    International Nuclear Information System (INIS)

    Xiao, Jing; Song, Fuchuan; Seo, Sang-Woo

    2013-01-01

    This paper presents a layer-by-layer integration approach of different functional devices and demonstrates a heterogeneously integrated optical sensor featuring a micro-ring resonator and a high-speed thin-film InGaAs-based photodetector co-integrated with a microfluidic droplet generation device. A thin optical device structure allows a seamless integration with other polymer-based devices on a silicon platform. The integrated sensor successfully demonstrates its transient measurement capability of two-phase liquid flow in a microfluidic droplet generation device. The proposed approach represents an important step toward fully integrated micro total analysis systems. (paper)

  12. Integrated safety analysis to operate while constructing Urenco USA

    International Nuclear Information System (INIS)

    Kohrt, Rick; Su, Shiaw-Der; Lehman, Richard

    2013-01-01

    The URENCO USA (UUSA) site in Lea County, New Mexico, USA is authorized by the U.S. Nuclear Regulatory Commission (NRC) for construction and operation of a uranium enrichment facility under 10 CFR 70 (Ref 1). The facility employs the gas centrifuge process to separate natural uranium hexafluoride (UF 6 ) feed material into a product stream enriched up to 5% U-235 and a depleted UF 6 stream containing approximately 0.2 to 0.34% U-235. Initial plant operations, with a limited number of cascades on line, commenced in the second half of 2010. Construction activities continue as each subsequent cascade is commissioned and placed into service. UUSA performed an Integrated Safety Analysis (ISA) to allow the facility to operate while constructing the remainder of the facility. The ISA Team selected the What-If/Checklist method based on guidance in NUREG-1513 (Ref 2) and AIChE Guidelines (Ref 3). Of the three methods recommended for high risk events HAZOP, What-If/Checklist, or Failure Modes and Effects Analysis (FMEA), the What-If/Checklist lends itself best to construction activities. It combines the structure of a checklist with an unstructured 'brainstorming' approach to create a list of specific accident events that could produce an undesirable consequence. The What-If/Checklist for Operate While Constructing divides the UUSA site into seven areas and creates what-if questions for sixteen different construction activities, such as site preparation, external construction cranes, and internal construction lifts. The result is a total of 112 nodes, for which the Operate While Constructing ISA Team created hundreds of what-if questions. For each what-if question the team determined the likelihood, consequences, safeguards, and acceptability of risk. What-if questions with unacceptable risk are the accident sequences and their selected safeguards are the Items Relied on For Safety (IROFS). The final ISA identified four (4) new accident sequences that, unless

  13. A comparison of integrated safety analysis and probabilistic risk assessment

    International Nuclear Information System (INIS)

    Damon, Dennis R.; Mattern, Kevin S.

    2013-01-01

    The U.S. Nuclear Regulatory Commission conducted a comparison of two standard tools for risk informing the regulatory process, namely, the Probabilistic Risk Assessment (PRA) and the Integrated Safety Analysis (ISA). PRA is a calculation of risk metrics, such as Large Early Release Frequency (LERF), and has been used to assess the safety of all commercial power reactors. ISA is an analysis required for fuel cycle facilities (FCFs) licensed to possess potentially critical quantities of special nuclear material. A PRA is usually more detailed and uses more refined models and data than an ISA, in order to obtain reasonable quantitative estimates of risk. PRA is considered fully quantitative, while most ISAs are typically only partially quantitative. The extension of PRA methodology to augment or supplant ISAs in FCFs has long been considered. However, fuel cycle facilities have a wide variety of possible accident consequences, rather than a few surrogates like LERF or core damage as used for reactors. It has been noted that a fuel cycle PRA could be used to better focus attention on the most risk-significant structures, systems, components, and operator actions. ISA and PRA both identify accident sequences; however, their treatment is quite different. ISA's identify accidents that lead to high or intermediate consequences, as defined in 10 Code of Federal Regulations (CFR) 70, and develop a set of Items Relied on For Safety (IROFS) to assure adherence to performance criteria. PRAs identify potential accident scenarios and estimate their frequency and consequences to obtain risk metrics. It is acceptable for ISAs to provide bounding evaluations of accident consequences and likelihoods in order to establish acceptable safety; but PRA applications usually require a reasonable quantitative estimate, and often obtain metrics of uncertainty. This paper provides the background, features, and methodology associated with the PRA and ISA. The differences between the

  14. Building-integrated PV -- Analysis and US market potential

    International Nuclear Information System (INIS)

    Frantzis, L.; Hill, S.; Teagan, P.; Friedman, D.

    1994-01-01

    Arthur D Little, Inc., in conjunction with Solar Design Associates, conducted a study for the US Department of Energy (DOE), Office of Building Technologies (OBT) to determine the market potential for building-integrated photovoltaics (BIPV). This study defines BIPV as two types of applications: (1) where the PV modules are an integral part of the building, often serving as the exterior weathering skin, and (2) the PV modules are mounted on the existing building exterior. Both of these systems are fully integrated with the energy usage of the building and have potential for significant market penetration in the US

  15. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    Science.gov (United States)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  16. Toward synthesizing executable models in biology.

    Science.gov (United States)

    Fisher, Jasmin; Piterman, Nir; Bodik, Rastislav

    2014-01-01

    Over the last decade, executable models of biological behaviors have repeatedly provided new scientific discoveries, uncovered novel insights, and directed new experimental avenues. These models are computer programs whose execution mechanistically simulates aspects of the cell's behaviors. If the observed behavior of the program agrees with the observed biological behavior, then the program explains the phenomena. This approach has proven beneficial for gaining new biological insights and directing new experimental avenues. One advantage of this approach is that techniques for analysis of computer programs can be applied to the analysis of executable models. For example, one can confirm that a model agrees with experiments for all possible executions of the model (corresponding to all environmental conditions), even if there are a huge number of executions. Various formal methods have been adapted for this context, for example, model checking or symbolic analysis of state spaces. To avoid manual construction of executable models, one can apply synthesis, a method to produce programs automatically from high-level specifications. In the context of biological modeling, synthesis would correspond to extracting executable models from experimental data. We survey recent results about the usage of the techniques underlying synthesis of computer programs for the inference of biological models from experimental data. We describe synthesis of biological models from curated mutation experiment data, inferring network connectivity models from phosphoproteomic data, and synthesis of Boolean networks from gene expression data. While much work has been done on automated analysis of similar datasets using machine learning and artificial intelligence, using synthesis techniques provides new opportunities such as efficient computation of disambiguating experiments, as well as the ability to produce different kinds of models automatically from biological data.

  17. Towards Synthesizing Executable Models in Biology

    Directory of Open Access Journals (Sweden)

    Jasmin eFisher

    2014-12-01

    Full Text Available Over the last decade, executable models of biological behaviors have repeatedly provided new scientific discoveries, uncovered novel insights, and directed new experimental avenues. These models are computer programs whose execution mechanistically simulates aspects of the cell’s behaviors. If the observed behavior of the program agrees with the observed biological behavior, then the program explains the phenomena. This approach has proven beneficial for gaining new biological insights and directing new experimental avenues. One advantage of this approach is that techniques for analysis of computer programs can be applied to the analysis of executable models. For example, one can confirm that a model agrees with experiments for all possible executions of the model (corresponding to all environmental conditions, even if there are a huge number of executions. Various formal methods have been adapted for this context, for example, model checking or symbolic analysis of state spaces. To avoid manual construction of executable models, one can apply synthesis, a method to produce programs automatically from high-level specifications. In the context of biological modelling, synthesis would correspond to extracting executable models from experimental data. We survey recent results about the usage of the techniques underlying synthesis of computer programs for the inference of biological models from experimental data. We describe synthesis of biological models from curated mutation experiment data, inferring network connectivity models from phosphoproteomic data, and synthesis of Boolean networks from gene expression data. While much work has been done on automated analysis of similar datasets using machine learning and artificial intelligence, using synthesis techniques provides new opportunities such as efficient computation of disambiguating experiments, as well as the ability to produce different kinds of models automatically from biological data.

  18. Executions in The Bahamas

    Directory of Open Access Journals (Sweden)

    Lofquist, William Steele

    2010-01-01

    Full Text Available The stories of those who have been executed in the Bahamas are heretofore untold. In telling these stories and in linking them to the changing course of Bahamian history, the present research adds an important dimension to our understanding of Bahamian history and politics. The major theme of this effort is that the changing practice of the death penalty is much more than a consequence of changes in crime. The use of the death penalty parallels the changing interests of colonial rulers, the changing practice of slavery, and the changing role of the Bahamas in colonial and regional affairs. Four distinctive eras of death penalty practice can be identified: (1 the slave era, where executions and commutations were used liberally and with a clear racial patterning; (2 a long era of stable colonialism, a period of marginalization and few executions; (3 an era of unstable colonialism characterized by intensive and efficient use of the death penalty; and (4 the current independence era of high murder rates and equally high impediments to the use of executions.

  19. Executive functions in synesthesia

    NARCIS (Netherlands)

    Rouw, Romke; van Driel, Joram; Knip, Koen; Richard Ridderinkhof, K.

    2013-01-01

    In grapheme-color synesthesia, a number or letter can evoke two different and possibly conflicting (real and synesthetic) color sensations at the same time. In this study, we investigate the relationship between synesthesia and executive control functions. First, no general skill differences were

  20. School Executive Website Study

    Science.gov (United States)

    Thiede, Robert

    2009-01-01

    The School Executive Website will be a one-stop, online site for officials who are looking for educational data, best practices, product reviews, school documents, professional opinions, and/or job-related networking. The format of the website is designed in certain sections similar to other current and popular websites, such as Angie's List.com,…