WorldWideScience

Sample records for analysis tools workshop

  1. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  2. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  3. Policy analysis tools for air quality and health : report from the May 19, 2005 workshop

    International Nuclear Information System (INIS)

    Gower, S.; Shortreed, J.

    2005-08-01

    The total impact of air pollution on human health is not well understood. This workshop examined key policy issues concerning air quality, and the availability of models and analyses to inform decision-makers. Attendants included stakeholders from health and environment departments of municipal, provincial and federal governments, as well as academics, consulting firms, industry and non-governmental organizations. The complexity of computer-based models was identified as a significant barrier to the development of a better understanding of the impacts of air pollution on human health, and it was noted that most models are not equipped to deal with the various levels of policy and decision-making that occur across many jurisdictions. It was observed that there is also a lack of data. It was suggested that efficient and cost-effective models are needed to identify good policy options, as well as tools that maximize the integration of information in a comprehensive manner. Evaluations of the impacts of air pollution should occur within the broad context of public health and consider both social and interactive needs. Continuing stakeholder dialogue was recommended, as well as a more in-depth exploration of policy analysis tools. A national meeting was planned to build on conclusions from the workshop. A guidance document was proposed to provide best practices to guide non-experts on health impacts, the interpretation of monitoring results, and the selection of models and appropriate analyses. Case studies of issues facing municipalities concerning planning and land use decisions were recommended, as well as various actions to mitigate the effects of poor air quality and greenhouse gas (GHG) emissions. Five presentations were given, followed by breakout sessions and discussions. refs., tabs., figs

  4. North Region ROW tool implementation workshop.

    Science.gov (United States)

    2010-08-02

    Welcome to the North Region ROW Tool Workshop. This workshop is funded under an implementation project sponsored by TxDOTs Research & Technology Implementation Office (RTI). This is the second of four regional workshops being planned for this summ...

  5. 6th International Parallel Tools Workshop

    CERN Document Server

    Brinkmann, Steffen; Gracia, José; Resch, Michael; Nagel, Wolfgang

    2013-01-01

    The latest advances in the High Performance Computing hardware have significantly raised the level of available compute performance. At the same time, the growing hardware capabilities of modern supercomputing architectures have caused an increasing complexity of the parallel application development. Despite numerous efforts to improve and simplify parallel programming, there is still a lot of manual debugging and  tuning work required. This process  is supported by special software tools, facilitating debugging, performance analysis, and optimization and thus  making a major contribution to the development of  robust and efficient parallel software. This book introduces a selection of the tools, which were presented and discussed at the 6th International Parallel Tools Workshop, held in Stuttgart, Germany, 25-26 September 2012.

  6. Object Oriented Risk Analysis Workshop

    Science.gov (United States)

    Pons, M. Güell I.; Jaboyedoff, M.

    2009-04-01

    In the framework of the RISET Project (Interfaculty Network of Support to Education and Technology) an educational tool for introducing risk analysis has been developed. This workshop enables to carry a group of students (role-play game) through a step-by-step process of risk identification and quantification. The aim is to assess risk in a characteristic alpine village regarding natural hazards (rockfall, snow avalanche, flooding…) and is oriented to affected objects such as buildings, infrastructures... The workshop contains the following steps: 1.- Planning of the study and definition of stakeholders 2.- Hazard identification 3.- Risk analysis 4.- Risk assessment 5.- Proposition of mitigation measures 6- Risk management and cost-benefit analysis. During the process, information related to past events and useful concepts are provided in order to bring up discussion and decision making. The Risk Matrix and other graphical tools allow having a visual representation of the risk level and help to prioritize counter measures. At the end of the workshop, there is the possibility to compare the results between different groups and print out a summarizing report. This approach provides a rapid and comprehensible risk evaluation. The workshop is accessible from the internet and will be used for educational purposes at bachelor and master level as well as for external persons dealing with risk analysis.

  7. UVI Cyber-security Workshop Workshop Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Kuykendall, Tommie G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allsop, Jacob Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Benjamin Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Boumedine, Marc [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Carter, Cedric [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Galvin, Seanmichael Yurko [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Oscar [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lee, Wellington K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lin, Han Wei [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Morris, Tyler Jake [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nauer, Kevin S.; Potts, Beth A.; Ta, Kim Thanh; Trasti, Jennifer; White, David R.

    2015-07-08

    The cybersecurity consortium, which was established by DOE/NNSA’s Minority Serving Institutions Partnerships Program (MSIPP), allows students from any of the partner schools (13 HBCUs, two national laboratories, and a public school district) to have all consortia options available to them, to create career paths and to open doors to DOE sites and facilities to student members of the consortium. As a part of this year consortium activities, Sandia National Laboratories and the University of Virgin Islands conducted a week long cyber workshop that consisted of three courses; Digital Forensics and Malware Analysis, Python Programming, and ThunderBird Cup. These courses are designed to enhance cyber defense skills and promote learning within STEM related fields.

  8. Workshop One : Risk Analysis

    NARCIS (Netherlands)

    Carlson, T.J.; Jong, C.A.F. de; Dekeling, R.P.A.

    2012-01-01

    The workshop looked at the assessment of risk to aquatic animals exposed to anthropogenic sound. The discussion focused on marine mammals given the worldwide attention being paid to them at the present time, particularly in relationship to oil and gas exploration, ocean power, and increases in ship

  9. Workshop Physics and Related Curricula: "A 25-Year History of Collaborative Learning Enhanced by Computer Tools for Observation and Analysis"

    Science.gov (United States)

    Laws, Priscilla W.; Willis, Maxine C.; Sokoloff, David R.

    2015-01-01

    This article describes the 25-year history of development of the activity-based Workshop Physics (WP) at Dickinson College, its adaptation for use at Gettysburg Area High School, and its synergistic influence on curricular materials developed at the University of Oregon and Tufts University and vice versa. WP and these related curricula: 1) are…

  10. Collaboration tools for the global accelerator network Workshop Report

    CERN Document Server

    Agarwal, D; Olson, J

    2002-01-01

    The concept of a ''Global Accelerator Network'' (GAN) has been put forward as a means for inter-regional collaboration in the operation of internationally constructed and operated frontier accelerator facilities. A workshop was held to allow representatives of the accelerator community and of the collaboratory development community to meet and discuss collaboration tools for the GAN environment. This workshop, called the Collaboration Tools for the Global Accelerator Network (GAN) Workshop, was held on August 26, 2002 at Lawrence Berkeley National Laboratory. The goal was to provide input about collaboration tools in general and to provide a strawman for the GAN collaborative tools environment. The participants at the workshop represented accelerator physicists, high-energy physicists, operations, technology tool developers, and social scientists that study scientific collaboration.

  11. Collaboration tools for the global accelerator network: Workshop Report

    International Nuclear Information System (INIS)

    Agarwal, Deborah; Olson, Gary; Olson, Judy

    2002-01-01

    The concept of a ''Global Accelerator Network'' (GAN) has been put forward as a means for inter-regional collaboration in the operation of internationally constructed and operated frontier accelerator facilities. A workshop was held to allow representatives of the accelerator community and of the collaboratory development community to meet and discuss collaboration tools for the GAN environment. This workshop, called the Collaboration Tools for the Global Accelerator Network (GAN) Workshop, was held on August 26, 2002 at Lawrence Berkeley National Laboratory. The goal was to provide input about collaboration tools in general and to provide a strawman for the GAN collaborative tools environment. The participants at the workshop represented accelerator physicists, high-energy physicists, operations, technology tool developers, and social scientists that study scientific collaboration

  12. Collaboration tools for the global accelerator network: Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Olson, Gary [Univ. of Michigan, Ann Arbor, MI (United States); Olson, Judy [Univ. of Michigan, Ann Arbor, MI (United States)

    2002-09-15

    The concept of a ''Global Accelerator Network'' (GAN) has been put forward as a means for inter-regional collaboration in the operation of internationally constructed and operated frontier accelerator facilities. A workshop was held to allow representatives of the accelerator community and of the collaboratory development community to meet and discuss collaboration tools for the GAN environment. This workshop, called the Collaboration Tools for the Global Accelerator Network (GAN) Workshop, was held on August 26, 2002 at Lawrence Berkeley National Laboratory. The goal was to provide input about collaboration tools in general and to provide a strawman for the GAN collaborative tools environment. The participants at the workshop represented accelerator physicists, high-energy physicists, operations, technology tool developers, and social scientists that study scientific collaboration.

  13. Applications of ion beam analysis workshop. Workshop handbook

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-31

    A workshop on applications of ion beam analysis was held at ANSTO, immediate prior to the IBMM-95 Conference in Canberra. It aims was to review developments and current status on use of ion beams for analysis, emphasizing the following aspects: fundamental ion beam research and secondary effects of ion beams; material sciences, geological, life sciences, environmental and industrial applications; computing codes for use in accelerator research; high energy heavy ion scattering and recoil; recent technological development using ion beams. The handbook contains the workshop`s program, 29 abstracts and a list of participants.

  14. Applications of ion beam analysis workshop. Workshop handbook

    International Nuclear Information System (INIS)

    1995-01-01

    A workshop on applications of ion beam analysis was held at ANSTO, immediate prior to the IBMM-95 Conference in Canberra. It aims was to review developments and current status on use of ion beams for analysis, emphasizing the following aspects: fundamental ion beam research and secondary effects of ion beams; material sciences, geological, life sciences, environmental and industrial applications; computing codes for use in accelerator research; high energy heavy ion scattering and recoil; recent technological development using ion beams. The handbook contains the workshop's program, 29 abstracts and a list of participants

  15. Sawja: Static Analysis Workshop for Java

    Science.gov (United States)

    Hubert, Laurent; Barré, Nicolas; Besson, Frédéric; Demange, Delphine; Jensen, Thomas; Monfort, Vincent; Pichardie, David; Turpin, Tiphaine

    Static analysis is a powerful technique for automatic verification of programs but raises major engineering challenges when developing a full-fledged analyzer for a realistic language such as Java. Efficiency and precision of such a tool rely partly on low level components which only depend on the syntactic structure of the language and therefore should not be redesigned for each implementation of a new static analysis. This paper describes the Sawja library: a static analysis workshop fully compliant with Java 6 which provides OCaml modules for efficiently manipulating Java bytecode programs. We present the main features of the library, including i) efficient functional data-structures for representing a program with implicit sharing and lazy parsing, ii) an intermediate stack-less representation, and iii) fast computation and manipulation of complete programs. We provide experimental evaluations of the different features with respect to time, memory and precision.

  16. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  17. Soundscape actions: A tool for noise treatment based on three workshops in landscape architecture

    Directory of Open Access Journals (Sweden)

    Gunnar Cerwén

    2017-12-01

    Full Text Available This paper reports experiences from three workshops dealing with soundscape as a noise treatment approach in landscape architecture. The workshops were conducted between 2012 and 2016 in different contexts, for different purposes and with different participants. The paper describes the workshop approach employed and analyzes the proposals made by workshop participants to employ “soundscape action” as an operational tool in landscape architecture projects. Through a process of ‘keywording’ and clustering proposals from the workshops, 22 pragmatic soundscape actions emerged and are described on a general level. The paper then discusses the outcomes and experiences from the workshops and relates this to landscape architecture practice.

  18. Proceedings of pollution prevention and waste minimization tools workshop

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-11-01

    Pollution Prevention (P2) has evolved into one of DOE`s sprime strategies to meet environmental, fiscal, and worker safety obligations. P2 program planning, opportunity identification, and implementation tools were developed under the direction of the Waste Minimization Division (EM-334). Forty experts from EM, DP, ER and DOE subcontractors attended this 2-day workshop to formulate the incentives to drive utilization of these tools. Plenary and small working group sessions were held both days. Working Group 1 identified incentives to overcoming barriers in the area of P2 program planning and resource allocation. Working Group 2 identified mechanisms to drive the completion of P2 assessments and generation of opportunities. Working Group 3 compiled and documented a broad range of potential P2 incentives that address fundamental barriers to implementation of cost effective opportunities.

  19. Proceedings of pollution prevention and waste minimization tools workshop

    International Nuclear Information System (INIS)

    1995-01-01

    Pollution Prevention (P2) has evolved into one of DOE's sprime strategies to meet environmental, fiscal, and worker safety obligations. P2 program planning, opportunity identification, and implementation tools were developed under the direction of the Waste Minimization Division (EM-334). Forty experts from EM, DP, ER and DOE subcontractors attended this 2-day workshop to formulate the incentives to drive utilization of these tools. Plenary and small working group sessions were held both days. Working Group 1 identified incentives to overcoming barriers in the area of P2 program planning and resource allocation. Working Group 2 identified mechanisms to drive the completion of P2 assessments and generation of opportunities. Working Group 3 compiled and documented a broad range of potential P2 incentives that address fundamental barriers to implementation of cost effective opportunities

  20. 2nd International Workshop on Isogeometric Analysis and Applications

    CERN Document Server

    Simeon, Bernd

    2015-01-01

    Isogeometric Analysis is a groundbreaking computational approach that promises the possibility of integrating the finite element  method into conventional spline-based CAD design tools. It thus bridges the gap between numerical analysis and geometry, and moreover it allows to tackle new cutting edge applications at the frontiers of research in science and engineering. This proceedings volume contains a selection of outstanding research papers presented at the second International Workshop on Isogeometric Analysis and Applications, held at Annweiler, Germany, in April 2014.

  1. 3D-TRANS-2003, Workshop on Common Tools and Interfaces for Radiation Transport Codes

    International Nuclear Information System (INIS)

    2004-01-01

    Description: Contents proceedings of Workshop on Common Tools and Interfaces for Deterministic Radiation Transport, for Monte Carlo and Hybrid Codes with a proposal to develop the following: GERALD - A General Environment for Radiation Analysis and Design. GERALD intends to create a unifying software environment where the user can define, solve and analyse a nuclear radiation transport problem using available numerical tools seamlessly. This environment will serve many purposes: teaching, research, industrial needs. It will also help to preserve the existing analytical and numerical knowledge base. This could represent a significant step towards solving the legacy problem. This activity should contribute to attracting young engineers to nuclear science and engineering and contribute to competence and knowledge preservation and management. This proposal was made at the on Workshop on C ommon Tools and Interfaces for Deterministic Radiation Transport, for Monte Carlo and Hybrid Codes , held from 25-26 September 2003 in connection with the conference SNA-2003. A first success with the development of such tools was achieved with the BOT3P2.0 and 3.0 codes providing an easy procedure and mechanism for defining and displaying 3D geometries and materials both in the form of refineable meshes for deterministic codes or Monte Carlo geometries consistent with deterministic models. Advanced SUSD: Improved tools for Sensitivity/Uncertainty Analysis. The development of tools for the analysis and estimation of sensitivities and uncertainties in calculations, or their propagation through complex computational schemes, in the field of neutronics, thermal hydraulics and also thermo-mechanics is of increasing importance for research and engineering applications. These tools allow establishing better margins for engineering designs and for the safe operation of nuclear facilities. Such tools are not sufficiently developed, but their need is increasingly evident in many activities

  2. 7th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Nagel, Wolfgang; Resch, Michael

    2014-01-01

    Current advances in High Performance Computing (HPC) increasingly impact efficient software development workflows. Programmers for HPC applications need to consider trends such as increased core counts, multiple levels of parallelism, reduced memory per core, and I/O system challenges in order to derive well performing and highly scalable codes. At the same time, the increasing complexity adds further sources of program defects. While novel programming paradigms and advanced system libraries provide solutions for some of these challenges, appropriate supporting tools are indispensable. Such tools aid application developers in debugging, performance analysis, or code optimization and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 7th International Parallel Tools Workshop, held in Dresden, Germany, September 3-4, 2013.  

  3. 10th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Hilbrich, Tobias; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2017-01-01

    This book presents the proceedings of the 10th International Parallel Tools Workshop, held October 4-5, 2016 in Stuttgart, Germany – a forum to discuss the latest advances in parallel tools. High-performance computing plays an increasingly important role for numerical simulation and modelling in academic and industrial research. At the same time, using large-scale parallel systems efficiently is becoming more difficult. A number of tools addressing parallel program development and analysis have emerged from the high-performance computing community over the last decade, and what may have started as collection of small helper script has now matured to production-grade frameworks. Powerful user interfaces and an extensive body of documentation allow easy usage by non-specialists. tools have been commercialized, but others are operated as open source by a growing research community.

  4. CEC/USDOE workshop on uncertainty analysis

    International Nuclear Information System (INIS)

    Elderkin, C.E.; Kelly, G.N.

    1990-07-01

    Any measured or assessed quantity contains uncertainty. The quantitative estimation of such uncertainty is becoming increasingly important, especially in assuring that safety requirements are met in design, regulation, and operation of nuclear installations. The CEC/USDOE Workshop on Uncertainty Analysis, held in Santa Fe, New Mexico, on November 13 through 16, 1989, was organized jointly by the Commission of European Communities (CEC's) Radiation Protection Research program, dealing with uncertainties throughout the field of consequence assessment, and DOE's Atmospheric Studies in Complex Terrain (ASCOT) program, concerned with the particular uncertainties in time and space variant transport and dispersion. The workshop brought together US and European scientists who have been developing or applying uncertainty analysis methodologies, conducted in a variety of contexts, often with incomplete knowledge of the work of others in this area. Thus, it was timely to exchange views and experience, identify limitations of approaches to uncertainty and possible improvements, and enhance the interface between developers and users of uncertainty analysis methods. Furthermore, the workshop considered the extent to which consistent, rigorous methods could be used in various applications within consequence assessment. 3 refs

  5. Planning support tools and their effects in participatory urban adaptation workshops.

    Science.gov (United States)

    McEvoy, Sadie; van de Ven, Frans H M; Blind, Michiel W; Slinger, Jill H

    2018-02-01

    In the face of a changing climate, many cities are engaged in adaptation planning and are using participatory workshops to involve stakeholders in these initiatives. Different tools are being used to structure the process and content of participatory planning workshops, but it is unclear what effect the tools have on the workshops and their results. We evaluated three different tools (Group Model Building, the Adaptation Support Tool, and the Stress Test Guideline) and a tool-free approach in repeated simulated workshops, to observe and compare (1) the way workshops played out, and (2) the direct outcomes that were achieved. Tools appear to influence both aspects. Specifically, we measured differences in the learning effects in groups, in the development of shared understanding within groups, in the types of plans that are developed by groups, and in the nature of participation during the workshops. Further research is needed to translate these results into practice, but this is a first step in advancing knowledge about the influence of tools in participatory planning activities. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Ninth Workshop and Tutorial on Practical Use of Coloured Petri Nets and the CPN Tools

    DEFF Research Database (Denmark)

    to practical use -- often in an industrial setting. The remaining papers deal with different extensions of tools and methodology. The papers from the first eight CPN Workshops can be found via web pages: http://www.daimi.au.dk/CPnets/. After an additional round of reviewing and revision, some of the papers......This booklet contains the proceedings of the Ninth Workshop on Pratical Use of Coloured Petri Nets and CPN Tools, October 20-22, 2008. The workshop is organised by the CPN group at the Department of Computer Science, Aarhus University, Denmark. Coloured Petri Nets and the CPN Tools are now licensed...... to more than 7,200 users in 138 countries. The aim of the workshop is to bring together some of the users and in this way provide a forum for those who are interested in the practical use of Coloured Petri nets and their tools. The submitted papers were evaluated by a programme committee...

  7. Workshop on IAEA Tools for Nuclear Energy System Assessment for Long-Term Planning and Development

    International Nuclear Information System (INIS)

    2009-01-01

    The purpose of the workshop is to present to Member States tools and methods that are available from the IAEA in support of long-term energy planning and nuclear energy system assessments, both focusing on the sustainable development of nuclear energy. This includes tools devoted to energy system planning, indicators for sustainable energy development, the INPRO methodology for Nuclear Energy System Assessment (NESA) and tools for analysing nuclear fuel cycle material balance. The workshop also intends to obtain feedback from Member States on applying the tools, share experiences and lessons learned, and identify needs for IAEA support

  8. Proceedings of the 11th Thermal and Fluids Analysis Workshop

    Science.gov (United States)

    Sakowski, Barbara

    2002-07-01

    The Eleventh Thermal & Fluids Analysis WorkShop (TFAWS 2000) was held the week of August 21-25 at The Forum in downtown Cleveland. This year's annual event focused on building stronger links between research community and the engineering design/application world and celebrated the theme "Bridging the Gap Between Research and Design". Dr. Simon Ostrach delivered the keynote address "Research for Design (R4D)" and encouraged a more deliberate approach to performing research with near-term engineering design applications in mind. Over 100 persons attended TFAWS 2000, including participants from five different countries. This year's conference devoted a full-day seminar to the discussion of analysis and design tools associated with aeropropulsion research at the Glenn Research Center. As in previous years, the workshop also included hands-on instruction in state-of-the-art analysis tools, paper sessions on selected topics, short courses and application software demonstrations. TFAWS 2000 was co-hosted by the Thermal/Fluids Systems Design and Analysis Branch of NASA GRC and by the Ohio Aerospace Institute and was co-chaired by Barbara A. Sakowski and James R. Yuko. The annual NASA Delegates meeting is a standard component of TFAWS where the civil servants of the various centers represented discuss current and future events which affect the Community of Applied Thermal and Fluid ANalystS (CATFANS). At this year's delegates meeting the following goals (among others) were set by the collective body of delegates participation of all Centers in the NASA material properties database (TPSX) update: (1) developing and collaboratively supporting multi-center proposals; (2) expanding the scope of TFAWS to include other federal laboratories; (3) initiation of a white papers on thermal tools and standards; and (4) formation of an Agency-wide TFAWS steering committee.

  9. Workshop

    DEFF Research Database (Denmark)

    Hess, Regitze; Lotz, Katrine

    2003-01-01

    Program for en arkitektur-workshop med focus på de danske havne. Præsentation af 57 yngre danske og internationale deltagende arkitekter.......Program for en arkitektur-workshop med focus på de danske havne. Præsentation af 57 yngre danske og internationale deltagende arkitekter....

  10. Pollution prevention and waste minimization tools workshops: Proceedings. Part 2

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-31

    The purpose of the second workshop was to bring together representatives of DOE and DOE contractor organizations to discuss four topics: process waste assessments (PWAs), a continuation of one of the sessions held at the first workshop in Clearwater; waste minimization reporting requirements; procurement systems for waste minimization; and heating, ventilating, and air conditioning (HVAC) and replacements for chlorofluorocarbons (CFCs). The topics were discussed in four concurrent group sessions. Participants in each group were encouraged to work toward achieving two main objectives: establish a ``clear vision`` of the overall target for their session`s program, focusing not just on where the program is now but on where it should go in the long term; and determine steps to be followed to carry out the target program.

  11. Physics analysis tools

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  12. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  13. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  14. Non-Targeted Analysis Challenge (Non-targeted screening workshop)

    Science.gov (United States)

    This brief presentation is intended to motivate discussion of the "Non-Targeted Analysis Challenge" at the Advancing Non-Targeted Analyses of Xenobiotics in Environmental and Biological Media workshop held at the EPA RTP campus.

  15. PREFACE: EMAS 2013 Workshop: 13th European Workshop on Modern Developments and Applications in Microbeam Analysis

    Science.gov (United States)

    Llovet, Xavier, Dr; Matthews, Mr Michael B.; Brisset, François, Dr; Guimarães, Fernanda, Dr; Vieira, Professor Joaquim M., Dr

    2014-03-01

    This volume of the IOP Conference Series: Materials Science and Engineering contains papers from the 13th Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis which took place from the 12th to the 16th of May 2013 in the Centro de Congressos do Alfândega, Porto, Portugal. The primary aim of this series of workshops is to assess the state-of-the-art and reliability of microbeam analysis techniques. The workshops also provide a forum where students and young scientists starting out on a career in microbeam analysis can meet and discuss with the established experts. The workshops have a very specific format comprising invited plenary lectures by internationally recognized experts, poster presentations by the participants and round table discussions on the key topics led by specialists in the field. This workshop was organized in collaboration with LNEG - Laboratório Nacional de Energia e Geologia and SPMICROS - Sociedade Portuguesa de Microscopia. The technical programme included the following topics: electron probe microanalysis, future technologies, electron backscatter diffraction (EBSD), particle analysis, and applications. As at previous workshops there was also a special oral session for young scientists. The best presentation by a young scientist was awarded with an invitation to attend the 2014 Microscopy and Microanalysis meeting at Hartford, Connecticut. The prize went to Shirin Kaboli, of the Department of Metals and Materials Engineering of McGill University (Montréal, Canada), for her talk entitled ''Plastic deformation studies with electron channelling contrast imaging and electron backscattered diffraction''. The continuing relevance of the EMAS workshops and the high regard in which they are held internationally can be seen from the fact that 74 posters from 21 countries were on display at the meeting and that the participants came from as far away as Japan, Canada and the USA. A

  16. The EADGENE Microarray Data Analysis Workshop

    OpenAIRE

    de Koning, Dirk-Jan; Jaffrézic, Florence; Lund, Mogens Sandø; Watson, Michael; Channing, Caroline; Hulsegge, Ina; Pool, Marco; Buitenhuis, Bart; Hedegaard, Jakob; Hornshøj, Henrik; Jiang, Li; Sørensen, Peter; Marot, Guillemette; Delmas, Céline; Lê Cao, Kim-Anh

    2007-01-01

    Abstract Microarray analyses have become an important tool in animal genomics. While their use is becoming widespread, there is still a lot of ongoing research regarding the analysis of microarray data. In the context of a European Network of Excellence, 31 researchers representing 14 research groups from 10 countries performed and discussed the statistical analyses of real and simulated 2-colour microarray data that were distributed among participants. The real data consisted of 48 microarra...

  17. Contamination Analysis Tools

    Science.gov (United States)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  18. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  19. Proceedings of the Workshop on software tools for distributed intelligent control systems

    Energy Technology Data Exchange (ETDEWEB)

    Herget, C.J. (ed.)

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  20. 9th Workshop on Stochastic Analysis and Related Topics

    CERN Document Server

    Decreusefond, Laurent; Stochastic Analysis and Related Topics

    2012-01-01

    Since the early eighties, Ali Suleyman Ustunel has been one of the main contributors to the field of Malliavin calculus. In a workshop held in Paris, June 2010 several prominent researchers gave exciting talks in honor of his 60th birthday. The present volume includes scientific contributions from this workshop. Probability theory is first and foremost aimed at solving real-life problems containing randomness. Markov processes are one of the key tools for modeling that plays a vital part concerning such problems. Contributions on inventory control, mutation-selection in genetics and public-pri

  1. Using equitable impact sensitive tool (EQUIST) and knowledge translation to promote evidence to policy link in maternal and child health: report of first EQUIST training workshop in Nigeria.

    Science.gov (United States)

    Uneke, Chigozie Jesse; Sombie, Issiaka; Uro-Chukwu, Henry Chukwuemeka; Johnson, Ermel; Okonofua, Friday

    2017-01-01

    The Equitable Impact Sensitive Tool (EQUIST) designed by UNICEF and knowledge translation (KT) are important strategies that can help policymakers to improve equity and evidence-informed policy making in maternal, newborn and child health (MNCH). The purpose of this study was to improve the knowledge and capacity of an MNCH implementation research team (IRT) and policy makers to use EQUIST and KT. A modified "before and after" intervention study design was used in which outcomes were measured on the target participants both before the intervention (workshop) is implemented and after. A 5-point likert scale according to the degree of adequacy was employed. A three -day intensive EQUIST and KT training workshop was organized in Edo State, Nigeria with 45 participants in attendance. Some of the topics covered included: (i) Knowledge translation models, measures & tools; (ii) Policy review, analysis and contextualization; (iii) Policy formulation and legislation process; (iv) EQUIST Overview & Theory of change; (v) EQUIST's situation analysis, scenario analysis and scenario comparison. The pre-workshop mean of understanding of use of KT ranged from 2.02-3.41, while the post-workshop mean ranged from 3.24-4.30. Pre-workshop mean of understanding of use of EQUIST ranged from 1.66-2.41, while the post-workshop mean ranged from 3.56-4.54 on the 5point scale. The percentage increase in mean of KT and EQUIST at the end of the workshop ranged from 8.0%-88.1% and 65.6%-158.4% respectively. Findings of this study suggest that policymakers' and researchers KT and EQUSIT use competence relevant to evidence-informed policymaking can be enhanced through training workshop.

  2. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  3. Summary of Training Workshop on the Use of NASA tools for Coastal Resource Management in the Gulf of Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Judd, Chaeli; Judd, Kathleen S.; Gulbransen, Thomas C.; Thom, Ronald M.

    2009-03-01

    A two-day training workshop was held in Xalapa, Mexico from March 10-11 2009 with the goal of training end users from the southern Gulf of Mexico states of Campeche and Veracruz in the use of tools to support coastal resource management decision-making. The workshop was held at the computer laboratory of the Institute de Ecologia, A.C. (INECOL). This report summarizes the results of that workshop and is a deliverable to our NASA client.

  4. The EADGENE Microarray Data Analysis Workshop

    DEFF Research Database (Denmark)

    de Koning, Dirk-Jan; Jaffrézic, Florence; Lund, Mogens Sandø

    2007-01-01

    Microarray analyses have become an important tool in animal genomics. While their use is becoming widespread, there is still a lot of ongoing research regarding the analysis of microarray data. In the context of a European Network of Excellence, 31 researchers representing 14 research groups from...... 10 countries performed and discussed the statistical analyses of real and simulated 2-colour microarray data that were distributed among participants. The real data consisted of 48 microarrays from a disease challenge experiment in dairy cattle, while the simulated data consisted of 10 microarrays...... from a direct comparison of two treatments (dye-balanced). While there was broader agreement with regards to methods of microarray normalisation and significance testing, there were major differences with regards to quality control. The quality control approaches varied from none, through using...

  5. Analysis of the HLA population data (AHPD) submitted to the 15th International Histocompatibility/Immunogenetics Workshop by using the Gene[rate] computer tools accommodating ambiguous data (AHPD project report).

    Science.gov (United States)

    Nunes, J M; Riccio, M E; Buhler, S; Di, D; Currat, M; Ries, F; Almada, A J; Benhamamouch, S; Benitez, O; Canossi, A; Fadhlaoui-Zid, K; Fischer, G; Kervaire, B; Loiseau, P; de Oliveira, D C M; Papasteriades, C; Piancatelli, D; Rahal, M; Richard, L; Romero, M; Rousseau, J; Spiroski, M; Sulcebe, G; Middleton, D; Tiercy, J-M; Sanchez-Mazas, A

    2010-07-01

    During the 15th International Histocompatibility and Immunogenetics Workshop (IHIWS), 14 human leukocyte antigen (HLA) laboratories participated in the Analysis of HLA Population Data (AHPD) project where 18 new population samples were analyzed statistically and compared with data available from previous workshops. To that aim, an original methodology was developed and used (i) to estimate frequencies by taking into account ambiguous genotypic data, (ii) to test for Hardy-Weinberg equilibrium (HWE) by using a nested likelihood ratio test involving a parameter accounting for HWE deviations, (iii) to test for selective neutrality by using a resampling algorithm, and (iv) to provide explicit graphical representations including allele frequencies and basic statistics for each series of data. A total of 66 data series (1-7 loci per population) were analyzed with this standard approach. Frequency estimates were compliant with HWE in all but one population of mixed stem cell donors. Neutrality testing confirmed the observation of heterozygote excess at all HLA loci, although a significant deviation was established in only a few cases. Population comparisons showed that HLA genetic patterns were mostly shaped by geographic and/or linguistic differentiations in Africa and Europe, but not in America where both genetic drift in isolated populations and gene flow in admixed populations led to a more complex genetic structure. Overall, a fruitful collaboration between HLA typing laboratories and population geneticists allowed finding useful solutions to the problem of estimating gene frequencies and testing basic population diversity statistics on highly complex HLA data (high numbers of alleles and ambiguities), with promising applications in either anthropological, epidemiological, or transplantation studies.

  6. Proceedings of the Workshop on Methods & Tools for Computer Supported Collaborative Creativity Process: Linking creativity & informal learning

    NARCIS (Netherlands)

    Retalis, Symeon; Sloep, Peter

    2009-01-01

    Retalis, S., & Sloep, P. B. (Eds.) (2009). Collection of 4 symposium papers at EC-TEL 2009. Proceedings of the Workshop on Methods & Tools for Computer Supported Collaborative Creativity Process: Linking creativity & informal learning. September, 30, 2009, Nice,

  7. Proceedings of the of the Eleventh Workshop on Language Descriptions, Tools and Applications (LDTA 2011)

    DEFF Research Database (Denmark)

    in cooperation with ACM SIGPLAN. LDTA is an application and tool-oriented workshop focused on grammarware---software based on grammars in some form. Grammarware applications are typically language processing applications and traditional examples include parsers, program analyzers, optimizers and translators....... A primary focus of LDTA is grammarware that is generated from high-level grammar-centric specifications and thus submissions on parser generation, attribute grammar systems, term/graph rewriting systems, and other grammar-related meta-programming tools, techniques, and formalisms were encouraged. For 2011...

  8. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  9. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  10. Second Workshop on Stochastic Analysis and Related Topics

    CERN Document Server

    Ustunel, Ali

    1990-01-01

    The Second Silivri Workshop functioned as a short summer school and a working conference, producing lecture notes and research papers on recent developments of Stochastic Analysis on Wiener space. The topics of the lectures concern short time asymptotic problems and anticipative stochastic differential equations. Research papers are mostly extensions and applications of the techniques of anticipative stochastic calculus.

  11. 8th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2015-01-01

    Numerical simulation and modelling using High Performance Computing has evolved into an established technique in academic and industrial research. At the same time, the High Performance Computing infrastructure is becoming ever more complex. For instance, most of the current top systems around the world use thousands of nodes in which classical CPUs are combined with accelerator cards in order to enhance their compute power and energy efficiency. This complexity can only be mastered with adequate development and optimization tools. Key topics addressed by these tools include parallelization on heterogeneous systems, performance optimization for CPUs and accelerators, debugging of increasingly complex scientific applications, and optimization of energy usage in the spirit of green IT. This book represents the proceedings of the 8th International Parallel Tools Workshop, held October 1-2, 2014 in Stuttgart, Germany – which is a forum to discuss the latest advancements in the parallel tools.

  12. 17th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2016)

    International Nuclear Information System (INIS)

    2016-01-01

    Preface The 2016 version of the International Workshop on Advanced Computing and Analysis Techniques in Physics Research took place on January 18-22, 2016, at the Universidad Técnica Federico Santa Maria -UTFSM- in Valparaiso, Chile. The present volume of IOP Conference Series is devoted to the selected scientific contributions presented at the workshop. In order to guarantee the scientific quality of the Proceedings all papers were thoroughly peer-reviewed by an ad-hoc Editorial Committee with the help of many careful reviewers. The ACAT Workshop series has a long tradition starting in 1990 (Lyon, France), and takes place in intervals of a year and a half. Formerly these workshops were known under the name AIHENP (Artificial Intelligence for High Energy and Nuclear Physics). Each edition brings together experimental and theoretical physicists and computer scientists/experts, from particle and nuclear physics, astronomy and astrophysics in order to exchange knowledge and experience in computing and data analysis in physics. Three tracks cover the main topics: Computing technology: languages and system architectures. Data analysis: algorithms and tools. Theoretical Physics: techniques and methods. Although most contributions and discussions are related to particle physics and computing, other fields like condensed matter physics, earth physics, biophysics are often addressed in the hope to share our approaches and visions. It created a forum for exchanging ideas among fields, exploring and promoting cutting-edge computing technologies and debating hot topics. (paper)

  13. Microgrid Analysis Tools Summary

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Antonio [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Haase, Scott G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mathur, Shivani [Formerly NREL

    2018-03-05

    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimization tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).

  14. Jumpstarting Academic Careers: A Workshop and Tools for Career Development in Anesthesiology.

    Science.gov (United States)

    Yanofsky, Samuel D; Voytko, Mary Lou; Tobin, Joseph R; Nyquist, Julie G

    2011-01-01

    Career development is essential and has the potential to assist in building a sustained faculty within academic departments of Anesthesiology. Career development is essential for growth in academic medicine. Close attention to the details involved in career management, goal setting as part of career planning, and professional networking are key elements. This article examines the specific educational strategies involved in a 120 minute workshop divided into four 25 minute segments with 20 minutes at the end for discussion for training junior faculty in career development. The teaching methods include 1) brief didactic presentations, 2) pre-workshop completion of two professional development tools, 3) facilitated small group discussion using trained facilitators and 4) use of a commitment to change format. Three major learning tools were utilized in conjunction with the above methods: a professional network survey, a career planning and development form and a commitment to change form. Forty one participants from 2009 reported 80 projected changes in their practice behaviors in relation to career management: Build or enhance professional network and professional mentoring (36.3%); Set career goals, make a plan, follow though, collaborate, publish (35.1%); Increase visibility locally or nationally (10.0%); Building core skills, such as clinical, teaching, leading (36.3%); Identify the criteria for promotion in own institution (5.0%); Improved methods of documentation (2.5%). Over the past two years, the workshop has been very well received by junior faculty, with over 95% marking each of the following items as excellent or good (presentation, content, audiovisuals and objectives met). The challenge for continuing development and promotion of academic anesthesiologists lies in the explicit training of faculty for career advancement. Designing workshops using educational tools to promote a reflective process of the faculty member is the one method to meet this

  15. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  16. Proceedings of the of the Tenth Workshop on Language Descriptions, Tools and Applications (LDTA 2010)

    DEFF Research Database (Denmark)

    Brabrand, Claus

    2010-01-01

    This volume contains the proceedings of the Tenth Workshop on Language Descriptions, Tools and Applications (LDTA 2010), held in Paphos, Cyprus on March 28--29, 2010. LDTA is a two-day satellite event of ETAPS (European Joint Conferences on Theory and Practice of Software) organized in cooperation...... with ACM Sigplan. LDTA is an application and tool-oriented forum on meta programming in a broad sense. A meta program is a program that takes other programs as input or output. The focus of LDTA is on generated or otherwise efficiently implemented meta programs, possibly using high level descriptions...... of programming languages. Tools and techniques presented at LDTA are usually applicable in the context of "Language Workbenches" or "Meta Programming Systems" or simply as parts of advanced programming environments or IDEs. These proceedings include an extended abstract based on the invited talk by Jean...

  17. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  18. Social Data Analysis Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi; Hardt, Daniel

    2014-01-01

    , analyze and visualize patterns of web activity. This volume profiles the latest techniques being employed by social scientists to collect and interpret data from some of the most popular social media applications, the political parties' own online activist spaces, and the wider system of hyperlinks...... that structure the inter-connections between these sites. Including contributions from a range of academic disciplines including Political Science, Media and Communication Studies, Economics, and Computer Science, this study showcases a new methodological approach that has been expressly designed to capture......As governments, citizens and organizations have moved online there is an increasing need for academic enquiry to adapt to this new context for communication and political action. This adaptation is crucially dependent on researchers being equipped with the necessary methodological tools to extract...

  19. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  20. The Astronomy Workshop: Web Tools for Astronomy Students at All Levels

    Science.gov (United States)

    Hayes-Gehrke, Melissa N.; Hamilton, D.; Deming, G.

    2010-01-01

    The Astronomy Workshop (http://janus.astro.umd.edu/) is a collection of over 20 interactive web tools that were developed under the direction of Doug Hamilton for use in undergraduate classes and by the general public. The goal of the website is to encourage students to learn about astronomy by exploiting their fascination with the internet. Two of the tools, Scientific Notation and Solar System Collisions, have instructor materials available to facilitate their use in undergraduate, high school, and junior high classes. The Scientific Notation web tool allows students to practice conversion, addition/subtraction, and multiplication/division with scientific notation, while the Solar System Collisions web tool explores the effects of impacts on the Earth and other solar system bodies. Some web tools allow students to explore our own solar system (Solar System Visualizer) and the Sun's past and future history (The Life of the Sun), Others allow students to experiment with changes in the solar system, such as to the tilt of the Earth (Earth's Seasons) and changing the properties of the planets in the solar system (Build Your Own Solar System).

  1. Development of Workshops on Biodiversity and Evaluation of the Educational Effect by Text Mining Analysis

    Science.gov (United States)

    Baba, R.; Iijima, A.

    2014-12-01

    Conservation of biodiversity is one of the key issues in the environmental studies. As means to solve this issue, education is becoming increasingly important. In the previous work, we have developed a course of workshops on the conservation of biodiversity. To disseminate the course as a tool for environmental education, determination of the educational effect is essential. A text mining enables analyses of frequency and co-occurrence of words in the freely described texts. This study is intended to evaluate the effect of workshop by using text mining technique. We hosted the originally developed workshop on the conservation of biodiversity for 22 college students. The aim of the workshop was to inform the definition of biodiversity. Generally, biodiversity refers to the diversity of ecosystem, diversity between species, and diversity within species. To facilitate discussion, supplementary materials were used. For instance, field guides of wildlife species were used to discuss about the diversity of ecosystem. Moreover, a hierarchical framework in an ecological pyramid was shown for understanding the role of diversity between species. Besides, we offered a document material on the historical affair of Potato Famine in Ireland to discuss about the diversity within species from the genetic viewpoint. Before and after the workshop, we asked students for free description on the definition of biodiversity, and analyzed by using Tiny Text Miner. This technique enables Japanese language morphological analysis. Frequently-used words were sorted into some categories. Moreover, a principle component analysis was carried out. After the workshop, frequency of the words tagged to diversity between species and diversity within species has significantly increased. From a principle component analysis, the 1st component consists of the words such as producer, consumer, decomposer, and food chain. This indicates that the students have comprehended the close relationship between

  2. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  3. The Halden Reactor Project workshop on improved system development using case-tools based on formal methods

    International Nuclear Information System (INIS)

    Gran, Bjoern Axel; Sivertsen, Terje; Stoelen, Ketil; Thunem, Harald; Zhang, Wenhui

    1999-02-01

    The workshop 'Improved system development using case-tools based on formal methods' was organised in Halden, December 1-2, 1998. The purpose of the workshop was to present and discuss the state-of-the-art with respect to formal approaches. The workshop had two invited presentations: 'Formality in specification and modelling: developments in software engineering practice' by John Fitzgerald (Centre for Software Reliability, UK), and 'Formal methods in industry - reaching results when correctness is not the only issue' by Oeystein Haugen (Ericsson NorARC, Norway). The workshop also had several presentations divided into three sessions on industrial experience, tools, and combined approaches. Each day there was a discussion. The first was on the effect of formalization, while the second was on the role of formal verification. At the end of the workshop, the presentations and discussions were summarised into specific recommendations. This report summarises the presentations of the speakers, the discussions, the recommendations, and the demonstrations given at the workshop (author) (ml)

  4. Workshop on Analysis of Returned Comet Nucleus Samples

    International Nuclear Information System (INIS)

    1989-01-01

    This volume contains abstracts that were accepted by the Program Committee for presentation at the workshop on the analysis of returned comet nucleus samples held in Milpitas, California, January 16 to 18, 1989. The abstracts deal with the nature of cometary ices, cryogenic handling and sampling equipment, origin and composition of samples, and spectroscopic, thermal and chemical processing methods of cometary nuclei. Laboratory simulation experimental results on dust samples are reported. Some results obtained from Halley's comet are also included. Microanalytic techniques for examining trace elements of cometary particles, synchrotron x ray fluorescence and instrument neutron activation analysis (INAA), are presented

  5. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    Science.gov (United States)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  6. Workshop tools and methodologies for evaluation of energy chains and for technology perspective

    Energy Technology Data Exchange (ETDEWEB)

    Appert, O. [Institut Francais du Petrole (IFP), 92 - Rueil-Malmaison (France); Maillard, D. [Energy and Raw Materials, 75 - Paris (France); Pumphrey, D. [Energy Cooperation, US Dept. of Energy (United States); Sverdrup, G.; Valdez, B. [National Renewable Energy Laboratory, Golden, CO (United States); Schindler, J. [LB-Systemtechnik (LBST), GmbH, Ottobrunn (Germany); His, St.; Rozakis, St. [Centre International de Recherche sur Environnement Developpement (CIRED), 94 - Nogent sur Marne (France); Sagisaka, M. [LCA Research Centre (Japan); Bjornstad, D. [Oak Ridge National Laboratory, Oak Ridge, Tennessee (United States); Madre, J.L. [Institut National de Recherche sur les Transports et leur Securite, 94 - Arcueil (France); Hourcade, J.Ch. [Centre International de Recherche sur l' Environnement le Developpement (CIRED), 94 - Nogent sur Marne (France); Ricci, A.; Criqui, P.; Chateau, B.; Bunger, U.; Jeeninga, H. [EU/DG-R (Italy); Chan, A. [National Research Council (Canada); Gielen, D. [IEA-International Energy Associates Ltd., Fairfax, VA (United States); Tosato, G.C. [Energy Technology Systems Analysis Programme (ETSAP), 75 - Paris (France); Akai, M. [Agency of Industrial Science and technology (Japan); Ziesing, H.J. [Deutsches Institut fur Wirtschaftsforschung, DIW Berlin (Germany); Leban, R. [Conservatoire National des Arts et Metiers (CNAM), 75 - Paris (France)

    2005-07-01

    The aims of this workshop is to better characterize the future in integrating all the dynamic interaction between the economy, the environment and the society. It offers presentations on the Hydrogen chains evaluation, the micro-economic modelling for evaluation of bio-fuel options, life cycle assessment evolution and potentialities, the consumer valuation of energy technologies attributes, the perspectives for evaluation of changing behavior, the incentive systems and barriers to social acceptability, the internalization of external costs, the endogenous technical change in long-tem energy models, ETSAP/technology dynamics in partial equilibrium energy models, very long-term energy environment modelling, ultra long-term energy technology perspectives, the socio-economic toolbox of the EU hydrogen road-map, the combined approach using technology oriented optimization and evaluation of impacts of individual policy measures and the application of a suite of basic research portfolio management tools. (A.L.B.)

  7. Young people and drug consumption: workshops to provide tools for workers in social institutions, from a collective health perspective

    Directory of Open Access Journals (Sweden)

    Cássia Baldini Soares

    2010-01-01

    Full Text Available The objective of this study was, through workshops, to provide tools for workers in social institutions who work with young people, so that they could understand present-day drug consumption. It started from the presupposition that approaching this topic from a collective health perspective, i.e. from understanding the structure of the production, distribution and consumption of drugs today, the work of these institutions might be improved. The aim was to investigate the effectiveness of workshops as tools in the educational process. The methodology consisted of systematically conducting workshops within a theoretical-methodological framework of historical-critical theory. The workers' participation evolved qualitatively, thereby showing that the knowledge identified, along with the common sense initially brought in, evolved into comprehension of the roots of harmful drug consumption and into surmounting reiterative practices that fed back into myths, prejudice and stereotypes regarding users, as well as gaining respect for the power and effects of drugs.

  8. VCAT: Visual Crosswalk Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Cleland, Timothy J. [Los Alamos National Laboratory; Forslund, David W. [Los Alamos National Laboratory; Cleland, Catherine A. [Los Alamos National Laboratory

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  9. "I'm Not Here to Learn How to Mark Someone Else's Stuff": An Investigation of an Online Peer-to-Peer Review Workshop Tool

    Science.gov (United States)

    Wilson, Michael John; Diao, Ming Ming; Huang, Leon

    2015-01-01

    In this article, we explore the intersecting concepts of fairness, trust and temporality in relation to the implementation of an online peer-to-peer review Moodle Workshop tool at a Sydney metropolitan university. Drawing on qualitative interviews with unit convenors and online surveys of students using the Workshop tool, we seek to highlight a…

  10. Workshop on the applications of new computer tools to thermal engineering; Applications a la thermique des nouveaux outils informatiques

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    This workshop on the applications of new computer tools to thermal engineering has been organized by the French society of thermal engineers. Seven papers have been presented, from which two papers dealing with thermal diffusivity measurements in materials and with the optimization of dryers have been selected for ETDE. (J.S.)

  11. Channel CAT: A Tactical Link Analysis Tool

    National Research Council Canada - National Science Library

    Coleman, Michael

    1997-01-01

    .... This thesis produced an analysis tool, the Channel Capacity Analysis Tool (Channel CAT), designed to provide an automated tool for the analysis of design decisions in developing client-server software...

  12. Finite element analysis of degraded concrete structures - Workshop proceedings

    International Nuclear Information System (INIS)

    1999-09-01

    This workshop is related to the finite element analysis of degraded concrete structures. It is composed of three sessions. The first session (which title is: the use of finite element analysis in safety assessments) comprises six papers which titles are: Historical Development of Concrete Finite Element Modeling for Safety Evaluation of Accident-Challenged and Aging Concrete Structures; Experience with Finite Element Methods for Safety Assessments in Switzerland; Stress State Analysis of the Ignalina NPP Confinement System; Prestressed Containment: Behaviour when Concrete Cracking is Modelled; Application of FEA for Design and Support of NPP Containment in Russia; Verification Problems of Nuclear Installations Safety Software of Strength Analysis (NISS SA). The second session (title: concrete containment structures under accident loads) comprises seven papers which titles are: Two Application Examples of Concrete Containment Structures under Accident Load Conditions Using Finite Element Analysis; What Kind of Prediction for Leak rates for Nuclear Power Plant Containments in Accidental Conditions; Influence of Different Hypotheses Used in Numerical Models for Concrete At Elevated Temperatures on the Predicted Behaviour of NPP Core Catchers Under Severe Accident Conditions; Observations on the Constitutive Modeling of Concrete Under Multi-Axial States at Elevated Temperatures; Analyses of a Reinforced Concrete Containment with Liner Corrosion Damage; Program of Containment Concrete Control During Operation for the Temelin Nuclear Power Plant; Static Limit Load of a Deteriorated Hyperbolic Cooling Tower. The third session (concrete structures under extreme environmental load) comprised five papers which titles are: Shear Transfer Mechanism of RC Plates After Cracking; Seismic Back Calculation of an Auxiliary Building of the Nuclear Power Plant Muehleberg, Switzerland; Seismic Behaviour of Slightly Reinforced Shear Wall Structures; FE Analysis of Degraded Concrete

  13. Workshops as a useful tool to better understand care professionals' views of a lean change program.

    Science.gov (United States)

    Simons, Pascale A M; Benders, Jos; Marneffe, Wim; Pijls-Johannesma, Madelon; Vandijck, Dominique

    2015-01-01

    For change programs to succeed, it is vital to have a detailed understanding of employees' views regarding the program, especially when the proposed changes are potentially contested. Gaining insight into employee perceptions helps managers to decide how to proceed. The authors conducted two workshops in a radiotherapy institute to assess the benefits and drawbacks, as well as their underlying causes, of a proposed Lean change program. Managers' views on the workshops' usefulness were charted. The paper aims to discuss these issues. Two workshops were organized in which employees predicted positive and negative effects of a Lean program. The workshops combined a structured brainstorm (KJ-technique) and an evaluation of the expected effects. Eight top managers judged the workshops' value on supporting decision making. In total, 15 employees participated in the workshops. Participants from workshop 2 reported more expected effects (27 effects; 18 positive) than from workshop 1 (14 effects; six positive). However, when effects were categorized, similar results were shown. Three from eight managers scored the results relevant for decision making and four neutral. Seven managers recommended future use of the instrument. Increased employee involvement and bottom-up thinking combined with relatively low costs were appreciated most. The workshop could serve as a simple instrument to improve decision making and enhance successful implementation of change programs, as it was expected to enhance employees' involvement and was relatively easy to conduct and cheap. The workshop increased insight into employee views, facilitating adaptive actions by healthcare organization managers.

  14. 6th International Workshop on Compositional Data Analysis

    CERN Document Server

    Thió-Henestrosa, Santiago

    2016-01-01

    The authoritative contributions gathered in this volume reflect the state of the art in compositional data analysis (CoDa). The respective chapters cover all aspects of CoDa, ranging from mathematical theory, statistical methods and techniques to its broad range of applications in geochemistry, the life sciences and other disciplines. The selected and peer-reviewed papers were originally presented at the 6th International Workshop on Compositional Data Analysis, CoDaWork 2015, held in L’Escala (Girona), Spain. Compositional data is defined as vectors of positive components and constant sum, and, more generally, all those vectors representing parts of a whole which only carry relative information. Examples of compositional data can be found in many different fields such as geology, chemistry, economics, medicine, ecology and sociology. As most of the classical statistical techniques are incoherent on compositions, in the 1980s John Aitchison proposed the log-ratio approach to CoDa. This became the foundation...

  15. A new educational tool to learn about hydration: taste workshops for children.

    Science.gov (United States)

    Valero Gaspar, Teresa; Rodríguez-Alonso, Paula; Ruiz Moreno, Emma; Del Pozo de la Calle, Susana; Ávila Torres, José Manuel; Varela Moreiras, Gregorio

    2016-07-13

    Nutrition education contributes to children´s understanding and practice of healthy lifestyles behaviors. Having a well hydration status is an essential topic, especially since children are a vulnerable population who are much more  prone to dehydration than adults are. The approval of the Report on the European Gastronomic Heritage: Cultural and Educational Aspects in 2014 served as starting point to work on innovative audio-visual and multimedia materials for children. The Spanish Nutrition Foundation (FEN) and the Royal Academy of Gastronomy (RAG), in collaboration with the Ministry of Education, Culture and Sport in Spain (MECD),  developed educational videos for schoolchildren to learn about food, nutrition and gastronomy, specially, the importance of being hydrated. To develop a serial of videos for children between 3 and 9 years old with nutrition and cooking lessons to be used as educational resources in the official curricula. Fourteen chapters related to food, nutrition, gastronomy, physical activity and hydration to be used to record videos were designed and tested. A nutritionist, a chef and two puppets were the main characters acting in the videos.  The chapters were assembled in nine videos that included five sections: introduction, video lesson, recipes -in case of hydration, recipes with different water content foods were recorded-, what have you learntand check your knowledge. A summary of the new educational material was officially presented at the Spain Pavilion during the Expo Milano 2015. Moreover, they are included as education  tool for teachers in the new PANGEI Programme (Food, Nutrition and Gastronomy for Infantile Education) conjointly launched by FEN, RAG and MEDC. Taste workshops are useful as innovative nutrition education tools to reinforce language, listening and motor skills as well as food and nutrition concepts, and specially, the importance of being well hydrated.

  16. Nearfield Summary and Statistical Analysis of the Second AIAA Sonic Boom Prediction Workshop

    Science.gov (United States)

    Park, Michael A.; Nemec, Marian

    2017-01-01

    A summary is provided for the Second AIAA Sonic Boom Workshop held 8-9 January 2017 in conjunction with AIAA SciTech 2017. The workshop used three required models of increasing complexity: an axisymmetric body, a wing body, and a complete configuration with flow-through nacelle. An optional complete configuration with propulsion boundary conditions is also provided. These models are designed with similar nearfield signatures to isolate geometry and shock/expansion interaction effects. Eleven international participant groups submitted nearfield signatures with forces, pitching moment, and iterative convergence norms. Statistics and grid convergence of these nearfield signatures are presented. These submissions are propagated to the ground, and noise levels are computed. This allows the grid convergence and the statistical distribution of a noise level to be computed. While progress is documented since the first workshop, improvement to the analysis methods for a possible subsequent workshop are provided. The complete configuration with flow-through nacelle showed the most dramatic improvement between the two workshops. The current workshop cases are more relevant to vehicles with lower loudness and have the potential for lower annoyance than the first workshop cases. The models for this workshop with quieter ground noise levels than the first workshop exposed weaknesses in analysis, particularly in convective discretization.

  17. A Decision-Analytic Feasibility Study of Upgrading Machinery at a Tools Workshop

    Directory of Open Access Journals (Sweden)

    M. L. Chew Hernandez

    2012-04-01

    Full Text Available This paper presents the evaluation, from a Decision Analysis point of view, of the feasibility of upgrading machinery at an existing metal-forming workshop. The Integral Decision Analysis (IDA methodology is applied to clarify the decision and develop a decision model. One of the key advantages of the IDA is its careful selection of the problem frame, allowing a correct problem definition. While following most of the original IDA methodology, an addition to this methodology is proposed in this work, that of using the strategic Means-Ends Objective Network as a backbone for the development of the decision model. The constructed decision model uses influence diagrams to include factual operator and vendor expertise, simulation to evaluate the alternatives and a utility function to take into account the risk attitude of the decision maker. Three alternatives are considered: Base (no modification, CNC (installing an automatic lathe and CF (installation of an automatic milling machine. The results are presented as a graph showing zones in which a particular alternative should be selected. The results show the potential of IDA to tackle technical decisions that are otherwise approached without the due care.

  18. Low-Cost Air Quality Monitoring Tools: From Research to Practice (A Workshop Summary

    Directory of Open Access Journals (Sweden)

    Andrea L. Clements

    2017-10-01

    Full Text Available In May 2017, a two-day workshop was held in Los Angeles (California, U.S.A. to gather practitioners who work with low-cost sensors used to make air quality measurements. The community of practice included individuals from academia, industry, non-profit groups, community-based organizations, and regulatory agencies. The group gathered to share knowledge developed from a variety of pilot projects in hopes of advancing the collective knowledge about how best to use low-cost air quality sensors. Panel discussion topics included: (1 best practices for deployment and calibration of low-cost sensor systems, (2 data standardization efforts and database design, (3 advances in sensor calibration, data management, and data analysis and visualization, and (4 lessons learned from research/community partnerships to encourage purposeful use of sensors and create change/action. Panel discussions summarized knowledge advances and project successes while also highlighting the questions, unresolved issues, and technological limitations that still remain within the low-cost air quality sensor arena.

  19. Research Directions for Cyber Experimentation: Workshop Discussion Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    DeWaard, Elizabeth [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Deccio, Casey [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Fritz, David Jakob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tarman, Thomas D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-10-01

    Sandia National Laboratories hosted a workshop on August 11, 2017 entitled "Research Directions for Cyber Experimentation," which focused on identifying and addressing research gaps within the field of cyber experimentation , particularly emulation testbeds . This report mainly documents the discussion toward the end of the workshop, which included research gaps such as developing a sustainable research infrastructure, exp anding cyber experimentation, and making the field more accessible to subject matter experts who may not have a background in computer science . Other gaps include methodologies for rigorous experimentation, validation, and uncertainty quantification, which , if addressed, also have the potential to bridge the gap between cyber experimentation and cyber engineering. Workshop attendees presented various ways to overcome these research gaps, however the main conclusion for overcoming these gaps is better commun ication through increased workshops, conferences, email lists, and slack chann els, among other opportunities.

  20. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  1. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  2. Proceedings of the 6th International Workshop on Folk Music Analysis, 15-17 June, 2016

    OpenAIRE

    Beauguitte, Pierre; Duggan, Bryan; Kelleher, John

    2016-01-01

    The Folk Music Analysis Workshop brings together computational music analysis and ethnomusicology. Both symbolic and audio representations of music are considered, with a broad range of scientific approaches being applied (signal processing, graph theory, deep learning). The workshop features a range of interesting talks from international researchers in areas such as Indian classical music, Iranian singing, Ottoman-Turkish Makam music scores, Flamenco singing, Irish traditional music, Georgi...

  3. Climate Data Analysis Tools - (CDAT)

    Science.gov (United States)

    Doutriaux, C.; Jennifer, A.; Drach, R.; Dubois, P.; Williams, D.

    2003-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems. The power of the system comes from Python and its ability to seamlessly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and graphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management System or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS). One of the most difficult challenges facing climate researchers today is the cataloging and analysis of massive amounts of multi-dimensional global atmospheric and oceanic model data. To reduce the labor intensive and time-consuming process of data management, retrieval, and analysis, PCMDI and other DOE sites have come together to develop intelligent filing system and data management software for the linking of storage devices located throughout the United States and the international climate research community. This effort, headed by PCMDI, NCAR, and ANL will allow users anywhere to remotely access this distributed multi-petabyte archive and perform analysis. PCMDI's CDAT is an innovative system that supports exploration and visualization of climate scientific datasets. As an "open system", the software sub-systems (i.e., modules) are independent and freely available to the global climate community. CDAT is easily extended to include new modules and as a result of its flexibility, PCMDI has integrated other popular software components, such as: the popular Live Access Server (LAS) and the Distributed Oceanographic Data System (DODS). Together with ANL's Globus middleware

  4. Low-Cost Air Quality Monitoring Tools: From Research to Practice (A Workshop Summary)

    Science.gov (United States)

    In May 2017, a two-day workshop was held in Los Angeles (California, U.S.A.) to gather practitioners who work with low-cost sensors used to make air quality measurements. The community of practice included individuals from academia, industry, non-profit groups, community-based or...

  5. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  6. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  7. The EUFAR ICCP Workshop on Data Processing, Analysis and Presentation Software of Cloud Probes

    OpenAIRE

    Voigt, Christiane; Baumgardner, Darrel; McFarquhar, Greg

    2016-01-01

    The EUFAR ICCP Workshop on Data Processing, Analysis and Presentation Software of Cloud Probes took place at the University of Manchester from 23 to 24 July 2016. More than 40 cloud measurement experts and students from Europe, America, Asia and Australia participated in the workshop with the objectives to summarise current data processing algorithms for measurements made with cloud spectrometers operated on research aircraft, to discuss differences in the data processing methods, to assess o...

  8. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  9. A workshop report on the development of the Cow's Milk-related Symptom Score awareness tool for young children.

    Science.gov (United States)

    Vandenplas, Yvan; Dupont, Christophe; Eigenmann, Philippe; Host, Arne; Kuitunen, Mikael; Ribes-Koninckx, Carmen; Shah, Neil; Shamir, Raanan; Staiano, Annamaria; Szajewska, Hania; Von Berg, Andrea

    2015-04-01

    Clinicians with expertise in managing children with gastrointestinal problems and/or atopic diseases attended a workshop in Brussels in September 2014 to review the literature and determine whether a clinical score derived from symptoms associated with the ingestion of cow's milk proteins could help primary healthcare providers. The Cow's Milk-related Symptom Score (CoMiSS), which considers general manifestations, dermatological, gastrointestinal and respiratory symptoms, was developed as an awareness tool for cow's milk-related symptoms. It can also be used to evaluate and quantify the evolution of symptoms during therapeutic interventions, but does not diagnose cow's milk protein allergy and does not replace a food challenge. Its usefulness needs to be evaluated by a prospective randomised study. The CoMiSS provides primary healthcare clinicians with a simple, fast and easy-to-use awareness tool for cow's milk-related symptoms. ©2014 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  10. Protein analysis tools and services at IBIVU

    Directory of Open Access Journals (Sweden)

    Brandt Bernd W.

    2011-06-01

    Full Text Available During the last years several new tools applicable to protein analysis have made available on the IBIVU web site. Recently, a number of tools, ranging from multiple sequence alignment construction to domain prediction, have been updated and/or extended with services for programmatic access using SOAP. We provide an overview of these tools and their application.

  11. Proceedings of a NEA workshop on probabilistic structure integrity analysis and its relationship to deterministic analysis

    International Nuclear Information System (INIS)

    1996-01-01

    This workshop was hosted jointly by the Swedish Nuclear Power Inspectorate (SKi) and the Swedish Royal Institute of Technology (KTH). It was sponsored by the Principal Working Group 3 (PWG-3) of the NEA CSNI. PWG-3 deals with the integrity of structures and components, and has three sub-groups, dealing with the integrity of metal components and structures, ageing of concrete structures, and the seismic behaviour of structures. The sub-group dealing with metal components has three mains areas of activity: non-destructive examination; fracture mechanics; and material degradation. The topic of this workshop is primarily probabilistic fracture mechanics, but probabilistic integrity analysis includes NDE and materials degradation also. Session 1 (5 papers) was devoted to the development of probabilistic models; Session 2 (5 papers) to the random modelling of defects and material properties; Session 3 (8 papers) to the applications of probabilistic modelling to nuclear components; Sessions 4 is a concluding panel discussion

  12. High-Speed Research: 1994 Sonic Boom Workshop. Configuration, Design, Analysis and Testing

    Science.gov (United States)

    McCurdy, David A. (Editor)

    1999-01-01

    The third High-Speed Research Sonic Boom Workshop was held at NASA Langley Research Center on June 1-3, 1994. The purpose of this workshop was to provide a forum for Government, industry, and university participants to present and discuss progress in their research. The workshop was organized into sessions dealing with atmospheric propagation; acceptability studies; and configuration design, and testing. Attendance at the workshop was by invitation only. The workshop proceedings include papers on design, analysis, and testing of low-boom high-speed civil transport configurations and experimental techniques for measuring sonic booms. Significant progress is noted in these areas in the time since the previous workshop a year earlier. The papers include preliminary results of sonic boom wind tunnel tests conducted during 1993 and 1994 on several low-boom designs. Results of a mission performance analysis of all low-boom designs are also included. Two experimental methods for measuring near-field signatures of airplanes in flight are reported.

  13. STUDENT PERFORMANCE PARAMETER ANALYSIS USING SPSS TOOL

    OpenAIRE

    Bhavesh Patel; Jyotindra Dharwa

    2017-01-01

    SPSS tool is statistical analysis tool. This tool is used for analyzing the large volume of available data, extracting useful information and knowledge to support the major decision-making processes. SPSS tool can be applied in educational sector for improving the performance of students by finding the highly affected parameter on student performance. This research study is carried out by collecting the student performance parameters and its related dataset. In this research study we have col...

  14. Machine tool accuracy characterization workshops. Final report, May 5, 1992--November 5 1993

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-01-06

    The ability to assess the accuracy of machine tools is required by both tool builders and users. Builders must have this ability in order to predict the accuracy capability of a machine tool for different part geometry`s, to provide verifiable accuracy information for sales purposes, and to locate error sources for maintenance, troubleshooting, and design enhancement. Users require the same ability in order to make intelligent choices in selecting or procuring machine tools, to predict component manufacturing accuracy, and to perform maintenance and troubleshooting. In both instances, the ability to fully evaluate the accuracy capabilities of a machine tool and the source of its limitations is essential for using the tool to its maximum accuracy and productivity potential. This project was designed to transfer expertise in modern machine tool accuracy testing methods from LLNL to US industry, and to educate users on the use and application of emerging standards for machine tool performance testing.

  15. Applying instructional design theories to bioinformatics education in microarray analysis and primer design workshops.

    Science.gov (United States)

    Shachak, Aviv; Ophir, Ron; Rubin, Eitan

    2005-01-01

    The need to support bioinformatics training has been widely recognized by scientists, industry, and government institutions. However, the discussion of instructional methods for teaching bioinformatics is only beginning. Here we report on a systematic attempt to design two bioinformatics workshops for graduate biology students on the basis of Gagne's Conditions of Learning instructional design theory. This theory, although first published in the early 1970s, is still fundamental in instructional design and instructional technology. First, top-level as well as prerequisite learning objectives for a microarray analysis workshop and a primer design workshop were defined. Then a hierarchy of objectives for each workshop was created. Hands-on tutorials were designed to meet these objectives. Finally, events of learning proposed by Gagne's theory were incorporated into the hands-on tutorials. The resultant manuals were tested on a small number of trainees, revised, and applied in 1-day bioinformatics workshops. Based on this experience and on observations made during the workshops, we conclude that Gagne's Conditions of Learning instructional design theory provides a useful framework for developing bioinformatics training, but may not be optimal as a method for teaching it.

  16. Virtual Workshop

    DEFF Research Database (Denmark)

    Buus, Lillian; Bygholm, Ann

    In relation to the Tutor course in the Mediterranean Virtual University (MVU) project, a virtual workshop “Getting experiences with different synchronous communication media, collaboration, and group work” was held with all partner institutions in January 2006. More than 25 key-tutors within MVU...... participated from different institutions in the workshop. The result of the workshop was experiences with different communication tools and media. Facing the difficulties and possibilities in collaborateting virtually concerned around group work and development of a shared presentation. All based on getting...... experiences for the learning design of MVU courses. The workshop intented to give the participants the possibility to draw their own experiences with issues on computer supported collaboration, group work in a virtual environment, synchronous and asynchronous communication media, and different perspectives...

  17. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  18. Proc. of the Workshop on Agent Simulation : Applications, Models, and Tools, Oct. 15-16, 1999

    International Nuclear Information System (INIS)

    Macal, C. M.; Sallach, D.

    2000-01-01

    The many motivations for employing agent-based computation in the social sciences are reviewed. It is argued that there exist three distinct uses of agent modeling techniques. One such use-the simplest-is conceptually quite close to traditional simulation in operations research. This use arises when equations can be formulated that completely describe a social process, and these equations are explicitly soluble, either analytically or numerically. In the former case, the agent model is merely a tool for presenting results, while in the latter it is a novel kind of Monte Carlo analysis. A second, more commonplace usage of computational agent models arises when mathematical models can be written down but not completely solved. In this case the agent-based model can shed significant light on the solution structure, illustrate dynamical properties of the model, serve to test the dependence of results on parameters and assumptions, and be a source of counter-examples. Finally, there are important classes of problems for which writing down equations is not a useful activity. In such circumstances, resort to agent-based computational models may be the only way available to explore such processes systematically, and constitute a third distinct usage of such models

  19. A workshop report on the development of the Cow's Milk-related Symptom Score awareness tool for young children

    DEFF Research Database (Denmark)

    Vandenplas, Yvan; Dupont, Christophe; Eigenmann, Philippe

    2015-01-01

    Clinicians with expertise in managing children with gastrointestinal problems and, or, atopic diseases attended a workshop in Brussels in September 2014 to review the literature and determine whether a clinical score derived from symptoms associated with the ingestion of cow's milk proteins could...... help primary healthcare providers. The Cow's Milk-related Symptom Score (CoMiSS), which considers general manifestations, dermatological, gastrointestinal and respiratory symptoms, was developed as an awareness tool for cow's milk related symptoms. It can also be used to evaluate and quantify...... the evolution of symptoms during therapeutic interventions, but does not diagnose cow's milk protein allergy and does not replace a food challenge. Its usefulness needs to be evaluated by a prospective randomised study. ConclusionThe CoMiSS provides primary healthcare clinicians with a simple, fast and easy...

  20. VALIDERING AV VERKTYGET "ENTERPRISE ARCHITECTURE ANALYSIS TOOL"

    OpenAIRE

    Österlind, Magnus

    2011-01-01

    The Enterprise Architecture Analysis Tool, EAAT, is a software tool developed by the department of Industrial Information- and Control systems, ICS, at the Royal Institute of Technology, Stockholm, Sweden. EAAT is a modeling tool that combines Enterprise Architecture (EA) modeling with probabilistic relational modeling. Therefore EAAT makes it possible to design, describe and analyze the organizational structure, business processes, information systems and infrastructure within an enterprise....

  1. Integrating Reliability Analysis with a Performance Tool

    Science.gov (United States)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  2. Proceedings of the workshop on small angle scattering data analysis. Micelle related topics

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, Toshio [Fukuoka Univ. (Japan). Faculty of Science; Furusaka, Michihiro; Ohtomo, Toshiya [eds.

    1996-02-01

    This workshop was held on December 13 and 14, 1995 at National Laboratory for High Energy Physics. At the workshop, the purpose of the workshop was explained, and lectures were given on the research on superhigh molecular structure by small angle neutron scattering, the verification of the reliability of WINK data (absolute intensity), the analysis of WINK data, the new data program of SAN, small angle X-ray scattering data analysis program (SAXS), the basis of the analysis of micelle system, analysis software manual and practice program Q-I(Q) ver 1.0, various analysis methods for small angle scattering and contrast modulation method and others, the ordering of and the countermeasures to the problems of WINK, and the hereafter of KENS small angle scattering facility. How to treat the analysis related to micelle, how to save WINK and how to install the SAN/reflectometer are the matters to be discussed at the workshop. In this book, the summaries of the lectures are collected. (K.I.)

  3. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  4. Women's Workshop.

    Science.gov (United States)

    Karelius, Karen

    The Women's Workshop Notebook is the tool used in the nine-week course designed for the mature woman returning to school at Antelope Valley College. The notebook exercises along with the group interaction and instruction stress the importance of personal assessment of strengths, weaknesses, dreams, deliberations and life history in…

  5. Quick Spacecraft Thermal Analysis Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  6. Short interactive workshops reduce variability in contouring treatment volumes for spine stereotactic body radiation therapy: Experience with the ESTRO FALCON programme and EduCase™ training tool.

    Science.gov (United States)

    De Bari, Berardino; Dahele, Max; Palmu, Miika; Kaylor, Scott; Schiappacasse, Luis; Guckenberger, Matthias

    2017-11-20

    We report the results of 4, 2-h contouring workshops on target volume definition for spinal stereotactic radiotherapy. They combined traditional teaching methods with a web-based contouring/contour-analysis platform and led to a significant reduction in delineation variability. Short, interactive workshops can reduce interobserver variability in spine SBRT target volume delineation. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. PREFACE: EMAS 2011: 12th European Workshop on Modern Developments in Microbeam Analysis

    Science.gov (United States)

    Brisset, François; Dugne, Olivier; Robaut, Florence; Lábár, János L.; Walker, Clive T.

    2012-03-01

    This volume of IOP Conference Series: Materials Science and Engineering contains papers from the 12th Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis, which took place from the 15-19 May 2011 in the Angers Congress Centre, Angers, France. The primary aim of this series of workshops is to assess the state-of-the-art and reliability of microbeam analysis techniques. The workshops also provide a forum where students and young scientists starting out on a career in microbeam analysis can meet and discuss with the established experts. The workshops have a very specific format comprising invited plenary lectures by internationally recognized experts, poster presentations by the participants and round table discussions on the key topics led by specialists in the field. This workshop was organized in collaboration with GN-MEBA - Groupement National de Microscopie Electronique à Balayage et de microAnalysis, France. The technical programme included the following topics: the limits of EPMA, new techniques, developments and concepts in microanalysis, microanalysis in the SEM, and new and less common applications of micro- and nanoanalysis. As at previous workshops there was also a special oral session for young scientists. The best presentation by a young scientist was awarded with an invitation to attend the 2012 Microscopy and Microanalysis meeting at Phoenix, Arizona. The prize went to Pierre Burdet, of the Federal Institute of Technology of Lausanne (EPFL), for his talk entitled '3D EDS microanalysis by FIB-SEM: enhancement of elemental quantification'. The continuing relevance of the EMAS workshops and the high regard in which they are held internationally can be seen from the fact that 74 posters from 18 countries were on display at the meeting, and that the participants came from as far away as Japan, Canada and the USA. A selection of participants with posters were invited to give a short oral

  8. Materiality Matters: Exploring the use of design tools in innovation workshops with the craft and creative sector in the Northern Isles of Scotland

    OpenAIRE

    Broadley, Cara; Champion, Katherine; McHattie, Lynn-Sayers

    2017-01-01

    This paper presents initial reflections regarding the use of bespoke design tools within a series of innovation workshops carried out with practitioners and stakeholders active in the craft and creative industry sector in the Scottish Islands of Orkney and Shetland. We argue that by emphasising such bespoke material tools located in and inspired by the local landscape, history and culture, we encouraged engagement, provided space for innovation and enabled creative collectives in their goal o...

  9. Abstract interfaces for data analysis - component architecture for data analysis tools

    International Nuclear Information System (INIS)

    Barrand, G.; Binko, P.; Doenszelmann, M.; Pfeiffer, A.; Johnson, A.

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis'99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. The authors give an overview of the architecture and design of the various components for data analysis as discussed in AIDA

  10. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  11. (Re)interpreting LHC New Physics Search Results : Tools and Methods, 3rd Workshop

    CERN Document Server

    The quest for new physics beyond the SM is arguably the driving topic for LHC Run2. LHC collaborations are pursuing searches for new physics in a vast variety of channels. Although collaborations provide various interpretations for their search results, the full understanding of these results requires a much wider interpretation scope involving all kinds of theoretical models. This is a very active field, with close theory-experiment interaction. In particular, development of dedicated methodologies and tools is crucial for such scale of interpretation. Recently, a Forum was initiated to host discussions among LHC experimentalists and theorists on topics related to the BSM (re)interpretation of LHC data, and especially on the development of relevant interpretation tools and infrastructure: https://twiki.cern.ch/twiki/bin/view/LHCPhysics/InterpretingLHCresults Two meetings were held at CERN, where active discussions and concrete work on (re)interpretation methods and tools took place, with valuable cont...

  12. Fourth Workshop and Tutorial on Practical Use of Coloured Petri Nets and the CPN Tools

    DEFF Research Database (Denmark)

    Coloured Petri Nets and the CPN tools are now used by more than 750 organisations in 50 different countries all over the world (including 150 commercial companies). The purpose of this event is to bring together some of the users and in this way provide a forum for those who are interested in the...

  13. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  14. Surface analysis of stone and bone tools

    Science.gov (United States)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  15. Performance analysis of GYRO: a tool evaluation

    International Nuclear Information System (INIS)

    Worley, P; Candy, J; Carrington, L; Huck, K; Kaiser, T; Mahinthakumar, G; Malony, A; Moore, S; Reed, D; Roth, P; Shan, H; Shende, S; Snavely, A; Sreepathi, S; Wolf, F; Zhang, Y

    2005-01-01

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wallclock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses

  16. A Workshop on Analysis and Evaluation of Enterprise Architectures

    Science.gov (United States)

    2010-11-01

    Framework ( TOGAF ) defines EA as comprising four domains (The Open Group 2009): 1. The business architecture defines the business strategy, governance...different types of products (the DoDAF, TOGAF , sys- tem architecture, spreadsheets, Visio diagrams, and other tools) and reside in different reposito- ries...the approach would work for most parts of an EA except for business architecture (for example, the TOGAF Application, Data, and Technology

  17. CMMI High Maturity Measurement and Analysis Workshop Report: March 2008

    National Research Council Canada - National Science Library

    Stoddard, II, Robert W; Goldenson, Dennis R; Zubrow, Dave; Harper, Erin

    2008-01-01

    .... In response to the need for clarification and guidance on implementing measurement and analysis in the context of high maturity processes, members of the SEI s Software Engineering Measurement and Analysis (SEMA...

  18. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  19. Materials characterization center workshop on compositional and microstructural analysis of nuclear waste materials. Summary report

    International Nuclear Information System (INIS)

    Daniel, J.L.; Strachan, D.M.; Shade, J.W.; Thomas, M.T.

    1981-06-01

    The purpose of the Workshop on Compositional and Microstructural Analysis of Nuclear Waste Materials, conducted November 11 and 12, 1980, was to critically examine and evaluate the various methods currently used to study non-radioactive, simulated, nuclear waste-form performance. Workshop participants recognized that most of the Materials Characterization Center (MCC) test data for inclusion in the Nuclear Waste Materials Handbook will result from application of appropriate analytical procedures to waste-package materials or to the products of performance tests. Therefore, the analytical methods must be reliable and of known accuracy and precision, and results must be directly comparable with those from other laboratories and from other nuclear waste materials. The 41 participants representing 18 laboratories in the United States and Canada were organized into three working groups: Analysis of Liquids and Solutions, Quantitative Analysis of Solids, and Phase and Microstructure Analysis. Each group identified the analytical methods favored by their respective laboratories, discussed areas needing attention, listed standards and reference materials currently used, and recommended means of verifying interlaboratory comparability of data. The major conclusions from this workshop are presented

  20. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.

    1993-01-01

    Work is in progress on interactive tools for linear and non-linear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  1. Rapid Benefit Indicators (RBI) Spatial Analysis Tools

    Science.gov (United States)

    The Rapid Benefit Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration - A Rapid Benefits Indicators Approach for Decision Makers. This spatial analysis tool is intended to be used to analyze existing spatial informatio...

  2. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.

    1993-05-01

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  3. KAFE - A Flexible Image Analysis Tool

    Science.gov (United States)

    Burkutean, Sandra

    2017-11-01

    We present KAFE, the Keywords of Astronomical FITS-Images Explorer - a web-based FITS images post-processing analysis tool designed to be applicable in the radio to sub-mm wavelength domain. KAFE was developed to enrich selected FITS files with metadata based on a uniform image analysis approach as well as to provide advanced image diagnostic plots. It is ideally suited for data mining purposes and multi-wavelength/multi-instrument data samples that require uniform data diagnostic criteria.

  4. The CANDU alarm analysis tool (CAAT)

    International Nuclear Information System (INIS)

    Davey, E.C.; Feher, M.P.; Lupton, L.R.

    1997-01-01

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs

  5. 11th International Workshop in Model-Oriented Design and Analysis

    CERN Document Server

    Müller, Christine; Atkinson, Anthony

    2016-01-01

    This volume contains pioneering contributions to both the theory and practice of optimal experimental design. Topics include the optimality of designs in linear and nonlinear models, as well as designs for correlated observations and for sequential experimentation. There is an emphasis on applications to medicine, in particular, to the design of clinical trials. Scientists from Europe, the US, Asia, Australia and Africa contributed to this volume of papers from the 11th Workshop on Model Oriented Design and Analysis.

  6. Decision Analysis Tools for Volcano Observatories

    Science.gov (United States)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  7. Memory Efficient Sequence Analysis Using Compressed Data Structures (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Jared

    2011-10-13

    Wellcome Trust Sanger Institute's Jared Simpson on Memory efficient sequence analysis using compressed data structures at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  8. Duke Workshop on High-Dimensional Data Sensing and Analysis

    Science.gov (United States)

    2015-05-06

    Laplacian, Amit Singer. Motivated by problems in structural biology, specifically cryo- electron microscopy, we introduce vector diffusion maps (VDM...Laplacian operator for vector fields over the manifold. Applications to structural biology (cryo- electron microscopy and NMR spectroscopy), computer vision...Christopher Rozell. We present an analysis of the Locally Competitive Algorithm ( LCA ), a Hopfield-style neural network that solves sparse approximation

  9. Summary and Statistical Analysis of the First AIAA Sonic Boom Prediction Workshop

    Science.gov (United States)

    Park, Michael A.; Morgenstern, John M.

    2014-01-01

    A summary is provided for the First AIAA Sonic Boom Workshop held 11 January 2014 in conjunction with AIAA SciTech 2014. Near-field pressure signatures extracted from computational fluid dynamics solutions are gathered from nineteen participants representing three countries for the two required cases, an axisymmetric body and simple delta wing body. Structured multiblock, unstructured mixed-element, unstructured tetrahedral, overset, and Cartesian cut-cell methods are used by the participants. Participants provided signatures computed on participant generated and solution adapted grids. Signatures are also provided for a series of uniformly refined workshop provided grids. These submissions are propagated to the ground and loudness measures are computed. This allows the grid convergence of a loudness measure and a validation metric (dfference norm between computed and wind tunnel measured near-field signatures) to be studied for the first time. Statistical analysis is also presented for these measures. An optional configuration includes fuselage, wing, tail, flow-through nacelles, and blade sting. This full configuration exhibits more variation in eleven submissions than the sixty submissions provided for each required case. Recommendations are provided for potential improvements to the analysis methods and a possible subsequent workshop.

  10. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan

    2012-11-01

    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs130170

  11. Feedback from the European Bioanalysis Forum: focus workshop on current analysis of immunogenicity: best practices and regulatory hurdles.

    Science.gov (United States)

    Goodman, Joanne; Cowen, Simon; Devanarayan, Viswanath; Egging, David; Emrich, Thomas; Golob, Michaela; Kramer, Daniel; McNally, Jim; Munday, James; Nelson, Robert; Pedras-Vasconcelos, João A; Piironen, Timo; Sickert, Denise; Skibeli, Venke; Fjording, Marianne Scheel; Timmerman, Philip

    2018-02-01

    European Bioanalysis Forum Workshop, Lisbon, Portugal, September 2016: At the recent European Bioanalysis Forum Focus Workshop, 'current analysis of immunogenicity: best practices and regulatory hurdles', several important challenges facing the bioanalytical community in relation to immunogenicity assays were discussed through a mixture of presentations and panel sessions. The main areas of focus were the evolving regulatory landscape, challenges of assay interferences from either drug or target, cut-point setting and whether alternative assays can be used to replace neutralizing antibody assays. This workshop report captures discussions and potential solutions and/or recommendations made by the speakers and delegates.

  12. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  13. Summary of the workshop on structural analysis needs for magnetic fusion energy superconducting magnets

    International Nuclear Information System (INIS)

    Reich, M.; Lehner, J.; Powell, J.

    1976-09-01

    The technical portions of the meeting were divided into three major sessions as follows: (1) Review of methods being presently used by the MFE community for structural evaluation of current designs. (2) Future structural analysis needs. (3) Open discussions dealing with adequacy of present methods, the improvements needed for MFE magnet structural analysis, and the establishment of an MFE magnet structural advisory group. Summaries of the individual talks presented on Wednesday and Thursday (i.e., items 1 and 2 above) are included following the workshop schedule given later in this synopsis

  14. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    Wickett, A.J.; Yadigaroglu, G.

    1994-08-01

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  15. Applied DMP Consultation Workshop

    OpenAIRE

    Zilinski, Lisa

    2013-01-01

    This workshop allowed participants to work through a data management plan consultation, using a real world example. The workshop covered data management plans (DMPs), described in detail the areas that are typically included in a DMP, and utilized tools and resources to help support a DMP consultation with disciplinary faculty.

  16. BBAT: Bunch and bucket analysis tool

    International Nuclear Information System (INIS)

    Deng, D.P.

    1995-01-01

    BBAT is written to meet the need of an interactive graphical tool to explore the longitudinal phase space. It is driven for testing new ideas or new tricks quickly. It is especially suitable for machine physicists or operation staff as well both in the control room during machine studies or off-line to analyze the data. The heart of the package contains a set of c-routines to do the number crunching. The graphics part is wired with scripting language tcl/tk and BLT. The c-routines are general enough that one can write new applications such as animation of the bucket as a machine parameter varies via a sliding scale. BBAT deals with single rf system. For double rf system, one can use Dr. BBAT, which stands for Double rf Bunch and Bucket Analysis Tool. One usage of Dr. BBAT is to visualize the process of bunch coalescing and flat bunch creation

  17. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  18. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... and the value of associated rewards in states of interest for a real-world example from a case company in the Danish baked goods industry. The developments are presented in a generalised fashion to make them relevant to the general problem of implementing quantitative probabilistic model checking of graph...

  19. Science Thought and Practices: A Professional Development Workshop on Teaching Scientific Reasoning, Mathematical Modeling and Data Analysis

    Science.gov (United States)

    Robbins, Dennis; Ford, K. E. Saavik

    2018-01-01

    The NSF-supported “AstroCom NYC” program, a collaboration of the City University of New York and the American Museum of Natural History (AMNH), has developed and offers hands-on workshops to undergraduate faculty on teaching science thought and practices. These professional development workshops emphasize a curriculum and pedagogical strategies that uses computers and other digital devices in a laboratory environment to teach students fundamental topics, including: proportional reasoning, control of variables thinking, experimental design, hypothesis testing, reasoning with data, and drawing conclusions from graphical displays. Topics addressed here are rarely taught in-depth during the formal undergraduate years and are frequently learned only after several apprenticeship research experiences. The goal of these workshops is to provide working and future faculty with an interactive experience in science learning and teaching using modern technological tools.

  20. SEAT: A strategic engagement analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Dreicer, J.; Michelsen, C.; Morgeson, D.

    1988-01-01

    The Strategic Engagement Analysis Tool (SEAT) is a prototype of an expert system knowledge-based discrete event simulation. SEAT realistically represents the interrelationships between the eight major subsystems in the strategic targeting and assault domain. Some of the subsystems employ run-time cognitive decision making and reasoning capabilities to represent human tactical and operational strategy decisions. SEAT's goal is to allow analysts to conduct sensitivity analysis and to determine cause-effect relationships. An intelligent interface mechanism is provided to aid the analyst in scenario creation. The interface was designed to provide on-line documentation, support for model input, logic control, and data validation prior to simulation execution. 4 refs., 3 figs.

  1. Medical decision making tools: Bayesian analysis and ROC analysis

    International Nuclear Information System (INIS)

    Lee, Byung Do

    2006-01-01

    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  2. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  3. Microgenetic Learning Analytics Methods: Workshop Report

    Science.gov (United States)

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  4. Standardised risk analysis as a communication tool

    International Nuclear Information System (INIS)

    Pluess, Ch.; Montanarini, M.; Bernauer, M.

    1998-01-01

    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  5. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  6. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    Gary Casuccio; Michael Potter; Fred Schwerer; Richard J. Fruehan; Dr. Scott Story

    2005-01-01

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  7. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  8. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  9. Airborne LIDAR Data Processing and Analysis Tools

    Science.gov (United States)

    Zhang, K.

    2007-12-01

    Airborne LIDAR technology allows accurate and inexpensive measurements of topography, vegetation canopy heights, and buildings over large areas. In order to provide researchers high quality data, NSF has created the National Center for Airborne Laser Mapping (NCALM) to collect, archive, and distribute the LIDAR data. However, the LIDAR systems collect voluminous irregularly-spaced, three-dimensional point measurements of ground and non-ground objects scanned by the laser beneath the aircraft. To advance the use of the technology and data, NCALM is developing public domain algorithms for ground and non-ground measurement classification and tools for data retrieval and transformation. We present the main functions of the ALDPAT (Airborne LIDAR Data Processing and Analysis Tools) developed by NCALM. While Geographic Information Systems (GIS) provide a useful platform for storing, analyzing, and visualizing most spatial data, the shear volume of raw LIDAR data makes most commercial GIS packages impractical. Instead, we have developed a suite of applications in ALDPAT which combine self developed C++ programs with the APIs of commercial remote sensing and GIS software. Tasks performed by these applications include: 1) transforming data into specified horizontal coordinate systems and vertical datums; 2) merging and sorting data into manageable sized tiles, typically 4 square kilometers in dimension; 3) filtering point data to separate measurements for the ground from those for non-ground objects; 4) interpolating the irregularly spaced elevations onto a regularly spaced grid to allow raster based analysis; and 5) converting the gridded data into standard GIS import formats. The ALDPAT 1.0 is available through http://lidar.ihrc.fiu.edu/.

  10. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  11. Analysis of Work Assignments After Medical Ethics Workshop for First-Year Residents at Siriraj Hospital

    Directory of Open Access Journals (Sweden)

    Sakda Sathirareuangchai

    2016-11-01

    Full Text Available Background: Upon entering the residency training program, all 1st year residents at Siriraj Hospital must join medical ethics workshop held by the Division of Postgraduate Studies. At the end of the workshop, the residents were given a work assignment to write a clinical ethics situation they have encountered in their past practice. Methods: This study is an analysis of content described in the work assignments in order to gain the information regarding common medical ethics dilemmas, which the physicians faced in the early days of practice. Results: 740 work assignments were reviewed. The 4 most common ethical principle mentioned in these assign- ments were autonomy (144, 19.5%, palliative care (133, 18.0%, beneficence (121, 16.4%, and confidentiality (110, 14.9%. More than half of the situations described were during their internship (474, 64.1% and tended to distributed equally among community hospital (39.1%, university hospital (28.0%, and general hospital (24.3%. Conclusion: This study should raise the awareness of the medical educator towards these medical ethics issues during curriculum planning.

  12. Sustainability Tools Inventory Initial Gap Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  13. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  14. Science for Managing Riverine Ecosystems: Actions for the USGS Identified in the Workshop "Analysis of Flow and Habitat for Instream Aquatic Communities"

    Science.gov (United States)

    Bencala, Kenneth E.; Hamilton, David B.; Petersen, James H.

    2006-01-01

    Federal and state agencies need improved scientific analysis to support riverine ecosystem management. The ability of the USGS to integrate geologic, hydrologic, chemical, geographic, and biological data into new tools and models provides unparalleled opportunities to translate the best riverine science into useful approaches and usable information to address issues faced by river managers. In addition to this capability to provide integrated science, the USGS has a long history of providing long-term and nationwide information about natural resources. The USGS is now in a position to advance its ability to provide the scientific support for the management of riverine ecosystems. To address this need, the USGS held a listening session in Fort Collins, Colorado in April 2006. Goals of the workshop were to: 1) learn about the key resource issues facing DOI, other Federal, and state resource management agencies; 2) discuss new approaches and information needs for addressing these issues; and 3) outline a strategy for the USGS role in supporting riverine ecosystem management. Workshop discussions focused on key components of a USGS strategy: Communications, Synthesis, and Research. The workshop identified 3 priority actions the USGS can initiate now to advance its capabilities to support integrated science for resource managers in partner government agencies and non-governmental organizations: 1) Synthesize the existing science of riverine ecosystem processes to produce broadly applicable conceptual models, 2) Enhance selected ongoing instream flow projects with complementary interdisciplinary studies, and 3) Design a long-term, watershed-scale research program that will substantively reinvent riverine ecosystem science. In addition, topical discussion groups on hydrology, geomorphology, aquatic habitat and populations, and socio-economic analysis and negotiation identified eleven important complementary actions required to advance the state of the science and to

  15. THE SMALL BODY GEOPHYSICAL ANALYSIS TOOL

    Science.gov (United States)

    Bercovici, Benjamin; McMahon, Jay

    2017-10-01

    The Small Body Geophysical Analysis Tool (SBGAT) that we are developing aims at providing scientists and mission designers with a comprehensive, easy to use, open-source analysis tool. SBGAT is meant for seamless generation of valuable simulated data originating from small bodies shape models, combined with advanced shape-modification properties.The current status of SBGAT is as follows:The modular software architecture that was specified in the original SBGAT proposal was implemented in the form of two distinct packages: a dynamic library SBGAT Core containing the data structure and algorithm backbone of SBGAT, and SBGAT Gui which wraps the former inside a VTK, Qt user interface to facilitate user/data interaction. This modular development facilitates maintenance and addi- tion of new features. Note that SBGAT Core can be utilized independently from SBGAT Gui.SBGAT is presently being hosted on a GitHub repository owned by SBGAT’s main developer. This repository is public and can be accessed at https://github.com/bbercovici/SBGAT. Along with the commented code, one can find the code documentation at https://bbercovici.github.io/sbgat-doc/index.html. This code documentation is constently updated in order to reflect new functionalities.SBGAT’s user’s manual is available at https://github.com/bbercovici/SBGAT/wiki. This document contains a comprehensive tutorial indicating how to retrieve, compile and run SBGAT from scratch.Some of the upcoming development goals are listed hereafter. First, SBGAT's dynamics module will be extented: the PGM algorithm is the only type of analysis method currently implemented. Future work will therefore consists in broadening SBGAT’s capabilities with the Spherical Harmonics Expansion of the gravity field and the calculation of YORP coefficients. Second, synthetic measurements will soon be available within SBGAT. The software should be able to generate synthetic observations of different type (radar, lightcurve, point clouds

  16. Risk analysis as a decision tool

    International Nuclear Information System (INIS)

    Yadigaroglu, G.; Chakraborty, S.

    1985-01-01

    From 1983 - 1985 a lecture series entitled ''Risk-benefit analysis'' was held at the Swiss Federal Institute of Technology (ETH), Zurich, in cooperation with the Central Department for the Safety of Nuclear Installations of the Swiss Federal Agency of Energy Economy. In that setting the value of risk-oriented evaluation models as a decision tool in safety questions was discussed on a broad basis. Experts of international reputation from the Federal Republic of Germany, France, Canada, the United States and Switzerland have contributed to report in this joint volume on the uses of such models. Following an introductory synopsis on risk analysis and risk assessment the book deals with practical examples in the fields of medicine, nuclear power, chemistry, transport and civil engineering. Particular attention is paid to the dialogue between analysts and decision makers taking into account the economic-technical aspects and social values. The recent chemical disaster in the Indian city of Bhopal again signals the necessity of such analyses. All the lectures were recorded individually. (orig./HP) [de

  17. Team-Based Peer Review as a Form of Formative Assessment--The Case of a Systems Analysis and Design Workshop

    Science.gov (United States)

    Lavy, Ilana; Yadin, Aharon

    2010-01-01

    The present study was carried out within a systems analysis and design workshop. In addition to the standard analysis and design tasks, this workshop included practices designed to enhance student capabilities related to non-technical knowledge areas, such as critical thinking, interpersonal and team skills, and business understanding. Each task…

  18. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...

  19. Parallel Enhancements of the General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  20. Parallel Enhancements of the General Mission Analysis Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  1. Predictive tools and data needs for long term performance of in-situ stabilization and containment systems: DOE/OST stabilization workshop, June 26-27, Park City, Utah

    International Nuclear Information System (INIS)

    Borns, D.J.

    1997-01-01

    This paper summarizes the discussion within the Predictive Tools and Data Needs for Long Term Performance Assessment Subgroup. This subgroup formed at the DOE Office of Science and Technology workshop to address long-term performance of in situ stabilization and containment systems. The workshop was held in Park City, Utah, 26 and 27 June, 1996. All projects, engineering and environmental, have built-in decision processes that involve varying risk/reward scenarios. Such decision-processes maybe awkward to describe but are utilized every day following approaches that range from intuitive to advanced mathematical and numerical. Examples are the selection of components of home sound system, the members of a sports team, investments in a portfolio, and the members of a committee. Inherent in the decision method are an understanding of the function or process of the system requiring a decision or prediction, an understanding of the criteria on which decisions are made such as cost, performance, durability and verifiability. Finally, this process requires a means to judge or predict how the objects, activities, people and processes being analyzed will perform relative to the operations and functions of the system and relative to the decision criteria posed for the problem. These risk and decision analyses are proactive and iterative throughout the life of a remediation project. Prediction inherent to the analyses are based on intuition, experience, trial and error, and system analysis often using numerical approaches

  2. Presentations and recorded keynotes of the First European Workshop on Latent Semantic Analysis in Technology Enhanced Learning

    NARCIS (Netherlands)

    Several

    2007-01-01

    Presentations and recorded keynotes at the 1st European Workshop on Latent Semantic Analysis in Technology-Enhanced Learning, March, 29-30, 2007. Heerlen, The Netherlands: The Open University of the Netherlands. Please see the conference website for more information:

  3. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    Energy Technology Data Exchange (ETDEWEB)

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

    2009-09-01

    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  4. Ball Bearing Analysis with the ORBIS Tool

    Science.gov (United States)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  5. Signal optimization and analysis using PASSER V-07 : training workshop: code IPR006.

    Science.gov (United States)

    2011-01-01

    The objective of this project was to conduct one pilot workshop and five regular workshops to teach the effective use of the enhanced PASSER V-07 arterial signal timing optimization software. PASSER V-07 and materials for conducting a one-day trainin...

  6. Forensic analysis of video steganography tools

    Directory of Open Access Journals (Sweden)

    Thomas Sloan

    2015-05-01

    Full Text Available Steganography is the art and science of concealing information in such a way that only the sender and intended recipient of a message should be aware of its presence. Digital steganography has been used in the past on a variety of media including executable files, audio, text, games and, notably, images. Additionally, there is increasing research interest towards the use of video as a media for steganography, due to its pervasive nature and diverse embedding capabilities. In this work, we examine the embedding algorithms and other security characteristics of several video steganography tools. We show how all feature basic and severe security weaknesses. This is potentially a very serious threat to the security, privacy and anonymity of their users. It is important to highlight that most steganography users have perfectly legal and ethical reasons to employ it. Some common scenarios would include citizens in oppressive regimes whose freedom of speech is compromised, people trying to avoid massive surveillance or censorship, political activists, whistle blowers, journalists, etc. As a result of our findings, we strongly recommend ceasing any use of these tools, and to remove any contents that may have been hidden, and any carriers stored, exchanged and/or uploaded online. For many of these tools, carrier files will be trivial to detect, potentially compromising any hidden data and the parties involved in the communication. We finish this work by presenting our steganalytic results, that highlight a very poor current state of the art in practical video steganography tools. There is unfortunately a complete lack of secure and publicly available tools, and even commercial tools offer very poor security. We therefore encourage the steganography community to work towards the development of more secure and accessible video steganography tools, and make them available for the general public. The results presented in this work can also be seen as a useful

  7. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    . The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur.......This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...

  8. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  9. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  10. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri

    2011-01-01

    We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order...

  11. Buffer$--An Economic Analysis Tool

    Science.gov (United States)

    Gary Bentrup

    2007-01-01

    Buffer$ is an economic spreadsheet tool for analyzing the cost-benefits of conservation buffers by resource professionals. Conservation buffers are linear strips of vegetation managed for multiple landowner and societal objectives. The Microsoft Excel based spreadsheet can calculate potential income derived from a buffer, including income from cost-share/incentive...

  12. 5D Task Analysis Visualization Tool Phase II, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  13. 5D Task Analysis Visualization Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  14. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  15. Graphical Acoustic Liner Design and Analysis Tool

    Science.gov (United States)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  16. An Integrated Tool for System Analysis of Sample Return Vehicles

    Science.gov (United States)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  17. In vitro exposure systems and dosimetry assessment tools for inhaled tobacco products: Workshop proceedings, conclusions and paths forward for in vitro model use.

    Science.gov (United States)

    Behrsing, Holger; Hill, Erin; Raabe, Hans; Tice, Raymond; Fitzpatrick, Suzanne; Devlin, Robert; Pinkerton, Kent; Oberdörster, Günter; Wright, Chris; Wieczorek, Roman; Aufderheide, Michaela; Steiner, Sandro; Krebs, Tobias; Asgharian, Bahman; Corley, Richard; Oldham, Michael; Adamson, Jason; Li, Xiang; Rahman, Irfan; Grego, Sonia; Chu, Pei-Hsuan; McCullough, Shaun; Curren, Rodger

    2017-07-01

    In 2009, the passing of the Family Smoking Prevention and Tobacco Control Act facilitated the establishment of the FDA Center for Tobacco Products (CTP), and gave it regulatory authority over the marketing, manufacture and distribution of tobacco products, including those termed 'modified risk'. On 4-6 April 2016, the Institute for In Vitro Sciences, Inc. (IIVS) convened a workshop conference entitled, In Vitro Exposure Systems and Dosimetry Assessment Tools for Inhaled Tobacco Products, to bring together stakeholders representing regulatory agencies, academia and industry to address the research priorities articulated by the FDA CTP. Specific topics were covered to assess the status of current in vitro smoke and aerosol/vapour exposure systems, as well as the various approaches and challenges to quantifying the complex exposures in in vitro pulmonary models developed for evaluating adverse pulmonary events resulting from tobacco product exposures. The four core topics covered were: a) Tobacco Smoke and E-Cigarette Aerosols; b) Air-Liquid Interface-In Vitro Exposure Systems; c) Dosimetry Approaches for Particles and Vapours/In Vitro Dosimetry Determinations; and d) Exposure Microenvironment/Physiology of Cells. The 2.5-day workshop included presentations from 20 expert speakers, poster sessions, networking discussions, and breakout sessions which identified key findings and provided recommendations to advance these technologies. Here, we will report on the proceedings, recommendations, and outcome of the April 2016 technical workshop, including paths forward for developing and validating non-animal test methods for tobacco product smoke and next generation tobacco product aerosol/vapour exposures. With the recent FDA publication of the final deeming rule for the governance of tobacco products, there is an unprecedented necessity to evaluate a very large number of tobacco-based products and ingredients. The questionable relevance, high cost, and ethical

  18. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  19. Navigating freely-available software tools for metabolomics analysis.

    Science.gov (United States)

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  20. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  1. A Lexical Analysis Tool with Ambiguity Support

    OpenAIRE

    Quesada, Luis; Berzal, Fernando; Cortijo, Francisco J.

    2012-01-01

    Lexical ambiguities naturally arise in languages. We present Lamb, a lexical analyzer that produces a lexical analysis graph describing all the possible sequences of tokens that can be found within the input string. Parsers can process such lexical analysis graphs and discard any sequence of tokens that does not produce a valid syntactic sentence, therefore performing, together with Lamb, a context-sensitive lexical analysis in lexically-ambiguous language specifications.

  2. Workshop: Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis: Modeling Climate Change Impacts and Associated Economic Damages (2011 - part 2)

    Science.gov (United States)

    The purpose of this workshop Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis. focused on conceptual and methodological issues - estimating impacts and valuing damages on a sectoral basis.

  3. Workshop: Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis: Modeling Climate Change Impacts and Associated Economic Damages (2010 - part 1)

    Science.gov (United States)

    The purpose of this workshop Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis. focused on conceptual and methodological issues - integrated assessment modeling and valuation.

  4. PREFACE: European Microbeam Analysis Society's 14th European Workshop on Modern Developments and Applications in Microbeam Analysis (EMAS 2015), Portorož, Slovenia, 3-7 May 2015

    Science.gov (United States)

    Llovet, Xavier; Matthews, Michael B.; Čeh, Miran; Langer, Enrico; Žagar, Kristina

    2016-02-01

    This volume of the IOP Conference Series: Materials Science and Engineering contains papers from the 14th Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis which took place from the 3rd to the 7th of May 2015 in the Grand Hotel Bernardin, Portorož, Slovenia. The primary aim of this series of workshops is to assess the state-of-the-art and reliability of microbeam analysis techniques. The workshops also provide a forum where students and young scientists starting out on a career in microbeam analysis can meet and discuss with the established experts. The workshops have a unique format comprising invited plenary lectures by internationally recognized experts, poster presentations by the participants and round table discussions on the key topics led by specialists in the field.This workshop was organized in collaboration with the Jožef Stefan Institute and SDM - Slovene Society for Microscopy. The technical programme included the following topics: electron probe microanalysis, STEM and EELS, materials applications, cathodoluminescence and electron backscatter diffraction (EBSD), and their applications. As at previous workshops there was also a special oral session for young scientists. The best presentation by a young scientist was awarded with an invitation to attend the 2016 Microscopy and Microanalysis meeting at Columbus, Ohio. The prize went to Shirin Kaboli, of the Department of Metals and Materials Engineering of McGill University (Montréal, Canada), for her talk entitled "Electron channelling contrast reconstruction with electron backscattered diffraction". The continuing relevance of the EMAS workshops and the high regard in which they are held internationally can be seen from the fact that 71 posters from 16 countries were on display at the meeting and that the participants came from as far away as Japan, Canada, USA, and Australia. A selection of participants with posters was invited

  5. Workshop: Economic Impacts of Aquatic Invasive Species Workshop (2005)

    Science.gov (United States)

    EPA's National Center for Environmental Economics and Office of Water jointly hosted the Economic Impacts of Aquatic Invasive Species Workshop on July 20-21, 2005 in DC. Goal to examine conceptual frameworks and tools to value invasive species impacts.

  6. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike

    2001-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  7. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.

    2004-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  8. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were car...

  9. MICCAI Workshops

    CERN Document Server

    Nedjati-Gilani, Gemma; Venkataraman, Archana; O'Donnell, Lauren; Panagiotaki, Eleftheria

    2014-01-01

    This volume contains the proceedings from two closely related workshops: Computational Diffusion MRI (CDMRI’13) and Mathematical Methods from Brain Connectivity (MMBC’13), held under the auspices of the 16th International Conference on Medical Image Computing and Computer Assisted Intervention, which took place in Nagoya, Japan, September 2013. Inside, readers will find contributions ranging from mathematical foundations and novel methods for the validation of inferring large-scale connectivity from neuroimaging data to the statistical analysis of the data, accelerated methods for data acquisition, and the most recent developments on mathematical diffusion modeling. This volume offers a valuable starting point for anyone interested in learning computational diffusion MRI and mathematical methods for brain connectivity as well as offers new perspectives and insights on current research challenges for those currently in the field. It will be of interest to researchers and practitioners in computer science, ...

  10. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  11. A Tool for the Concise Analysis of Patient Safety Incidents.

    Science.gov (United States)

    Pham, Julius Cuong; Hoffman, Carolyn; Popescu, Ioana; Ijagbemi, O Mayowa; Carson, Kathryn A

    2016-01-01

    Patient safety incidents, sometimes referred to as adverse events, incidents, or patient safety events, are too common an occurrence in health care. Most methods for incident analysis are time and labor intensive. Given the significant resource requirements of a root cause analysis, for example, there is a need for a more targeted and efficient method of analyzing a larger number of incidents. Although several concise incident analysis tools are in existence, there are no published studies regarding their usability or effectiveness. Building on previous efforts, a Concise Incident Analysis (CIA) methodology and tool were developed to facilitate analysis of no- or low-harm incidents. Staff from 11 hospitals in five countries-Australia, Canada, Hong Kong, India, and the United States-pilot tested the tool in two phases. The tool was evaluated and refined after each phase on the basis of user perceptions of usability and effectiveness. From September 2013 through January 2014, 52 patient safety incidents were analyzed. A broad variety of incident types were investigated, the most frequent being patient falls (25%). Incidents came from a variety of hospital work areas, the most frequent being from the medical ward (37%). Most incidents investigated resulted in temporary harm or no harm (94%). All or most sites found the tool "understandable" (100%), "easy to use" (89%), and "effective" (89%). Some 95% of participants planned to continue to use all or some parts of the tool after the pilot. Qualitative feedback suggested that the tool allowed analysis of incidents that were not currently being analyzed because of insufficient resources. The tool was described as simple to use, easy to document, and aligned with the flow of the incident analysis. A concise tool for the investigation of patient safety incidents with low or no harm was well accepted across a select group of hospitals from five countries.

  12. Workshops as a Research Methodology

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Levinsen, Karin Tweddell

    2017-01-01

    that workshops provide a platform that can aid researchers in identifying and exploring relevant factors in a given domain by providing means for understanding complex work and knowledge processes that are supported by technology (for example, e-learning). The approach supports identifying factors......This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice......, and workshops as a research methodology. Focusing primarily on the latter, this paper presents five studies on upper secondary and higher education teachers’ professional development and on teaching and learning through video conferencing. Through analysis and discussion of these studies’ findings, we argue...

  13. RNAmute: RNA secondary structure mutation analysis tool

    Directory of Open Access Journals (Sweden)

    Barash Danny

    2006-04-01

    Full Text Available Abstract Background RNAMute is an interactive Java application that calculates the secondary structure of all single point mutations, given an RNA sequence, and organizes them into categories according to their similarity with respect to the wild type predicted structure. The secondary structure predictions are performed using the Vienna RNA package. Several alternatives are used for the categorization of single point mutations: Vienna's RNAdistance based on dot-bracket representation, as well as tree edit distance and second eigenvalue of the Laplacian matrix based on Shapiro's coarse grain tree graph representation. Results Selecting a category in each one of the processed tables lists all single point mutations belonging to that category. Selecting a mutation displays a graphical drawing of the single point mutation and the wild type, and includes basic information such as associated energies, representations and distances. RNAMute can be used successfully with very little previous experience and without choosing any parameter value alongside the initial RNA sequence. The package runs under LINUX operating system. Conclusion RNAMute is a user friendly tool that can be used to predict single point mutations leading to conformational rearrangements in the secondary structure of RNAs. In several cases of substantial interest, notably in virology, a point mutation may lead to a loss of important functionality such as the RNA virus replication and translation initiation because of a conformational rearrangement in the secondary structure.

  14. Bayesian data analysis tools for atomic physics

    Science.gov (United States)

    Trassinelli, Martino

    2017-10-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested_fit to calculate the different probability distributions and other related quantities. Nested_fit is a Fortran90/Python code developed during the last years for analysis of atomic spectra. As indicated by the name, it is based on the nested algorithm, which is presented in details together with the program itself.

  15. Biofuel transportation analysis tool : description, methodology, and demonstration scenarios

    Science.gov (United States)

    2014-01-01

    This report describes a Biofuel Transportation Analysis Tool (BTAT), developed by the U.S. Department of Transportation (DOT) Volpe National Transportation Systems Center (Volpe) in support of the Department of Defense (DOD) Office of Naval Research ...

  16. Generalized Aliasing as a Basis for Program Analysis Tools

    National Research Council Canada - National Science Library

    O'Callahan, Robert

    2000-01-01

    .... This dissertation describes the design of a system, Ajax, that addresses this problem by using semantics-based program analysis as the basis for a number of different tools to aid Java programmers...

  17. The environment power system analysis tool development program

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  18. Failure Modes and Effects Analysis (FMEA) Assistant Tool

    Data.gov (United States)

    National Aeronautics and Space Administration — The FMEA Assistant tool offers a new and unique approach to assist hardware developers and safety analysts perform failure analysis by using model based systems...

  19. Surface Operations Data Analysis and Adaptation Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  20. Constructing Social Networks From Secondary Storage With Bulk Analysis Tools

    Science.gov (United States)

    2016-06-01

    displays more than one social network that needs to be separated. The drive owner of Component d9c1 was an administrator for an email server, and would...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS CONSTRUCTING SOCIAL NETWORKS FROM SECONDARY STORAGE WITH BULK ANALYSIS TOOLS by Janina L. Green...AND SUBTITLE CONSTRUCTING SOCIAL NETWORKS FROM SECONDARY STORAGE WITH BULK ANALYSIS TOOLS 5. FUNDING NUMBERS 6. AUTHOR(S) Janina L. Green 7. PERFORMING

  1. 3rd International Workshop on Intelligent Data Analysis and Management (IDAM)

    CERN Document Server

    Wang, Leon; Hong, Tzung-Pei; Yang, Hsin-Chang; Ting, I-Hsien

    2013-01-01

    These papers on Intelligent Data Analysis and Management (IDAM) examine issues related to the research and applications of Artificial Intelligence techniques in data analysis and management across a variety of disciplines. The papers derive from the 2013 IDAM conference in Kaohsiung ,Taiwan. It is an interdisciplinary research field involving academic researchers in information technologies, computer science, public policy, bioinformatics, medical informatics, and social and behavior studies, etc. The techniques studied include (but are not limited to): data visualization, data pre-processing, data engineering, database mining techniques, tools and applications, evolutionary algorithms, machine learning, neural nets, fuzzy logic, statistical pattern recognition, knowledge filtering, and post-processing, etc.

  2. An E-learning Tool as Living Book for Knowledge Preservation in Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Bode, P.; Landsberger, S.; Ridikas, D.; Iunikova, A.

    2016-01-01

    Full text: Neutron activation analysis (NAA) is one of the most common activities in research reactors, irrespective of their power size. Although being a well-established technique, it has been observed that retirement and/or departure of experienced staff often results in gaps in knowledge of methodological principles and metrological aspects of the NAA technique employed, both within the remaining NAA team and for new recruits. Existing books are apparently not sufficient to timely transfer the knowledge on the practice of NAA. As such, the IAEA has launched a project resulting in an E-learning tool for NAA, consisting of lecture noes, animations, practical exercises and self-assessments. The tool includes more than 30 modules and has been reviewed and tested during an IAEA workshop by experienced and new coming practitioners. It is expected that the tool will be developed as a ‘living book’ which can be permanently updated and extended and serve as an archive, fostering unpublished experimental experiences. (author

  3. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  4. INDICO Workshop

    CERN Multimedia

    CERN. Geneva; Fabbrichesi, Marco

    2004-01-01

    The INtegrated DIgital COnferencing EU project has finished building a complete software solution to facilitate the MANAGEMENT OF CONFERENCES, workshops, schools or simple meetings from their announcement to their archival. Everybody involved in the organization of events is welcome to join this workshop, in order to understand the scope of the project and to see demonstrations of the various features.

  5. Stakeholder Views of Nanosilver Linings: Macroethics Education and Automated Text Analysis Through Participatory Governance Role Play in a Workshop Format.

    Science.gov (United States)

    Dempsey, Joshua; Stamets, Justin; Eggleson, Kathleen

    2017-06-01

    The Nanosilver Linings role play case offers participants first-person experience with interpersonal interaction in the context of the wicked problems of emerging technology macroethics. In the fictional scenario, diverse societal stakeholders convene at a town hall meeting to consider whether a nanotechnology-enabled food packaging industry should be offered incentives to establish an operation in their economically struggling Midwestern city. This original creative work was built with a combination of elements, selected for their established pedagogical efficacy (e.g. active learning, case-based learning) and as topical dimensions of the realistic scenario (e.g. nanosilver in food packaging, occupational safety and health). The product life cycle is used as a framework for integrated consideration of scientific, societal, and ethical issues. The Nanosilver Linings hypothetical case was delivered through the format of the 3-hour workshop Ethics when Biocomplexity meets Human Complexity, providing an immersive, holistic ethics learning experience for STEM graduate students. Through their participation in the Nanosilver Linings case and Ethics when Biocomplexity meets Human Complexity workshop, four cohorts of science and engineering doctoral students reported the achievement of specific learning objectives pertaining to a range of macroethics concepts and professional practices, including stakeholder perspectives, communication, human values, and ethical frameworks. Automated text analysis of workshop transcripts revealed differences in sentiment and in ethical framework (consequentialism/deontology) preference between societal stakeholder roles. These resources have been recognized as ethics education exemplars by the U.S. National Academy of Engineering .

  6. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a complicated endeavor....... In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete programs, and checking of type...

  7. NMR spectroscopy: a tool for conformational analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto, E-mail: rittner@iqm.unicamp.br [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica. Lab. de Fisico-Quimica Organica; Freitas, Matheus P. [Universidade Federal de Lavras (UFLA), MG (Brazil). Dept. de Qumica

    2011-07-01

    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  8. NMR spectroscopy: a tool for conformational analysis

    International Nuclear Information System (INIS)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto; Freitas, Matheus P.

    2011-01-01

    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  9. Advanced tools for in vivo skin analysis.

    Science.gov (United States)

    Cal, Krzysztof; Zakowiecki, Daniel; Stefanowska, Justyna

    2010-05-01

    A thorough examination of the skin is essential for accurate disease diagnostics, evaluation of the effectiveness of topically applied drugs and the assessment of the results of dermatologic surgeries such as skin grafts. Knowledge of skin parameters is also important in the cosmetics industry, where the effects of skin care products are evaluated. Due to significant progress in the electronics and computer industries, sophisticated analytic devices are increasingly available for day-to-day diagnostics. The aim of this article is to review several advanced methods for in vivo skin analysis in humans: magnetic resonance imaging, electron paramagnetic resonance, laser Doppler flowmetry and time domain reflectometry. The molecular bases of these techniques are presented, and several interesting applications in the field are discussed. Methods for in vivo assessment of the biomechanical properties of human skin are also reviewed.

  10. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  11. EMERGING MODALITIES FOR SOIL CARBON ANALYSIS: SAMPLING STATISTICS AND ECONOMICS WORKSHOP.

    Energy Technology Data Exchange (ETDEWEB)

    WIELOPOLSKI, L.

    2006-04-01

    The workshop's main objectives are (1) to present the emerging modalities for analyzing carbon in soil, (2) to assess their error propagation, (3) to recommend new protocols and sampling strategies for the new instrumentation, and, (4) to compare the costs of the new methods with traditional chemical ones.

  12. TxACOL workshop : Texas asphalt concrete overlay design and analysis system.

    Science.gov (United States)

    2010-01-01

    General Information: : -Two workshops were held respectively on Aug. 25 at Paris, Tx and on Oct. 6 at Austin, Tx, : -More than 30 representatives from TxDOT attended, : -Introduction of TxACOL software, key input parameters, and related lab and field...

  13. 77 FR 14814 - Tobacco Product Analysis; Scientific Workshop; Request for Comments

    Science.gov (United States)

    2012-03-13

    ... workshop: 1. Availability, Manufacture, and Characterization of Tobacco Reference Products A. Discuss the... reference products used in research and manufacturing? B. Discuss the advantages and disadvantages of using... are applied? F. Describe characteristics of a tobacco reference product that would provide advantages...

  14. Applying Instructional Design Theories to Bioinformatics Education in Microarray Analysis and Primer Design Workshops

    Science.gov (United States)

    Shachak, Aviv; Ophir, Ron; Rubin, Eitan

    2005-01-01

    The need to support bioinformatics training has been widely recognized by scientists, industry, and government institutions. However, the discussion of instructional methods for teaching bioinformatics is only beginning. Here we report on a systematic attempt to design two bioinformatics workshops for graduate biology students on the basis of…

  15. Comparative analysis based on the replies to the questionnaire and on the discussions which took place during the workshop

    International Nuclear Information System (INIS)

    2003-01-01

    In the order to help participants best prepare for the discussions during the Workshop on the Indemnification of Damage in the Event of a Nuclear Accident, the NEA Secretariat, in co-operation with the French authorities, drafted a questionnaire on the implementation of third party liability and indemnification regimes applicable to nuclear damage resulting from a nuclear emergency situation. This questionnaire was circulated to countries invited to participate in the Workshop to serve as a basis for exchanges. The representatives of countries which, in light of their geographical situation in relation to the Gravelines nuclear power plant where the nuclear accident was simulated on 22 May 2001, would be most likely to be concerned by the application of liability regimes following a nuclear accident in France having transboundary effects were first of all asked to reply to this questionnaire. A number of other countries, referred to as 'unaffected' also accepted to reply to the questionnaire. On the basis of the replies to the questionnaire (which are reproduced in Annex I to these Proceedings) as well as the discussions which took place during the Workshop, the Secretariat has carried out a comparative study to the different mechanisms governing the emergency alert and management of a nuclear accident and the indemnification of victims in place in the countries participating in the Workshop. This study aims to provide and overview of the replies to the questionnaire. Furthermore, as this analysis is based on the replies to the questionnaire and the discussions which took place during the workshop, one should therefore not assume that where a country is not expressly included in the Secretariat's conclusions, that country has not established measures on this subject. The countries whose alert mechanisms and measures governing indemnification of nuclear damage have been included in this analysis are as follows: Austria, Belgium, Bulgaria, Canada, the Czech Republic

  16. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    Science.gov (United States)

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  17. A computer aided tolerancing tool, II: tolerance analysis

    NARCIS (Netherlands)

    Salomons, O.W.; Haalboom, F.J.; Jonge poerink, H.J.; van Slooten, F.; van Slooten, F.; van Houten, Frederikus J.A.M.; Kals, H.J.J.

    1996-01-01

    A computer aided tolerance analysis tool is presented that assists the designer in evaluating worst case quality of assembly after tolerances have been specified. In tolerance analysis calculations, sets of equations are generated. The number of equations can be restricted by using a minimum number

  18. Tools for analysis of Dirac structures on banach spaces

    NARCIS (Netherlands)

    Iftime, Orest V.; Sandovici, Adrian; Golo, Goran

    2005-01-01

    Power-conserving and Dirac structures are known as an approach to mathematical modeling of physical engineering systems. In this paper connections between Dirac structures and well known tools from standard functional analysis are presented. The analysis can be seen as a possible starting framework

  19. Gender analysis of use of participatory tools among extension workers

    African Journals Online (AJOL)

    (c2 = 0.833, p = 0.361; t = 0.737, p = 0.737, CC = 0.396) Participatory tools used by both male and female extension personnel include resource map, mobility map, transect map, focus group discussion, venn diagram, seasonal calendar, SWOT analysis, semistructured interview, daily activity schedule, resource analysis, ...

  20. Printed Circuit Board Signal Integrity Analysis at CERN: POSTER 1/2 Workshop

    CERN Document Server

    Evans, John

    2001-01-01

    Printed circuit board (PCB) design layout for digital circuits has become a critical issue due to increasing clock frequencies and faster signal switching times. The Cadence SPECCTRAQuest package allows the detailed signal integrity (SI) analysis of designs from the schematic-entry phase to the board level. It is fully integrated into the Cadence PCB design flow and can be used to reduce prototype iterations and improve production robustness. Examples are given on how the tool can help engineers to make design choices and how to optimise board layout for electrical performance. Case studies of work done for LHC detectors are presented.

  1. Printed Circuit Board Signal Integrity Analysis at CERN: POSTER 2/2 Workshop

    CERN Document Server

    Evans, John

    2001-01-01

    Printed circuit board (PCB) design layout for digital circuits has become a critical issue due to increasing clock frequencies and faster signal switching times. The Cadence SPECCTRAQuest package allows the detailed signal integrity (SI) analysis of designs from the schematic-entry phase to the board level. It is fully integrated into the Cadence PCB design flow and can be used to reduce prototype iterations and improve production robustness. Examples are given on how the tool can help engineers to make design choices and how to optimise board layout for electrical performance. Case studies of work done for LHC detectors are presented.

  2. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  3. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  4. Simplified Analysis Tool for Ship-Ship Collision

    DEFF Research Database (Denmark)

    Yamada, Yasuhira; Pedersen, Preben Terndrup

    2007-01-01

    The purpose of this paper is to develop a simplified ship collision analysis tool in order to rapidly estimate the structural damage and energy absorption of both striking and struck ships as well as prediction of rupture of cargo oil tanks of struck tankers. The present tool calculates external...... to the collision scenario thatwhere a VLCC in ballast condition collides perpendicularly with the mid part of another D/H VLCC in fully loaded condition. The results obtained from the present tool are compared with those obtained by large scale FEA, and fairy good agreements are achieved. The applicability...

  5. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  6. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  7. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Divisions's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed off line from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving off-line data analysis environment on the USC computers

  8. Network workshop

    DEFF Research Database (Denmark)

    Bruun, Jesper; Evans, Robert Harry

    2014-01-01

    This paper describes the background for, realisation of and author reflections on a network workshop held at ESERA2013. As a new research area in science education, networks offer a unique opportunity to visualise and find patterns and relationships in complicated social or academic network data...... research community. With this workshop, participants were offered a way into network science based on authentic educational research data. The workshop was constructed as an inquiry lesson with emphasis on user autonomy. Learning activities had participants choose to work with one of two cases of networks...

  9. Emerging methods and tools for environmental risk assessment, decision-making, and policy for nanomaterials: summary of NATO Advanced Research Workshop

    DEFF Research Database (Denmark)

    Linkov, I; Steevens, J; Adlakha-Hutcheon, G

    2009-01-01

    the environmental effects and health risks associated with nanomaterials is limited and sometimes contradictory. This article summarizes the conclusions of a 2008 NATO workshop designed to evaluate the wide-scale implications (e.g., benefits, risks, and costs) of the use of nanomaterials on human health...... and the environment. A unique feature of this workshop was its interdisciplinary nature and focus on the practical needs of policy decision makers. Workshop presentations and discussion panels were structured along four main themes: technology and benefits, human health risk, environmental risk, and policy...

  10. Analysis of the Requirements Generation Process for the Logistics Analysis and Wargame Support Tool

    Science.gov (United States)

    2017-06-01

    REQUIREMENTS GENERATION PROCESS FOR THE LOGISTICS ANALYSIS AND WARGAME SUPPORT TOOL by Jonathan M. Swan June 2017 Thesis Advisor...GENERATION PROCESS FOR THE LOGISTICS ANALYSIS AND WARGAME SUPPORT TOOL 5. FUNDING NUMBERS 6. AUTHOR(S) Jonathan M. Swan 7. PERFORMING ORGANIZATION...maximum 200 words) This thesis conducts an analysis of the system requirements for the Logistics Analysis and Wargame Support Tool (LAWST). It studies

  11. CTBTO international cooperation workshop

    International Nuclear Information System (INIS)

    1999-01-01

    The International Cooperation Workshop took place in Vienna, Austria, on 16 and 17 November 1998, with the participation of 104 policy/decision makers, Research and Development managers and diplomatic representatives from 58 States Signatories to the Comprehensive Nuclear-Test Ban Treaty (CTBT). The Workshop attempted to develop Treaty stipulations to: promote cooperation to facilitate and participate in the fullest possible exchange relating to technologies used in the verification of the Treaty; enable member states to strengthen national implementation of verification measures, and to benefit from the application of such technologies for peaceful purposes. The potential benefits arising from the CTBT monitoring, analysis and data communication systems are multifaceted, and as yet unknown. This Workshop provided the opportunity to examine some of these possibilities. An overview of the CTBT verification regime on the general aspects of the four monitoring technologies (seismic, hydro-acoustic, infrasound and radionuclides), including some of the elements that are the subject of international cooperation, were presented and discussed. Questions were raised on the potential benefits that can be derived by participating in the CTBT regime and broad-based discussions took place. Several concrete proposals on ways and means to facilitate and promote cooperation among States Signatories were suggested. The main points discussed by the participants can be summarized as follows: the purpose of the CTBT Organization is to assist member states to monitor Treaty compliance; the CTBT can be a highly effective technological tool which can generate wide-ranging data, which can be used for peaceful purposes; there are differences in the levels of technology development in the member states that is why peaceful applications should be supported by the Prep Com for the benefit of all member states, whether developed or developing, training being a key element to optimize the CTBT

  12. SmashCommunity: A metagenomic annotation and analysis tool

    DEFF Research Database (Denmark)

    Arumugam, Manimozhiyan; Harrington, Eoghan D; Foerstner, Konrad U

    2010-01-01

    SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate the quanti......SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate...

  13. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  14. Analysis of design tool attributes with regards to sustainability benefits

    Science.gov (United States)

    Zain, S.; Ismail, A. F.; Ahmad, Z.; Adesta, E. Y. T.

    2018-01-01

    The trend of global manufacturing competitiveness has shown a significant shift from profit and customer driven business to a more harmonious sustainability paradigm. This new direction, which emphasises the interests of three pillars of sustainability, i.e., social, economic and environment dimensions, has changed the ways products are designed. As a result, the roles of design tools in the product development stage of manufacturing in adapting to the new strategy are vital and increasingly challenging. The aim of this paper is to review the literature on the attributes of design tools with regards to the sustainability perspective. Four well-established design tools are selected, namely Quality Function Deployment (QFD), Failure Mode and Element Analysis (FMEA), Design for Six Sigma (DFSS) and Design for Environment (DfE). By analysing previous studies, the main attributes of each design tool and its benefits with respect to each sustainability dimension throughout four stages of product lifecycle are discussed. From this study, it is learnt that each of the design tools contributes to the three pillars of sustainability either directly or indirectly, but they are unbalanced and not holistic. Therefore, the prospective of improving and optimising the design tools is projected, and the possibility of collaboration between the different tools is discussed.

  15. A Comparative Analysis of Life-Cycle Assessment Tools for ...

    Science.gov (United States)

    We identified and evaluated five life-cycle assessment tools that community decision makers can use to assess the environmental and economic impacts of end-of-life (EOL) materials management options. The tools evaluated in this report are waste reduction mode (WARM), municipal solid waste-decision support tool (MSW-DST), solid waste optimization life-cycle framework (SWOLF), environmental assessment system for environmental technologies (EASETECH), and waste and resources assessment for the environment (WRATE). WARM, MSW-DST, and SWOLF were developed for US-specific materials management strategies, while WRATE and EASETECH were developed for European-specific conditions. All of the tools (with the exception of WARM) allow specification of a wide variety of parameters (e.g., materials composition and energy mix) to a varying degree, thus allowing users to model specific EOL materials management methods even outside the geographical domain they are originally intended for. The flexibility to accept user-specified input for a large number of parameters increases the level of complexity and the skill set needed for using these tools. The tools were evaluated and compared based on a series of criteria, including general tool features, the scope of the analysis (e.g., materials and processes included), and the impact categories analyzed (e.g., climate change, acidification). A series of scenarios representing materials management problems currently relevant to c

  16. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where an a...... provide the possibility for the designer to work both with the aesthetics as well as the technical aspects of architectural design.......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered...

  17. Tools for T-RFLP data analysis using Excel.

    Science.gov (United States)

    Fredriksson, Nils Johan; Hermansson, Malte; Wilén, Britt-Marie

    2014-11-08

    Terminal restriction fragment length polymorphism (T-RFLP) analysis is a DNA-fingerprinting method that can be used for comparisons of the microbial community composition in a large number of samples. There is no consensus on how T-RFLP data should be treated and analyzed before comparisons between samples are made, and several different approaches have been proposed in the literature. The analysis of T-RFLP data can be cumbersome and time-consuming, and for large datasets manual data analysis is not feasible. The currently available tools for automated T-RFLP analysis, although valuable, offer little flexibility, and few, if any, options regarding what methods to use. To enable comparisons and combinations of different data treatment methods an analysis template and an extensive collection of macros for T-RFLP data analysis using Microsoft Excel were developed. The Tools for T-RFLP data analysis template provides procedures for the analysis of large T-RFLP datasets including application of a noise baseline threshold and setting of the analysis range, normalization and alignment of replicate profiles, generation of consensus profiles, normalization and alignment of consensus profiles and final analysis of the samples including calculation of association coefficients and diversity index. The procedures are designed so that in all analysis steps, from the initial preparation of the data to the final comparison of the samples, there are various different options available. The parameters regarding analysis range, noise baseline, T-RF alignment and generation of consensus profiles are all given by the user and several different methods are available for normalization of the T-RF profiles. In each step, the user can also choose to base the calculations on either peak height data or peak area data. The Tools for T-RFLP data analysis template enables an objective and flexible analysis of large T-RFLP datasets in a widely used spreadsheet application.

  18. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  19. An Online Image Analysis Tool for Science Education

    Science.gov (United States)

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

    2008-01-01

    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

  20. Tools of Audience Analysis in Contemporary Political Campaigns.

    Science.gov (United States)

    Friedenberg, Robert V.

    This paper examines two basic tools of audience analysis as they are used in contemporary political campaingning: public opinion polls and interpretations of voter statistics. The raw data used in the statistical analyses reported in this investigation come from national polls and voter statistics provided to Republican candidates running in local…

  1. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    ABSTRACT: Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical. Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a ...

  2. The Adversarial Route Analysis Tool: A Web Application

    Energy Technology Data Exchange (ETDEWEB)

    Casson, William H. Jr. [Los Alamos National Laboratory

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  3. Rapid Benefit Indicators (RBI) Spatial Analysis Tools - Manual

    Science.gov (United States)

    The Rapid Benefit Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration - A Rapid Benefits Indicators Approach for Decision Makers. This spatial analysis tool is intended to be used to analyze existing spatial informatio...

  4. Interactive exploratory data analysis tool in Alzheimer’s disease

    Directory of Open Access Journals (Sweden)

    Diana Furcila

    2015-04-01

    Thus, MorExAn provide us the possibility to relate histopathological data with neuropsychological and clinical variables. The aid of this interactive visualization tool brings us the possibility to find unexpected conclusions beyond the insight provided by simple statistics analysis, as well as to improve neuroscientists’ productivity.

  5. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  6. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a reference guide in ...

  7. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    International Nuclear Information System (INIS)

    Clough, D; Fletcher, S; Longstaff, A P; Willoughby, P

    2012-01-01

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  8. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  9. Comparative analysis of marine ecosystems: workshop on predator-prey interactions

    DEFF Research Database (Denmark)

    Bailey, Kevin M.; Ciannelli, Lorenzo; Hunsicker, Mary

    2010-01-01

    in marine ecosystems was held at the Oregon State University, Corvallis, OR, USA on 16–18 March 2010. The meeting brought together scientists from diverse fields of expertise including theoretical ecology, animal behaviour, fish and seabird ecology, statistics, fisheries science and ecosystem modelling......Climate and human influences on marine ecosystems are largely manifested by changes in predator–prey interactions. It follows that ecosystem-based management of the world's oceans requires a better understanding of food web relationships. An international workshop on predator–prey interactions...

  10. Workshop meeting

    International Nuclear Information System (INIS)

    Veland, Oeystein

    2004-04-01

    1-2 September 2003 the Halden Project arranged a workshop on 'Innovative Human-System Interfaces and their Evaluation'. This topic is new in the HRP 2003-2005 programme, and it is important to get feedback from member organizations to the work that is being performed in Halden. It is also essential that relevant activities and experiences in this area from the member organizations are shared with the Halden staff and other HRP members. Altogether 25 persons attended the workshop. The workshop had a mixture of presentations and discussions, and was chaired by Dominique Pirus of EDF, France. Day one focused on the HRP/IFE activities on Human-System Interface design, including Function-oriented displays, Ecological Interface Design, Task-oriented displays, as well as work on innovative display solutions for the oil and gas domain. There were also presentations of relevant work in France, Japan and the Czech Republic. The main focus of day two was the verification and validation of human-system interfaces, with presentations of work at HRP on Human-Centered Validation, Criteria-Based System Validation, and Control Room Verification and Validation. The chairman concluded that it was a successful workshop, although one could have had more time for discussions. The Halden Project got valuable feedback and viewpoints on this new topic during the workshop, and will consider all recommendations related to the future work in this area. (Author)

  11. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  12. Analysis for Non-Traditional Security Challenges: Methods and Tools

    Science.gov (United States)

    2006-11-20

    Course of Action COCOM Combatant Commander COI Community of Interest CPB Cultural Preparation of the Battlefield CPM Critical Path Method DARPA Defense...Secretary of Defense (Program Analysis and Evaluation) PACOM United States Pacific Command PERT Program Evaluation Review Technique PMESII Political...Availability .U .aid5.ss of each pos f1 00" tool.Ow, pomty al wi c ro.sd. J Metod F pl Tools I I LL J L L L Dvnmuoc jI.1n~stl osb M00io ~g~osgdowr01Vfl) X x

  13. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  14. Linguistics and cognitive linguistics as tools of pedagogical discourse analysis

    Directory of Open Access Journals (Sweden)

    Kurovskaya Yulia G.

    2016-01-01

    Full Text Available The article discusses the use of linguistics and cognitive linguistics as tools of pedagogical discourse analysis, thus establishing a new branch of pedagogy called pedagogical semiology that is concerned with students’ acquisition of culture encoded in symbols and the way students’ sign consciousness formed in the context of learning affects their world cognition and interpersonal communication. The article introduces a set of tools that would enable the teacher to organize the educational process in compliance with the rules of language as a sign system applied to the context of pedagogy and with the formation of younger generation’s language picture of the world.

  15. The Development of a Humanitarian Health Ethics Analysis Tool.

    Science.gov (United States)

    Fraser, Veronique; Hunt, Matthew R; de Laat, Sonya; Schwartz, Lisa

    2015-08-01

    Introduction Health care workers (HCWs) who participate in humanitarian aid work experience a range of ethical challenges in providing care and assistance to communities affected by war, disaster, or extreme poverty. Although there is increasing discussion of ethics in humanitarian health care practice and policy, there are very few resources available for humanitarian workers seeking ethical guidance in the field. To address this knowledge gap, a Humanitarian Health Ethics Analysis Tool (HHEAT) was developed and tested as an action-oriented resource to support humanitarian workers in ethical decision making. While ethical analysis tools increasingly have become prevalent in a variety of practice contexts over the past two decades, very few of these tools have undergone a process of empirical validation to assess their usefulness for practitioners. A qualitative study consisting of a series of six case-analysis sessions with 16 humanitarian HCWs was conducted to evaluate and refine the HHEAT. Participant feedback inspired the creation of a simplified and shortened version of the tool and prompted the development of an accompanying handbook. The study generated preliminary insight into the ethical deliberation processes of humanitarian health workers and highlighted different types of ethics support that humanitarian workers might find helpful in supporting the decision-making process.

  16. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  17. PREFACE: Proceedings of the 11th European Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis

    Science.gov (United States)

    2010-07-01

    This volume of IOP Conference Series: Materials Science and Engineering contains papers from the 11th Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis which took place from 10-14 May 2009 in the Hotel Faltom, Gdynia, Poland. The primary aim of this series of workshops is to assess the state-of-the-art and reliability of microbeam analysis techniques. The workshops also provide a forum where students and young scientists starting out on careers in microbeam analysis can meet and discuss with the established experts. The workshops have a very distinct format comprising invited plenary lectures by internationally recognized experts, poster presentations by the participants and round table discussions on the key topics led by specialists in the field. For this workshop EMAS invited speakers on the following topics: EPMA, EBSD, fast energy-dispersive X-ray spectroscopy, three-dimensional microanalysis, and micro-and nanoanalysis in the natural resources industry. The continuing relevance of the EMAS workshops and the high regard in which they are held internationally can be seen from the fact that 69 posters from 16 countries were on display at the meeting and that the participants came from as far away as Japan and the USA. A number of participants with posters were invited to give short oral presentations of their work in two dedicated sessions. As at previous workshops there was also a special oral session for young scientists. Small cash prizes were awarded for the three best posters and for the best oral presentation by a young scientist. The prize for the best poster went to the contribution by G Tylko, S Dubchak, Z Banach and K Turnau, entitled Monte Carlo simulation for an assessment of standard validity and quantitative X-ray microanalysis in plant. Joanna Wojewoda-Budka of the Institute of Metallurgy and Materials Science, Krakow, received the prize for the best oral presentation by a

  18. EDITORIAL: Proceedings of the 12th Gravitational Wave Data Analysis Workshop (GWDAW 12), Cambridge, MA, USA, 13 16 December 2007

    Science.gov (United States)

    Hughes, S.; Katsavounidis, E.

    2008-09-01

    It was a great pleasure and an honor for us to host the 12th Gravitational Wave Data Analysis Workshop (GWDAW) at MIT and the LIGO Laboratory in Cambridge, Massachusetts, the place where this workshop series started in 1996. This time the conference was held at the conference facilities of the Royal Sonesta Hotel in Cambridge from 13 16 December, 2007. This 12th GWDAW found us with the ground interferometers having just completed their most sensitive search for gravitational waves and as they were starting their preparation to bring online and/or propose more sensitive instruments. Resonant mass detectors continued to observe the gravitational wave sky with instruments that have been operating now for many years. LISA, the Laser Interferometer Space Antenna, was recently reviewed by NASA's Beyond Einstein Program Assessment Committee (BEPAC) convened by the National Research Council (NRC) and found that 'on purely scientific grounds LISA is the mission that is the most promising and least scientifically risky…thus, the committee gave LISA its highest scientific ranking'. Even so, JDEM, the Joint Dark Energy Mission, was identified to go first, with LISA following a few years after. New methods, analysis ideas, results from the analysis of data collected by the instruments, as well as Mock Data Challenges for LISA were reported in this conference. While data from the most recent runs of the instruments are still being analyzed, the first upper limit results show how even non-detection statements can be interesting astrophysics. Beyond these traditional aspects of GWDAW though, for the first time in this workshop we tried to bring the non-gravitational wave physics and astronomy community on board in order to present, discuss and propose ways to work together as we pursue the first detection of gravitational waves and as we hope to transition to gravitational wave astronomy in the near future. Overview talks by colleagues leading observations in the electromagnetic

  19. SMART: Statistical Metabolomics Analysis-An R Tool.

    Science.gov (United States)

    Liang, Yu-Jen; Lin, Yu-Ting; Chen, Chia-Wei; Lin, Chien-Wei; Chao, Kun-Mao; Pan, Wen-Harn; Yang, Hsin-Chou

    2016-06-21

    Metabolomics data provide unprecedented opportunities to decipher metabolic mechanisms by analyzing hundreds to thousands of metabolites. Data quality concerns and complex batch effects in metabolomics must be appropriately addressed through statistical analysis. This study developed an integrated analysis tool for metabolomics studies to streamline the complete analysis flow from initial data preprocessing to downstream association analysis. We developed Statistical Metabolomics Analysis-An R Tool (SMART), which can analyze input files with different formats, visually represent various types of data features, implement peak alignment and annotation, conduct quality control for samples and peaks, explore batch effects, and perform association analysis. A pharmacometabolomics study of antihypertensive medication was conducted and data were analyzed using SMART. Neuromedin N was identified as a metabolite significantly associated with angiotensin-converting-enzyme inhibitors in our metabolome-wide association analysis (p = 1.56 × 10(-4) in an analysis of covariance (ANCOVA) with an adjustment for unknown latent groups and p = 1.02 × 10(-4) in an ANCOVA with an adjustment for hidden substructures). This endogenous neuropeptide is highly related to neurotensin and neuromedin U, which are involved in blood pressure regulation and smooth muscle contraction. The SMART software, a user guide, and example data can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/metabolomics/SMART.htm .

  20. Formal Analysis Tools for the Synchronous Aspect Language Larissa

    Directory of Open Access Journals (Sweden)

    Stauch David

    2008-01-01

    Full Text Available Abstract We present two tools for the formal analysis of the aspect language Larissa, which extends the simple synchronous language Argos. The first tool concerns the combination of design-by-contract with Larissa aspects, and shows how we can apply an aspect not only to a program, but to a specification of programs in form of a contract, and obtain a new contract. The second concerns aspect interferences, that is, aspects that influence each other in an unintended way if they are applied to the same program. We present a way to weave aspects in a less conflict-prone manner, and a means to detect remaining conflicts statically. These tools are quite powerful, compared to those available for other aspect languages.

  1. Formal Analysis Tools for the Synchronous Aspect Language Larissa

    Directory of Open Access Journals (Sweden)

    David Stauch

    2008-11-01

    Full Text Available We present two tools for the formal analysis of the aspect language Larissa, which extends the simple synchronous language Argos. The first tool concerns the combination of design-by-contract with Larissa aspects, and shows how we can apply an aspect not only to a program, but to a specification of programs in form of a contract, and obtain a new contract. The second concerns aspect interferences, that is, aspects that influence each other in an unintended way if they are applied to the same program. We present a way to weave aspects in a less conflict-prone manner, and a means to detect remaining conflicts statically. These tools are quite powerful, compared to those available for other aspect languages.

  2. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  3. Microscopy image segmentation tool: Robust image data analysis

    International Nuclear Information System (INIS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-01-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy

  4. Is motion analysis a valid tool for assessing laparoscopic skill?

    Science.gov (United States)

    Mason, John D; Ansell, James; Warren, Neil; Torkington, Jared

    2013-05-01

    The use of simulation for laparoscopic training has led to the development of objective tools for skills assessment. Motion analysis represents one area of focus. This study was designed to assess the evidence for the use of motion analysis as a valid tool for laparoscopic skills assessment. Embase, MEDLINE and PubMed were searched using the following domains: (1) motion analysis, (2) validation and (3) laparoscopy. Studies investigating motion analysis as a tool for assessment of laparoscopic skill in general surgery were included. Common endpoints in motion analysis metrics were compared between studies according to a modified form of the Oxford Centre for Evidence-Based Medicine levels of evidence and recommendation. Thirteen studies were included from 2,039 initial papers. Twelve (92.3 %) reported the construct validity of motion analysis across a range of laparoscopic tasks. Of these 12, 5 (41.7 %) evaluated the ProMIS Augmented Reality Simulator, 3 (25 %) the Imperial College Surgical Assessment Device (ICSAD), 2 (16.7 %) the Hiroshima University Endoscopic Surgical Assessment Device (HUESAD), 1 (8.33 %) the Advanced Dundee Endoscopic Psychomotor Tester (ADEPT) and 1 (8.33 %) the Robotic and Video Motion Analysis Software (ROVIMAS). Face validity was reported by 1 (7.7 %) study each for ADEPT and ICSAD. Concurrent validity was reported by 1 (7.7 %) study each for ADEPT, ICSAD and ProMIS. There was no evidence for predictive validity. Evidence exists to validate motion analysis for use in laparoscopic skills assessment. Valid parameters are time taken, path length and number of hand movements. Future work should concentrate on the conversion of motion data into competency-based scores for trainee feedback.

  5. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    Science.gov (United States)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  6. Procrustes rotation as a diagnostic tool for projection pursuit analysis.

    Science.gov (United States)

    Wentzell, Peter D; Hou, Siyuan; Silva, Carolina Santos; Wicks, Chelsi C; Pimentel, Maria Fernanda

    2015-06-02

    Projection pursuit (PP) is an effective exploratory data analysis tool because it optimizes the projection of high dimensional data using distributional characteristics rather than variance or distance metrics. The recent development of fast and simple PP algorithms based on minimization of kurtosis for clustering data has made this powerful tool more accessible, but under conditions where the sample-to-variable ratio is small, PP fails due to opportunistic overfitting of random correlations to limiting distributional targets. Therefore, some kind of variable compression or data regularization is required in these cases. However, this introduces an additional parameter whose optimization is manually time consuming and subject to bias. The present work describes the use of Procrustes analysis as diagnostic tool that can be used to evaluate the results of PP analysis in an efficient manner. Through Procrustes rotation, the similarity of different PP projections can be examined in an automated fashion with "Procrustes maps" to establish regions of stable projections as a function of the parameter to be optimized. The application of this diagnostic is demonstrated using principal components analysis to compress FTIR spectra from ink samples of ten different brands of pen, and also in conjunction with regularized PP for soybean disease classification. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  8. Multi-Spacecraft Analysis with Generic Visualization Tools

    Science.gov (United States)

    Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.

    2010-12-01

    To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

  9. Evaluation of the Transfer of Permanent Formation: Analysis of an Experience of Workshops on Astronomy

    Science.gov (United States)

    Cano, Elena; Fabregat, Jaime; Ros, Rosa M.

    2016-08-01

    In the framework of a European project to bring astronomy near to children, several permanent teachers training activities were developed. These actions included workshops with teachers from various stages of the educational system. This paper presents the process and results of the evaluation of that training program. It intends to assess the satisfaction of the participants, as well as their learning and their later transfer of formation to the classroom. Barriers encountered in the transfer of formation, some of them linked to the type of training method chosen and other factors derived from personal and institutional conditions, are outlined. Finally, some guidelines for improving the transfer of scientific formation to the classroom in the future are pointed out.

  10. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    Magill, J.; Dreher, R.; Soti, Z.

    2014-01-01

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  11. Analysis of functionality free CASE-tools databases design

    Directory of Open Access Journals (Sweden)

    A. V. Gavrilov

    2016-01-01

    Full Text Available The introduction in the educational process of database design CASEtechnologies requires the institution of significant costs for the purchase of software. A possible solution could be the use of free software peers. At the same time this kind of substitution should be based on even-com representation of the functional characteristics and features of operation of these programs. The purpose of the article – a review of the free and non-profi t CASE-tools database design, as well as their classifi cation on the basis of the analysis functionality. When writing this article were used materials from the offi cial websites of the tool developers. Evaluation of the functional characteristics of CASEtools for database design made exclusively empirically with the direct work with software products. Analysis functionality of tools allow you to distinguish the two categories CASE-tools database design. The first category includes systems with a basic set of features and tools. The most important basic functions of these systems are: management connections to database servers, visual tools to create and modify database objects (tables, views, triggers, procedures, the ability to enter and edit data in table mode, user and privilege management tools, editor SQL-code, means export/import data. CASE-system related to the first category can be used to design and develop simple databases, data management, as well as a means of administration server database. A distinctive feature of the second category of CASE-tools for database design (full-featured systems is the presence of visual designer, allowing to carry out the construction of the database model and automatic creation of the database on the server based on this model. CASE-system related to this categories can be used for the design and development of databases of any structural complexity, as well as a database server administration tool. The article concluded that the

  12. Workshop report

    African Journals Online (AJOL)

    raoul

    2011-10-10

    Oct 10, 2011 ... Nursing partnerships established to build capacity can be an important resource, especially when considering nurses' pivotal role in generating and transferring knowledge to students, who will eventually address complex changes in health care. Notably, the workshop planned in Cameroon was founded ...

  13. Poetry Workshop.

    Science.gov (United States)

    Janeczko, Paul B.

    2000-01-01

    This workshop offers activities to teach students about poetry. After describing haiku as a brief snapshot rather than a story, it explains how to teach poetry using an attached reproducible and poster. The tear-out reproducible sheet teaches students how to write their own haiku, offering a sample one as a model. The poster presents three sample…

  14. Workshop presentation.

    Science.gov (United States)

    2013-12-01

    On December 18, 2013, the research team hosted a workshop at CTR to gather feedback on and : generate discussion of the mode choice model that was developed. : Attendees included the project monitoring committee (PMC) and TTI personnel who staff a he...

  15. Ondernemersplan workshop

    NARCIS (Netherlands)

    Jacques Hartog

    2013-01-01

    Workshop over tips & tricks voor een goed plan - Serie Startup Academy., gehouden op 28-05-2013. Workshopprogramma Value in Business, ViB050. Binnen het CVO Groningen stimuleert het Groningen Center of Enterpreneurship Value050 valorisatie door het ontwikkelen en ondersteunen van ondernemerschap en

  16. Workshop report

    African Journals Online (AJOL)

    raoul

    2011-11-11

    Nov 11, 2011 ... workshop was held in May 2011 in Nairobi, Kenya and was funded by the Canadian Global Health Research Initiatives (GHRI) and the US Centre for Disease Control .... data accuracy, completeness and timely reporting, leading to improvements in ARV stock management in health centres, better access to.

  17. The RUMBA software: tools for neuroimaging data analysis.

    Science.gov (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  18. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  19. Analysis and Prediction of Micromilling Stability with Variable Tool Geometry

    Directory of Open Access Journals (Sweden)

    Ziyang Cao

    2014-11-01

    Full Text Available Micromilling can fabricate miniaturized components using micro-end mill at high rotational speeds. The analysis of machining stability in micromilling plays an important role in characterizing the cutting process, estimating the tool life, and optimizing the process. A numerical analysis and experimental method are presented to investigate the chatter stability in micro-end milling process with variable milling tool geometry. The schematic model of micromilling process is constructed and the calculation formula to predict cutting force and displacements is derived. This is followed by a detailed numerical analysis on micromilling forces between helical ball and square end mills through time domain and frequency domain method and the results are compared. Furthermore, a detailed time domain simulation for micro end milling with straight teeth and helical teeth end mill is conducted based on the machine-tool system frequency response function obtained through modal experiment. The forces and displacements are predicted and the simulation result between variable cutter geometry is deeply compared. The simulation results have important significance for the actual milling process.

  20. Aerospace Power Systems Design and Analysis (APSDA) Tool

    Science.gov (United States)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  1. SOFTWARE TOOLS FOR COMPUTING EXPERIMENT AIMED AT MULTIVARIATE ANALYSIS IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    A. V. Tyurin

    2015-09-01

    Full Text Available A concept for organization and planning of computational experiment aimed at implementation of multivariate analysis of complex multifactor models is proposed. It is based on the generation of calculations tree. The logical and structural schemes of the tree are given and software tools, as well, for the automation of work with it: calculation generation, carrying out calculations and analysis of the obtained results. Computer modeling systems and such special-purpose systems as RACS and PRADIS do not solve the problems connected with effective carrying out of computational experiment, consisting of its organization, planning, execution and analysis of the results. Calculation data storage for computational experiment organization is proposed in the form of input and output data tree. Each tree node has a reference to the calculation of model step performed earlier. The storage of calculations tree is realized in a specially organized directory structure. A software tool is proposed for creating and modifying design scheme that stores the structure of one branch of the calculation tree with the view of effective planning of multivariate calculations. A set of special-purpose software tools gives the possibility for the quick generation and modification of the tree, addition of calculations with step-by-step change in the model factors. To perform calculations, software environment in the form of a graphical user interface for creating and modifying calculation script has been developed. This environment makes it possible to traverse calculation tree in a certain order and to perform serial and parallel initiation of computational modules. To analyze the results, software tool has been developed, operating on the base of the tag tree. It is a special tree that stores input and output data of the calculations in the set of changes form of appropriate model factors. The tool enables to select the factors and responses of the model at various steps

  2. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  3. The Astronomy Workshop

    Science.gov (United States)

    Hamilton, Douglas P.

    2012-05-01

    {\\bf The Astronomy Workshop} (http://janus.astro.umd.edu) is a collection of interactive online educational tools developed for use by students, educators, professional astronomers, and the general public. The more than 20 tools in the Astronomy workshop are rated for ease-of-use, and have been extensively tested in large university survey courses as well as more specialized classes for undergraduate majors and graduate students. Here we briefly describe a few of the available tools. {\\bf Solar Systems Visualizer}: The orbital motions of planets, moons, and asteroids in the Solar System as well as many of the planets in exoplanetary systems are animated at their correct relative speeds in accurate to-scale drawings. Zoom in from the chaotic outer satellite systems of the giant planets all the way to their innermost ring systems. {\\bf Solar System Calculators}: These tools calculate a user-defined mathematical expression simultaneously for all of the Solar System's planets (Planetary Calculator) or moons (Satellite Calculator). Key physical and orbital data are automatically accessed as needed. {\\bf Stellar Evolution}: The "Life of the Sun" tool animates the history of the Sun as a movie, showing students how the size and color of our star has evolved and will evolve over billions of years. In "Star Race," the user selects two stars of different masses and watches their evolution in a split-screeen format that emphasizes the great differences in stellar lifetimes and fates.

  4. Curriculum for Development: Analysis and Review of Processes, Products and Outcomes. Final Report: Sub-Regional Curriculum Workshop (Colombo, Sri Lanka, October 1-30, 1976).

    Science.gov (United States)

    United Nations Educational, Scientific, and Cultural Organization, Bangkok (Thailand). Regional Office for Education in Asia and Oceania.

    Presenting proceedings and materials covered at an Asian curriculum workshop involving 15 participants from 7 countries (Afghanistan, Bangladesh, Indonesia, Malaysia, the Philippines, India, and Sri Lanka), this document includes: a discussion of criteria for curriculum analysis re: health education and nutrition instruction for grades 6-10; a…

  5. Analysis of Design and Delivery of Critical Incident Workshops for Elementary School English as a Foreign Language Teachers in Community of Practice

    Science.gov (United States)

    Chien, Chin-Wen

    2018-01-01

    Language teachers can uncover new understanding of the teaching and learning process through reflecting on critical incidents [Richard, J.C., and T.S.C. Farrell. 2005. "Professional Development for Language Teachers." New York, NY: Cambridge University Press]. Based on the data analysis of workshop handouts, observation notes, and…

  6. Design and Application of the Exploration Maintainability Analysis Tool

    Science.gov (United States)

    Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

    2012-01-01

    Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew

  7. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  8. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    International Nuclear Information System (INIS)

    Plott, B.

    2006-01-01

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  9. GammaWorkshops Proceedings

    International Nuclear Information System (INIS)

    Ramebaeck, H.; Straelberg, E.; Klemola, S.; Nielsen, Sven P.; Palsson, S.E.

    2012-01-01

    Due to a sparse interaction during the last years between practioners in gamma ray spectrometry in the Nordic countries, a NKS activity was started in 2009. This GammaSem was focused on seminars relevant to gamma spectrometry. A follow up seminar was held in 2010. As an outcome of these activities it was suggested that the 2011 meeting should be focused on practical issues, e.g. different corrections needed in gamma spectrometric measurements. This three day's meeting, GammaWorkshops, was held in September at Risoe-DTU. Experts on different topics relevant for gamma spectrometric measurements were invited to the GammaWorkshops. The topics included efficiency transfer, true coincidence summing corrections, self-attenuation corrections, measurement of natural radionuclides (natural decay series), combined measurement uncertainty calculations, and detection limits. These topics covered both lectures and practical sessions. The practical sessions included demonstrations of tools for e.g. corrections and calculations of the above meantioned topics. (Author)

  10. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    Science.gov (United States)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  11. Sixth international wind-diesel workshop

    International Nuclear Information System (INIS)

    1992-01-01

    At a workshop on hybrid wind/diesel power generation systems, papers were presented on international research programs, demonstration projects, wind/diesel deployment strategies and requirements, wind/diesel market development and economics, wind turbine design requirements, and wind/diesel models and analytical tools. Separate abstracts have been prepared for 11 papers from this workshop

  12. Sixth international wind-diesel workshop

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    At a workshop on hybrid wind/diesel power generation systems, papers were presented on international research programs, demonstration projects, wind/diesel deployment strategies and requirements, wind/diesel market development and economics, wind turbine design requirements, and wind/diesel models and analytical tools. Separate abstracts have been prepared for 11 papers from this workshop.

  13. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  14. Workshop presentations

    International Nuclear Information System (INIS)

    Sanden, Per-Olof; Edland, Anne; Reiersen, Craig; Mullins, Peter; Ingemarsson, Karl-Fredrik; Bouchard, Andre; Watts, Germaine; Johnstone, John; Hollnagel, Erik; Ramberg, Patric; Reiman, Teemu

    2009-01-01

    An important part of the workshop was a series of invited presentations. The presentations were intended to both provide the participants with an understanding of various organisational approaches and activities as well as to stimulate the exchange of ideas during the small group discussion sessions. The presentation subjects ranged from current organisational regulations and licensee activities to new organisational research and the benefits of viewing organisations from a different perspective. There were more than a dozen invited presentations. The initial set of presentations gave the participants an overview of the background, structure, and aims of the workshop. This included a short presentation on the results from the regulatory responses to the pre-workshop survey. Representatives from four countries (Sweden, Canada, Finland, and the United Kingdom) expanded upon their survey responses with detailed presentations on both regulatory and licensee safety-related organisational activities in their countries. There were also presentations on new research concerning how to evaluate safety critical organisations and on a resilience engineering perspective to safety critical organisations. Below is the list of the presentations, the slides of which being available in Appendix 2: 1 - Workshop Welcome (Per-Olof Sanden); 2 - CSNI Working Group on Human and Organisational Factors (Craig Reiersen); 3 - Regulatory expectations on justification of suitability of licensee organisational structures, resources and competencies (Anne Edland); 4 - Justifying the suitability of licensee organisational structures, resources and competencies (Karl-Fredrik Ingemarsson); 5 - Nuclear Organisational Suitability in Canada (Andre Bouchard); 6 - Designing and Resourcing for Safety and Effectiveness (Germaine Watts); 7 - Organisational Suitability - What do you need and how do you know that you've got it? (Craig Reiersen); 8 - Suitability of Organisations - UK Regulator's View

  15. Anaphe - OO libraries and tools for data analysis

    International Nuclear Information System (INIS)

    Couet, O.; Ferrero-Merlino, B.; Molnar, Z.; Moscicki, J.T.; Pfeiffer, A.; Sang, M.

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, the authors have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create 'shadow classes' for the Python language, which map the definitions of the Abstract Interfaces almost at a one-to-one level. The authors will give an overview of the architecture and design choices and will present the current status and future developments of the project

  16. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  17. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    Science.gov (United States)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  18. Predicting SPE Fluxes: Coupled Simulations and Analysis Tools

    Science.gov (United States)

    Gorby, M.; Schwadron, N.; Linker, J.; Caplan, R. M.; Wijaya, J.; Downs, C.; Lionello, R.

    2017-12-01

    Presented here is a nuts-and-bolts look at the coupled framework of Predictive Science Inc's Magnetohydrodynamics Around a Sphere (MAS) code and the Energetic Particle Radiation Environment Module (EPREM). MAS simulated coronal mass ejection output from a variety of events can be selected as the MHD input to EPREM and a variety of parameters can be set to run against: bakground seed particle spectra, mean free path, perpendicular diffusion efficiency, etc.. A standard set of visualizations are produced as well as a library of analysis tools for deeper inquiries. All steps will be covered end-to-end as well as the framework's user interface and availability.

  19. Schema for the LANL infrasound analysis tool, infrapy

    Energy Technology Data Exchange (ETDEWEB)

    Dannemann, Fransiska Kate [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Marcillo, Omar Eduardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-14

    The purpose of this document is to define the schema used for the operation of the infrasound analysis tool, infrapy. The tables described by this document extend the CSS3.0 or KB core schema to include information required for the operation of infrapy. This document is divided into three sections, the first being this introduction. Section two defines eight new, infrasonic data processing-specific database tables. Both internal (ORACLE) and external formats for the attributes are defined, along with a short description of each attribute. Section three of the document shows the relationships between the different tables by using entity-relationship diagrams.

  20. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  1. Software Tools for the Analysis of Functional Magnetic Resonance Imaging

    Directory of Open Access Journals (Sweden)

    Mehdi Behroozi

    2012-09-01

    Full Text Available Functional magnetic resonance imaging (fMRI has become the most popular method for imaging of brain functions. Currently, there is a large variety of software packages for the analysis of fMRI data, each providing many features for users. Since there is no single package that can provide all the necessary analyses for the fMRI data, it is helpful to know the features of each software package. In this paper, several software tools have been introduced and they have been evaluated for comparison of their functionality and their features. The description of each program has been discussed and summarized.

  2. Software Tools for the Analysis of functional Magnetic Resonance Imaging

    Directory of Open Access Journals (Sweden)

    Mehdi Behroozi

    2012-12-01

    Full Text Available Functional magnetic resonance imaging (fMRI has become the most popular method for imaging of brain functions. Currently, there is a large variety of software packages for the analysis of fMRI data, each providing many features for users. Since there is no single package that can provide all the necessary analyses for the fMRI data, it is helpful to know the features of each software package. In this paper, several software tools have been introduced and they have been evaluated for comparison of their functionality and their features. The description of each program has been discussed and summarized

  3. Smart roadside initiative macro benefit analysis : user's guide for the benefit-cost analysis tool.

    Science.gov (United States)

    Through the Smart Roadside Initiative (SRI), a Benefit-Cost Analysis (BCA) tool was developed for the evaluation of : various new transportation technologies at a State level and to provide results that could support technology adoption : by a State ...

  4. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    Science.gov (United States)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  5. Identifying rare disease variants in the Genetic Analysis Workshop 17 simulated data: a comparison of several statistical approaches.

    Science.gov (United States)

    Fan, Ruixue; Huang, Chien-Hsun; Lo, Shaw-Hwa; Zheng, Tian; Ionita-Laza, Iuliana

    2011-01-01

    Genome-wide association studies have been successful at identifying common disease variants associated with complex diseases, but the common variants identified have small effect sizes and account for only a small fraction of the estimated heritability for common diseases. Theoretical and empirical studies suggest that rare variants, which are much less frequent in populations and are poorly captured by single-nucleotide polymorphism chips, could play a significant role in complex diseases. Several new statistical methods have been developed for the analysis of rare variants, for example, the combined multivariate and collapsing method, the weighted-sum method and a replication-based method. Here, we apply and compare these methods to the simulated data sets of Genetic Analysis Workshop 17 and thereby explore the contribution of rare variants to disease risk. In addition, we investigate the usefulness of extreme phenotypes in identifying rare risk variants when dealing with quantitative traits. Finally, we perform a pathway analysis and show the importance of the vascular endothelial growth factor pathway in explaining different phenotypes.

  6. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  7. Analysis tools for the interplay between genome layout and regulation.

    Science.gov (United States)

    Bouyioukos, Costas; Elati, Mohamed; Képès, François

    2016-06-06

    Genome layout and gene regulation appear to be interdependent. Understanding this interdependence is key to exploring the dynamic nature of chromosome conformation and to engineering functional genomes. Evidence for non-random genome layout, defined as the relative positioning of either co-functional or co-regulated genes, stems from two main approaches. Firstly, the analysis of contiguous genome segments across species, has highlighted the conservation of gene arrangement (synteny) along chromosomal regions. Secondly, the study of long-range interactions along a chromosome has emphasised regularities in the positioning of microbial genes that are co-regulated, co-expressed or evolutionarily correlated. While one-dimensional pattern analysis is a mature field, it is often powerless on biological datasets which tend to be incomplete, and partly incorrect. Moreover, there is a lack of comprehensive, user-friendly tools to systematically analyse, visualise, integrate and exploit regularities along genomes. Here we present the Genome REgulatory and Architecture Tools SCAN (GREAT:SCAN) software for the systematic study of the interplay between genome layout and gene expression regulation. SCAN is a collection of related and interconnected applications currently able to perform systematic analyses of genome regularities as well as to improve transcription factor binding sites (TFBS) and gene regulatory network predictions based on gene positional information. We demonstrate the capabilities of these tools by studying on one hand the regular patterns of genome layout in the major regulons of the bacterium Escherichia coli. On the other hand, we demonstrate the capabilities to improve TFBS prediction in microbes. Finally, we highlight, by visualisation of multivariate techniques, the interplay between position and sequence information for effective transcription regulation.

  8. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  9. Message Correlation Analysis Tool for NOvA

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  10. Bibliological analysis: security tool in collections of rare books

    Directory of Open Access Journals (Sweden)

    Raphael Diego Greenhalgh

    2015-04-01

    Full Text Available The historical-cultural and economic value of rare book makes the same is often the object of robbery and theft of rare books. Therefore, the guardians of this type of library institutions have to strengthen their safety in order to inhibit such criminal practice. This paper, through literature review has the objective analyzing the possibility of using bibliological analysis as a security tool. Because with the detailed description of the copies is possible to increase the knowledge about the collection of an institution and attribute unequivocal property to rare specimens, that not always receive stamps or other marks, making it difficult to return the books in case of robbery and theft and possible recovery thereof. Therefore, the bibliological analysis individualizes the exemplary, besides allowing the handling and extensive knowledge of this, essential factors to security of the books.

  11. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  12. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  13. Message correlation analysis tool for NOvA

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Qiming [Fermilab; Biery, Kurt A. [Fermilab; Kowalkowski, James B. [Fermilab

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  14. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  15. The Climate Data Analysis Tools (CDAT): Scientific Discovery Made Easy

    Science.gov (United States)

    Doutriaux, C. M.; Williams, D. N.; Drach, R. S.; McCoy, R. B.; Mlaker, V.

    2008-12-01

    In recent years, amount of data available to climate scientists has grown exponentially. Whether we're looking at the increasing number of organizations providing data, the finer resolutions of climate models, or the escalating number of experiments and realizations for those experiments, every aspect of climate research leads to an unprecedented growth of the volume of data to analyze. The recent success and visibility of the Intergovernmental Panel on Climate Change Annual Report 4 (IPCC AR4) is boosting the demand to unprecedented levels and keeping the numbers increasing. Meanwhile, technology available for scientists to analyze the data has remained largely unchanged since the early days. One tool, however, has proven itself flexible enough not only to follow the trend of escalating demand, but also to be ahead of the game: the Climate Data Analysis Tools (CDAT) from the Program for Climate Model Diagnosis and Comparison (PCMDI). While providing the cutting edge technology necessary to distribute the IPCC AR4 data via the Earth System Grid, PCMDI has continuously evolved CDAT to handle new grids and higher definitions, and provide new diagnostics. In the near future, in time for AR5, PCMDI will use CDAT for state-of-the-art remote data analysis in a grid computing environment.

  16. SBOAT: A Stochastic BPMN Analysis and Optimisation Tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching and parame......In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching...... and parameterised reward annotations. SBOAT allows the optimisation of these processes by specifying optimisation goals by means of probabilistic control tree logic (PCTL). Optimisation is performed by means of an evolutionary algorithm where stochastic model checking, in the form of the PRISM model checker......, is used to compute the fitness, the performance of a candidate in terms of the specified goals, of variants of a process. Our evolutionary algorithm approach uses a matrix representation of process models to efficiently allow mutation and crossover of a process model to be performed, allowing broad...

  17. metaSNV: A tool for metagenomic strain level analysis.

    Directory of Open Access Journals (Sweden)

    Paul Igor Costea

    Full Text Available We present metaSNV, a tool for single nucleotide variant (SNV analysis in metagenomic samples, capable of comparing populations of thousands of bacterial and archaeal species. The tool uses as input nucleotide sequence alignments to reference genomes in standard SAM/BAM format, performs SNV calling for individual samples and across the whole data set, and generates various statistics for individual species including allele frequencies and nucleotide diversity per sample as well as distances and fixation indices across samples. Using published data from 676 metagenomic samples of different sites in the oral cavity, we show that the results of metaSNV are comparable to those of MIDAS, an alternative implementation for metagenomic SNV analysis, while data processing is faster and has a smaller storage footprint. Moreover, we implement a set of distance measures that allow the comparison of genomic variation across metagenomic samples and delineate sample-specific variants to enable the tracking of specific strain populations over time. The implementation of metaSNV is available at: http://metasnv.embl.de/.

  18. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  19. Collection Development: A Summary of Workshop Discussions.

    Science.gov (United States)

    Dudley, Norman

    1979-01-01

    Highlights from five workshop sessions held during the Preconference Institute on Collection Development in June 1977 focus on collection development policy statements, selection tools, budgeting, evaluation, and weeding. (MBR)

  20. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    Science.gov (United States)

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  1. Proceedings of the workshop on applications of synchrotron radiation to trace impurity analysis for advanced silicon processing

    Energy Technology Data Exchange (ETDEWEB)

    Laderman, S [Integrated Circuits Business Div., Hewlett Packard Co., Palo Alto, CA (United States); Pianetta, P [Stanford Linear Accelerator Center, Menlo Park, CA (United States)

    1993-03-01

    Wafer surface trace impurity analysis is essential for development of competitive Si circuit technologies. Today's grazing incidence x-ray fluorescence techniques with rotating anodes fall short of requirements for the future. Hewlett Packard/Toshiba experiments indicate that with second generation synchrotron sources such as SSRL, the techniques can be extended sufficiently to meet important needs of the leading edge Si circuit industry through nearly all of the 1990's. This workshop was held to identify people interested in use of synchrotron radiation-based methods and to document needs and concerns for further development. Viewgraphs are included for the following presentations: microcontamination needs in silicon technology (M. Liehr), analytical methods for wafer surface contamination (A. Schimazaki), trace impurity analysis of liquid drops using synchrotron radiation (D. Wherry), TRXRF using synchrotron sources (S. Laderman), potential role of synchrotron radiation TRXRF in Si process R D (M. Scott), potenital development of synchrotron radiation facilities (S. Brennan), and identification of goals, needs and concerns (M. Garner).

  2. Creating Fantastic PI Workshops

    Energy Technology Data Exchange (ETDEWEB)

    Biedermann, Laura B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Clark, Blythe G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Colbert, Rachel S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dagel, Amber Lynn [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gupta, Vipin P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hibbs, Michael R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perkins, David Nikolaus [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); West, Roger Derek [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The goal of this SAND report is to provide guidance for other groups hosting workshops and peerto-peer learning events at Sandia. Thus this SAND report provides detail about our team structure, how we brainstormed workshop topics and developed the workshop structure. A Workshop “Nuts and Bolts” section provides our timeline and check-list for workshop activities. The survey section provides examples of the questions we asked and how we adapted the workshop in response to the feedback.

  3. Workshop experience

    Directory of Open Access Journals (Sweden)

    Georgina Holt

    2007-04-01

    Full Text Available The setting for the workshop was a heady mix of history, multiculturalism and picturesque riverscapes. Within the group there was, as in many food studies, a preponderance of female scientists (or ethnographers, but the group interacted on lively, non-gendered terms - focusing instead on an appreciation of locals food and enthusiasm for research shared by all, and points of theoretical variance within that.The food provided by our hosts was of the very highest eating and local food qualities...

  4. Desnarrativas: workshop

    Directory of Open Access Journals (Sweden)

    Ivânia Marques

    2014-08-01

    Full Text Available This is a report of a teacher workshop. It was an encounter among dialogues, pictures and possibilities of deconstruction in multiple directions. It enables studies inspiring debate in favor of images. Images are loaded with clichés and they risk breaking with the documentary/real character of photography. It leads us to think of the non-neutrality of an image and how the place is hegemonically imposed on us. It does away with blocking forces in a playful experimentation. The experimentation is extended into compositions with photographs, monotype printing, and different ways of perceiving space, dialogues, exchanges, poems and art.

  5. Communications data delivery system analysis : public workshop read-ahead document.

    Science.gov (United States)

    2012-04-09

    This document presents an overview of work conducted to date around development and analysis of communications data delivery systems for : supporting transactions in the connected vehicle environment. It presents the results of technical analysis of ...

  6. SIMS applications workshop. Proceedings

    International Nuclear Information System (INIS)

    1997-04-01

    The first ANSTO/AINSE SIMS Workshop drew together a mixture of Surface Analysis experts and Surface Analysis users with the concept that SIMS analysis has to be enfolded within the spectrum of surface analysis techniques and that the user should select the technique most applicable to the problem. With this concept in mind the program was structured as sessions on SIMS Facilities; Applications to Mineral Surfaces; Applications to Biological Systems, Applications to Surfaces as Semi- conductors, Catalysts and Surface Coatings; and Applications to Ceramics

  7. Workshops for state review of site suitability criteria for high-level radioactive waste repositories: analysis and recommendations

    International Nuclear Information System (INIS)

    1978-02-01

    The responses from various discussion groups on site suitability criteria for high-level radioactive waste repositories are presented. The consensus, principal concern, and minority opinion on each issue are given. The visual aids used in the workshop are included

  8. FCJ-209 Indigenous Knowledge Systems and Pattern Thinking: An Expanded Analysis of the First Indigenous Robotics Prototype Workshop

    Directory of Open Access Journals (Sweden)

    Angie Abdilla

    2016-12-01

    Full Text Available In November 2014, the lead researcher’s interest in the conceptual development of digital technology and her cultural connection to Indigenous Knowledge Systems created an opportunity to explore a culturally relevant use of technology with urban Indigenous youth: the Indigenous Robotics Prototype Workshop. The workshop achieved a sense of cultural pride and confidence in Indigenous traditional knowledge while inspiring the youth to continue with their engagement in coding and programming through building robots. Yet, the outcomes from the prototype workshop further revealed a need to investigate how Indigenous Knowledge Systems, and particularly Pattern Thinking, might hint toward a possible paradigm shift for the ethical and advanced design of new technologies. This article examines the implications of such a hypothetical shift in autonomous systems in robotics and artificial intelligence (AI, using the Indigenous Robotics Prototype Workshop as a case study and springboard.

  9. Essays in Societal System Dynamics and Transportation : Report of the Third Annual Workshop in Urban and Regional Systems Analysis

    Science.gov (United States)

    1981-03-01

    This document contains the White Papers on urban-regional modeling presented at the Workshop as well as additional research papers aimed at increasing our understanding of the relationships between transportation and society. The ultimate aim is to p...

  10. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  11. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Bekavac, Ivan; Garbin Praničević, Daniela

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  12. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    Science.gov (United States)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  13. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-06-01

    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  14. Geomagnetic Workshop

    Science.gov (United States)

    DeNoyer, John; Cain, Joseph C.; Banerjee, Subir; Benton, Edward R.; Blakely, Richard J.; Coe, Rob; Harrison, C. G. A.; Johnston, Malcolm; Regan, Robert D.

    A workshop on geomagnetism, sponsored by the Geologic Division of the U.S. Geological Survey, was held in the Denver West Office Complex in Golden, Colorado, April 13-15, 1982. There were 90 registered participants from government agencies, academic institutions, and industry.This effort stemmed from the realization that geomagnetism, once a small but coherent discipline, has now expanded into numerous areas of the geosciences, yet those doing research in these specialties seldom make contact outside their area of immediate interest. The impetus for this event came from the members of a committee formed to review the geomagnetic activities within the U.S. Geological Survey. They observed that the level of communication between the various elements of this now diverse discipline was inadequate, not only within their organization but also between federal agencies, academia, and the private sector. While the desire was to cover as much of geomagnetism as possible, it was necessary for a workshop of reasonable size and length to exclude some important areas of the subject: magnetic reversal chronology, studies of the externally produced variations, and most aspects of internal induction. The plan was to give emphasis to some of the newer areas: those which have recently seen a high level of activity and those with increasing activity abroad compared to that in the United States. The purpose was to evaluate the status and problems in selected areas with an eye to those whose emphasis might produce fruitful results in the next decade.

  15. Recent Workshops

    CERN Multimedia

    Wickens, F. J.

    Since the previous edition of ATLAS e-news, the NIKHEF Institute in Amsterdam has hosted not just one but two workshops related to ATLAS TDAQ activities. The first in October was dedicated to the Detector Control System (DCS). Just three institutes, CERN, NIKHEF and St Petersburg, provide the effort for the central DCS services, but each ATLAS sub-detector provides effort for their own controls. Some 30 people attended, including representatives for all of the ATLAS sub-detectors, representatives of the institutes working on the central services and the project leader of JCOP, which brings together common aspects of detector controls across the LHC experiments. During the three-day workshop the common components were discussed, and each sub-detector described their experiences and plans for their future systems. Whilst many of the components to be used are standard commercial components, a key custom item for ATLAS is the ELMB (Embedded Local Monitor Board). Prototypes for this have now been extensively test...

  16. Advancing Risk Analysis for Nanoscale Materials: Report from an International Workshop on the Role of Alternative Testing Strategies for Advancement.

    Science.gov (United States)

    Shatkin, J A; Ong, Kimberly J; Beaudrie, Christian; Clippinger, Amy J; Hendren, Christine Ogilvie; Haber, Lynne T; Hill, Myriam; Holden, Patricia; Kennedy, Alan J; Kim, Baram; MacDonell, Margaret; Powers, Christina M; Sharma, Monita; Sheremeta, Lorraine; Stone, Vicki; Sultan, Yasir; Turley, Audrey; White, Ronald H

    2016-08-01

    The Society for Risk Analysis (SRA) has a history of bringing thought leadership to topics of emerging risk. In September 2014, the SRA Emerging Nanoscale Materials Specialty Group convened an international workshop to examine the use of alternative testing strategies (ATS) for manufactured nanomaterials (NM) from a risk analysis perspective. Experts in NM environmental health and safety, human health, ecotoxicology, regulatory compliance, risk analysis, and ATS evaluated and discussed the state of the science for in vitro and other alternatives to traditional toxicology testing for NM. Based on this review, experts recommended immediate and near-term actions that would advance ATS use in NM risk assessment. Three focal areas-human health, ecological health, and exposure considerations-shaped deliberations about information needs, priorities, and the next steps required to increase confidence in and use of ATS in NM risk assessment. The deliberations revealed that ATS are now being used for screening, and that, in the near term, ATS could be developed for use in read-across or categorization decision making within certain regulatory frameworks. Participants recognized that leadership is required from within the scientific community to address basic challenges, including standardizing materials, protocols, techniques and reporting, and designing experiments relevant to real-world conditions, as well as coordination and sharing of large-scale collaborations and data. Experts agreed that it will be critical to include experimental parameters that can support the development of adverse outcome pathways. Numerous other insightful ideas for investment in ATS emerged throughout the discussions and are further highlighted in this article. © 2016 Society for Risk Analysis.

  17. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  18. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  19. Sensitivity analysis of an information fusion tool: OWA operator

    Science.gov (United States)

    Zarghaami, Mahdi; Ardakanian, Reza; Szidarovszky, Ferenc

    2007-04-01

    The successful design and application of the Ordered Weighted Averaging (OWA) method as a decision making tool depend on the efficient computation of its order weights. The most popular methods for determining the order weights are the Fuzzy Linguistic Quantifiers approach and the Minimal Variability method which give different behavior patterns for OWA. These methods will be compared by using Sensitivity Analysis on the outputs of OWA with respect to the optimism degree of the decision maker. The theoretical results are illustrated in a water resources management problem. The Fuzzy Linguistic Quantifiers approach gives more information about the behavior of the OWA outputs in comparison to the Minimal Variability method. However, in using the Minimal Variability method, the OWA has a linear behavior with respect to the optimism degree and therefore it has better computation efficiency.

  20. Earth Exploration Toolbook Workshops: Helping Teachers and Students Analyze Web-based Scientific Data

    Science.gov (United States)

    McAuliffe, C.; Ledley, T.; Dahlman, L.; Haddad, N.

    2007-12-01

    One of the challenges faced by Earth science teachers, particularly in K-12 settings, is that of connecting scientific research to classroom experiences. Helping teachers and students analyze Web-based scientific data is one way to bring scientific research to the classroom. The Earth Exploration Toolbook (EET) was developed as an online resource to accomplish precisely that. The EET consists of chapters containing step-by-step instructions for accessing Web-based scientific data and for using a software analysis tool to explore issues or concepts in science, technology, and mathematics. For example, in one EET chapter, users download Earthquake data from the USGS and bring it into a geographic information system (GIS), analyzing factors affecting the distribution of earthquakes. The goal of the EET Workshops project is to provide professional development that enables teachers to incorporate Web-based scientific data and analysis tools in ways that meet their curricular needs. In the EET Workshops project, Earth science teachers participate in a pair of workshops that are conducted in a combined teleconference and Web-conference format. In the first workshop, the EET Data Analysis Workshop, participants are introduced to the National Science Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). They also walk through an Earth Exploration Toolbook (EET) chapter and discuss ways to use Earth science datasets and tools with their students. In a follow-up second workshop, the EET Implementation Workshop, teachers share how they used these materials in the classroom by describing the projects and activities that they carried out with students. The EET Workshops project offers unique and effective professional development. Participants work at their own Internet-connected computers, and dial into a toll-free group teleconference for step-by-step facilitation and interaction. They also receive support via Elluminate, a Web

  1. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    Science.gov (United States)

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  2. ASOURCE: Source Term Analysis Tool for Advanced Fuel Cycle

    International Nuclear Information System (INIS)

    Cho, Dong Keun; Kook, Dong Hak; Choi, Jong Won; Choi, Heui Joo; Jeong, Jong Tae

    2012-01-01

    In 2007, the 3 rd Comprehensive Nuclear Energy Promotion Plan, passed at the 254 th meeting of the Atomic Energy Commission, was announced as an R and D action plan for the development of an advanced fuel cycle adopting a sodium-cooled fast reactor (SFR) in connection with a pyroprocess for a sustainable stable energy supply and a reduction in the amount of spent fuel (SF). It is expected that this fuel cycle can greatly reduce the SF inventory through a recycling process in which transuranics (TRU) and long-lived nuclides are burned in the SFR and cesium and strontium are disposed of after sufficient interim storage. For the success of the R and D plan, there are several issues related to the source term analysis. These are related with the following: (a) generation of inflow and outflow source terms of mixed SF in each process for the design of the pyroprocess facility, (b) source terms of mixed radwaste in a canister for the design of storage and disposal systems, (c) overall inventory estimation for TRU and long-lived nuclides for the design of the SFR, and (d) best estimate source terms for the practical design of the interim storage facility of SFs. A source term evaluation for a SF or radwaste with a single irradiation profile can be easily accomplished with the conventional computation tool. However, source term assessment for a batch of SFs or a mixture of radwastes generated from SFs with different irradiation profiles. A task that is essential to support the aforementioned activities is not possible with the conventional tool. Therefore, hybrid computing program for source term analysis to support the advanced fuel cycle was developed

  3. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    Science.gov (United States)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  4. Steam Generator Analysis Tools and Modeling of Degradation Mechanisms

    International Nuclear Information System (INIS)

    Yetisir, M.; Pietralik, J.; Tapping, R.L.

    2004-01-01

    The degradation of steam generators (SGs) has a significant effect on nuclear heat transport system effectiveness and the lifetime and overall efficiency of a nuclear power plant. Hence, quantification of the effects of degradation mechanisms is an integral part of a SG degradation management strategy. Numerical analysis tools such as THIRST, a 3-dimensional (3D) thermal hydraulics code for recirculating SGs; SLUDGE, a 3D sludge prediction code; CHECWORKS a flow-accelerated corrosion prediction code for nuclear piping, PIPO-FE, a SG tube vibration code; and VIBIC and H3DMAP, 3D non-linear finite-element codes to predict SG tube fretting wear can be used to assess the impacts of various maintenance activities on SG thermal performance. These tools are also found to be invaluable at the design stage to influence the design by determining margins or by helping the designers minimize or avoid known degradation mechanisms. In this paper, the aforementioned numerical tools and their application to degradation mechanisms in CANDU recirculating SGs are described. In addition, the following degradation mechanisms are identified and their effect on SG thermal efficiency and lifetime are quantified: primary-side fouling, secondary-side fouling, fretting wear, and flow-accelerated corrosion (FAC). Primary-side tube inner diameter fouling has been a major contributor to SG thermal degradation. Using the results of thermalhydraulic analysis and field data, fouling margins are calculated. Individual effects of primary- and secondary-side fouling are separated through analyses, which allow station operators to decide what type of maintenance activity to perform and when to perform the maintenance activity. Prediction of the fretting-wear rate of tubes allows designers to decide on the number and locations of support plates and U-bend supports. The prediction of FAC rates for SG internals allows designers to select proper materials, and allows operators to adjust the SG maintenance

  5. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  6. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  7. Usage of a Responsible Gambling Tool: A Descriptive Analysis and Latent Class Analysis of User Behavior.

    Science.gov (United States)

    Forsström, David; Hesser, Hugo; Carlbring, Per

    2016-09-01

    Gambling is a common pastime around the world. Most gamblers can engage in gambling activities without negative consequences, but some run the risk of developing an excessive gambling pattern. Excessive gambling has severe negative economic and psychological consequences, which makes the development of responsible gambling strategies vital to protecting individuals from these risks. One such strategy is responsible gambling (RG) tools. These tools track an individual's gambling history and supplies personalized feedback and might be one way to decrease excessive gambling behavior. However, research is lacking in this area and little is known about the usage of these tools. The aim of this article is to describe user behavior and to investigate if there are different subclasses of users by conducting a latent class analysis. The user behaviour of 9528 online gamblers who voluntarily used a RG tool was analysed. Number of visits to the site, self-tests made, and advice used were the observed variables included in the latent class analysis. Descriptive statistics show that overall the functions of the tool had a high initial usage and a low repeated usage. Latent class analysis yielded five distinct classes of users: self-testers, multi-function users, advice users, site visitors, and non-users. Multinomial regression revealed that classes were associated with different risk levels of excessive gambling. The self-testers and multi-function users used the tool to a higher extent and were found to have a greater risk of excessive gambling than the other classes.

  8. Activity-Centred Tool Integration

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius

    2003-01-01

    This paper is concerned with integration of heterogeneous tools for system development. We argue that such tools should support concrete activities (e.g., programming, unit testing, conducting workshops) in contrast to abstract concerns (e.g., analysis, design, implementation). A consequence...... of this is that tools — or components —that support activities well should be integrated in ad-hoc, dynamic, and heterogeneous ways. We present a peer-to-peer architecture for this based on type-based publish subscribe and give an example of its use....

  9. SisRadiologia: a new software tool for analysis of radiological accidents and incidents in industrial radiography

    International Nuclear Information System (INIS)

    Lima, Camila M. Araujo; Silva, Francisco C.A. da; Araujo, Rilton A.

    2013-01-01

    According to the International Atomic Energy Agency (IAEA), many efforts have been made by Member states, aiming a better control of radioactive sources. Accidents mostly happened in practices named as high radiological risk and classified by IAEA in categories 1 and 2, being highlighted those related to radiotherapy, large irradiators and industrial radiography. Worldwide, more than 40 radiological accidents have been recorded in the industrial radiography. Worldwide, more than 40 radiological accidents have been recorded in the industrial radiography area, involving 37 workers, 110 members of the public and 12 fatalities. Records display 5 severe radiological accidents in industrial radiography activities in Brazil, in which 7 workers and 19 members of the public were involved. Such events led to hands and fingers radiodermatitis, but to no death occurrence. The purpose of this study is to present a computational program that allows the data acquisition and recording in the company, in such a way to ease a further detailed analysis of radiological event, besides providing the learning cornerstones aiming the avoidance of future occurrences. After one year of the 'Industrial SisRadiologia' computational program application - and mostly based upon the workshop about Analysis and Dose Calculation of Radiological Accidents in Industrial Radiography (Workshop sobre Analise e Calculo de dose de acidentes Radiologicos em Radiografia Industrial - IRD 2012), in which several Radiation Protection officers took part - it can be concluded that the computational program is a powerful tool to data acquisition, as well as, to accidents and incidents events recording and surveying in Industrial Radiography. The program proved to be efficient in the report elaboration to the Brazilian Regulatory Authority, and very useful in workers training to fix the lessons learned from radiological events.

  10. Interaction tools for underwater shock analysis in naval platform design

    NARCIS (Netherlands)

    Aanhold, J.E.; Tuitman, J.T.; Trouwborst, W.; Vaders, J.A.A.

    2016-01-01

    In order to satisfy the need for good quality UNDerwater EXplosion (UNDEX) response estimates of naval platforms, TNO developed two 3D simulation tools: the Simplified Interaction Tool (SIT) and the hydro/structural code 3DCAV. Both tools are an add-on to LS-DYNA. SIT is a module of user routines

  11. Fatigue in cold-forging dies: Tool life analysis

    DEFF Research Database (Denmark)

    Skov-Hansen, P.; Bay, Niels; Grønbæk, J.

    1999-01-01

    In the present investigation it is shown how the tool life of heavily loaded cold-forging dies can be predicted. Low-cycle fatigue and fatigue crack growth testing of the tool materials are used in combination with finite element modelling to obtain predictions of tool lives. In the models...... is reported. (C) 1999 Elsevier Science S.A. All rights reserved....

  12. Discourse Analysis: A Tool for Helping Educators to Teach Science

    Directory of Open Access Journals (Sweden)

    Katerina Plakitsi

    2016-11-01

    Full Text Available This article refers to a part of a collaborative action research project in three elementary science classrooms. The project aims at the transformation of the nature and type of teachers' discursive practices into more collaborative inquiries. The basic strategy is to give the teachers the opportunity to analyze their discourse using a three-dimensional context of analysis. The teachers analyzed their discursive repertoires when teaching science. They studied the companion meaning, i.e., the different layers of explicit and tacit messages they communicate about Nature of Science (NoS, Nature of Teaching (NoT, and Nature of Language (NoL. The question investigated is the following: Could an action research program, which involves teachers in the analysis of their own discursive practices, lead to the transformation of discourse modes that take place in the science classrooms to better communicate aspects of NoS, NoT and NoL in a collaborative, inquiry-based context? Results indicate that the teachers' involvement in their discourse analysis led to a transformation in the discursive repertoires in their science classrooms. Gradually, the teachers' companion meanings that were created, implicitly/explicitly, from the dialogues taking place during science lessons were more appropriate for the establishment of a productive collaborative inquiry learning context. We argue that discourse analysis could be used for research purposes, as a training medium or as a reflective tool on how teachers communicate science. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs170168

  13. Workshop proceedings

    DEFF Research Database (Denmark)

    dedicated to all aspects of content-based recommendation. We issued a Call for Papers asking for submissions of novel research papers (both long and short) addressing recommendation in do- mains where textual content is abundant (e.g., books, news, scientific articles, jobs, educational resources, Web pages......While content-based recommendation has been applied successfully in many different domains, it has not seen the same level of attention as collaborative filtering techniques have. In recent years, competitions like the Netflix Prize, CAMRA, and the Yahoo! Music KDD Cup 2011 have spurred on advances...... investigation already, but for many other domains, such as books, news, scientific articles, and Web pages we do not know if and how these data sources should be combined to provided the best recommendation performance. The CBRecSys 2014 workshop aims to address this by providing a dedicated venue for papers...

  14. Kinematic Analysis of a 3-dof Parallel Machine Tool with Large Workspace

    Directory of Open Access Journals (Sweden)

    Shi Yan

    2016-01-01

    Full Text Available Kinematics of a 3-dof (degree of freedom parallel machine tool with large workspace was analyzed. The workspace volume and surface and boundary posture angles of the 3-dof parallel machine tool are relatively large. Firstly, three dimensional simulation manipulator of the 3-dof parallel machine tool was constructed, and its joint distribution was described. Secondly, kinematic models of the 3-dof parallel machine tool were fixed on, including displacement analysis, velocity analysis, and acceleration analysis. Finally, the kinematic models of the machine tool were verified by a numerical example. The study result has an important significance to the application of the parallel machine tool.

  15. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Pakarinen Jyri

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  16. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Science.gov (United States)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  17. Trade-Space Analysis Tool for Constellations (TAT-C)

    Science.gov (United States)

    Le Moigne, Jacqueline; Dabney, Philip; de Weck, Olivier; Foreman, Veronica; Grogan, Paul; Holland, Matthew; Hughes, Steven; Nag, Sreeja

    2016-01-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: How many spacecraft should be included in the constellation? Which design has the best costrisk value? The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time.This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit Coverage, Reduction Metrics, and Cost Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance.TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of

  18. Actigraphy and motion analysis: new tools for psychiatry.

    Science.gov (United States)

    Teicher, M H

    1995-01-01

    Altered locomotor activity is a cardinal sign of several psychiatric disorders. With advances in technology, activity can now be measured precisely. Contemporary studies quantifying activity in psychiatric patients are reviewed. Studies were located by a Medline search (1965 to present; English language only) cross-referencing motor activity and major psychiatric disorders. The review focused on mood disorders and attention-deficit hyperactivity disorder (ADHD). Activity levels are elevated in mania, agitated depression, and ADHD and attenuated in bipolar depression and seasonal depression. The percentage of low-level daytime activity is directly related to severity of depression, and change in this parameter accurately mirrors recovery. Demanding cognitive tasks elicit fidgeting in children with ADHD, and precise measures of activity and attention may provide a sensitive and specific marker for this disorder. Circadian rhythm analysis enhances the sophistication of activity measures. Affective disorders in children and adolescents are characterized by an attenuated circadian rhythm and an enhanced 12-hour harmonic rhythm (diurnal variation). Circadian analysis may help to distinguish between the activity patterns of mania (dysregulated) and ADHD (intact or enhanced). Persistence of hyperactivity or circadian dysregulation in bipolar patients treated with lithium appears to predict rapid relapse once medication is discontinued. Activity monitoring is a valuable research tool, with the potential to aid clinicians in diagnosis and in prediction of treatment response.

  19. BUSINESS INTELLIGENCE TOOLS FOR DATA ANALYSIS AND DECISION MAKING

    Directory of Open Access Journals (Sweden)

    DEJAN ZDRAVESKI

    2011-04-01

    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  20. Study of academic achievements using spatial analysis tools

    Science.gov (United States)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  1. A policy model to initiate environmental negotiations: Three hydropower workshops

    Science.gov (United States)

    Lamb, Berton Lee; Taylor, Jonathan G.; Burkardt, Nina; Ponds, Phadrea D.

    1998-01-01

    How do I get started in natural resource negotiations? Natural resource managers often face difficult negotiations when they implement laws and policies regulating such resources as water, wildlife, wetlands, endangered species, and recreation. As a result of these negotiations, managers must establish rules, grant permits, or create management plans. The Legal‐Institutional Analysis Model (LIAM) was designed to assist managers in systematically analyzing the parties in natural resource negotiations and using that analysis to prepare for bargaining. The LIAM relies on the theory that organizations consistently employ behavioral roles. The model uses those roles to predict likely negotiation behavior. One practical use of the LIAM is when all parties to a negotiation conduct a workshop as a way to open the bargaining on a note of trust and mutual understanding. The process and results of three LIAM workshops designed to guide hydroelectric power licensing negotiations are presented. Our experience with these workshops led us to conclude that the LIAM can be an effective tool to begin a negotiation and that trust built through the workshops can help create a successful result.

  2. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis

    DEFF Research Database (Denmark)

    Callot, Laurent Abdelkader Francois; Paldam, Martin

    Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis......, on the data of the funnel, which jointly estimates the FAT and the PET. Ideal funnels are lean and symmetric. Empirical funnels are wide, and most have asymmetries biasing the plain average. Many asymmetries are due to censoring made during the research-publication process. The PET is tooled to correct...

  3. Workshop report

    African Journals Online (AJOL)

    abp

    2017-09-14

    Sep 14, 2017 ... Foundation (CSF), with funding from the Bill and Melinda Gates. Foundation and builds on previously-existing platforms including the. Lives Saved Tool (John Hopkins Bloomberg School of Public Health) and the Marginal Budgeting for Bottleneck tool (a cost estimator used by the World Bank) and on ...

  4. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    OpenAIRE

    Shang-Liang Chen; Yin-Ting Cheng; Chin-Fa Su

    2015-01-01

    Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as ...

  5. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    OpenAIRE

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnost...

  6. Flow injection analysis: Emerging tool for laboratory automation in radiochemistry

    International Nuclear Information System (INIS)

    Egorov, O.; Ruzicka, J.; Grate, J.W.; Janata, J.

    1996-01-01

    Automation of routine and serial assays is a common practice of modern analytical laboratory, while it is virtually nonexistent in the field of radiochemistry. Flow injection analysis (FIA) is a general solution handling methodology that has been extensively used for automation of routine assays in many areas of analytical chemistry. Reproducible automated solution handling and on-line separation capabilities are among several distinctive features that make FI a very promising, yet under utilized tool for automation in analytical radiochemistry. The potential of the technique is demonstrated through the development of an automated 90 Sr analyzer and its application in the analysis of tank waste samples from the Hanford site. Sequential injection (SI), the latest generation of FIA, is used to rapidly separate 90 Sr from interfering radionuclides and deliver separated Sr zone to a flow-through liquid scintillation detector. The separation is performed on a mini column containing Sr-specific sorbent extraction material, which selectively retains Sr under acidic conditions. The 90 Sr is eluted with water, mixed with scintillation cocktail, and sent through the flow cell of a flow through counter, where 90 Sr radioactivity is detected as a transient signal. Both peak area and peak height can be used for quantification of sample radioactivity. Alternatively, stopped flow detection can be performed to improve detection precision for low activity samples. The authors current research activities are focused on expansion of radiochemical applications of FIA methodology, with an ultimate goal of creating a set of automated methods that will cover the basic needs of radiochemical analysis at the Hanford site. The results of preliminary experiments indicate that FIA is a highly suitable technique for the automation of chemically more challenging separations, such as separation of actinide elements

  7. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Otoshi, Tsunehiko

    2004-01-01

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  8. PIXE and μ-PIXE analysis of glazes from terracotta sculptures of the della Robbia workshop

    International Nuclear Information System (INIS)

    Zucchiatti, Alessandro; Bouquillon, Anne; Lanterna, Giancarlo; Franco, Lucarelli; Mando, Pier Andrea; Prati, Paolo; Salomon, Joseph; Vaccari, Maria Grazia

    2002-01-01

    A series of PIXE analyses has been performed on glazes from terracotta sculptures of the Italian Renaissance and on reference standards. The problems related to the investigation of such heterogeneous materials are discussed and the experimental uncertainties are evaluated, for each element, from the PIXE analysis of standard glasses. Some examples from artefacts coming from Italian collections are given. This research has been conducted in the framework of the COST-G1 European action

  9. Proceedings of the workshop on world oil supply-demand analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, K.C. (ed.)

    1977-01-01

    Twelve papers and four panel discussions are included. A separate abstract was prepared for each paper. The panel discussions were on: technical and physical policy elements affecting world oil supply and demand; financial, tax, and tariff issues in world oil supply and demand; the world economy as influenced by world oil prices and availability; the use of models and analysis in the policy process. (DLC)

  10. The use of current risk analysis tools evaluated towards preventing external domino accidents

    NARCIS (Netherlands)

    Reniers, Genserik L L; Dullaert, W.; Ale, B. J.M.; Soudan, K.

    Risk analysis is an essential tool for company safety policy. Risk analysis consists of identifying and evaluating all possible risks. The efficiency of risk analysis tools depends on the rigueur of identifying and evaluating all possible risks. The diversity in risk analysis procedures is such that

  11. Analysis of the influence of tool dynamics in diamond turning

    Energy Technology Data Exchange (ETDEWEB)

    Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

    1988-12-01

    This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

  12. Suspended Cell Culture ANalysis (SCAN) Tool to Enhance ISS On-Orbit Capabilities, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences and partner, Draper Laboratory, propose to develop an on-orbit immuno-based label-free Suspension Cell Culture ANalysis tool, SCAN tool, which...

  13. Bioelectrical impedance analysis: A new tool for assessing fish condition

    Science.gov (United States)

    Hartman, Kyle J.; Margraf, F. Joseph; Hafs, Andrew W.; Cox, M. Keith

    2015-01-01

    Bioelectrical impedance analysis (BIA) is commonly used in human health and nutrition fields but has only recently been considered as a potential tool for assessing fish condition. Once BIA is calibrated, it estimates fat/moisture levels and energy content without the need to kill fish. Despite the promise held by BIA, published studies have been divided on whether BIA can provide accurate estimates of body composition in fish. In cases where BIA was not successful, the models lacked the range of fat levels or sample sizes we determined were needed for model success (range of dry fat levels of 29%, n = 60, yielding an R2 of 0.8). Reduced range of fat levels requires an increased sample size to achieve that benchmark; therefore, standardization of methods is needed. Here we discuss standardized methods based on a decade of research, identify sources of error, discuss where BIA is headed, and suggest areas for future research.

  14. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies.

    Science.gov (United States)

    Uebe, Steffen; Pasutto, Francesca; Krumbiegel, Mandy; Schanze, Denny; Ekici, Arif B; Reis, André

    2010-09-21

    Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  15. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies

    Directory of Open Access Journals (Sweden)

    Schanze Denny

    2010-09-01

    Full Text Available Abstract Background Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Results Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Conclusions Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  16. VisIt: Interactive Parallel Visualization and Graphical Analysis Tool

    Science.gov (United States)

    Department Of Energy (DOE) Advanced Simulation; Computing Initiative (ASCI)

    2011-03-01

    VisIt is a free interactive parallel visualization and graphical analysis tool for viewing scientific data on Unix and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three-dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range. See the table below for more details about the tool’s features. VisIt was developed by the Department of Energy (DOE) Advanced Simulation and Computing Initiative (ASCI) to visualize and analyze the results of terascale simulations. It was developed as a framework for adding custom capabilities and rapidly deploying new visualization technologies. Although the primary driving force behind the development of VisIt was for visualizing terascale data, it is also well suited for visualizing data from typical simulations on desktop systems.

  17. Optical Network Testbeds Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Joe Mambretti

    2007-06-01

    This is the summary report of the third annual Optical Networking Testbed Workshop (ONT3), which brought together leading members of the international advanced research community to address major challenges in creating next generation communication services and technologies. Networking research and development (R&D) communities throughout the world continue to discover new methods and technologies that are enabling breakthroughs in advanced communications. These discoveries are keystones for building the foundation of the future economy, which requires the sophisticated management of extremely large qualities of digital information through high performance communications. This innovation is made possible by basic research and experiments within laboratories and on specialized testbeds. Initial network research and development initiatives are driven by diverse motives, including attempts to solve existing complex problems, the desire to create powerful new technologies that do not exist using traditional methods, and the need to create tools to address specific challenges, including those mandated by large scale science or government agency mission agendas. Many new discoveries related to communications technologies transition to wide-spread deployment through standards organizations and commercialization. These transition paths allow for new communications capabilities that drive many sectors of the digital economy. In the last few years, networking R&D has increasingly focused on advancing multiple new capabilities enabled by next generation optical networking. Both US Federal networking R&D and other national R&D initiatives, such as those organized by the National Institute of Information and Communications Technology (NICT) of Japan are creating optical networking technologies that allow for new, powerful communication services. Among the most promising services are those based on new types of multi-service or hybrid networks, which use new optical networking

  18. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    Science.gov (United States)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  19. 21st Century Kinematics : The 2012 NSF Workshop

    CERN Document Server

    2013-01-01

    21st Century Kinematics focuses on algebraic problems in the analysis and synthesis of mechanisms and robots, compliant mechanisms, cable-driven systems and protein kinematics. The specialist contributors provide the background for a series of presentations at the 2012 NSF Workshop. The text shows how the analysis and design of innovative mechanical systems yield increasingly complex systems of polynomials, characteristic of those systems. In doing so, takes advantage of increasingly sophisticated computational tools developed for numerical algebraic geometry and demonstrates the now routine derivation of polynomial systems dwarfing the landmark problems of even the recent past. The 21st Century Kinematics workshop echoes the NSF-supported 1963 Yale Mechanisms Teachers Conference that taught a generation of university educators the fundamental principles of kinematic theory. As such these proceedings will be provide admirable supporting theory for a graduate course in modern kinematics and should be of consid...

  20. Tools4miRs - one place to gather all the tools for miRNA analysis.

    Science.gov (United States)

    Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr

    2016-09-01

    MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. piotr@ibb.waw.pl Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  1. Tools4miRs – one place to gather all the tools for miRNA analysis

    Science.gov (United States)

    Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr

    2016-01-01

    Summary: MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Availability and Implementation: Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. Contact: piotr@ibb.waw.pl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153626

  2. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    Science.gov (United States)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  3. PREFACE: Collapse Calderas Workshop

    Science.gov (United States)

    Gottsmann, Jo; Aguirre-Diaz, Gerardo

    2008-10-01

    Caldera-formation is one of the most awe-inspiring and powerful displays of nature's force. Resultant deposits may cover vast areas and significantly alter the immediate topography. Post-collapse activity may include resurgence, unrest, intra-caldera volcanism and potentially the start of a new magmatic cycle, perhaps eventually leading to renewed collapse. Since volcanoes and their eruptions are the surface manifestation of magmatic processes, calderas provide key insights into the generation and evolution of large-volume silicic magma bodies in the Earth's crust. Despite their potentially ferocious nature, calderas play a crucial role in modern society's life. Collapse calderas host essential economic deposits and supply power for many via the exploitation of geothermal reservoirs, and thus receive considerable scientific, economic and industrial attention. Calderas also attract millions of visitors world-wide with their spectacular scenic displays. To build on the outcomes of the 2005 calderas workshop in Tenerife (Spain) and to assess the most recent advances on caldera research, a follow-up meeting was proposed to be held in Mexico in 2008. This abstract volume presents contributions to the 2nd Calderas Workshop held at Hotel Misión La Muralla, Querétaro, Mexico, 19-25 October 2008. The title of the workshop `Reconstructing the evolution of collapse calderas: Magma storage, mobilisation and eruption' set the theme for five days of presentations and discussions, both at the venue as well as during visits to the surrounding calderas of Amealco, Amazcala and Huichapan. The multi-disciplinary workshop was attended by more than 40 scientist from North, Central and South America, Europe, Australia and Asia. Contributions covered five thematic topics: geology, geochemistry/petrology, structural analysis/modelling, geophysics, and hazards. The workshop was generously supported by the International Association of Volcanology and the Chemistry of The Earth's Interior

  4. Microfabricated tools for manipulation and analysis of magnetic microcarriers

    International Nuclear Information System (INIS)

    Tondra, Mark; Popple, Anthony; Jander, Albrecht; Millen, Rachel L.; Pekas, Nikola; Porter, Marc D.

    2005-01-01

    Tools for manipulating and detecting magnetic microcarriers are being developed with microscale features. Microfabricated giant magnetoresistive (GMR) sensors and wires are used for detection, and for creating high local field gradients. Microfluidic structures are added to control flow, and positioning of samples and microcarriers. These tools are designed for work in analytical chemistry and biology

  5. An Analysis of Teacher Selection Tools in Pennsylvania

    Science.gov (United States)

    Vitale, Tracy L.

    2009-01-01

    The purpose of this study was to examine teacher screening and selection tools currently being utilized by public school districts in Pennsylvania and to compare these tools to the research on qualities of effective teachers. The researcher developed four research questions that guided her study. The Pennsylvania Association of School Personnel…

  6. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets...

  7. Workshop objectives and structure

    International Nuclear Information System (INIS)

    2004-01-01

    The overall aim of the Workshop was to create a platform in order to better understand different approaches to managing uncertainty in post-closure safety cases and regulatory approaches in different national waste management programmes. The principal objectives of the Workshop were to: - To identify common elements in different approaches for managing uncertainty. - To facilitate information exchange and to promote discussion on different technical approaches to the management and characterisation of uncertainty and on the role of risk. - To explore the merits of alternative approaches to risk-informed decision making. - To identify the potential for further developments of methods or strategies to support the management of uncertainties. The workshop was organised into plenary sessions and working group discussions: The first plenary session focused on establishing a framework for understanding the management of uncertainties and the use of risk. It comprised oral presentations drawing on a range of experience from both active participants in the development and assessment of safety cases and keynotes presentations by external participants involved in risk management in other sectors. The working group discussions covered three technical themes: Risk management and decision making. Regulatory requirements and review of uncertainty and risk in safety cases. Practical approaches and tools for the management of uncertainties and the assignment of probabilities, the use of expert judgements, and the presentation of information on uncertainties and risk were examined. The aim of the working groups was to develop an understanding of the specific issues, and to identify any further activities that will support the development and/or evaluation of safety cases. The round up plenary session brought together information and conclusions from each of the working groups. Common elements in the different approaches to treating uncertainty and risk were identified, along with

  8. Workshop in economics - the problem of climate change benefit-cost analysis

    International Nuclear Information System (INIS)

    Kosobud, R.F.

    1992-01-01

    Could benefit-cost analysis play a larger role in the discussion of policies to deal with the greenhouse effect? The paper also investigates the causes of this lack of influence. Selected forms of benefit-cost research are probed, particularly the critical discussions raised by this type of research, in an effort to suggest where the chances of greater acceptance lie. The paper begins by discussing the search for an appropriate policy: optimal, targeted, or incremental. It then describes the work being done in specifying and estimating climate change damage relationships. A consideration of the work being done in specifying and estimating abatement (both mitigation and adaptation) cost relationships follows. Finally, the paper ends with an examination of the search for the appropriate policy instrument. International and methodological concerns cut across these areas and are discussed in each section. This paper concludes that there seem to be a number of reasons that benefit-cost results play only a limited role in policy development. There is some evidence that the growing interest in market-based approaches to climate change policy and to other environmental control matters is a sign of increased acceptance. Suggestions about research directions are made throughout this paper

  9. In-depth Analysis of Pattern of Occupational Injuries and Utilization of Safety Measures among Workers of Railway Wagon Repair Workshop in Jhansi (U.P.).

    Science.gov (United States)

    Gupta, Shubhanshu; Malhotra, Anil K; Verma, Santosh K; Yadav, Rashmi

    2017-01-01

    Occupational injuries constitute a global health challenge, yet they receive comparatively modest scientific attention. Pattern of occupational injuries and its safety precautions among wagon repair workers is an important health issue, especially in developing countries like India. To assess the pattern of occupational injuries and utilization of safety measures among railway wagon repair workshop workers in Jhansi (U.P.). Railway wagon repair workshop urban area, Jhansi (U.P). Occupation-based cross-sectional study. A cross-sectional study was conducted among 309 workers of railway workshop in Jhansi (U.P.) who were all injured during the study period of 1 year from July 2015 to June 2016. Baseline characteristics, pattern of occupational injuries, safety measures, and their availability to and utilization by the participants were assessed using a pretested structured questionnaire. Data obtained were collected and analyzed statistically by simple proportions and Chi-square test. The majority of studied workers aged between 38 and 47 years ( n = 93, 30.6%) followed by 28-37 years ( n = 79, 26%). Among the pattern of occupational injuries, laceration (28.7%) was most common followed by abrasion/scratch (21%). Safety shoes and hat were utilized 100% by all workers. Many of them had more than 5 years of experience ( n = 237, 78%). Age group, education level, and utilization of safety measures were significantly associated with pattern of occupational injuries in univariate analysis ( P safety measures is low among workers on railway wagon repair workshop, which highlights the importance of strengthening safety regulatory services toward this group of workers. Younger age group workers show a significant association with open wounds and surface wounds. As the education level of workers increases, the incidence of injuries decreases. Apart from shoes, hat, and gloves, regular utilization of other personal protective equipment was not seen.

  10. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.

    Science.gov (United States)

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-10-23

    Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  11. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  12. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  13. 13th International Workshop on Advanced Computing and Analysis Techniques in Physics Research

    Science.gov (United States)

    Speer, T.; Boudjema, F.; Lauret, J.; Naumann, A.; Teodorescu, L.; Uwer, P.

    data taking and analysis, experiment monitoring and complex simulations? What physics research seizing these new technologies may bring forward innovations that would benefit the society at large? Editorial board: T. Speer (chairman), F. Boudjema, J. Lauret, A. Naumann, L. Teodorescu, P. Uwer

  14. Examining Science Teachers' Argumentation in a Teacher Workshop on Earthquake Engineering

    Science.gov (United States)

    Cavlazoglu, Baki; Stuessy, Carol

    2018-02-01

    The purpose of this study was to examine changes in the quality of science teachers' argumentation as a result of their engagement in a teacher workshop on earthquake engineering emphasizing distributed learning approaches, which included concept mapping, collaborative game playing, and group lesson planning. The participants were ten high school science teachers from US high schools who elected to attend the workshop. To begin and end the teacher workshop, teachers in small groups engaged in concept mapping exercises with other teachers. Researchers audio-recorded individual teachers' argumentative statements about the inclusion of earthquake engineering concepts in their concept maps, which were then analyzed to reveal the quality of teachers' argumentation. Toulmin's argumentation model formed the framework for designing a classification schema to analyze the quality of participants' argumentative statements. While the analysis of differences in pre- and post-workshop concept mapping exercises revealed that the number of argumentative statements did not change significantly, the quality of participants' argumentation did increase significantly. As these differences occurred concurrently with distributed learning approaches used throughout the workshop, these results provide evidence to support distributed learning approaches in professional development workshop activities to increase the quality of science teachers' argumentation. Additionally, these results support the use of concept mapping as a cognitive scaffold to organize participants' knowledge, facilitate the presentation of argumentation, and as a research tool for providing evidence of teachers' argumentation skills.

  15. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    Science.gov (United States)

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  16. GammaWorkshops Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Ramebaeck, H. (ed.) (Swedish Defence Research Agency (Sweden)); Straalberg, E. (Institute for Energy Technology, Kjeller (Norway)); Klemola, S. (Radiation and Nuclear Safety Authority, STUK (Finland)); Nielsen, Sven P. (Technical Univ. of Denmark. Risoe National Lab. for Sustainable Energy, Roskilde (Denmark)); Palsson, S.E. (Icelandic Radiation Safety Authority (Iceland))

    2012-01-15

    Due to a sparse interaction during the last years between practioners in gamma ray spectrometry in the Nordic countries, a NKS activity was started in 2009. This GammaSem was focused on seminars relevant to gamma spectrometry. A follow up seminar was held in 2010. As an outcome of these activities it was suggested that the 2011 meeting should be focused on practical issues, e.g. different corrections needed in gamma spectrometric measurements. This three day's meeting, GammaWorkshops, was held in September at Risoe-DTU. Experts on different topics relevant for gamma spectrometric measurements were invited to the GammaWorkshops. The topics included efficiency transfer, true coincidence summing corrections, self-attenuation corrections, measurement of natural radionuclides (natural decay series), combined measurement uncertainty calculations, and detection limits. These topics covered both lectures and practical sessions. The practical sessions included demonstrations of tools for e.g. corrections and calculations of the above meantioned topics. (Author)

  17. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    Science.gov (United States)

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest

  18. Evaluation of static analysis tools used to assess software important to nuclear power plant safety

    International Nuclear Information System (INIS)

    Ourghanlian, Alain

    2015-01-01

    We describe a comparative analysis of different tools used to assess safety-critical software used in nuclear power plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricit e de France (EDF) investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Currently, new industrial tools based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software packages is substantially improved. In the first part of this article, we present the analysis principles of the tools used in our experimentation. In the second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools.

  19. Assessing the Possibility of Implementing Tools of Technical Analysys for Real Estate Market Analysis

    Directory of Open Access Journals (Sweden)

    Brzezicka Justyna

    2016-06-01

    Full Text Available Technical analysis (TA and its different aspects are widely used to study the capital market. In the traditional approach, this analysis is used to determine the probability of changes in current rates on the basis of their past changes, accounting for factors which had, have or may have an influence on shaping the supply and demand of a given asset. In the practical sense, TA is a set of techniques used for assessing the value of an asset based on the analysis of the asset's trajectories as well as statistical tools.

  20. Neutrons and magnetic structures: analysis methods and tools

    Science.gov (United States)

    Damay, Françoise

    2015-12-01

    After a short introduction on neutron diffraction and magnetic structures, this review focuses on the new computing tools available in magnetic crystallography nowadays. The appropriate neutron techniques and different steps required to determine a magnetic structure are also introduced.

  1. Workshop introduction

    International Nuclear Information System (INIS)

    Streeper, Charles

    2010-01-01

    The Department of Energy's National Nuclear Security Administration's Global Threat Reduction Initiative (GTRI) has three subprograms that directly reduce the nuclear/radiological threat; Convert (Highly Enriched Uranium), Protect (Facilities), and Remove (Materials). The primary mission of the Off-Site Source Recovery Project (OSRP) falls under the 'Remove' subset. The purpose of this workshop is to provide a venue for joint-technical collaboration between the OSRP and the Nuclear Radiation Safety Service (NRSS). Eisenhower's Atoms for Peace initiative and the Soviet equivalent both promoted the spread of the paradoxical (peaceful and harmful) properties of the atom. The focus of nonproliferation efforts has been rightly dedicated to fissile materials and the threat they pose. Continued emphasis on radioactive materials must also be encouraged. An unquantifiable threat still exists in the prolific quantity of sealed radioactive sources (sources) spread worldwide. It does not appear that the momentum of the evolution in the numerous beneficial applications of radioactive sources will subside in the near future. Numerous expert studies have demonstrated the potentially devastating economic and psychological impacts of terrorist use of a radiological dispersal or emitting device. The development of such a weapon, from the acquisition of the material to the technical knowledge needed to develop and use it, is straightforward. There are many documented accounts worldwide of accidental and purposeful diversions of radioactive materials from regulatory control. The burden of securing sealed sources often falls upon the source owner, who may not have a disposal pathway once the source reaches the end of its useful life. This disposal problem is exacerbated by some source owners not having the resources to safely and compliantly store them. US Nuclear Regulatory Commission (NRC) data suggests that, in the US alone, there are tens of thousands of high-activity (IAEA

  2. Workshop report

    African Journals Online (AJOL)

    abp

    2015-07-23

    Jul 23, 2015 ... the use information and communication technology (ICT) [9]. ICTs have been defined as "tools that facilitate communication and the processing and transmission of information and the sharing of knowledge by electronic means encompassing the full range of electronic digital and analogue ICTs" (9).

  3. Purpose of the workshop

    International Nuclear Information System (INIS)

    Brunner, H.

    1998-01-01

    The main purpose of the Workshop is to share the experience on emergency data management and to review various conceptual, technical, organisational and operational aspects and problems. The problems posed by hardware and software, the interplay of software developers and users/operators and the positive and negative experiences both from development and operation of data management systems are discussed. Emergency data management systems and their demonstrations are divided into four classes of possible applications: video games, training and simulation systems, 'history writing' = post-event analysis and documentation systems, real-time operational systems. (author)

  4. 77 FR 31371 - Public Workshop: Privacy Compliance Workshop

    Science.gov (United States)

    2012-05-25

    ... SECURITY Office of the Secretary Public Workshop: Privacy Compliance Workshop AGENCY: Privacy Office, DHS. ACTION: Notice Announcing Public Workshop. SUMMARY: The Department of Homeland Security Privacy Office will host a public workshop, ``Privacy Compliance Workshop.'' DATES: The workshop will be held on June...

  5. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  6. Scaffolding Assignments: Analysis of Assignmentor as a Tool to Support First Year Students' Academic Writing Skills

    Science.gov (United States)

    Silva, Pedro

    2017-01-01

    There are several technological tools which aim to support first year students' challenges, especially when it comes to academic writing. This paper analyses one of these tools, Wiley's AssignMentor. The Technological Pedagogical Content Knowledge framework was used to systematise this analysis. The paper showed an alignment between the tools'…

  7. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    Science.gov (United States)

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  8. ICAR 2017 - Science Communication Workshop

    OpenAIRE

    Sparks, Erin

    2017-01-01

    Conveying information in a Twitter-world: utilizing infographics to expand the reach of your research. (Presented at ICAR communications workshop 2017) This presentation is geared towards folks who are interested in using infographics to convey their research, but don’t necessarily know where to begin. I’ll first discuss what an infographic is, key features of an infographic and why these tools are becoming increasingly important for science communication. I’ll...

  9. Workshop on PSA applications, Sofia, Bulgaria, 7-11 October 1996. Lecturing materials

    International Nuclear Information System (INIS)

    1997-01-01

    The objective of this workshop was to present detailed, systematic and useful information about PSA-based tools and PSA applications. The first presentation of the workshop was titled ''The role of PSA in safety management''. This topic served to introduce the workshop and to highlight several concepts that were afterwards stressed during the week, i.e. the defence in depth principle and the use of deterministic and probabilistic approaches in a complementary way. This presentation provided a basis for the discussion of ''PSA applications''. As a complement to the theoretical lectures, there was a workshop during which three different exercises were run in parallel. For two of these, computer-based PSA tools were used. One of them was focused towards the analysis of design modifications and the other one towards demonstrating configuration control strategies. The objective of the third practice was to obtain Allowed Outage Times using different PSA-based approaches and to discuss the differences observed and the insights obtained. To conclude the workshop, stress was put on the importance of the quality of the PSA (the development of a high quality Living PSA should be the first step), the necessity to be cautious (before taking decisions both the qualitative and numerical results should be carefully analyzed), and the logical order for the implementation of PSA applications. Refs, figs, tabs

  10. Modal Analysis and Experimental Determination of Optimum Tool Shank Overhang of a Lathe Machine

    Directory of Open Access Journals (Sweden)

    Nabin SARDAR

    2008-12-01

    Full Text Available Vibration of Tool Shank of a cutting tool has large influence on tolerances and surface finish of products. Frequency and amplitude of vibrations depend on the overhang of the shank of the cutting tool. In turning operations, when the tool overhang is about 2 times of the tool height, the amplitude of the vibration is almost zero and dimensional tolerances and surface finish of the product becomes high. In this paper, the above statement is verified firstly by using a finite element analysis of the cutting tool with ANSYS software package and secondly, with experimental verification with a piezoelectric sensor.

  11. Risk analysis for dengue suitability in Africa using the ArcGIS predictive analysis tools (PA tools).

    Science.gov (United States)

    Attaway, David F; Jacobsen, Kathryn H; Falconer, Allan; Manca, Germana; Waters, Nigel M

    2016-06-01

    Risk maps identifying suitable locations for infection transmission are important for public health planning. Data on dengue infection rates are not readily available in most places where the disease is known to occur. A newly available add-in to Esri's ArcGIS software package, the ArcGIS Predictive Analysis Toolset (PA Tools), was used to identify locations within Africa with environmental characteristics likely to be suitable for transmission of dengue virus. A more accurate, robust, and localized (1 km × 1 km) dengue risk map for Africa was created based on bioclimatic layers, elevation data, high-resolution population data, and other environmental factors that a search of the peer-reviewed literature showed to be associated with dengue risk. Variables related to temperature, precipitation, elevation, and population density were identified as good predictors of dengue suitability. Areas of high dengue suitability occur primarily within West Africa and parts of Central Africa and East Africa, but even in these regions the suitability is not homogenous. This risk mapping technique for an infection transmitted by Aedes mosquitoes draws on entomological, epidemiological, and geographic data. The method could be applied to other infectious diseases (such as Zika) in order to provide new insights for public health officials and others making decisions about where to increase disease surveillance activities and implement infection prevention and control efforts. The ability to map threats to human and animal health is important for tracking vectorborne and other emerging infectious diseases and modeling the likely impacts of climate change. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. IPHE Infrastructure Workshop Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    None

    2010-02-01

    This proceedings contains information from the IPHE Infrastructure Workshop, a two-day interactive workshop held on February 25-26, 2010, to explore the market implementation needs for hydrogen fueling station development.

  13. Workshop: LP-modellen

    DEFF Research Database (Denmark)

    Hostrup, Mathilde Nyvang

    Workshop i LP-modellen på NUBU (Nationalt videnscenter om Udsatte Børn og Unge) underviserkonference.......Workshop i LP-modellen på NUBU (Nationalt videnscenter om Udsatte Børn og Unge) underviserkonference....

  14. Formaldehyde Workshop Agenda

    Science.gov (United States)

    This is the agenda for the Formaldehyde Workshop hosted by the Office of Research and Development's National Center for Environmental Assessments in cooperation with the IRIS Program. The workshop was held in April 2014

  15. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    Science.gov (United States)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  16. Workshops as a Research Methodology

    Science.gov (United States)

    Ørngreen, Rikke; Levinsen, Karin

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and workshops as a research methodology. Focusing primarily on…

  17. The Python Spectral Analysis Tool (PySAT): A Powerful, Flexible, Preprocessing and Machine Learning Library and Interface

    Science.gov (United States)

    Anderson, R. B.; Finch, N.; Clegg, S. M.; Graff, T. G.; Morris, R. V.; Laura, J.; Gaddis, L. R.

    2017-12-01

    Machine learning is a powerful but underutilized approach that can enable planetary scientists to derive meaningful results from the rapidly-growing quantity of available spectral data. For example, regression methods such as Partial Least Squares (PLS) and Least Absolute Shrinkage and Selection Operator (LASSO), can be used to determine chemical concentrations from ChemCam and SuperCam Laser-Induced Breakdown Spectroscopy (LIBS) data [1]. Many scientists are interested in testing different spectral data processing and machine learning methods, but few have the time or expertise to write their own software to do so. We are therefore developing a free open-source library of software called the Python Spectral Analysis Tool (PySAT) along with a flexible, user-friendly graphical interface to enable scientists to process and analyze point spectral data without requiring significant programming or machine-learning expertise. A related but separately-funded effort is working to develop a graphical interface for orbital data [2]. The PySAT point-spectra tool includes common preprocessing steps (e.g. interpolation, normalization, masking, continuum removal, dimensionality reduction), plotting capabilities, and capabilities to prepare data for machine learning such as creating stratified folds for cross validation, defining training and test sets, and applying calibration transfer so that data collected on different instruments or under different conditions can be used together. The tool leverages the scikit-learn library [3] to enable users to train and compare the results from a variety of multivariate regression methods. It also includes the ability to combine multiple "sub-models" into an overall model, a method that has been shown to improve results and is currently used for ChemCam data [4]. Although development of the PySAT point-spectra tool has focused primarily on the analysis of LIBS spectra, the relevant steps and methods are applicable to any spectral data. The

  18. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  19. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  20. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    Science.gov (United States)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  1. Lagrangian analysis. Modern tool of the dynamics of solids

    Science.gov (United States)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  2. Statement by Ms Ana Maria Cetto at the Workshop on IAEA Tools for Nuclear Energy System Assessment (NESA) for Long-Term Planning and Development Vienna, 23 July 2009

    International Nuclear Information System (INIS)

    Cetto, Ana Maria

    2009-01-01

    We are all aware that energy is central to sustainable development and poverty reduction efforts. A 2006 report by the Task Force for the UN Millennium Project, 'Energy Services for the Millennium Development Goals', warns that without increased investment in the energy sector, and major improvements in the quality and quantity of energy services in developing countries, it will not be possible to meet any of the Millennium Development Goals. Demand for energy continues to grow worldwide, as countries seek to improve living standards for their populations. The bulk of this growth in demand is coming from less economically advanced countries. Currently, conventional cooperation approaches are being used by Member States and the Agency to achieve the main goal of phase I of the 'milestone book', namely getting ready to decide to launch a nuclear power programme and make an informed commitment. Most of the countries planning to introduce a nuclear programme are currently in phase I. The Agency is open to consider, for the future TC programme cycle, national projects to apply NESA tools and INPRO methodologies in an integrated approach and help Member States in the preparatory work for the call of bids and construction of their first NPP. Ladies and gentlemen, Workshops such as this one are an important means of sharing experiences and learning from each other. These days you have had the opportunity to learn more about the tools and methods that the Agency offers to support long term energy planning and nuclear energy system assessments, and today you will be providing us with feedback on applying these tools. By sharing your experiences, the lessons you have learned and the constraints you have faced, you will strengthen the Agency's ability to respond to your needs. Your comments will help us to further develop and refine the Agency's support to the sustainable development of nuclear energy

  3. Enabling Collaborative Analysis: State Evaluation Groups, the Electronic State File, and Collaborative Analysis Tools

    International Nuclear Information System (INIS)

    Eldridge, C.; Gagne, D.; Wilson, B.; Murray, J.; Gazze, C.; Feldman, Y.; Rorif, F.

    2015-01-01

    The timely collection and analysis of all safeguards relevant information is the key to drawing and maintaining soundly-based safeguards conclusions. In this regard, the IAEA has made multidisciplinary State Evaluation Groups (SEGs) central to this process. To date, SEGs have been established for all States and tasked with developing State-level approaches (including the identification of technical objectives), drafting annual implementation plans specifying the field and headquarters activities necessary to meet technical objectives, updating the State evaluation on an ongoing basis to incorporate new information, preparing an annual evaluation summary, and recommending a safeguards conclusion to IAEA senior management. To accomplish these tasks, SEGs need to be staffed with relevant expertise and empowered with tools that allow for collaborative access to, and analysis of, disparate information sets. To ensure SEGs have the requisite expertise, members are drawn from across the Department of Safeguards based on their knowledge of relevant data sets (e.g., nuclear material accountancy, material balance evaluation, environmental sampling, satellite imagery, open source information, etc.) or their relevant technical (e.g., fuel cycle) expertise. SEG members also require access to all available safeguards relevant data on the State. To facilitate this, the IAEA is also developing a common, secure platform where all safeguards information can be electronically stored and made available for analysis (an electronic State file). The structure of this SharePoint-based system supports IAEA information collection processes, enables collaborative analysis by SEGs, and provides for management insight and review. In addition to this common platform, the Agency is developing, deploying, and/or testing sophisticated data analysis tools that can synthesize information from diverse information sources, analyze diverse datasets from multiple viewpoints (e.g., temporal, geospatial

  4. ICP-MS Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Carman, April J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Eiden, Gregory C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-11-01

    This is a short document that explains the materials that will be transmitted to LLNL and DNN HQ regarding the ICP-MS Workshop held at PNNL June 17-19th. The goal of the information is to pass on to LLNL information regarding the planning and preparations for the Workshop at PNNL in preparation of the SIMS workshop at LLNL.

  5. MOOC Design Workshop

    DEFF Research Database (Denmark)

    Nørgård, Rikke Toft; Mor, Yishay; Warburton, Steven

    2016-01-01

    For the last two years we have been running a series of successful MOOC design workshops. These workshops build on previous work in learning design and MOOC design patterns. The aim of these workshops is to aid practitioners in defining and conceptualising educational innovations (predominantly, ...

  6. The Danish Scenario Workshop Report

    DEFF Research Database (Denmark)

    Brodersen, Søsser; Jørgensen, Michael Søgaard

    with informal drinks) and planned and carried out as recommended in Ahumada (2003). We have however not developed all the material recommended by Ahumada (2003) as informative material prior to the workshop, (e.g. a SWOT analysis) due to a wish only to produce material to the participants which we found useful...

  7. Tools-4-Metatool (T4M): online suite of web-tools to process stoichiometric network analysis data from Metatool.

    Science.gov (United States)

    Xavier, Daniela; Vázquez, Sara; Higuera, Clara; Morán, Federico; Montero, Francisco

    2011-08-01

    Tools-4-Metatool (T4M) is a suite of web-tools, implemented in PERL, which analyses, parses, and manipulates files related to Metatool. Its main goal is to assist the work with Metatool. T4M has two major sets of tools: Analysis and Compare. Analysis visualizes the results of Metatool (convex basis, elementary flux modes, and enzyme subsets) and facilitates the study of metabolic networks. It is composed of five tools: MDigraph, MetaMatrix, CBGraph, EMGraph, and SortEM. Compare was developed to compare different Metatool results from different networks. This set consists of: Compara and ComparaSub which compare network subsets providing outputs in different formats and ComparaEM that seeks for identical elementary modes in two metabolic networks. The suite T4M also includes one script that generates Metatool input: CBasis2Metatool, based on a Metatool output file that is filtered by a list of convex basis' metabolites. Finally, the utility CheckMIn checks the consistency of the Metatool input file. T4M is available at http://solea.quim.ucm.es/t4m. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  8. Stability analysis of multipoint tool equipped with metal cutting ceramics

    Science.gov (United States)

    Maksarov, V. V.; Khalimonenko, A. D.; Matrenichev, K. G.

    2017-10-01

    The article highlights the issues of determining the stability of the cutting process by a multipoint cutting tool equipped with cutting ceramics. There were some recommendations offered on the choice of parameters of replaceable cutting ceramic plates for milling based of the conducted researches. Ceramic plates for milling are proposed to be selected on the basis of value of their electrical volume resistivity.

  9. “DRYPACK” - a calculation and analysis tool

    DEFF Research Database (Denmark)

    Andreasen, M.B.; Toftegaard, R.; Schneider, P.

    2013-01-01

    energy consumption reductions by using “DryPack” are calculated. With the “DryPack” calculation tool, it is possible to calculate four different unit operations with moist air (dehumidification of air, humidification of air, mixing of two air streams, and heating of air). In addition, a Mollier diagram...

  10. comparative analysis of diagnostic applications of autoscan tools

    African Journals Online (AJOL)

    user

    changing the skills demanded of auto designers, engineers and production workers [1,5,6,]. In automobile education, the use of autotronic simulators and demonstrators as teaching aids with computer soft- wares, auto scan tools for diagnosis, servicing and maintenance, auto- analyzers, solid work design and can- bus hard ...

  11. Fractography analysis of tool samples used for cold forging

    DEFF Research Database (Denmark)

    Dahl, K.V.

    2002-01-01

    Three fractured tool dies used for industrial cold forging have been investigated using light optical microscopy and scanning electron microscopy. Two of the specimens were produced using the traditional Böhler P/M steel grade s790, while the lastspecimen was a third generation P/M steel produced...... resistance towards abrasive wear compared with the traditional P/M steel....

  12. Analysis of Online Quizzes as a Teaching and Assessment Tool

    Science.gov (United States)

    Salas-Morera, Lorenzo; Arauzo-Azofra, Antonio; García-Hernández, Laura

    2012-01-01

    This article deals with the integrated use of online quizzes as a teaching and assessment tool in the general program of the subject Proyectos in the third course of Ingeniero Técnico en Informática de Gestión over five consecutive years. The research undertaken aimed to test quizzes effectiveness on student performance when used, not only as an…

  13. Comprehensive Analysis of Semantic Web Reasoners and Tools: A Survey

    Science.gov (United States)

    Khamparia, Aditya; Pandey, Babita

    2017-01-01

    Ontologies are emerging as best representation techniques for knowledge based context domains. The continuing need for interoperation, collaboration and effective information retrieval has lead to the creation of semantic web with the help of tools and reasoners which manages personalized information. The future of semantic web lies in an ontology…

  14. Hygrothermal Simulation: A Tool for Building Envelope Design Analysis

    Science.gov (United States)

    Samuel V. Glass; Anton TenWolde; Samuel L. Zelinka

    2013-01-01

    Is it possible to gauge the risk of moisture problems while designing the building envelope? This article provides a brief introduction to computer-based hygrothermal (heat and moisture) simulation, shows how simulation can be useful as a design tool, and points out a number of im-portant considerations regarding model inputs and limita-tions. Hygrothermal simulation...

  15. NREL Suite of Tools for PV and Storage Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Elgqvist, Emma M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Salasovich, James A [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-04-03

    Many different factors such as the solar resource, technology costs and incentives, utility cost and consumption, space available, and financial parameters impact the technical and economic potential of a PV project. NREL has developed techno-economic modeling tools that can be used to evaluate PV projects at a site.

  16. Stakeholder Analysis of an Executable Achitecture Systems Engineering (EASE) Tool

    Science.gov (United States)

    2013-06-21

    The FCR tables and stakeholder feedback are then used as the foundation of a Strengths, Weaknesses, Opportunities, and Threats ( SWOT ) analysis . Finally...the SWOT analysis and stakeholder feedback arc translated into an EASE future development strategy; a series of recommendations regarding...and Threats ( SWOT ) analysis . Finally, the SWOT analysis and stakeholder feedback are translated into an EASE future development strategy; a series

  17. Interactive tool that empowers structural understanding and enables FEM analysis in a parametric design environment

    DEFF Research Database (Denmark)

    Christensen, Jesper Thøger; Parigi, Dario; Kirkegaard, Poul Henning

    2014-01-01

    This paper introduces an interactive tool developed to integrate structural analysis in the architectural design environment from the early conceptual design stage. The tool improves exchange of data between the design environment of Rhino Grasshopper and the FEM analysis of Autodesk Robot...

  18. Developing workshop module of realistic mathematics education: Follow-up workshop

    Science.gov (United States)

    Palupi, E. L. W.; Khabibah, S.

    2018-01-01

    Realistic Mathematics Education (RME) is a learning approach which fits the aim of the curriculum. The success of RME in teaching mathematics concepts, triggering students’ interest in mathematics and teaching high order thinking skills to the students will make teachers start to learn RME. Hence, RME workshop is often offered and done. This study applied development model proposed by Plomp. Based on the study by RME team, there are three kinds of RME workshop: start-up workshop, follow-up workshop, and quality boost. However, there is no standardized or validated module which is used in that workshops. This study aims to develop a module of RME follow-up workshop which is valid and can be used. Plopm’s developmental model includes materials analysis, design, realization, implementation, and evaluation. Based on the validation, the developed module is valid. While field test shows that the module can be used effectively.

  19. Risk Management Techniques and Practice Workshop Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, T; Zosel, M

    2008-12-02

    At the request of the Department of Energy (DOE) Office of Science (SC), Lawrence Livermore National Laboratory (LLNL) hosted a two-day Risk Management Techniques and Practice (RMTAP) workshop held September 18-19 at the Hotel Nikko in San Francisco. The purpose of the workshop, which was sponsored by the SC/Advanced Scientific Computing Research (ASCR) program and the National Nuclear Security Administration (NNSA)/Advanced Simulation and Computing (ASC) program, was to assess current and emerging techniques, practices, and lessons learned for effectively identifying, understanding, managing, and mitigating the risks associated with acquiring leading-edge computing systems at high-performance computing centers (HPCCs). Representatives from fifteen high-performance computing (HPC) organizations, four HPC vendor partners, and three government agencies attended the workshop. The overall workshop findings were: (1) Standard risk management techniques and tools are in the aggregate applicable to projects at HPCCs and are commonly employed by the HPC community; (2) HPC projects have characteristics that necessitate a tailoring of the standard risk management practices; (3) All HPCC acquisition projects can benefit by employing risk management, but the specific choice of risk management processes and tools is less important to the success of the project; (4) The special relationship between the HPCCs and HPC vendors must be reflected in the risk management strategy; (5) Best practices findings include developing a prioritized risk register with special attention to the top risks, establishing a practice of regular meetings and status updates with the platform partner, supporting regular and open reviews that engage the interests and expertise of a wide range of staff and stakeholders, and documenting and sharing the acquisition/build/deployment experience; and (6) Top risk categories include system scaling issues, request for proposal/contract and acceptance testing, and

  20. Advancing Risk Analysis for Nanoscale Materials: Report from an International Workshop on the Role of Alternative Testing Strategies for Advancement: Advancing Risk Analysis for Nanoscale Materials

    Energy Technology Data Exchange (ETDEWEB)

    Shatkin, J. A. [Vireo Advisors, Boston MA USA; Ong, Kimberly J. [Vireo Advisors, Boston MA USA; Beaudrie, Christian [Compass RM, Vancouver CA USA; Clippinger, Amy J. [PETA International Science Consortium Ltd, London UK; Hendren, Christine Ogilvie [Center for the Environmental Implications of NanoTechnology, Duke University, Durham NC USA; Haber, Lynne T. [TERA, Cincinnati OH USA; Hill, Myriam [Health Canada, Ottawa Canada; Holden, Patricia [UC Santa Barbara, Bren School of Environmental Science & Management, ERI, and UC CEIN, University of California, Santa Barbara CA USA; Kennedy, Alan J. [U.S. Army Engineer Research and Development Center, Environmental Laboratory, Vicksburg MS USA; Kim, Baram [Independent, Somerville MA USA; MacDonell, Margaret [Argonne National Laboratory, Environmental Science Division, Argonne IL USA; Powers, Christina M. [U.S. Environmental Protection Agency, Office of Air and Radiation, Office of Transportation and Air Quality, Ann Arbor MI USA; Sharma, Monita [PETA International Science Consortium Ltd, London UK; Sheremeta, Lorraine [Alberta Ingenuity Labs, Edmonton Alberta Canada; Stone, Vicki [John Muir Building Gait 1 Heriot-Watt University, Edinburgh Scotland UK; Sultan, Yasir [Environment Canada, Gatineau QC Canada; Turley, Audrey [ICF International, Durham NC USA; White, Ronald H. [RH White Consultants, Silver Spring MD USA

    2016-08-01

    The Society for Risk Analysis (SRA) has a history of bringing thought leadership to topics of emerging risk. In September 2014, the SRA Emerging Nanoscale Materials Specialty Group convened an international workshop to examine the use of alternative testing strategies (ATS) for manufactured nanomaterials (NM) from a risk analysis perspective. Experts in NM environmental health and safety, human health, ecotoxicology, regulatory compliance, risk analysis, and ATS evaluated and discussed the state of the science for in vitro and other alternatives to traditional toxicology testing for NM. Based on this review, experts recommended immediate and near-term actions that would advance ATS use in NM risk assessment. Three focal areas-human health, ecological health, and exposure considerations-shaped deliberations about information needs, priorities, and the next steps required to increase confidence in and use of ATS in NM risk assessment. The deliberations revealed that ATS are now being used for screening, and that, in the near term, ATS could be developed for use in read-across or categorization decision making within certain regulatory frameworks. Participants recognized that leadership is required from within the scientific community to address basic challenges, including standardizing materials, protocols, techniques and reporting, and designing experiments relevant to real-world conditions, as well as coordination and sharing of large-scale collaborations and data. Experts agreed that it will be critical to include experimental parameters that can support the development of adverse outcome pathways. Numerous other insightful ideas for investment in ATS emerged throughout the discussions and are further highlighted in this article.

  1. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    Science.gov (United States)

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. Evaluating a poetry workshop in medical education.

    Science.gov (United States)

    Collett, T J; McLachlan, J C

    2006-06-01

    This study aimed at evaluating how doing poetry could affect students' understanding of medical practice and at assessing the effectiveness of the evaluation method used. Qualitative research was carried out on the experiences of medical students participating in a poetry workshop, followed by some quantitative analysis. The study was conducted at Peninsula Medical School and St Ives, Cornwall, UK, with three medical students, a poet and a pathologist as participants. Data were collected by interviews, observation and web access. "Doing poetry" with a professional poet was found to assist communication between doctors and patients as it enhanced skills of observation, heightened awareness of the effect of language and fostered deep reflection. Poetry was also found to offer an outlet for medics and patients. The voluntary workshop attracted three participants; however, it might have had an effect on the wider student community because the poetry website received 493 hits in four months. Qualitative methods worked well as a tool for evaluation. "Doing poetry for poetry's sake" seemed to foster the development of skills related to empathy. The opportunity to do poetry should be made available to medical students as part of a wider arts and humanities programme.

  3. The ATLAS Electromagnetic Calorimeter Calibration Workshop

    CERN Multimedia

    Hong Ma; Isabelle Wingerter

    The ATLAS Electromagnetic Calorimeter Calibration Workshop took place at LAPP-Annecy from the 1st to the 3rd of October; 45 people attended the workshop. A detailed program was setup before the workshop. The agenda was organised around very focused presentations where questions were raised to allow arguments to be exchanged and answers to be proposed. The main topics were: Electronics calibration Handling of problematic channels Cluster level corrections for electrons and photons Absolute energy scale Streams for calibration samples Calibration constants processing Learning from commissioning Forty-five people attended the workshop. The workshop was on the whole lively and fruitful. Based on years of experience with test beam analysis and Monte Carlo simulation, and the recent operation of the detector in the commissioning, the methods to calibrate the electromagnetic calorimeter are well known. Some of the procedures are being exercised in the commisssioning, which have demonstrated the c...

  4. Stability analysis of machine tool spindle under uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Dou

    2016-05-01

    Full Text Available Chatter is a harmful machining vibration that occurs between the workpiece and the cutting tool, usually resulting in irregular flaw streaks on the finished surface and severe tool wear. Stability lobe diagrams could predict chatter by providing graphical representations of the stable combinations of the axial depth of the cut and spindle speed. In this article, the analytical model of a spindle system is constructed, including a Timoshenko beam rotating shaft model and double sets of angular contact ball bearings with 5 degrees of freedom. Then, the stability lobe diagram of the model is developed according to its dynamic properties. The Monte Carlo method is applied to analyse the bearing preload influence on the system stability with uncertainty taken into account.

  5. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  6. Tools for Developing a Quality Management Program: Proactive Tools (Process Mapping, Value Stream Mapping, Fault Tree Analysis, and Failure Mode and Effects Analysis)

    International Nuclear Information System (INIS)

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings

  7. An online database for plant image analysis software tools

    OpenAIRE

    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire

    2013-01-01

    Background: Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is...

  8. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where an a...... provide the possibility for the designer to work both with the aesthetics as well as the technical aspects of architectural design....

  9. Pyrosequencing data analysis software: a useful tool for EGFR, KRAS, and BRAF mutation analysis

    Directory of Open Access Journals (Sweden)

    Shen Shanxiang

    2012-05-01

    Full Text Available Abstract Background Pyrosequencing is a new technology and can be used for mutation tests. However, its data analysis is a manual process and involves sophisticated algorithms. During this process, human errors may occur. A better way of analyzing pyrosequencing data is needed in clinical diagnostic laboratory. Computer software is potentially useful for pyrosequencing data analysis. We have developed such software, which is able to perform pyrosequencing mutation data analysis for epidermal growth factor receptor, Kirsten rat sarcoma viral oncogene homolog and v-raf murine sarcoma viral oncogene homolog B1. The input data for analysis includes the targeted nucleotide sequence, common mutations in the targeted sequence, pyrosequencing dispensing order, pyrogram peak order and peak heights. The output includes mutation type and percentage of mutant gene in the specimen. Results The data from 1375 pyrosequencing test results were analyzed using the software in parallel with manual analysis. The software was able to generate correct results for all 1375 cases. Conclusion The software developed is a useful molecular diagnostic tool for pyrosequencing mutation data analysis. This software can increase laboratory data analysis efficiency and reduce data analysis error rate. Virtual slides The virtual slide(s for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/1348911657684292.

  10. HTTR workshop (workshop on hydrogen production technology)

    International Nuclear Information System (INIS)

    Shiina, Yasuaki; Takizuka, Takakazu

    2004-12-01

    Various research and development efforts have been performed to solve the global energy and environmental problems caused by large consumption of fossil fuels. Research activities on advanced hydrogen production technology by the use of nuclear heat from high temperature gas cooled reactors, for example, have been flourished in universities, research institutes and companies in many countries. The Department of HTTR Project and the Department of Advanced Nuclear Heat Technology of JAERI held the HTTR Workshop (Workshop on Hydrogen Production Technology) on July 5 and 6, 2004 to grasp the present status of R and D about the technology of HTGR and the nuclear hydrogen production in the world and to discuss about necessity of the nuclear hydrogen production and technical problems for the future development of the technology. More than 110 participants attended the Workshop including foreign participants from USA, France, Korea, Germany, Canada and United Kingdom. In the Workshop, the presentations were made on such topics as R and D programs for nuclear energy and hydrogen production technologies by thermo-chemical or other processes. Also, the possibility of the nuclear hydrogen production in the future society was discussed. The workshop showed that the R and D for the hydrogen production by the thermo-chemical process has been performed in many countries. The workshop affirmed that nuclear hydrogen production could be one of the competitive supplier of hydrogen in the future. The second HTTR Workshop will be held in the autumn next year. (author)

  11. Performance Analysis of the Capability Assessment Tool for Sustainable Manufacturing

    Directory of Open Access Journals (Sweden)

    Enda Crossin

    2013-08-01

    Full Text Available This paper explores the performance of a novel capability assessment tool, developed to identify capability gaps and associated training and development requirements across the supply chain for environmentally-sustainable manufacturing. The tool was developed to assess 170 capabilities that have been clustered with respect to key areas of concern such as managing energy, water, material resources, carbon emissions and waste as well as environmental management practices for sustainability. Two independent expert teams used the tool to assess a sample group of five first and second tier sports apparel and footwear suppliers within the supply chain of a global sporting goods manufacturer in Asia. The paper addresses the reliability and robustness of the developed assessment method by formulating the expected links between the assessment results. The management practices of the participating suppliers were shown to be closely connected to their performance in managing their resources and emissions. The companies’ initiatives in implementing energy efficiency measures were found to be generally related to their performance in carbon emissions management. The suppliers were also asked to undertake a self-assessment by using a short questionnaire. The large gap between the comprehensive assessment and these in-house self-assessments revealed the suppliers’ misconceptions about their capabilities.

  12. Elevation Difference and Bouguer Anomaly Analysis Tool (EDBAAT) User's Guide

    Science.gov (United States)

    Smittle, Aaron M.; Shoberg, Thomas G.

    2017-06-16

    This report describes a software tool that imports gravity anomaly point data from the Gravity Database of the United States (GDUS) of the National Geospatial-Intelligence Agency and University of Texas at El Paso along with elevation data from The National Map (TNM) of the U.S. Geological Survey that lie within a user-specified geographic area of interest. Further, the tool integrates these two sets of data spatially and analyzes the consistency of the elevation of each gravity station from the GDUS with TNM elevation data; it also evaluates the consistency of gravity anomaly data within the GDUS data repository. The tool bins the GDUS data based on user-defined criteria of elevation misfit between the GDUS and TNM elevation data. It also provides users with a list of points from the GDUS data, which have Bouguer anomaly values that are considered outliers (two standard deviations or greater) with respect to other nearby GDUS anomaly data. “Nearby” can be defined by the user at time of execution. These outputs should allow users to quickly and efficiently choose which points from the GDUS would be most useful in reconnaissance studies or in augmenting and extending the range of individual gravity studies.

  13. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  14. Propositional Analysis: A Tool for Library and Information Science Research.

    Science.gov (United States)

    Allen, Bryce

    1989-01-01

    Reviews the use of propositional analysis in library and information science research. Evidence that different analysts produce similar judgments about texts and use the method consistently over time is presented, and it is concluded that propositional analysis is a reliable and valid research method. An example of an analysis is appended. (32…

  15. Application of Statistical Tools for Data Analysis and Interpretation in Rice Plant Pathology

    Directory of Open Access Journals (Sweden)

    Parsuram Nayak

    2018-01-01

    Full Text Available There has been a significant advancement in the application of statistical tools in plant pathology during the past four decades. These tools include multivariate analysis of disease dynamics involving principal component analysis, cluster analysis, factor analysis, pattern analysis, discriminant analysis, multivariate analysis of variance, correspondence analysis, canonical correlation analysis, redundancy analysis, genetic diversity analysis, and stability analysis, which involve in joint regression, additive main effects and multiplicative interactions, and genotype-by-environment interaction biplot analysis. The advanced statistical tools, such as non-parametric analysis of disease association, meta-analysis, Bayesian analysis, and decision theory, take an important place in analysis of disease dynamics. Disease forecasting methods by simulation models for plant diseases have a great potentiality in practical disease control strategies. Common mathematical tools such as monomolecular, exponential, logistic, Gompertz and linked differential equations take an important place in growth curve analysis of disease epidemics. The highly informative means of displaying a range of numerical data through construction of box and whisker plots has been suggested. The probable applications of recent advanced tools of linear and non-linear mixed models like the linear mixed model, generalized linear model, and generalized linear mixed models have been presented. The most recent technologies such as micro-array analysis, though cost effective, provide estimates of gene expressions for thousands of genes simultaneously and need attention by the molecular biologists. Some of these advanced tools can be well applied in different branches of rice research, including crop improvement, crop production, crop protection, social sciences as well as agricultural engineering. The rice research scientists should take advantage of these new opportunities adequately in

  16. Geothermal systems materials: a workshop/symposium

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    Sixteen papers are included. A separate abstract was prepared for each. Summaries of workshops on the following topics are also included in the report: non-metallic materials, corrosion, materials selection, fluid chemistry, and failure analysis. (MHR)

  17. An analysis of the ``accidental painting'' technique of D.A. Siqueiros: the Rayleigh Taylor instability as a tool to create explosive textures

    Science.gov (United States)

    Zetina, S.; Zenit, R.

    2012-11-01

    In the spring of 1936, the famous Mexican muralist David Alfaro Siqueiros organized an experimental painting workshop in New York: a group of artists focused in developing painting techniques through empirical experimentation of modern and industrial materials and tools. Among the young artists attending the workshop was Jackson Pollock. They tested different lacquers and a number of experimental techniques. One of the techniques, named by Siqueiros as a ``controlled accident,'' consisted in pouring layers of paint of different colors on top of each other. After a brief time, the paint from the lower layer emerged from bottom to top creating a relatively regular pattern of blobs. This technique led to the creation of explosion-inspired textures and catastrophic images. We conducted an analysis of this process. We experimentally reproduced the patterns ``discovered'' by Siquieros and analyzed the behavior of the flow. We found that the flow is driven by the well-known Rayleigh Taylor instability: different colors paints have different densities; a heavy layer on top of a light one is an unstable configuration. The blobs and plumes that result from the instability create the aesthetically pleasing patterns. We discuss the importance of fluid mechanics in artistic creation.

  18. An auditory display tool for DNA sequence analysis.

    Science.gov (United States)

    Temple, Mark D

    2017-04-24

    DNA Sonification refers to the use of an auditory display to convey the information content of DNA sequence data. Six sonification algorithms are presented that each produce an auditory display. These algorithms are logically designed from the simple through to the more complex. Three of these parse individual nucleotides, nucleotide pairs or codons into musical notes to give rise to 4, 16 or 64 notes, respectively. Codons may also be parsed degenerately into 20 notes with respect to the genetic code. Lastly nucleotide pairs can be parsed as two separate frames or codons can be parsed as three reading frames giving rise to multiple streams of audio. The most informative sonification algorithm reads the DNA sequence as codons in three reading frames to produce three concurrent streams of audio in an auditory display. This approach is advantageous since start and stop codons in either frame have a direct affect to start or stop the audio in that frame, leaving the other frames unaffected. Using these methods, DNA sequences such as open reading frames or repetitive DNA sequences can be distinguished from one another. These sonification tools are available through a webpage interface in which an input DNA sequence can be processed in real time to produce an auditory display playable directly within the browser. The potential of this approach as an analytical tool is discussed with reference to auditory displays derived from test sequences including simple nucleotide sequences, repetitive DNA sequences and coding or non-coding genes. This study presents a proof-of-concept that some properties of a DNA sequence can be identified through sonification alone and argues for their inclusion within the toolkit of DNA sequence browsers as an adjunct to existing visual and analytical tools.

  19. Fifth International Microgravity Combustion Workshop

    Science.gov (United States)

    Sacksteder, Kurt (Compiler)

    1999-01-01

    This conference proceedings document is a compilation of 120 papers presented orally or as poster displays to the Fifth International Microgravity Combustion Workshop held in Cleveland, Ohio on May 18-20, 1999. The purpose of the workshop is to present and exchange research results from theoretical and experimental work in combustion science using the reduced-gravity environment as a research tool. The results are contributed by researchers funded by NASA throughout the United States at universities, industry and government research agencies, and by researchers from at least eight international partner countries that are also participating in the microgravity combustion science research discipline. These research results are intended for use by public and private sector organizations for academic purposes, for the development of technologies needed for the Human Exploration and Development of Space, and to improve Earth-bound combustion and fire-safety related technologies.

  20. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    Directory of Open Access Journals (Sweden)

    Drechsel Marion

    2009-10-01

    Full Text Available Abstract Background Single nucleotide polymorphism (SNP genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis. Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  1. Practical Multi-Disciplinary Analysis Tools for Combustion Devices Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The use of multidisciplinary analysis (MDA) techniques for combustion device environment prediction, including complex fluid mixing phenomena, is now becoming...

  2. A Comparative Analysis of Cockpit Display Development Tools

    National Research Council Canada - National Science Library

    Gebhardt, Matthew

    2000-01-01

    ..., Virtual Application Prototyping System (VAPS) and Display Editor. The comparison exploits the analysis framework establishing the advantages and disadvantages of the three software development suites...

  3. From beans to breams: how participatory workshops can contribute ...

    African Journals Online (AJOL)

    From beans to breams: how participatory workshops can contribute to marine conservation planning. ... a valuable new tool for marine conservation planning. Keywords: marine conservation planning; participatory mapping and scoring methods; Sparidae; workshops. African Journal of Marine Science 2008, 30(3): 475–487 ...

  4. Data visualization and analysis tools for the MAVEN mission

    Science.gov (United States)

    Harter, B.; De Wolfe, A. W.; Putnam, B.; Brain, D.; Chaffin, M.

    2016-12-01

    The Mars Atmospheric and Volatile Evolution (MAVEN) mission has been collecting data at Mars since September 2014. We have developed new software tools for exploring and analyzing the science data. Our open-source Python toolkit for working with data from MAVEN and other missions is based on the widely-used "tplot" IDL toolkit. We have replicated all of the basic tplot functionality in Python, and use the bokeh and matplotlib libraries to generate interactive line plots and spectrograms, providing additional functionality beyond the capabilities of IDL graphics. These Python tools are generalized to work with missions beyond MAVEN, and our software is available on Github. We have also been exploring 3D graphics as a way to better visualize the MAVEN science data and models. We have constructed a 3D visualization of MAVEN's orbit using the CesiumJS library, which not only allows viewing of MAVEN's orientation and position, but also allows the display of selected science data sets and their variation over time.

  5. Cemented carbide cutting tool: Laser processing and thermal stress analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yilbas, B.S. [Mechanical Engineering Department, KFUPM, Box 1913, Dhahran 31261 (Saudi Arabia)]. E-mail: bsyilbas@kfupm.edu.sa; Arif, A.F.M. [Mechanical Engineering Department, KFUPM, Box 1913, Dhahran 31261 (Saudi Arabia); Karatas, C. [Engineering Faculty, Hacettepe University, Ankara (Turkey); Ahsan, M. [Mechanical Engineering Department, KFUPM, Box 1913, Dhahran 31261 (Saudi Arabia)

    2007-04-15

    Laser treatment of cemented carbide tool surface consisting of W, C, TiC, TaC is examined and thermal stress developed due to temperature gradients in the laser treated region is predicted numerically. Temperature rise in the substrate material is computed numerically using the Fourier heating model. Experiment is carried out to treat the tool surfaces using a CO{sub 2} laser while SEM, XRD and EDS are carried out for morphological and structural characterization of the treated surface. Laser parameters were selected include the laser output power, duty cycle, assisting gas pressure, scanning speed, and nominal focus setting of the focusing lens. It is found that temperature gradient attains significantly high values below the surface particularly for titanium and tantalum carbides, which in turn, results in high thermal stress generation in this region. SEM examination of laser treated surface and its cross section reveals that crack initiation below the surface occurs and crack extends over the depth of the laser treated region.

  6. Pathway-based analysis tools for complex diseases: a review.

    Science.gov (United States)

    Jin, Lv; Zuo, Xiao-Yu; Su, Wei-Yang; Zhao, Xiao-Lei; Yuan, Man-Qiong; Han, Li-Zhen; Zhao, Xiang; Chen, Ye-Da; Rao, Shao-Qi

    2014-10-01

    Genetic studies are traditionally based on single-gene analysis. The use of these analyses can pose tremendous challenges for elucidating complicated genetic interplays involved in complex human diseases. Modern pathway-based analysis provides a technique, which allows a comprehensive understanding of the molecular mechanisms underlying complex diseases. Extensive studies utilizing the methods and applications for pathway-based analysis have significantly advanced our capacity to explore large-scale omics data, which has rapidly accumulated in biomedical fields. This article is a comprehensive review of the pathway-based analysis methods-the powerful methods with the potential to uncover the biological depths of the complex diseases. The general concepts and procedures for the pathway-based analysis methods are introduced and then, a comprehensive review of the major approaches for this analysis is presented. In addition, a list of available pathway-based analysis software and databases is provided. Finally, future directions and challenges for the methodological development and applications of pathway-based analysis techniques are discussed. This review will provide a useful guide to dissect complex diseases. Copyright © 2014 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  7. Workshops for state review of site suitability criteria for high-level radioactive waste repositories: analysis and recommendations

    International Nuclear Information System (INIS)

    1978-02-01

    The purpose of this report is to present the views and recommendations of invited State officials and legislators participating in a workshop concerned with preliminary site suitability criteria for high level radioactive waste repositories. The workshops were open to the public and were conducted by the U. S. Nuclear Regulatory Commission (NRC) during September 1977 in three regional locations across the United States. This contractor report is the second of two reports and consolidates the discussion by State officials on the role of a State in siting a repository, NRC's waste management program, the transportation of high level wastes, the number and location of repositories and concerns with the socio-economic impacts of siting a repository in a community. The recommendations to the NRC can be categorized into four areas. These were: (1) general recommendations, (2) procedural recommendations, (3) recommendations for improving communications, and (4) specific recommendations on the preliminary siting criteria. The recommendations emphasized the need for early State involvement in the siting process, the need for an impacted State to assess repository operations, the need for early solution of waste transportation concerns, and the requirement that any repository developed insure the protection of the public health and safety as its most important characteristic. Other participant recommendations are included in the body of the report

  8. Workshops for state review of site suitability criteria for high-level radioactive waste repositories. analysis and recommendation. Volume 1

    International Nuclear Information System (INIS)

    Kress, H.W.

    1978-02-01

    The purpose of this report is to present the views and recommendations of State officials and legislators participating in a workshop concerned with preliminary site suitability criteria for high-level radioactive waste repositories. The workshops were open to the public and were conducted by the U.S. Nuclear Regulatory Commission (NRC) during September 1977 in three regional locations across the United States. This contractor report is the second of two reports and consolidates the discussion by State officials to the role of a State in siting a repository, NRC's waste management program, the transportation of high-level wastes, the number and location of repositories and concerns with the socio-economic impacts of siting a repository in a community. The recommendations to the NRC can be categorized into four areas. These are: (1) general recommendations, (2) procedural recommendations, (3) recommendations for improving communications, and (4) specific recommendations on the preliminary siting criteria. The recommendations emphasized the need for early State involvement in the siting process, the need for an impacted State to assess repository operations, and the requirement that any repository developed insure the protection of the public health and safety as its most important characteristic

  9. Mind maps and network analysis to evaluate conceptualization of complex issues: A case example evaluating systems science workshops for childhood obesity prevention.

    Science.gov (United States)

    Frerichs, Leah; Young, Tiffany L; Dave, Gaurav; Stith, Doris; Corbie-Smith, Giselle; Hassmiller Lich, Kristen

    2018-03-05

    Across disciplines, it is common practice to bring together groups to solve complex problems. Facilitators are often asked to help groups organize information about and better understand the problem in order to develop and prioritize solutions. However, despite existence of several methods to elicit and characterize how individuals and groups think about and conceptualize an issue, many are difficult to implement in practice-based settings where resources such as technology and participant time are limited and research questions shift over time. This paper describes an easy-to-implement diagramming technique for eliciting conceptualization and a flexible network analysis method for characterizing changes in both individual and group conceptualization. We use a case example to illustrate how we used the methods to evaluate African American adolescent's conceptual understanding of obesity before and after participating in a series of four systems thinking workshops. The methods produced results that were sensitive to changes in conceptualization that were likely driven by the specific activities employed during the workshop sessions. The methods appear strong for capturing salient levels of conceptualization at both individual and collective levels. The paper concludes with a critical examination of strengths and weaknesses of the methods and implications for future practice and research. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Integration of management control tools. Analysis of a case study

    Directory of Open Access Journals (Sweden)

    Raúl Comas Rodríguez

    2015-09-01

    Full Text Available The objective of this article is to design and to implement a procedure that integrates management control tools focusing on process, to improve the efficiency and the efficacy. It was carried out an experimental study where is defined a procedure, based in the Balanced Scorecard, which integrates the process management into the strategic planning and their evaluation. As results of this work, we define the key factors of success associated with the four perspectives of the Balanced Scorecard that are linked through the cause-effect relations obtaining the strategic map that allows visualizing and communicating the enterprise strategy. The indicators evaluate the key factor of success, integrating the process with the assistance of a software. The implementation of the procedure in a commercialization enterprise contributed to integrate the process definition into the strategic planning. The alignment was evaluated and the efficiency and efficacy indicators improved the company´s performance.

  11. Capability Portfolio Analysis Tool (CPAT) Verification and Validation Report

    Science.gov (United States)

    2013-01-01

    Med Evac Vehicle MGS Mobile Gun System MILPRS Military Personnel MILCON Military Construction MODA Multiple Objective Decision Analysis...Analysis ( MODA ) approach for assessing the value of vehicle modernization in the HBCT and SBCT combat fleets. The MODA approach provides insight to...used to measure the returns of scale for a given attribute. The MODA approach promotes buy-in from multiple stakeholders. The CPAT team held an SME

  12. The Effectiveness of Engineering Workshops in Attracting Females into Engineering Fields: A Review of the Literature

    Science.gov (United States)

    Sinkele, Carrie Nicole; Mupinga, Davison M.

    2011-01-01

    All-girl engineering workshops are increasing in popularity as a means to attract females into the male-dominated field of engineering. However, the effectiveness of such workshops as a recruitment tool is unclear. This report summarizes several research studies on this topic with the intent of showing the effectiveness of such workshops and other…

  13. Building Blocks for Academic Papers Graduate Writing Center Workshop [video

    OpenAIRE

    Naval Postgraduate School; Naval Postgraduate School (U.S.)

    2015-01-01

    Naval Postgraduate School, Graduate Writing Center Workshop, Building Blocks for Academic Papers. Presented by George Lober, MA, Senior Lecturer, Department of Defense Analysis, Naval Postgraduate School.

  14. ATENA – A tool for engineering analysis of fracture in concrete

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    ATENA – A tool for engineering analysis of fracture in concrete. VLADIMIR CERVENKA, JAN CERVENKA and RADOMIR PUKL. Cervenka Consulting, Prague, Czech Republic e-mail: cervenka@cervenka.cz. Abstract. Advanced constitutive models implemented in the finite element system ATENA serve as rational tools to ...

  15. HAWCStab2 with super element foundations: A new tool for frequency analysis of offshore wind turbines

    DEFF Research Database (Denmark)

    Henriksen, Lars Christian; Hansen, Anders Melchior; Kragh, Knud Abildgaard

    2013-01-01

    HAWCStab2 is a linear frequency domain aero-elastic tool, developed by DTU Wind Energy, suitable for frequency and stability analysis of horizontal axis 3 bladed wind turbines [1]. This tool has now been extended to also handle complex offshore foundation types, such as jacket structures...

  16. DEBRISK, a Tool for Re-Entry Risk Analysis

    Science.gov (United States)

    Omaly, P.; Spel, M.

    2012-01-01

    An act of French parliament, adopted in 2008, imposes satellite constructors to evaluate the end-of-life operations in order to assure the risk mitigation of their satellites. One important element in this evaluation is the estimation of the mass and impact energy of the satellite debris after atmospheric re-entry. For this purpose, CNES has developed the tool DEBRISK which allows the operator to simulate the re-entry phase and to study the demise altitudes or impact energy of the individual fragments of the original satellite. DEBRISK is based on the so called object based approach. Using this approach, a breakup altitude is assumed where the satellite disintegrates due to the pressure loads. This altitude is typically around 78 km. After breakup, the satellite structure is modelled by a parent-child approach, where each child has its birth criterion. In the simplest approach the child is born after demise of the parent object. This could be the case of an object A containing an object B which is in the interior of object A and thus not exposed to the atmosphere. Each object is defined by: - its shape, attitude and dimensions, - the material along with their physical properties - the state and velocity vectors. The shape, attitude and dimensions define the aerodynamic drag of the object which is input to the 3DOF trajectory modelling. The aerodynamic mass used in the equation of motion is defined as the sum of the object's own mass and the mass of the object's offspring. A new born object inherits the state vector of the parent object. The shape, attitude and dimensions also define the heating rates experienced by the object. The heating rate is integrated in time up to the point where the melting temperature is reached. The mass of melted material is computed from the excess heat and the material properties. After each step the amount of ablated material is determined using the lumped mass approach and is peeled off from the object, updating mass and shape of the

  17. 7th International Workshop on Statistical Simulation

    CERN Document Server

    Mignani, Stefania; Monari, Paola; Salmaso, Luigi

    2014-01-01

    The Department of Statistical Sciences of the University of Bologna in collaboration with the Department of Management and Engineering of the University of Padova, the Department of Statistical Modelling of Saint Petersburg State University, and INFORMS Simulation Society sponsored the Seventh Workshop on Simulation. This international conference was devoted to statistical techniques in stochastic simulation, data collection, analysis of scientific experiments, and studies representing broad areas of interest. The previous workshops took place in St. Petersburg, Russia in 1994, 1996, 1998, 2001, 2005, and 2009. The Seventh Workshop took place in the Rimini Campus of the University of Bologna, which is in Rimini’s historical center.

  18. Tandem mirror theory workshop

    International Nuclear Information System (INIS)

    1981-05-01

    The workshop was divided into three sections which were constituted according to subject matter: RF Heating, MHD Equilibrium and Stability, and Transport and Microstability. An overview from Livermore's point of view was given at the beginning of each session. Each session was assigned a secretary to take notes. These notes have been used in preparing this report on the workshop. The report includes the activities, conclusions, and recommendations of the workshop

  19. Innovative confinement concepts workshop

    International Nuclear Information System (INIS)

    Kirkpatrick, R.C.

    1998-01-01

    The Innovative Confinement Concepts Workshop occurred in California during the week preceding the Second Symposium on Current Trends in International Fusion Research. An informal report was made to the Second Symposium. A summary of the Workshop concluded that some very promising ideas were presented, that innovative concept development is a central element of the restructured US DOE. Fusion Energy Sciences program, and that the Workshop should promote real scientific progress in fusion

  20. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.

    2014-12-01

    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  1. 2nd Ralf Yorque Workshop

    CERN Document Server

    1985-01-01

    These are the proceedings of the Second R. Yorque Workshop on Resource Management which took place in Ashland, Oregon on July 23-25, 1984. The purpose of the workshop is to provide an informal atmosphere for the discussion of resource assessment and management problems. Each participant presented a one hour morning talk; afternoons were reserved for informal chatting. The workshop was successful in stimulating ideas and interaction. The papers by R. Deriso, R. Hilborn and C. Walters all address the same basic issue, so they are lumped together. Other than that, the order to the papers in this volume was determined in the same fashion as the order of speakers during the workshop -- by random draw. Marc Mangel Department of Mathematics University of California Davis, California June 1985 TABLE OF CONTENTS A General Theory for Fishery Modeling Jon Schnute Data Transformations in Regression Analysis with Applications to Stock-Recruitment Relationships David Ruppert and Raymond J. Carroll ••••••. •�...

  2. dada - a web-based 2D detector analysis tool

    Science.gov (United States)

    Osterhoff, Markus

    2017-06-01

    The data daemon, dada, is a server backend for unified access to 2D pixel detector image data stored with different detectors, file formats and saved with varying naming conventions and folder structures across instruments. Furthermore, dada implements basic pre-processing and analysis routines from pixel binning over azimuthal integration to raster scan processing. Common user interactions with dada are by a web frontend, but all parameters for an analysis are encoded into a Uniform Resource Identifier (URI) which can also be written by hand or scripts for batch processing.

  3. SCALE 5: Powerful new criticality safety analysis tools

    International Nuclear Information System (INIS)

    Bowman, Stephen M.; Hollenbach, Daniel F.; Dehart, Mark D.; Rearden, Bradley T.; Gauld, Ian C.; Goluoglu, Sedat

    2003-01-01

    Version 5 of the SCALE computer software system developed at Oak Ridge National Laboratory, scheduled for release in December 2003, contains several significant new modules and sequences for criticality safety analysis and marks the most important update to SCALE in more than a decade. This paper highlights the capabilities of these new modules and sequences, including continuous energy flux spectra for processing multigroup problem-dependent cross sections; one- and three-dimensional sensitivity and uncertainty analyses for criticality safety evaluations; two-dimensional flexible mesh discrete ordinates code; automated burnup-credit analysis sequence; and one-dimensional material distribution optimization for criticality safety. (author)

  4. Nuclear physics workshop

    International Nuclear Information System (INIS)

    1988-01-01

    This Workshop in Nuclear Physics related to the TANDAR, took place in Buenos Aires in April from 23 to 26, 1987, with attendance of foreign scientists. There were presented four seminars and a lot of studies which deal with the following fields: Nuclear Physics at medium energies, Nuclear Structure, Nuclear Reactions, Nuclear Matter, Instrumentation and Methodology for Nuclear Spectroscopy, Classical Physics, Quantum Mechanics and Field Theory. It must be emphasized that the Electrostatic Accelerator TANDAR allows to work with heavy ions of high energy, that opens a new field of work in PIXE (particle induced X-ray emission). This powerful analytic technique makes it possiblethe analysis of nearly all the elements of the periodic table with the same accuracy. (M.E.L.) [es

  5. 2014 MICCAI Workshop

    CERN Document Server

    Nedjati-Gilani, Gemma; Rathi, Yogesh; Reisert, Marco; Schneider, Torben

    2014-01-01

    This book contains papers presented at the 2014 MICCAI Workshop on Computational Diffusion MRI, CDMRI’14. Detailing new computational methods applied to diffusion magnetic resonance imaging data, it offers readers a snapshot of the current state of the art and covers a wide range of topics from fundamental theoretical work on mathematical modeling to the development and evaluation of robust algorithms and applications in neuroscientific studies and clinical practice.   Inside, readers will find information on brain network analysis, mathematical modeling for clinical applications, tissue microstructure imaging, super-resolution methods, signal reconstruction, visualization, and more. Contributions include both careful mathematical derivations and a large number of rich full-color visualizations.   Computational techniques are key to the continued success and development of diffusion MRI and to its widespread transfer into the clinic. This volume will offer a valuable starting point for anyone interested i...

  6. Cluster analysis as a prediction tool for pregnancy outcomes.

    Science.gov (United States)

    Banjari, Ines; Kenjerić, Daniela; Šolić, Krešimir; Mandić, Milena L

    2015-03-01

    Considering specific physiology changes during gestation and thinking of pregnancy as a "critical window", classification of pregnant women at early pregnancy can be considered as crucial. The paper demonstrates the use of a method based on an approach from intelligent data mining, cluster analysis. Cluster analysis method is a statistical method which makes possible to group individuals based on sets of identifying variables. The method was chosen in order to determine possibility for classification of pregnant women at early pregnancy to analyze unknown correlations between different variables so that the certain outcomes could be predicted. 222 pregnant women from two general obstetric offices' were recruited. The main orient was set on characteristics of these pregnant women: their age, pre-pregnancy body mass index (BMI) and haemoglobin value. Cluster analysis gained a 94.1% classification accuracy rate with three branch- es or groups of pregnant women showing statistically significant correlations with pregnancy outcomes. The results are showing that pregnant women both of older age and higher pre-pregnancy BMI have a significantly higher incidence of delivering baby of higher birth weight but they gain significantly less weight during pregnancy. Their babies are also longer, and these women have significantly higher probability for complications during pregnancy (gestosis) and higher probability of induced or caesarean delivery. We can conclude that the cluster analysis method can appropriately classify pregnant women at early pregnancy to predict certain outcomes.

  7. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic...

  8. EMGTools, an adaptive and versatile tool for detailed EMG analysis

    DEFF Research Database (Denmark)

    Nikolic, M; Krarup, C

    2010-01-01

    We have developed an EMG decomposition system called EMGTools that can extract the constituent MUAPs and firing patterns for quantitative analysis from the EMG signal recorded at slight effort for clinical evaluation. The aim was to implement a robust system able to handle the challenges...

  9. Gipsy 3D : Analysis, Visualization and Vo-Tools

    NARCIS (Netherlands)

    Ruiz, J. E.; Santander-Vela, J. D.; Espigares, V.; Verdes-Montenegro, L.; Hulst, J. M. van der

    2009-01-01

    The scientific goals of the AMIGA project are based on the analysis of a significant amount of spectroscopic 3D data. In order to perform this work we present an initiative to develop a new VO compliant package, including present core applications and tasks offered by the Groningen Image Processing

  10. Television as an Instructional Tool for Concept Analysis

    Science.gov (United States)

    Benwari, Nnenna Ngozi

    2015-01-01

    This is a study of the perception of teachers on the use of television for concept analysis in the classroom. The population of the study is all the 9,784 Secondary School teachers in Bayelsa State of Nigeria out of which 110 teachers were randomly selected using the proportional sampling method. The instrument is a questionnaire designed by the…

  11. Principal Component Analysis: Most Favourite Tool in Chemometrics

    Indian Academy of Sciences (India)

    differentiate olive oils from non-olive vegetable oils. Moreover, manual analysis of such a large volume of data is laborious and time consuming, and may not provide any meaningful interpre-. Figure 4. Amount of vari- ance captured by different principal components (PCs). The plot indicates that first two PCs are sufficient to ...

  12. Principal Component Analysis: Most Favourite Tool in Chemometrics

    Indian Academy of Sciences (India)

    Abstract. Principal component analysis (PCA) is the most commonlyused chemometric technique. It is an unsupervised patternrecognition technique. PCA has found applications in chemistry,biology, medicine and economics. The present work attemptsto understand how PCA work and how can we interpretits results.

  13. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    Vincent, J.; Midwinter, J.

    2015-01-01

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  14. Transportation Routing Analysis Geographic Information System -- TRAGIS, progress on improving a routing tool

    International Nuclear Information System (INIS)

    Johnson, P.E.; Lester, P.B.

    1998-05-01

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental US. This paper outlines some of the features available in this model

  15. Gene Ontology-Based Analysis of Zebrafish Omics Data Using the Web Tool Comparative Gene Ontology.

    Science.gov (United States)

    Ebrahimie, Esmaeil; Fruzangohar, Mario; Moussavi Nik, Seyyed Hani; Newman, Morgan

    2017-10-01

    Gene Ontology (GO) analysis is a powerful tool in systems biology, which uses a defined nomenclature to annotate genes/proteins within three categories: "Molecular Function," "Biological Process," and "Cellular Component." GO analysis can assist in revealing functional mechanisms underlying observed patterns in transcriptomic, genomic, and proteomic data. The already extensive and increasing use of zebrafish for modeling genetic and other diseases highlights the need to develop a GO analytical tool for this organism. The web tool Comparative GO was originally developed for GO analysis of bacterial data in 2013 ( www.comparativego.com ). We have now upgraded and elaborated this web tool for analysis of zebrafish genetic data using GOs and annotations from the Gene Ontology Consortium.

  16. Evaluation of fatigue damage in nuclear power plants: evolution and new tools of analysis

    International Nuclear Information System (INIS)

    Cicero, R.; Corchon, F.

    2011-01-01

    This paper presents new fatigue mechanisms requiring analysis, tools developed for evaluation and the latest trends and studies that are currently working in the nuclear field, and allow proper management referring facilities the said degradation mechanism.

  17. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  18. Pressurized water reactor simulator. Workshop material

    International Nuclear Information System (INIS)

    2003-01-01

    The International Atomic Energy Agency (IAEA) has established an activity in nuclear reactor simulation computer programs to assist its Member States in education. The objective is to provide, for a variety of advanced reactor types, insight and practice in their operational characteristics and their response to perturbations and accident situations. To achieve this, the IAEA arranges for the development and distribution of simulation programs and educational material and sponsors courses and workshops. The workshops are in two parts: techniques and tools for reactor simulator development; and the use of reactor simulators in education. Workshop material for the first part is covered in the IAEA Training Course Series No. 12, 'Reactor Simulator Development' (2001). Course material for workshops using a WWER- 1000 reactor department simulator from the Moscow Engineering and Physics Institute, the Russian Federation is presented in the IAEA Training Course Series No. 21 'WWER-1000 Reactor Simulator' (2002). Course material for workshops using a boiling water reactor simulator developed for the IAEA by Cassiopeia Technologies Incorporated of Canada (CTI) is presented in the IAEA publication: Training Course Series No.23 'Boiling Water Reactor Simulator' (2003). This report consists of course material for workshops using a pressurized water reactor simulator

  19. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    Science.gov (United States)

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  20. Accessibility Analyst: an integrated GIS tool for accessibility analysis in urban transportation planning

    OpenAIRE

    Suxia Liu; Xuan Zhu

    2004-01-01

    The authors present an integrated GIS tool, Accessibility Analyst, for accessibility analysis in urban transportation planning, built as an extension to the desktop GIS software package, ArcView. Accessibility Analyst incorporates a number of accessibility measures, ranging from catchment profile analysis to cumulative-opportunity measures, gravity-type measures, and utility-based measures, contains several travel-impedance measurement tools for estimating the travel distance, time, or cost b...