WorldWideScience

Sample records for analysis tools workshop

  1. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  2. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  3. Video Analysis and Modeling Tool for Physics Education: A workshop for Redesigning Pedagogy

    CERN Document Server

    Wee, Loo Kang

    2012-01-01

    This workshop aims to demonstrate how the Tracker Video Analysis and Modeling Tool engages, enables and empowers teachers to be learners so that we can be leaders in our teaching practice. Through this workshop, the kinematics of a falling ball and a projectile motion are explored using video analysis and in the later video modeling. We hope to lead and inspire other teachers by facilitating their experiences with this ICT-enabled video modeling pedagogy (Brown, 2008) and free tool for facilitating students-centered active learning, thus motivate students to be more self-directed.

  4. Policy analysis tools for air quality and health : report from the May 19, 2005 workshop

    International Nuclear Information System (INIS)

    The total impact of air pollution on human health is not well understood. This workshop examined key policy issues concerning air quality, and the availability of models and analyses to inform decision-makers. Attendants included stakeholders from health and environment departments of municipal, provincial and federal governments, as well as academics, consulting firms, industry and non-governmental organizations. The complexity of computer-based models was identified as a significant barrier to the development of a better understanding of the impacts of air pollution on human health, and it was noted that most models are not equipped to deal with the various levels of policy and decision-making that occur across many jurisdictions. It was observed that there is also a lack of data. It was suggested that efficient and cost-effective models are needed to identify good policy options, as well as tools that maximize the integration of information in a comprehensive manner. Evaluations of the impacts of air pollution should occur within the broad context of public health and consider both social and interactive needs. Continuing stakeholder dialogue was recommended, as well as a more in-depth exploration of policy analysis tools. A national meeting was planned to build on conclusions from the workshop. A guidance document was proposed to provide best practices to guide non-experts on health impacts, the interpretation of monitoring results, and the selection of models and appropriate analyses. Case studies of issues facing municipalities concerning planning and land use decisions were recommended, as well as various actions to mitigate the effects of poor air quality and greenhouse gas (GHG) emissions. Five presentations were given, followed by breakout sessions and discussions. refs., tabs., figs

  5. 6th International Parallel Tools Workshop

    CERN Document Server

    Brinkmann, Steffen; Gracia, José; Resch, Michael; Nagel, Wolfgang

    2013-01-01

    The latest advances in the High Performance Computing hardware have significantly raised the level of available compute performance. At the same time, the growing hardware capabilities of modern supercomputing architectures have caused an increasing complexity of the parallel application development. Despite numerous efforts to improve and simplify parallel programming, there is still a lot of manual debugging and  tuning work required. This process  is supported by special software tools, facilitating debugging, performance analysis, and optimization and thus  making a major contribution to the development of  robust and efficient parallel software. This book introduces a selection of the tools, which were presented and discussed at the 6th International Parallel Tools Workshop, held in Stuttgart, Germany, 25-26 September 2012.

  6. Object Oriented Risk Analysis Workshop

    Science.gov (United States)

    Pons, M. Güell I.; Jaboyedoff, M.

    2009-04-01

    In the framework of the RISET Project (Interfaculty Network of Support to Education and Technology) an educational tool for introducing risk analysis has been developed. This workshop enables to carry a group of students (role-play game) through a step-by-step process of risk identification and quantification. The aim is to assess risk in a characteristic alpine village regarding natural hazards (rockfall, snow avalanche, flooding…) and is oriented to affected objects such as buildings, infrastructures... The workshop contains the following steps: 1.- Planning of the study and definition of stakeholders 2.- Hazard identification 3.- Risk analysis 4.- Risk assessment 5.- Proposition of mitigation measures 6- Risk management and cost-benefit analysis. During the process, information related to past events and useful concepts are provided in order to bring up discussion and decision making. The Risk Matrix and other graphical tools allow having a visual representation of the risk level and help to prioritize counter measures. At the end of the workshop, there is the possibility to compare the results between different groups and print out a summarizing report. This approach provides a rapid and comprehensible risk evaluation. The workshop is accessible from the internet and will be used for educational purposes at bachelor and master level as well as for external persons dealing with risk analysis.

  7. Workshop Physics and Related Curricula: A 25-Year History of Collaborative Learning Enhanced by Computer Tools for Observation and Analysis

    Science.gov (United States)

    Laws, Priscilla W.; Willis, Maxine C.; Sokoloff, David R.

    2015-10-01

    This article describes the 25-year history of development of the activity-based Workshop Physics (WP) at Dickinson College, its adaptation for use at Gettysburg Area High School, and its synergistic influence on curricular materials developed at the University of Oregon and Tufts University and vice versa. WP and these related curricula: 1) are based on Physics Education Research (PER) findings and are PER-validated; 2) feature active, collaborative learning; and 3) use computer-based tools that enable students to learn by making predictions and then collecting, displaying, and analyzing data from their experiments.

  8. UVI Cyber-security Workshop Workshop Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Kuykendall, Tommie G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allsop, Jacob Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Benjamin Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Boumedine, Marc [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Carter, Cedric [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Galvin, Seanmichael Yurko [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Oscar [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lee, Wellington K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lin, Han Wei [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Morris, Tyler Jake [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nauer, Kevin S.; Potts, Beth A.; Ta, Kim Thanh; Trasti, Jennifer; White, David R.

    2015-07-08

    The cybersecurity consortium, which was established by DOE/NNSA’s Minority Serving Institutions Partnerships Program (MSIPP), allows students from any of the partner schools (13 HBCUs, two national laboratories, and a public school district) to have all consortia options available to them, to create career paths and to open doors to DOE sites and facilities to student members of the consortium. As a part of this year consortium activities, Sandia National Laboratories and the University of Virgin Islands conducted a week long cyber workshop that consisted of three courses; Digital Forensics and Malware Analysis, Python Programming, and ThunderBird Cup. These courses are designed to enhance cyber defense skills and promote learning within STEM related fields.

  9. Workshop Physics and Related Curricula: "A 25-Year History of Collaborative Learning Enhanced by Computer Tools for Observation and Analysis"

    Science.gov (United States)

    Laws, Priscilla W.; Willis, Maxine C.; Sokoloff, David R.

    2015-01-01

    This article describes the 25-year history of development of the activity-based Workshop Physics (WP) at Dickinson College, its adaptation for use at Gettysburg Area High School, and its synergistic influence on curricular materials developed at the University of Oregon and Tufts University and vice versa. WP and these related curricula: 1) are…

  10. Collaboration tools for the global accelerator network: Workshop Report

    International Nuclear Information System (INIS)

    The concept of a ''Global Accelerator Network'' (GAN) has been put forward as a means for inter-regional collaboration in the operation of internationally constructed and operated frontier accelerator facilities. A workshop was held to allow representatives of the accelerator community and of the collaboratory development community to meet and discuss collaboration tools for the GAN environment. This workshop, called the Collaboration Tools for the Global Accelerator Network (GAN) Workshop, was held on August 26, 2002 at Lawrence Berkeley National Laboratory. The goal was to provide input about collaboration tools in general and to provide a strawman for the GAN collaborative tools environment. The participants at the workshop represented accelerator physicists, high-energy physicists, operations, technology tool developers, and social scientists that study scientific collaboration

  11. Collaboration tools for the global accelerator network: Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Olson, Gary [Univ. of Michigan, Ann Arbor, MI (United States); Olson, Judy [Univ. of Michigan, Ann Arbor, MI (United States)

    2002-09-15

    The concept of a ''Global Accelerator Network'' (GAN) has been put forward as a means for inter-regional collaboration in the operation of internationally constructed and operated frontier accelerator facilities. A workshop was held to allow representatives of the accelerator community and of the collaboratory development community to meet and discuss collaboration tools for the GAN environment. This workshop, called the Collaboration Tools for the Global Accelerator Network (GAN) Workshop, was held on August 26, 2002 at Lawrence Berkeley National Laboratory. The goal was to provide input about collaboration tools in general and to provide a strawman for the GAN collaborative tools environment. The participants at the workshop represented accelerator physicists, high-energy physicists, operations, technology tool developers, and social scientists that study scientific collaboration.

  12. Workshop on Tool Criticism in the Digital Humanities

    OpenAIRE

    Traub, Myriam; Ossenbruggen, van, Jacco

    2015-01-01

    This document reports on the discussions and results of the Workshop on Tool Criticism in the Digital Humanities, that took place on May 22, 2015 in Pand 020, Amsterdam. The workshop was co-organized by Centrum Wiskunde & Informatica, the eHumanities group of KNAW and the Amsterdam Data Science Center.

  13. Applications of ion beam analysis workshop. Workshop handbook

    International Nuclear Information System (INIS)

    A workshop on applications of ion beam analysis was held at ANSTO, immediate prior to the IBMM-95 Conference in Canberra. It aims was to review developments and current status on use of ion beams for analysis, emphasizing the following aspects: fundamental ion beam research and secondary effects of ion beams; material sciences, geological, life sciences, environmental and industrial applications; computing codes for use in accelerator research; high energy heavy ion scattering and recoil; recent technological development using ion beams. The handbook contains the workshop's program, 29 abstracts and a list of participants

  14. Sawja: Static Analysis Workshop for Java

    Science.gov (United States)

    Hubert, Laurent; Barré, Nicolas; Besson, Frédéric; Demange, Delphine; Jensen, Thomas; Monfort, Vincent; Pichardie, David; Turpin, Tiphaine

    Static analysis is a powerful technique for automatic verification of programs but raises major engineering challenges when developing a full-fledged analyzer for a realistic language such as Java. Efficiency and precision of such a tool rely partly on low level components which only depend on the syntactic structure of the language and therefore should not be redesigned for each implementation of a new static analysis. This paper describes the Sawja library: a static analysis workshop fully compliant with Java 6 which provides OCaml modules for efficiently manipulating Java bytecode programs. We present the main features of the library, including i) efficient functional data-structures for representing a program with implicit sharing and lazy parsing, ii) an intermediate stack-less representation, and iii) fast computation and manipulation of complete programs. We provide experimental evaluations of the different features with respect to time, memory and precision.

  15. Proceedings of pollution prevention and waste minimization tools workshop

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-11-01

    Pollution Prevention (P2) has evolved into one of DOE`s sprime strategies to meet environmental, fiscal, and worker safety obligations. P2 program planning, opportunity identification, and implementation tools were developed under the direction of the Waste Minimization Division (EM-334). Forty experts from EM, DP, ER and DOE subcontractors attended this 2-day workshop to formulate the incentives to drive utilization of these tools. Plenary and small working group sessions were held both days. Working Group 1 identified incentives to overcoming barriers in the area of P2 program planning and resource allocation. Working Group 2 identified mechanisms to drive the completion of P2 assessments and generation of opportunities. Working Group 3 compiled and documented a broad range of potential P2 incentives that address fundamental barriers to implementation of cost effective opportunities.

  16. Proceedings of pollution prevention and waste minimization tools workshop

    International Nuclear Information System (INIS)

    Pollution Prevention (P2) has evolved into one of DOE's sprime strategies to meet environmental, fiscal, and worker safety obligations. P2 program planning, opportunity identification, and implementation tools were developed under the direction of the Waste Minimization Division (EM-334). Forty experts from EM, DP, ER and DOE subcontractors attended this 2-day workshop to formulate the incentives to drive utilization of these tools. Plenary and small working group sessions were held both days. Working Group 1 identified incentives to overcoming barriers in the area of P2 program planning and resource allocation. Working Group 2 identified mechanisms to drive the completion of P2 assessments and generation of opportunities. Working Group 3 compiled and documented a broad range of potential P2 incentives that address fundamental barriers to implementation of cost effective opportunities

  17. Proceedings of the workshop on precursor analysis

    International Nuclear Information System (INIS)

    The purpose of the meeting was to provide a forum for open discussion on the state-of-the-art of Precursor Analysis to all professional parties involved, i.e. industry, regulators and support organizations. The meeting was intended to discuss insights both from the operating experience point of view and from the PA methodology point of view. Therefore, it was considered interesting to bring together specialists in the use and application of operational feedback (mainly covered via NEA CSNI/WGOE) and specialists in methodological aspects of Precursor Analysis (mainly covered via CSNI/WGRISK). Cross-fertilization of insights and experiences was expected to be beneficial for both sides. The major effort in Precursor Analysis is carried out on a limited set of safety significant events that occurred at commercial nuclear power plants. Thus, an objective of the Workshop was also to explore, to which extent Precursor Analysis is already applied to other types of nuclear installations (fuel fabrication, research reactors, etc.). In session 1, introductory remarks were given by representatives of WGOE and WGRISK, the two OECD/NEA working groups respectively on Operating Experience and Risk. Sessions 2, 5 and 6 focused on national programs of probabilistic precursor analysis. Contributions from US, Germany, Switzerland, Finland, Czech Republic, Belgium and Japan were presented. Sessions 7 and 8 brought further prospects on the national programs, combined with information on more specific applications. These sessions contained contributions from Hungary, US, Spain, and France. Session 3 was mainly dealing with the more classical (non probabilistic) approach of the analysis of operational events, with contributions from Finland, Belgium, and Sweden. A consultant company (Enconet) presented a newly developed method and a computer tool. Session 4 focused on methodological aspects, and the development of guidelines and models for probabilistic precursor analysis. It included a

  18. 9th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Hilbrich, Tobias; Niethammer, Christoph; Gracia, José; Nagel, Wolfgang; Resch, Michael

    2016-01-01

    High Performance Computing (HPC) remains a driver that offers huge potentials and benefits for science and society. However, a profound understanding of the computational matters and specialized software is needed to arrive at effective and efficient simulations. Dedicated software tools are important parts of the HPC software landscape, and support application developers. Even though a tool is by definition not a part of an application, but rather a supplemental piece of software, it can make a fundamental difference during the development of an application. Such tools aid application developers in the context of debugging, performance analysis, and code optimization, and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 9th International Parallel Tools Workshop held in Dresden, Germany, September 2-3, 2015, which offered an established forum for discussing the latest advances in paral...

  19. 7th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Nagel, Wolfgang; Resch, Michael

    2014-01-01

    Current advances in High Performance Computing (HPC) increasingly impact efficient software development workflows. Programmers for HPC applications need to consider trends such as increased core counts, multiple levels of parallelism, reduced memory per core, and I/O system challenges in order to derive well performing and highly scalable codes. At the same time, the increasing complexity adds further sources of program defects. While novel programming paradigms and advanced system libraries provide solutions for some of these challenges, appropriate supporting tools are indispensable. Such tools aid application developers in debugging, performance analysis, or code optimization and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 7th International Parallel Tools Workshop, held in Dresden, Germany, September 3-4, 2013.  

  20. 2nd International Workshop on Isogeometric Analysis and Applications

    CERN Document Server

    Simeon, Bernd

    2015-01-01

    Isogeometric Analysis is a groundbreaking computational approach that promises the possibility of integrating the finite element  method into conventional spline-based CAD design tools. It thus bridges the gap between numerical analysis and geometry, and moreover it allows to tackle new cutting edge applications at the frontiers of research in science and engineering. This proceedings volume contains a selection of outstanding research papers presented at the second International Workshop on Isogeometric Analysis and Applications, held at Annweiler, Germany, in April 2014.

  1. Proceedings Fifth Workshop on Formal Languages and Analysis of Contract-Oriented Software

    CERN Document Server

    Pimentel, Ernesto

    2011-01-01

    This volume consists of the proceedings of the 5th Workshop on Formal Languages and Analysis of Contract-Oriented Software (FLACOS'11). The FLACOS Workshops serve as annual meeting places to bring together researchers and practitioners working on language-based solutions to contract-oriented software development. High-level models of contracts are needed as a tool to negotiate contracts and provide services conforming to them. This Workshop provides language-based solutions to the above issues through formalization of contracts, design of appropriate abstraction mechanisms, and formal analysis of contract languages and software. The program of this edition consists of 5 regular papers and 3 invited presentations. Detailed information about the FLACOS 2011 Workshop can be found at http://flacos2011.lcc.uma.es/. The 5th edition of the FLACOS Workshop was organized by the University of M\\'alaga. It took place in M\\'alaga, Spain, during September 22-23, 2011.

  2. Workshop on IAEA Tools for Nuclear Energy System Assessment for Long-Term Planning and Development

    International Nuclear Information System (INIS)

    The purpose of the workshop is to present to Member States tools and methods that are available from the IAEA in support of long-term energy planning and nuclear energy system assessments, both focusing on the sustainable development of nuclear energy. This includes tools devoted to energy system planning, indicators for sustainable energy development, the INPRO methodology for Nuclear Energy System Assessment (NESA) and tools for analysing nuclear fuel cycle material balance. The workshop also intends to obtain feedback from Member States on applying the tools, share experiences and lessons learned, and identify needs for IAEA support

  3. BENCHMARKING WORKSHOPS AS A TOOL TO RAISE BUSINESS EXCELLENCE

    Directory of Open Access Journals (Sweden)

    Milos Jelic

    2011-03-01

    Full Text Available Annual competition for national award for business excellence appears to be a good opportunity for participating organizations to demonstrate their practices particularly those ones which enable them to excel. National quality award competition in Serbia (and Montenegro, namely "OSKAR KVALITETA" started in 1995 but was limited to competition cycle only. However, upon establishing Fund for Quality Culture and Excellence - FQCE in 2002, which took over OSKAR KVALITETA model, several changes took place. OSKAR KVALITETA turned to be annual competition in business excellence, but at the same time FQCE started to offer much wider portfolio of its services including levels of excellence programs, assessment and self-assessment training courses and benchmarking workshops. These benchmarking events have hosted by Award winners or other laureates in OSKAR KVALITETA competition who demonstrated excellence in regard of some particular criteria thus being in position to share their practice with other organizations. For six years experience in organizing benchmarking workshops FQCE scored 31 workshops covering major part of model issues. Increasing level of participation on the workshops and distinct positive trends of participants expressed satisfaction may serve as a reliable indicator that the workshops have been effective in actuating people to think and move in business excellence direction.

  4. Physics analysis tools

    International Nuclear Information System (INIS)

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  5. 100-KE REACTOR CORE REMOVAL PROJECT ALTERNATIVE ANALYSIS WORKSHOP REPORT

    Energy Technology Data Exchange (ETDEWEB)

    HARRINGTON RA

    2010-01-15

    . In brief, the Path Forward was developed to reconsider potential open air demolition areas; characterize to determine if any zircaloy exists, evaluate existing concrete data to determine additional characterization needs, size the new building to accommodate human machine interface and tooling, consider bucket thumb and use ofshape-charges in design, and finally to utilize complex-wide and industry explosive demolition lessons learned in the design approach. Appendix B documents these results from the team's use ofValue Engineering process tools entitled Weighted Analysis Alternative Matrix, Matrix Conclusions, Evaluation Criteria, and Alternative Advantages and Disadvantages. These results were further supported with the team's validation of parking-lot information sheets: memories (potential ideas to consider), issues/concerns, and assumptions, contained in Appendix C. Appendix C also includes the recorded workshop flipchart notes taken from the SAR Alternatives and Project Overview presentations. The SAR workshop presentations, including a 3-D graphic illustration demonstration video have been retained in the CHPRC project file, and were not included in this report due to size limitations. The workshop concluded with a round robin close-out where each member was engaged for any last minute items and meeting utility. In summary, the team felt the session was value added and looked forward to proceeding with the recommended actions and conceptual design.

  6. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  7. Fourth Workshop and Tutorial on Practical Use of Coloured Petri Nets and the CPN Tools

    DEFF Research Database (Denmark)

    Coloured Petri Nets and the CPN tools are now used by more than 750 organisations in 50 different countries all over the world (including 150 commercial companies). The purpose of this event is to bring together some of the users and in this way provide a forum for those who are interested in the...... practical use of Coloured Petri Nets and the CPN tools. This booklet contains the proceedings of the Fourth Workshop on Practical Use of Coloured Petri Nets and the CPN Tools, August 28-30, 2002. The workshop is organised by the CPN group at the Department of Computer Science, University of Aarhus, Denmark....

  8. Proceedings of the of the Eleventh Workshop on Language Descriptions, Tools and Applications (LDTA 2011)

    DEFF Research Database (Denmark)

    . For 2011, 25 papers were submitted, of which two were tool demonstration papers. A program committee of 20 with the help of 7 additional reviewers provided three reviews for each paper. An electronic PC meeting was held using EasyChair to discuss the papers and from those submitted 9 full papers and...... two tool demonstration papers were selected. In addition to these 11 papers the proceedings includes the paper "Getting a Grip on Tasks that Coordinate Tasks" by Rinus Plasmeijer of Radboud University Nijmegen in The Netherlands. This paper accompanied his invited talk at the workshop. This year LDTA...... languages. Tool challenge participants presented highlights of their solution during special sessions of the workshop and will contribute to a joint paper on the Tool Challenge and proposed solutions to be co-authored by all participants after the workshop....

  9. Pollution prevention and waste minimization tools workshops: Proceedings. Part 2

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-31

    The purpose of the second workshop was to bring together representatives of DOE and DOE contractor organizations to discuss four topics: process waste assessments (PWAs), a continuation of one of the sessions held at the first workshop in Clearwater; waste minimization reporting requirements; procurement systems for waste minimization; and heating, ventilating, and air conditioning (HVAC) and replacements for chlorofluorocarbons (CFCs). The topics were discussed in four concurrent group sessions. Participants in each group were encouraged to work toward achieving two main objectives: establish a ``clear vision`` of the overall target for their session`s program, focusing not just on where the program is now but on where it should go in the long term; and determine steps to be followed to carry out the target program.

  10. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  11. Workshop

    DEFF Research Database (Denmark)

    Hess, Regitze; Lotz, Katrine

    2003-01-01

    Program for en arkitektur-workshop med focus på de danske havne. Præsentation af 57 yngre danske og internationale deltagende arkitekter.......Program for en arkitektur-workshop med focus på de danske havne. Præsentation af 57 yngre danske og internationale deltagende arkitekter....

  12. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  13. Neutron multiplicity analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Scott L [Los Alamos National Laboratory

    2010-01-01

    I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This

  14. Tools for Project Management, Workshops and Consulting A Must-Have Compendium of Essential Tools and Techniques

    CERN Document Server

    Andler, Nicolai

    2012-01-01

    Typically today's tasks in management and consulting include project management, running workshops and strategic work - all complex activities, which require a multitude of skills and competencies. This standard work, which is also well accepted amongst consultants, gives you a reference or cookbook-style access to the most important tools, including a rating of each tool in terms of applicability, ease of use and effectiveness.In his book, Nicolai Andler presents about 120 of such tools, grouped into task-specific categories entitled Define Situation, Gather Information, Information Consolida

  15. 77 FR 14814 - Tobacco Product Analysis; Scientific Workshop; Request for Comments

    Science.gov (United States)

    2012-03-13

    ... days before the workshop. Comments: Regardless of attendance at the public workshop, interested persons... within the same class of constituents into a single analysis. Particularly discuss the benefits...

  16. PREFACE: EMAS 2013 Workshop: 13th European Workshop on Modern Developments and Applications in Microbeam Analysis

    Science.gov (United States)

    Llovet, Xavier, Dr; Matthews, Mr Michael B.; Brisset, François, Dr; Guimarães, Fernanda, Dr; Vieira, Professor Joaquim M., Dr

    2014-03-01

    This volume of the IOP Conference Series: Materials Science and Engineering contains papers from the 13th Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis which took place from the 12th to the 16th of May 2013 in the Centro de Congressos do Alfândega, Porto, Portugal. The primary aim of this series of workshops is to assess the state-of-the-art and reliability of microbeam analysis techniques. The workshops also provide a forum where students and young scientists starting out on a career in microbeam analysis can meet and discuss with the established experts. The workshops have a very specific format comprising invited plenary lectures by internationally recognized experts, poster presentations by the participants and round table discussions on the key topics led by specialists in the field. This workshop was organized in collaboration with LNEG - Laboratório Nacional de Energia e Geologia and SPMICROS - Sociedade Portuguesa de Microscopia. The technical programme included the following topics: electron probe microanalysis, future technologies, electron backscatter diffraction (EBSD), particle analysis, and applications. As at previous workshops there was also a special oral session for young scientists. The best presentation by a young scientist was awarded with an invitation to attend the 2014 Microscopy and Microanalysis meeting at Hartford, Connecticut. The prize went to Shirin Kaboli, of the Department of Metals and Materials Engineering of McGill University (Montréal, Canada), for her talk entitled ''Plastic deformation studies with electron channelling contrast imaging and electron backscattered diffraction''. The continuing relevance of the EMAS workshops and the high regard in which they are held internationally can be seen from the fact that 74 posters from 21 countries were on display at the meeting and that the participants came from as far away as Japan, Canada and the USA. A

  17. Ninth Workshop and Tutorial on Practical Use of Coloured Petri Nets and the CPN Tools

    DEFF Research Database (Denmark)

    to more than 7,200 users in 138 countries. The aim of the workshop is to bring together some of the users and in this way provide a forum for those who are interested in the practical use of Coloured Petri nets and their tools. The submitted papers were evaluated by a programme committee with the...... Daniel Moldt, Gremany Laure Petrucci, France Rüdiger Vlak, Germany Lee Wagenhals, USA Karsten Wolf, Germany Jianli Xu, Finland The programme committee has accepted 10 papers for presentation. Most of these deal with different projects in which Coloured Petri Nets and their tools have been put to...... practical use -- often in an industrial setting. The remaining papers deal with different extensions of tools and methodology. The papers from the first eight CPN Workshops can be found via web pages: http://www.daimi.au.dk/CPnets/. After an additional round of reviewing and revision, some of the papers...

  18. Proceedings of the Workshop on software tools for distributed intelligent control systems

    Energy Technology Data Exchange (ETDEWEB)

    Herget, C.J. (ed.)

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  19. Workshop in Moodle: a tool for peer critiquing

    OpenAIRE

    Brown, C.; Honeychurch, S.; Munro, J

    2011-01-01

    This paper will begin with a brief discussion of the benefits of peer assessment and peer critiquing. In particular, it will examine how both can be beneficial in helping to introduce, and reinforce, valuable graduate attributes in students throughout their university careers. It will then examine the tools available at the University of Glasgow and evaluate them in terms of their strengths and weaknesses. In order to explain this in detail, a real life case study from a third year c...

  20. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  1. Summary of Training Workshop on the Use of NASA tools for Coastal Resource Management in the Gulf of Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Judd, Chaeli; Judd, Kathleen S.; Gulbransen, Thomas C.; Thom, Ronald M.

    2009-03-01

    A two-day training workshop was held in Xalapa, Mexico from March 10-11 2009 with the goal of training end users from the southern Gulf of Mexico states of Campeche and Veracruz in the use of tools to support coastal resource management decision-making. The workshop was held at the computer laboratory of the Institute de Ecologia, A.C. (INECOL). This report summarizes the results of that workshop and is a deliverable to our NASA client.

  2. A Pluralistic, Longitudinal Method: Using Participatory Workshops, Interviews and Lexicographic Analysis to Investigate Relational Evolution

    DEFF Research Database (Denmark)

    Evers, Winie; Marroun, Sana; Young, Louise

    2016-01-01

    analysis. Longitudinal research considers a Danish advertising and communication firm looking for new ideas by involving their network in order to help them to compete in their environment of rapid globalization and emergence of new technologies. A five stage research design considered how network...... development facilitates the ability to cope with these changes. Network development was given impetus via two workshops where network members discussed the firm’s opportunities and challenges using a number of tools to facilitate brainstorming and general discussion. The impact of these workshops on...

  3. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  4. 9th Workshop on Stochastic Analysis and Related Topics

    CERN Document Server

    Decreusefond, Laurent; Stochastic Analysis and Related Topics

    2012-01-01

    Since the early eighties, Ali Suleyman Ustunel has been one of the main contributors to the field of Malliavin calculus. In a workshop held in Paris, June 2010 several prominent researchers gave exciting talks in honor of his 60th birthday. The present volume includes scientific contributions from this workshop. Probability theory is first and foremost aimed at solving real-life problems containing randomness. Markov processes are one of the key tools for modeling that plays a vital part concerning such problems. Contributions on inventory control, mutation-selection in genetics and public-pri

  5. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    International Nuclear Information System (INIS)

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document

  6. Failure Environment Analysis Tool (FEAT)

    Science.gov (United States)

    Lawler, D. G.

    1991-01-01

    Information is given in viewgraph form on the Failure Environment Analysis Tool (FEAT), a tool designed to demonstrate advanced modeling and analysis techniques to better understand and capture the flow of failures within and between elements of the Space Station Freedom (SSF) and other large complex systems. Topics covered include objectives, development background, the technical approach, SSF baseline integration, and FEAT growth and evolution.

  7. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    CERN Document Server

    Battaglieri, M; Celentano, A; Chung, S -U; D'Angelo, A; De Vita, R; Döring, M; Dudek, J; Eidelman, S; Fegan, S; Ferretti, J; Fox, G; Galata, G; Garcia-Tecocoatzi, H; Glazier, D I; Grube, B; Hanhart, C; Hoferichter, M; Hughes, S M; Ireland, D G; Ketzer, B; Klein, F J; Kubis, B; Liu, B; Masjuan, P; Mathieu, V; McKinnon, B; Mitchell, R; Nerling, F; Paul, S; Pelaez, J R; Rademacker, J; Rizzo, A; Salgado, C; Santopinto, E; Sarantsev, A V; Sato, T; Schlüter, T; da Silva, M L L; Stankovic, I; Strakovsky, I; Szczepaniak, A; Vassallo, A; Walford, N K; Watts, D P; Zana, L

    2014-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near...

  8. 8th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2015-01-01

    Numerical simulation and modelling using High Performance Computing has evolved into an established technique in academic and industrial research. At the same time, the High Performance Computing infrastructure is becoming ever more complex. For instance, most of the current top systems around the world use thousands of nodes in which classical CPUs are combined with accelerator cards in order to enhance their compute power and energy efficiency. This complexity can only be mastered with adequate development and optimization tools. Key topics addressed by these tools include parallelization on heterogeneous systems, performance optimization for CPUs and accelerators, debugging of increasingly complex scientific applications, and optimization of energy usage in the spirit of green IT. This book represents the proceedings of the 8th International Parallel Tools Workshop, held October 1-2, 2014 in Stuttgart, Germany – which is a forum to discuss the latest advancements in the parallel tools.

  9. Social Data Analysis Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi; Hardt, Daniel;

    2014-01-01

    As governments, citizens and organizations have moved online there is an increasing need for academic enquiry to adapt to this new context for communication and political action. This adaptation is crucially dependent on researchers being equipped with the necessary methodological tools to extract......, analyze and visualize patterns of web activity. This volume profiles the latest techniques being employed by social scientists to collect and interpret data from some of the most popular social media applications, the political parties' own online activist spaces, and the wider system of hyperlinks...

  10. Performance Analysis using CPN Tools

    DEFF Research Database (Denmark)

    Wells, Lisa Marie

    2006-01-01

    This paper provides an overview of new facilities for performance analysis using Coloured Petri Nets and the tool CPN Tools. Coloured Petri Nets is a formal modeling language that is well suited for modeling and analyzing large and complex systems. The new facilities include support for collecting...

  11. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  12. Proceedings of the of the Tenth Workshop on Language Descriptions, Tools and Applications (LDTA 2010)

    DEFF Research Database (Denmark)

    Brabrand, Claus

    2010-01-01

    This volume contains the proceedings of the Tenth Workshop on Language Descriptions, Tools and Applications (LDTA 2010), held in Paphos, Cyprus on March 28--29, 2010. LDTA is a two-day satellite event of ETAPS (European Joint Conferences on Theory and Practice of Software) organized in cooperation...... with ACM Sigplan. LDTA is an application and tool-oriented forum on meta programming in a broad sense. A meta program is a program that takes other programs as input or output. The focus of LDTA is on generated or otherwise efficiently implemented meta programs, possibly using high level descriptions...... of programming languages. Tools and techniques presented at LDTA are usually applicable in the context of "Language Workbenches" or "Meta Programming Systems" or simply as parts of advanced programming environments or IDEs. These proceedings include an extended abstract based on the invited talk by...

  13. Second Workshop on Stochastic Analysis and Related Topics

    CERN Document Server

    Ustunel, Ali

    1990-01-01

    The Second Silivri Workshop functioned as a short summer school and a working conference, producing lecture notes and research papers on recent developments of Stochastic Analysis on Wiener space. The topics of the lectures concern short time asymptotic problems and anticipative stochastic differential equations. Research papers are mostly extensions and applications of the techniques of anticipative stochastic calculus.

  14. The EADGENE Microarray Data Analysis Workshop

    DEFF Research Database (Denmark)

    de Koning, Dirk-Jan; Jaffrézic, Florence; Lund, Mogens Sandø;

    2007-01-01

    10 countries performed and discussed the statistical analyses of real and simulated 2-colour microarray data that were distributed among participants. The real data consisted of 48 microarrays from a disease challenge experiment in dairy cattle, while the simulated data consisted of 10 microarrays...... from a direct comparison of two treatments (dye-balanced). While there was broader agreement with regards to methods of microarray normalisation and significance testing, there were major differences with regards to quality control. The quality control approaches varied from none, through using...... statistical weights, to omitting a large number of spots or omitting entire slides. Surprisingly, these very different approaches gave quite similar results when applied to the simulated data, although not all participating groups analysed both real and simulated data. The workshop was very successful in...

  15. The Astronomy Workshop: Computer Assisted Learning Tools with Instructor Support Materials and Student Activities

    Science.gov (United States)

    Deming, Grace; Hamilton, D.; Hayes-Gehrke, M.

    2006-12-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is a collection of interactive World Wide Web tools that were developed under the direction of Doug Hamilton for use in undergraduate classes, as supplementary materials appropriate for grades 9-12, and by the general public. The philosophy of the website is to foster student and public interest in astronomy by capitalizing on their fascination with computers and the internet. Many of the tools were developed by graduate and undergraduate students at UMD. This website contains over 20 tools on topics including scientific notation, giant impacts, extrasolar planets, astronomical distances, planets, moons, comets, and asteroids. Educators around the country at universities, colleges, and secondary schools have used the Astronomy Workshop’s tools and activities as homework assignments, in-class demos, or extra credit. Since 2005, Grace Deming has assessed several of the Astronomy Workshop’s tools for clarity and effectiveness by interviewing students as they used tools on the website. Based on these interviews, Deming wrote student activities and instructor support materials and posted them to the website. Over the next three years, we will continue to interview students, develop web materials, and field-test activities. We are targeting classes in introductory undergraduate astronomy courses and grades 11-12 for our Spring 2007 field tests. We are interested in hearing your ideas on how we can make the Astronomy Workshop more appealing to educators, museum directors, specialty programs, and professors. This research is funded by NASA EPO grants NNG04GM18G and NNG06GGF99G.

  16. Atlas Distributed Analysis Tools

    Science.gov (United States)

    de La Hoz, Santiago Gonzalez; Ruiz, Luis March; Liko, Dietrich

    2008-06-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval.

  17. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  18. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  19. VCAT: Visual Crosswalk Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Cleland, Timothy J. [Los Alamos National Laboratory; Forslund, David W. [Los Alamos National Laboratory; Cleland, Catherine A. [Los Alamos National Laboratory

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  20. Development of Workshops on Biodiversity and Evaluation of the Educational Effect by Text Mining Analysis

    Science.gov (United States)

    Baba, R.; Iijima, A.

    2014-12-01

    Conservation of biodiversity is one of the key issues in the environmental studies. As means to solve this issue, education is becoming increasingly important. In the previous work, we have developed a course of workshops on the conservation of biodiversity. To disseminate the course as a tool for environmental education, determination of the educational effect is essential. A text mining enables analyses of frequency and co-occurrence of words in the freely described texts. This study is intended to evaluate the effect of workshop by using text mining technique. We hosted the originally developed workshop on the conservation of biodiversity for 22 college students. The aim of the workshop was to inform the definition of biodiversity. Generally, biodiversity refers to the diversity of ecosystem, diversity between species, and diversity within species. To facilitate discussion, supplementary materials were used. For instance, field guides of wildlife species were used to discuss about the diversity of ecosystem. Moreover, a hierarchical framework in an ecological pyramid was shown for understanding the role of diversity between species. Besides, we offered a document material on the historical affair of Potato Famine in Ireland to discuss about the diversity within species from the genetic viewpoint. Before and after the workshop, we asked students for free description on the definition of biodiversity, and analyzed by using Tiny Text Miner. This technique enables Japanese language morphological analysis. Frequently-used words were sorted into some categories. Moreover, a principle component analysis was carried out. After the workshop, frequency of the words tagged to diversity between species and diversity within species has significantly increased. From a principle component analysis, the 1st component consists of the words such as producer, consumer, decomposer, and food chain. This indicates that the students have comprehended the close relationship between

  1. Sawja: Static Analysis Workshop for Java

    OpenAIRE

    Hubert, Laurent; Barré, Nicolas; Besson, Frédéric; Demange, Delphine; Jensen, Thomas; Monfort, Vincent; Pichardie, David; Turpin, Tiphaine

    2010-01-01

    Static analysis is a powerful technique for automatic verification of programs but raises major engineering challenges when developing a full-fledged analyzer for a realistic language such as Java. This paper describes the Sawja library: a static analysis framework fully compliant with Java 6 which provides OCaml modules for efficiently manipulating Java bytecode programs. We present the main features of the library, including (i) efficient functional data-structures for representing program ...

  2. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    Science.gov (United States)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  3. Workshop tools and methodologies for evaluation of energy chains and for technology perspective

    Energy Technology Data Exchange (ETDEWEB)

    Appert, O. [Institut Francais du Petrole (IFP), 92 - Rueil-Malmaison (France); Maillard, D. [Energy and Raw Materials, 75 - Paris (France); Pumphrey, D. [Energy Cooperation, US Dept. of Energy (United States); Sverdrup, G.; Valdez, B. [National Renewable Energy Laboratory, Golden, CO (United States); Schindler, J. [LB-Systemtechnik (LBST), GmbH, Ottobrunn (Germany); His, St.; Rozakis, St. [Centre International de Recherche sur Environnement Developpement (CIRED), 94 - Nogent sur Marne (France); Sagisaka, M. [LCA Research Centre (Japan); Bjornstad, D. [Oak Ridge National Laboratory, Oak Ridge, Tennessee (United States); Madre, J.L. [Institut National de Recherche sur les Transports et leur Securite, 94 - Arcueil (France); Hourcade, J.Ch. [Centre International de Recherche sur l' Environnement le Developpement (CIRED), 94 - Nogent sur Marne (France); Ricci, A.; Criqui, P.; Chateau, B.; Bunger, U.; Jeeninga, H. [EU/DG-R (Italy); Chan, A. [National Research Council (Canada); Gielen, D. [IEA-International Energy Associates Ltd., Fairfax, VA (United States); Tosato, G.C. [Energy Technology Systems Analysis Programme (ETSAP), 75 - Paris (France); Akai, M. [Agency of Industrial Science and technology (Japan); Ziesing, H.J. [Deutsches Institut fur Wirtschaftsforschung, DIW Berlin (Germany); Leban, R. [Conservatoire National des Arts et Metiers (CNAM), 75 - Paris (France)

    2005-07-01

    The aims of this workshop is to better characterize the future in integrating all the dynamic interaction between the economy, the environment and the society. It offers presentations on the Hydrogen chains evaluation, the micro-economic modelling for evaluation of bio-fuel options, life cycle assessment evolution and potentialities, the consumer valuation of energy technologies attributes, the perspectives for evaluation of changing behavior, the incentive systems and barriers to social acceptability, the internalization of external costs, the endogenous technical change in long-tem energy models, ETSAP/technology dynamics in partial equilibrium energy models, very long-term energy environment modelling, ultra long-term energy technology perspectives, the socio-economic toolbox of the EU hydrogen road-map, the combined approach using technology oriented optimization and evaluation of impacts of individual policy measures and the application of a suite of basic research portfolio management tools. (A.L.B.)

  4. Summary of the CEC/USDOE workshop on uncertainty analysis

    International Nuclear Information System (INIS)

    There is uncertainty in all aspects of assessing the consequences of accidental releases of radioactive material, from understanding and describing the environmental and biological transfer processes to modeling emergency response. A wide range of scientific disciplines is involved in these assessments, and diverse approaches have been adopted to take account of uncertainties in the different areas. The need for an exchange of views and a comparison of approaches between the diverse disciplines led to the organization of a CEC/USDOE Workshop on Uncertainty Analysis held in Santa Fe, New Mexico, in November 1989. The workshop brought together specialists in a number of disciplines, including those expert in the mathematics and statistics of uncertainty analysis, in expert judgment elicitation and evaluation, and in all aspects of assessing the radiological and environmental consequences of accidental releases of radioactive material. In addition, there was participation from users of the output of accident consequences assessment in decision making and/or regulatory frameworks. The main conclusions that emerged from the workshop are summarized in this paper. These are discussed in the context of three different types of accident consequence assessment: probabilistic assessments of accident consequences undertaken as inputs to risk analyses of nuclear installations, assessments of accident consequences in real time to provide inputs to decisions on the introduction of countermeasures, and the reconstruction of doses and risks resulting from past releases of radioactive material

  5. Proceedings of the CEC/USDOE workshop on uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Elderkin, C.E. (Pacific Northwest Lab., Richland, WA (USA)); Kelly, G.N. (eds.)(Commission of the European Communities, Brussels (Belgium))

    1990-09-01

    In recent years it has become increasingly important to specify the uncertainty inherent in consequence assessments and in the models that trace radionuclides from their source, through the environment, to their impacts on human health. European and US scientists have, been independently developing and applying methods for analyzing uncertainty. It recently became apparent that a scientific exchange on this subject would be beneficial as improvements are sought and as uncertainty methods find broader application. The Commission of the European Communities (CEC) and the Office of Health and Environmental Research of the US Department of Energy (OHER/DOE), through their continuing agreement for cooperation, decided to co-sponsor the CEC/USDOE Workshop on Uncertainty Analysis. CEC's Radiation Protection Research Programme and OHER's Atmospheric Studies in Complex Terrain Program collaborated in planning and organizing the workshop, which was held in Santa Fe, New Mexico, on November 13 through 16, 1989. As the workshop progressed, the perspectives of individual participants, each with their particular background and interests in some segment of consequence assessment and its uncertainties, contributed to a broader view of how uncertainties are introduced and handled. This proceedings contains, first, the editors' introduction to the problem of uncertainty analysis and their general summary and conclusions. These are then followed by the results of the working groups, and the abstracts of individual presentations.

  6. Development of Student Exercises with Instructor Support at the Astronomy Workshop Solar System Collisions Web Tool

    Science.gov (United States)

    Deming, G. L.; Hamilton, D. P.

    2005-12-01

    During the spring 2005 semester, seven students taking ASTR101 General Astronomy for non-science majors at the University of Maryland were interviewed while completing an assignment using the Astronomy Workshop Solar System Collisions web tool (http://janus.umd.edu/astro/impact/). The Astronomy Workshop Solar System Collisions web tool can be used to investigate how different variables affect collisions in a fun, but informative manner. Based on the 2005 spring interviews, three web-based activities were developed as appropriate for homework or as enrichment to coursework. The first activity explores how the impactor's mass affects energy released, crater diameter, frequency of similar impacts, and magnitude of the earthquake generated by the impact. The second activity investigates the energy released and damage done when the impactor's density is changed. Collisions by icy bodies are compared to those of rocky and metallic materials. The third activity compares collisions on different planets. In addition to masses and densities, velocities vary in these collisions. The activities are written so that introductory astronomy students will interpret the differences observed in terms of kinetic energy. During the fall 2005 semester, ASTR101 students at the University of Maryland were interviewed and observed as they completed the three activities described above using the Solar System Collisions website. The twelve students in this study were selected based on pretest scores on the Astronomy Diagnostic Test. An effort was made to include students of diverse backgrounds and mathematical experiences. Based on these interviews, final revisions have been made. Student exercises on the website and the directions on how instructors can use these materials in their courses are ready for field-testing at other institutions. Faculty interested in participating in the field-test for this project during spring 2006 are encouraged to contact the authors. This research is funded

  7. Proceedings of the Workshop on Methods & Tools for Computer Supported Collaborative Creativity Process: Linking creativity & informal learning

    NARCIS (Netherlands)

    Retalis, Symeon; Sloep, Peter

    2009-01-01

    Retalis, S., & Sloep, P. B. (Eds.) (2009). Collection of 4 symposium papers at EC-TEL 2009. Proceedings of the Workshop on Methods & Tools for Computer Supported Collaborative Creativity Process: Linking creativity & informal learning. September, 30, 2009, Nice, France. http://sunsite.informatik.rwt

  8. Workshop on the applications of new computer tools to thermal engineering; Applications a la thermique des nouveaux outils informatiques

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    This workshop on the applications of new computer tools to thermal engineering has been organized by the French society of thermal engineers. Seven papers have been presented, from which two papers dealing with thermal diffusivity measurements in materials and with the optimization of dryers have been selected for ETDE. (J.S.)

  9. Tools for income mobility analysis

    OpenAIRE

    Philippe Kerm

    2002-01-01

    A set of Stata routines to help analysis of `income mobility' are presented and illustrated. Income mobility is taken here as the pattern of income change from one time period to another within an income distribution. Multiple approaches have been advocated to assess the magnitude of income mobility. The macros presented provide tools for estimating several measures of income mobility, e.g. the Shorrocks (JET 1978) or King (Econometrica 1983) indices or summary statistics for transition matri...

  10. Dynamic Hurricane Data Analysis Tool

    Science.gov (United States)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  11. Finite element analysis of degraded concrete structures - Workshop proceedings

    International Nuclear Information System (INIS)

    This workshop is related to the finite element analysis of degraded concrete structures. It is composed of three sessions. The first session (which title is: the use of finite element analysis in safety assessments) comprises six papers which titles are: Historical Development of Concrete Finite Element Modeling for Safety Evaluation of Accident-Challenged and Aging Concrete Structures; Experience with Finite Element Methods for Safety Assessments in Switzerland; Stress State Analysis of the Ignalina NPP Confinement System; Prestressed Containment: Behaviour when Concrete Cracking is Modelled; Application of FEA for Design and Support of NPP Containment in Russia; Verification Problems of Nuclear Installations Safety Software of Strength Analysis (NISS SA). The second session (title: concrete containment structures under accident loads) comprises seven papers which titles are: Two Application Examples of Concrete Containment Structures under Accident Load Conditions Using Finite Element Analysis; What Kind of Prediction for Leak rates for Nuclear Power Plant Containments in Accidental Conditions; Influence of Different Hypotheses Used in Numerical Models for Concrete At Elevated Temperatures on the Predicted Behaviour of NPP Core Catchers Under Severe Accident Conditions; Observations on the Constitutive Modeling of Concrete Under Multi-Axial States at Elevated Temperatures; Analyses of a Reinforced Concrete Containment with Liner Corrosion Damage; Program of Containment Concrete Control During Operation for the Temelin Nuclear Power Plant; Static Limit Load of a Deteriorated Hyperbolic Cooling Tower. The third session (concrete structures under extreme environmental load) comprised five papers which titles are: Shear Transfer Mechanism of RC Plates After Cracking; Seismic Back Calculation of an Auxiliary Building of the Nuclear Power Plant Muehleberg, Switzerland; Seismic Behaviour of Slightly Reinforced Shear Wall Structures; FE Analysis of Degraded Concrete

  12. Statistical Analysis of CFD Solutions From the Fifth AIAA Drag Prediction Workshop

    Science.gov (United States)

    Morrison, Joseph H.

    2013-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using a common grid sequence and multiple turbulence models for the June 2012 fifth Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was the Common Research Model subsonic transport wing-body previously used for the 4th Drag Prediction Workshop. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.

  13. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  14. General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  15. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  16. Dialogue and Roles in a Strategy Workshop: Discovering Patterns through Discourse Analysis

    OpenAIRE

    Duffy, Martin

    2010-01-01

    Strategy workshops are frequently used by Executive management teams to discuss and formulate strategy but are under-researched and under-reported in the academic literature. This study uses Discourse Analysis to discover participant roles and dialogic patterns in an Executive management team’s strategy workshop, together with their effect on the workshop’s operation and outcome. The study shows how the workshop participants adopt different roles through their language and content. It then...

  17. A Decision-Analytic Feasibility Study of Upgrading Machinery at a Tools Workshop

    Directory of Open Access Journals (Sweden)

    M. L. Chew Hernandez

    2012-04-01

    Full Text Available This paper presents the evaluation, from a Decision Analysis point of view, of the feasibility of upgrading machinery at an existing metal-forming workshop. The Integral Decision Analysis (IDA methodology is applied to clarify the decision and develop a decision model. One of the key advantages of the IDA is its careful selection of the problem frame, allowing a correct problem definition. While following most of the original IDA methodology, an addition to this methodology is proposed in this work, that of using the strategic Means-Ends Objective Network as a backbone for the development of the decision model. The constructed decision model uses influence diagrams to include factual operator and vendor expertise, simulation to evaluate the alternatives and a utility function to take into account the risk attitude of the decision maker. Three alternatives are considered: Base (no modification, CNC (installing an automatic lathe and CF (installation of an automatic milling machine. The results are presented as a graph showing zones in which a particular alternative should be selected. The results show the potential of IDA to tackle technical decisions that are otherwise approached without the due care.

  18. Exploration tools in formal concept analysis

    OpenAIRE

    Stumme, Gerd

    1996-01-01

    The development of conceptual knowledge systems specifically requests knowledge acquisition tools within the framework of formal concept analysis. In this paper, the existing tools are presented, and furhter developments are discussed.

  19. Assessing the interactivity and prescriptiveness of faculty professional development workshops: The Real-Time Professional Development Observation Tool (R-PDOT)

    CERN Document Server

    Olmstead, Alice

    2016-01-01

    Professional development workshops are one of the primary mechanisms used to help faculty improve their teaching, and draw in many STEM instructors every year. Although workshops serve a critical role in changing instructional practices within our community, we rarely assess workshops through careful consideration of how they engage faculty. Initial evidence suggests that workshop leaders often overlook central tenets of education research that are well-established in classroom contexts, such as the role of interactivity in enabling student learning. As such, there is a need to develop more robust, evidence-based models of how best to support faculty learning in professional development contexts, and to activity support workshop leaders in relating their design decisions to familiar ideas form other educational contexts. In response to these needs, we have developed an observation tool, the Real-Time Professional Development Observation Tool (R-PDOT), to document the form and focus of faculty's engagement dur...

  20. A Workshop in the Analysis of Teaching; Interaction Analysis, Nonverbal Communication, Microteaching, Simulation.

    Science.gov (United States)

    Frymier, Jack R., Ed.

    1968-01-01

    Articles is this issue represent the substantive content of a series of 25 workshops sponsored by the American Association of Colleges for Teacher Education (AACTE). The four major articles discuss innovative models based on four approaches for improving teacher performance: (1) "Interaction Analysis" by Edmund J. Amidon, San Francisco State…

  1. Multi-mission telecom analysis tool

    Science.gov (United States)

    Hanks, D.; Kordon, M.; Baker, J.

    2002-01-01

    In the early formulation phase of a mission it is critically important to have fast, easy to use, easy to integrate space vehicle subsystem analysis tools so that engineers can rapidly perform trade studies not only by themselves but in coordination with other subsystem engineers as well. The Multi-Mission Telecom Analysis Tool (MMTAT) is designed for just this purpose.

  2. Workshop on Thermal Emission Spectroscopy and Analysis of Dust, Disk, and Regoliths

    Science.gov (United States)

    Sprague, Ann L. (Editor); Lynch, David K. (Editor); Sitko, Michael (Editor)

    1999-01-01

    This volume contains abstracts that have been accepted for presentation at the workshop on Thermal Emission Spectroscopy and analysis of Dust, Disks and Regoliths, held April 28-30, 1999, in Houston Texas.

  3. TECHNICAL NOTES Modelica - Language, Libraries, Tools, Workshop and EU-Project

    OpenAIRE

    Otter, Martin; Elmqvist, Hilding

    2000-01-01

    Modelica is a new language for convenient modeling of physical sytems. In this article an overview about the language features is given, the organisation behind the language development, available Modelica libraries and Modelica simulation environments, the recent, first workshop about Modelica and the EU-Project RealSim to enhance hardware-in-the-loop simulation and design optimization techniques on the basis of Modelica.

  4. Quick Spacecraft Thermal Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  5. Machine tool accuracy characterization workshops. Final report, May 5, 1992--November 5 1993

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-01-06

    The ability to assess the accuracy of machine tools is required by both tool builders and users. Builders must have this ability in order to predict the accuracy capability of a machine tool for different part geometry`s, to provide verifiable accuracy information for sales purposes, and to locate error sources for maintenance, troubleshooting, and design enhancement. Users require the same ability in order to make intelligent choices in selecting or procuring machine tools, to predict component manufacturing accuracy, and to perform maintenance and troubleshooting. In both instances, the ability to fully evaluate the accuracy capabilities of a machine tool and the source of its limitations is essential for using the tool to its maximum accuracy and productivity potential. This project was designed to transfer expertise in modern machine tool accuracy testing methods from LLNL to US industry, and to educate users on the use and application of emerging standards for machine tool performance testing.

  6. ADAPT-A Drainage Analysis Planning Tool

    OpenAIRE

    Boelee, Leonore; Kellagher, Richard

    2015-01-01

    HR Wallingford are a partner in the EU funded TRUST project. They are involved in Work package 4.3 Wastewater and stormwater systems, to produce a model and report on a system sustainability analysis and potential for improvements for stormwater systems as Deliverable 4.3.2. This report is deliverable 4.3.2. It details the development of the tool ADAPT (A Drainage Analysis and Planning Tool). The objective of the tool is to evaluate the improvement requirements to a stormwat...

  7. Proceedings of a NEA workshop on probabilistic structure integrity analysis and its relationship to deterministic analysis

    International Nuclear Information System (INIS)

    This workshop was hosted jointly by the Swedish Nuclear Power Inspectorate (SKi) and the Swedish Royal Institute of Technology (KTH). It was sponsored by the Principal Working Group 3 (PWG-3) of the NEA CSNI. PWG-3 deals with the integrity of structures and components, and has three sub-groups, dealing with the integrity of metal components and structures, ageing of concrete structures, and the seismic behaviour of structures. The sub-group dealing with metal components has three mains areas of activity: non-destructive examination; fracture mechanics; and material degradation. The topic of this workshop is primarily probabilistic fracture mechanics, but probabilistic integrity analysis includes NDE and materials degradation also. Session 1 (5 papers) was devoted to the development of probabilistic models; Session 2 (5 papers) to the random modelling of defects and material properties; Session 3 (8 papers) to the applications of probabilistic modelling to nuclear components; Sessions 4 is a concluding panel discussion

  8. An Automatic Hierarchical Delay Analysis Tool

    Institute of Scientific and Technical Information of China (English)

    FaridMheir-El-Saadi; BozenaKaminska

    1994-01-01

    The performance analysis of VLSI integrated circuits(ICs) with flat tools is slow and even sometimes impossible to complete.Some hierarchical tools have been developed to speed up the analysis of these large ICs.However,these hierarchical tools suffer from a poor interaction with the CAD database and poorly automatized operations.We introduce a general hierarchical framework for performance analysis to solve these problems.The circuit analysis is automatic under the proposed framework.Information that has been automatically abstracted in the hierarchy is kept in database properties along with the topological information.A limited software implementation of the framework,PREDICT,has also been developed to analyze the delay performance.Experimental results show that hierarchical analysis CPU time and memory requirements are low if heuristics are used during the abstraction process.

  9. Abstract interfaces for data analysis - component architecture for data analysis tools

    International Nuclear Information System (INIS)

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis'99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. The authors give an overview of the architecture and design of the various components for data analysis as discussed in AIDA

  10. Abstract Interfaces for Data Analysis —Component Architecture for Data Analysis Tools

    Institute of Scientific and Technical Information of China (English)

    G.Barrand; P.Binko; 等

    2001-01-01

    The fast turnover of software technologies,in particular in the domain of in teractivity(covering user interface and visualisation)makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete.At the HepVis '99 workshop,a working group has been formed to improve the rpoduction of software tools for data analysis in HENP.Beside promoting a distributed development organisation,one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques.An initial domain analysis has come up with several categories(componets)found in typical data analysis tools:historams,Ntuples,Functions,Vectors,Fitter,Plotter,Analyzer and Controller,Special Emphasis was put on reducing the couplings between the categories to a minimum,thus optimising re-use and maintainability of any component individually.The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++(Anaphe/Lizard,Openscientist)and Java(Java Analysis Studio),A special implementation aims at accessing the Java Liraries(through their Abstract Interfaces)from C++.This paper giver an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

  11. Surface analysis of stone and bone tools

    Science.gov (United States)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  12. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  13. A workshop report on the development of the Cow's Milk-related Symptom Score awareness tool for young children

    DEFF Research Database (Denmark)

    Vandenplas, Yvan; Dupont, Christophe; Eigenmann, Philippe; Host, Arne; Kuitunen, Mikael; Ribes-Koninck, Carmen; Shah, Neil; Shamir, Raanan; Staiano, Annamaria; Szajewska, Hania; Von Berg, Andrea

    2015-01-01

    Clinicians with expertise in managing children with gastrointestinal problems and, or, atopic diseases attended a workshop in Brussels in September 2014 to review the literature and determine whether a clinical score derived from symptoms associated with the ingestion of cow's milk proteins could...... help primary healthcare providers. The Cow's Milk-related Symptom Score (CoMiSS), which considers general manifestations, dermatological, gastrointestinal and respiratory symptoms, was developed as an awareness tool for cow's milk related symptoms. It can also be used to evaluate and quantify the...... evolution of symptoms during therapeutic interventions, but does not diagnose cow's milk protein allergy and does not replace a food challenge. Its usefulness needs to be evaluated by a prospective randomised study. ConclusionThe CoMiSS provides primary healthcare clinicians with a simple, fast and easy...

  14. Tools & Strategies for Social Data Analysis

    OpenAIRE

    Willett, Wesley Jay

    2012-01-01

    Data analysis is often a complex, iterative process that involves a variety of stakeholders and requires a range of technical and professional competencies. However, in practice, tools for visualizing,analyzing, and communicating insights from data have primarily been designed to support individual users.In the past decade a handful of research systems like sense.us and Many Eyes have begun to explore how web-based visualization tools can allow larger groups of users to participate in analyse...

  15. Built Environment Energy Analysis Tool Overview (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  16. Performance analysis of GYRO: a tool evaluation

    International Nuclear Information System (INIS)

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wallclock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses

  17. Virtual Workshop

    DEFF Research Database (Denmark)

    Buus, Lillian; Bygholm, Ann

    In relation to the Tutor course in the Mediterranean Virtual University (MVU) project, a virtual workshop “Getting experiences with different synchronous communication media, collaboration, and group work” was held with all partner institutions in January 2006. More than 25 key-tutors within MVU...... participated from different institutions in the workshop. The result of the workshop was experiences with different communication tools and media. Facing the difficulties and possibilities in collaborateting virtually concerned around group work and development of a shared presentation. All based on getting...... experiences for the learning design of MVU courses. The workshop intented to give the participants the possibility to draw their own experiences with issues on computer supported collaboration, group work in a virtual environment, synchronous and asynchronous communication media, and different perspectives...

  18. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  19. Quantitative Text Analysis for Literary History - Report on a DARIAH-DE Expert Workshop

    OpenAIRE

    Schöch, Christof; Jannidis, Fotis

    2013-01-01

    The workshop on Quantitative Text Analysis for Literary History was the first in a series of DARIAH-DE expert workshops and took place from November 22 to 23 at the University of Würzburg, Germany. It was organized by Fotis Jannidis and Christof Schöch of the Department for Literary Computing, University of Würzburg, and brought together experts in the computational analysis of collections of literary texts from France, Germany, Poland and the US. This report first briefly provides some conte...

  20. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    Science.gov (United States)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  1. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  2. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  3. Proceedings of the workshop on small angle scattering data analysis. Micelle related topics

    International Nuclear Information System (INIS)

    This workshop was held on December 13 and 14, 1995 at National Laboratory for High Energy Physics. At the workshop, the purpose of the workshop was explained, and lectures were given on the research on superhigh molecular structure by small angle neutron scattering, the verification of the reliability of WINK data (absolute intensity), the analysis of WINK data, the new data program of SAN, small angle X-ray scattering data analysis program (SAXS), the basis of the analysis of micelle system, analysis software manual and practice program Q-I(Q) ver 1.0, various analysis methods for small angle scattering and contrast modulation method and others, the ordering of and the countermeasures to the problems of WINK, and the hereafter of KENS small angle scattering facility. How to treat the analysis related to micelle, how to save WINK and how to install the SAN/reflectometer are the matters to be discussed at the workshop. In this book, the summaries of the lectures are collected. (K.I.)

  4. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  5. Statistical Tools for Forensic Analysis of Toolmarks

    Energy Technology Data Exchange (ETDEWEB)

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  6. The CANDU alarm analysis tool (CAAT)

    International Nuclear Information System (INIS)

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs

  7. From sensor networks to connected analysis tools

    Science.gov (United States)

    Dawes, N.; Bavay, M.; Egger, T.; Sarni, S.; Salehi, A.; Davison, A.; Jeung, H.; Aberer, K.; Lehning, M.

    2012-04-01

    Multi-disciplinary data systems provide excellent tools for locating data, but most eventually provide a series of local files for further processing, providing marginal advantages for the regular user. The Swiss Experiment Platform (SwissEx) was built with the primary goal of enabling high density measurements, integrating them with lower density existing measurements and encouraging cross/inter-disciplinary collaborations. Nearing the end of the project, we have exceeded these goals, also providing connected tools for direct data access from analysis applications. SwissEx (www.swiss-experiment.ch) provides self-organising networks for rapid deployment and integrates these data with existing measurements from across environmental research. The data are categorised and documented according to their originating experiments and fieldsites as well as being searchable globally. Data from SwissEx are available for download, but we also provide tools to directly access data from within common scientific applications (Matlab, LabView, R) and numerical models such as Alpine3D (using a data acquisition plugin and preprocessing library, MeteoIO). The continuation project (the Swiss Environmental Data and Knowledge Platform) will aim to continue the ideas developed within SwissEx and (alongside cloud enablement and standardisation) work on the development of these tools for application specific tasks. We will work alongside several projects from a wide range of disciplines to help them to develop tools which either require real-time data, or large data samples. As well as developing domain specific tools, we will also be working on tools for the utilisation of the latest knowledge in data control, trend analysis, spatio-temporal statistics and downscaling (developed within the CCES Extremes project), which will be a particularly interesting application when combined with the large range of measurements already held in the system. This presentation will look at the

  8. Workshop proceedings of ISAMM 2009 - Decision-making, tools, training, risk targets and entrance to SAM

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-10-15

    Five papers were presented during Session 6. These papers covered a broad range of topics including: (i) Transition criteria used for moving from emergency procedures to severe accident procedures; (ii) An accident diagnosis and emergency response decision-making tool; (iii) Views on the use of risk targets and safety goals in view of IAEA recommendations; (iv) Development, validation and training of SAM. (author)

  9. Proceedings Fourth International Workshop on Testing, Analysis and Verification of Web Software

    CERN Document Server

    Salaün, Gwen; Hallé, Sylvain; 10.4204/EPTCS.35

    2010-01-01

    This volume contains the papers presented at the fourth international workshop on Testing, Analysis and Verification of Software, which was associated with the 25th IEEE/ACM International Conference on Automated Software Engineering (ASE 2010). The collection of papers includes research on formal specification, model-checking, testing, and debugging of Web software.

  10. 78 FR 33424 - Tobacco Product Analysis; Scientific Workshop; Request for Comments

    Science.gov (United States)

    2013-06-04

    ... tobacco reference products and general testing methods are used to analyze tobacco products (77 FR 14814... HUMAN SERVICES Food and Drug Administration Tobacco Product Analysis; Scientific Workshop; Request for.... The Food and Drug Administration (FDA), Center for Tobacco Products, is announcing a...

  11. Energy demand analysis in the workshop on alternative energy strategies

    Energy Technology Data Exchange (ETDEWEB)

    Carhart, S C

    1978-04-01

    The Workshop on Alternative Energy Strategies, conducted from 1974 through 1977, was an international study group formed to develop consistent national energy alternatives within a common analytical framework and global assumptions. A major component of this activity was the demand program, which involved preparation of highly disaggregated demand estimates based upon estimates of energy-consuming activities and energy requirements per unit of activity reported on a consistent basis for North America, Europe, and Japan. Comparison of the results of these studies reveals that North America requires more energy per unit of activity in many consumption categories, that major improvements in efficiency will move North America close to current European and Japanese efficiencies, and that further improvements in European and Japanese efficiencies may be anticipated as well. When contrasted with expected availabilities of fuels, major shortfalls of oil relative to projected demands emerge in the eighties and nineties. Some approaches to investment in efficiency improvements which will offset these difficulties are discussed.

  12. PREFACE: EMAS 2011: 12th European Workshop on Modern Developments in Microbeam Analysis

    Science.gov (United States)

    Brisset, François; Dugne, Olivier; Robaut, Florence; Lábár, János L.; Walker, Clive T.

    2012-03-01

    This volume of IOP Conference Series: Materials Science and Engineering contains papers from the 12th Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis, which took place from the 15-19 May 2011 in the Angers Congress Centre, Angers, France. The primary aim of this series of workshops is to assess the state-of-the-art and reliability of microbeam analysis techniques. The workshops also provide a forum where students and young scientists starting out on a career in microbeam analysis can meet and discuss with the established experts. The workshops have a very specific format comprising invited plenary lectures by internationally recognized experts, poster presentations by the participants and round table discussions on the key topics led by specialists in the field. This workshop was organized in collaboration with GN-MEBA - Groupement National de Microscopie Electronique à Balayage et de microAnalysis, France. The technical programme included the following topics: the limits of EPMA, new techniques, developments and concepts in microanalysis, microanalysis in the SEM, and new and less common applications of micro- and nanoanalysis. As at previous workshops there was also a special oral session for young scientists. The best presentation by a young scientist was awarded with an invitation to attend the 2012 Microscopy and Microanalysis meeting at Phoenix, Arizona. The prize went to Pierre Burdet, of the Federal Institute of Technology of Lausanne (EPFL), for his talk entitled '3D EDS microanalysis by FIB-SEM: enhancement of elemental quantification'. The continuing relevance of the EMAS workshops and the high regard in which they are held internationally can be seen from the fact that 74 posters from 18 countries were on display at the meeting, and that the participants came from as far away as Japan, Canada and the USA. A selection of participants with posters were invited to give a short oral

  13. Space Debris Reentry Analysis Methods and Tools

    Institute of Scientific and Technical Information of China (English)

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe

    2011-01-01

    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  14. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...... are specified using the temporal logic Probabilistic Computation Tree Logic (PCTL) and we employ stochastic model checking, by means of the model checker PRISM, to compute their exact values. We present a simplified example of a distributed stochastic system where we determine a reachability property...

  15. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  16. A tool for subjective analysis of TTOs

    OpenAIRE

    Resende, David Nunes; Gibson, David V.; Jarrett, James

    2011-01-01

    The objective of this article is to present a proposal (working paper) for a quantitative analysis tool to help technology transfer offices (TTOs) improve their structures, processes and procedures. Our research started from the study of internal practices and structures that facilitate the interaction between R&D institutions, their TTOs and regional surroundings. We wanted to identify “bottlenecks” in those processes, procedures, and structures. We mapped the bottlenecks in a set of “...

  17. Presentations and recorded keynotes of the First European Workshop on Latent Semantic Analysis in Technology Enhanced Learning

    NARCIS (Netherlands)

    Several

    2007-01-01

    Presentations and recorded keynotes at the 1st European Workshop on Latent Semantic Analysis in Technology-Enhanced Learning, March, 29-30, 2007. Heerlen, The Netherlands: The Open University of the Netherlands. Please see the conference website for more information: http://homer.ou.nl/lsa-workshop0

  18. Application of collapsing methods for continuous traits to the Genetic Analysis Workshop 17 exome sequence data

    OpenAIRE

    Sung Yun; Rice Treva K; Rao Dabeeru C

    2011-01-01

    Abstract Genetic Analysis Workshop 17 used real sequence data from the 1000 Genomes Project and simulated phenotypes influenced by a large number of rare variants. Our aim is to evaluate the performance of various collapsing methods that were developed for analysis of multiple rare variants. We apply collapsing methods to continuous phenotypes Q1 and Q2 for all 200 replicates of the unrelated individuals data. Within each gene, we collapse (1) all SNPs, (2) all SNPs with minor allele frequenc...

  19. Data Base Directions: Information Resource Management - Strategies and Tools. Proceedings of the Workshop of the National Bureau of Standards and the Association for Computing Machinery (Ft. Lauderdale, Florida, October 20-22, 1980).

    Science.gov (United States)

    Goldfine, Alan H., Ed.

    This workshop investigated how managers can evaluate, select, and effectively use information resource management (IRM) tools, especially data dictionary systems (DDS). An executive summary, which provides a definition of IRM as developed by workshop participants, precedes the keynote address, "Data: The Raw Material of a Paper Factory," by John…

  20. A content analysis of advertisements for psychotherapy workshops: implications for disseminating empirically supported treatments.

    Science.gov (United States)

    Cook, Joan M; Weingardt, Kenneth R; Jaszka, Jacqueline; Wiesner, Michael

    2008-03-01

    This study involved a content analysis of 261 unique advertisements for psychotherapy workshops that appeared in two bimonthly clinical magazines, Psychotherapy Networker and Counselor, during a 2-year period. Two independent judges coded each advertisement and documented the type and prevalence of advertising appeals used. From the seminal diffusion of innovations model, Rogers' (2003) five perceived characteristics of innovations found to influence adoption in diverse fields were not well represented in these workshops appeals, appearing less than 10% each. Few advertisements cited specific empirically supported treatments or presented any evidence of treatment effectiveness beyond expert testimonials. The most frequently noted appeals were to benefit the clinician (e.g., earning education credit or developing skills), characteristics that enhance credibility of the workshop (e.g., reference to storied history or mention of faculty), and features of the advertisements itself (e.g., use of superlatives and exclamation points). Promotional strategies to advertise psychotherapy workshops can be used to inform the dissemination of empirically supported treatments. PMID:18271002

  1. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  2. Microfracturing and new tools improve formation analysis

    Energy Technology Data Exchange (ETDEWEB)

    McMechan, D.E.; Venditto, J.J.; Heemstra, T. (New England River Basins Commission, Boston, MA (United States). Power and Environment Committee); Simpson, G. (Halliburton Logging Services, Houston, TX (United States)); Friend, L.L.; Rothman, E. (Columbia Natural Resources Inc., Charleston, WV (United States))

    1992-12-07

    This paper reports on microfracturing with nitrogen, an experimental extensometer, stress profile determination from wire line logs, and temperature logging in air-filled holes which are new tools and techniques that add resolution to Devonian shale gas well analysis. Microfracturing creates small fractures by injecting small amounts of fluid at very low rates. Microfracs are created usually at several different depths to determine stress variation as a function of depth and rock type. To obtain and oriented core containing the fracture, the formation is microfractured during drilling. These tests are critical in establishing basic open hole parameters for designing the main fracture treatment.

  3. DEVELOPING NEW TOOLS FOR POLICY ANALYSIS

    International Nuclear Information System (INIS)

    For the past three years, the Office of Security Policy has been aggressively pursuing substantial improvements in the U. S. Department of Energy (DOE) regulations and directives related to safeguards and security (S and S). An initial effort focused on areas where specific improvements could be made. This revision was completed during 2009 with the publication of a number of revised manuals. Developing these revisions involved more than 100 experts in the various disciplines involved, yet the changes made were only those that could be identified and agreed upon based largely on expert opinion. The next phase of changes will be more analytically based. A thorough review of the entire (S and S) directives set will be conducted using software tools to analyze the present directives with a view toward (1) identifying areas of positive synergism among topical areas, (2) identifying areas of unnecessary duplication within and among topical areas, and (3) identifying requirements that are less than effective in achieving the intended protection goals. This paper will describe the software tools available and in development that will be used in this effort. Some examples of the output of the tools will be included, as will a short discussion of the follow-on analysis that will be performed when these outputs are available to policy analysts.

  4. Materials characterization center workshop on compositional and microstructural analysis of nuclear waste materials. Summary report

    International Nuclear Information System (INIS)

    The purpose of the Workshop on Compositional and Microstructural Analysis of Nuclear Waste Materials, conducted November 11 and 12, 1980, was to critically examine and evaluate the various methods currently used to study non-radioactive, simulated, nuclear waste-form performance. Workshop participants recognized that most of the Materials Characterization Center (MCC) test data for inclusion in the Nuclear Waste Materials Handbook will result from application of appropriate analytical procedures to waste-package materials or to the products of performance tests. Therefore, the analytical methods must be reliable and of known accuracy and precision, and results must be directly comparable with those from other laboratories and from other nuclear waste materials. The 41 participants representing 18 laboratories in the United States and Canada were organized into three working groups: Analysis of Liquids and Solutions, Quantitative Analysis of Solids, and Phase and Microstructure Analysis. Each group identified the analytical methods favored by their respective laboratories, discussed areas needing attention, listed standards and reference materials currently used, and recommended means of verifying interlaboratory comparability of data. The major conclusions from this workshop are presented

  5. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  6. 11th International Workshop in Model-Oriented Design and Analysis

    CERN Document Server

    Müller, Christine; Atkinson, Anthony

    2016-01-01

    This volume contains pioneering contributions to both the theory and practice of optimal experimental design. Topics include the optimality of designs in linear and nonlinear models, as well as designs for correlated observations and for sequential experimentation. There is an emphasis on applications to medicine, in particular, to the design of clinical trials. Scientists from Europe, the US, Asia, Australia and Africa contributed to this volume of papers from the 11th Workshop on Model Oriented Design and Analysis.

  7. Medical decision making tools: Bayesian analysis and ROC analysis

    International Nuclear Information System (INIS)

    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  8. Standardised risk analysis as a communication tool

    International Nuclear Information System (INIS)

    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  9. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  10. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  11. Workshop proceedings of ISAMM 2009 - Code analysis for supporting SAMGs

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-10-15

    Nine papers were presented in Sessions 4 and 5, on the general subject of 'Code Analysis for Supporting SAMGs'. The papers addressed a wide spectrum of topics and described deterministic analysis of a several different reactor designs. The papers stimulated several useful and interesting comments and discussions. Discussion on the various topics can be summarized in three broad areas: (1) Deterministic consequence analysis, (2) Use of deterministic analysis to identify and verify SAM measures, and (3) Investigations of particular severe accident phenomena. General conclusions are as follows: (i) Technical progress was reported in modelling severe accident and consequence analysis, permitting realistic updates or revisions to past analysis of quantitative estimates of offsite radiological accident consequences. However, applications of these models to representative severe accident sequences led to very different conclusions regarding the extent to which quantitative health objectives (QHOs) have been achieved after severe accident measure implementation. (ii) Differences of opinion remain in the confidence with which SAM guidance can recommend the reintroduction of water to molten core debris, during either the in-vessel or the ex-vessel phase of the accident. Concerns regarding the detrimental side effects of the interaction between water and core debris (especially at coolant low flow rates), have not been fully resolved. (iii) Severe accident sequences involving induced steam generator tube rupture (SGTR) remain an important contributor to risk for some PWRs and a significant challenge for developing SAM measures. The relationship between the time of this event and the time at which creep rupture occurs at other locations in the RCS (e.g., hot leg) has a first-order impact on the radiological source term to the environment. If hot leg creep rupture occurs soon after tube rupture, a substantial fraction of fission products are discharged into

  12. Scientific Ballooning Technologies Workshop STO-2 Thermal Design and Analysis

    Science.gov (United States)

    Ferguson, Doug

    2016-01-01

    The heritage thermal model for the full STO-2 (Stratospheric Terahertz Observatory II), vehicle has been updated to model the CSBF (Columbia Scientific Balloon Facility) SIP-14 (Scientific Instrument Package) in detail. Analysis of this model has been performed for the Antarctica FY2017 launch season. Model temperature predictions are compared to previous results from STO-2 review documents.

  13. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  14. Ultrasonic vibrating system design and tool analysis

    Institute of Scientific and Technical Information of China (English)

    Kei-Lin KUO

    2009-01-01

    The applications of ultrasonic vibrations for material removal processes exist predominantly in the area of vertical processing of hard and brittle materials. This is because the power generated by vertical vibrating oscillators generates the greatest direct penetration, in order to conduct material removal on workpieces by grains. However, for milling processes, vertical vibrating power has to be transformed into lateral (horizontal) vibration to produce the required horizontal cutting force. The objective of this study is to make use of ultrasonic lateral transformation theory to optimize processing efficiency, through the use of the finite element method for design and analysis of the milling tool. In addition, changes can be made to the existing vibrating system to generate best performance under consistent conditions, namely, using the same piezoelectric ceramics.

  15. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  16. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  17. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  18. Summary of the workshop on structural analysis needs for magnetic fusion energy superconducting magnets

    International Nuclear Information System (INIS)

    The technical portions of the meeting were divided into three major sessions as follows: (1) Review of methods being presently used by the MFE community for structural evaluation of current designs. (2) Future structural analysis needs. (3) Open discussions dealing with adequacy of present methods, the improvements needed for MFE magnet structural analysis, and the establishment of an MFE magnet structural advisory group. Summaries of the individual talks presented on Wednesday and Thursday (i.e., items 1 and 2 above) are included following the workshop schedule given later in this synopsis

  19. Risk analysis as a decision tool

    International Nuclear Information System (INIS)

    From 1983 - 1985 a lecture series entitled ''Risk-benefit analysis'' was held at the Swiss Federal Institute of Technology (ETH), Zurich, in cooperation with the Central Department for the Safety of Nuclear Installations of the Swiss Federal Agency of Energy Economy. In that setting the value of risk-oriented evaluation models as a decision tool in safety questions was discussed on a broad basis. Experts of international reputation from the Federal Republic of Germany, France, Canada, the United States and Switzerland have contributed to report in this joint volume on the uses of such models. Following an introductory synopsis on risk analysis and risk assessment the book deals with practical examples in the fields of medicine, nuclear power, chemistry, transport and civil engineering. Particular attention is paid to the dialogue between analysts and decision makers taking into account the economic-technical aspects and social values. The recent chemical disaster in the Indian city of Bhopal again signals the necessity of such analyses. All the lectures were recorded individually. (orig./HP)

  20. ISHM Decision Analysis Tool: Operations Concept

    Science.gov (United States)

    2006-01-01

    The state-of-the-practice Shuttle caution and warning system warns the crew of conditions that may create a hazard to orbiter operations and/or crew. Depending on the severity of the alarm, the crew is alerted with a combination of sirens, tones, annunciator lights, or fault messages. The combination of anomalies (and hence alarms) indicates the problem. Even with much training, determining what problem a particular combination represents is not trivial. In many situations, an automated diagnosis system can help the crew more easily determine an underlying root cause. Due to limitations of diagnosis systems,however, it is not always possible to explain a set of alarms with a single root cause. Rather, the system generates a set of hypotheses that the crew can select from. The ISHM Decision Analysis Tool (IDAT) assists with this task. It presents the crew relevant information that could help them resolve the ambiguity of multiple root causes and determine a method for mitigating the problem. IDAT follows graphical user interface design guidelines and incorporates a decision analysis system. I describe both of these aspects.

  1. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  2. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  3. Parallel Enhancements of the General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  4. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur....... properties. Measured and controlled quantities in the system are related to variables through functional relations, which need only be stated as names, their explicit composition need not be described to the tool. The user enters a list of these relations that together describe the entirerity of the system...

  5. Proceedings of the International Workshop on: methods and tools for water-related adaptation to climate change and climate proofing

    NARCIS (Netherlands)

    Moerwanto, A.S.; Driel, van W.; Susandi, A.; Schrevel, A.; Meer, van der P.J.; Jacobs, C.

    2010-01-01

    The workshop fits in the National Water Plan of the Netherlands’ government of which the international chapter includes the strengthening of cooperation with other delta countries, including Indonesia, Vietnam and Bangladesh and is part of the work plan of the Cooperative Programme on Water and Clim

  6. Development of Integrated Protein Analysis Tool

    Directory of Open Access Journals (Sweden)

    Poorna Satyanarayana Boyidi,

    2010-05-01

    Full Text Available We present an “Integrated Protein Analysis Tool(IPAT” that is able to perform the following tasks in segregating and annotating genomic data: Protein Editor enables the entry of nucleotide/ aminoacid sequences Utilities :IPAT enables to conversion of given nucleotide sequence to equivalent amino acid sequence: Secondary Structure Prediction is possible using three algorithms (GOR-I Gibrat Method and DPM (Double Prediction Method with graphical display. Profiles and properties: allow calculating eight physico-chemical profiles and properties, viz Hydrophobicity, Hydrophilicity, Antigenicity, Transmembranous regions , Solvent Accessibility, Molecular Weight, Absorption factor and Amino Acid Content. IPAT has a provision for viewing Helical-Wheel Projection of a selected region of a given protein sequence and 2D representation of alphacarbon IPAT was developed using the UML (Unified Modeling Language for modeling the project elements, coded in Java, and subjected to unit testing, path testing, and integration testing.This project mainly concentrates on Butyrylcholinesterase to predict secondary structure and its physicochemical profiles, properties.

  7. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  8. Ball Bearing Analysis with the ORBIS Tool

    Science.gov (United States)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  9. Tools for Knowledge Analysis, Synthesis, and Sharing

    Science.gov (United States)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  10. Diving into the analysis of time-depth recorder and behavioural data records: A workshop summary

    Science.gov (United States)

    Womble, Jamie N.; Horning, Markus; Lea, Mary-Anne; Rehberg, Michael J.

    2013-04-01

    Directly observing the foraging behavior of animals in the marine environment can be extremely challenging, if not impossible, as such behavior often takes place beneath the surface of the ocean and in extremely remote areas. In lieu of directly observing foraging behavior, data from time-depth recorders and other types of behavioral data recording devices are commonly used to describe and quantify the behavior of fish, squid, seabirds, sea turtles, pinnipeds, and cetaceans. Often the definitions of actual behavioral units and analytical approaches may vary substantially which may influence results and limit our ability to compare behaviors of interest across taxonomic groups and geographic regions. A workshop was convened in association with the Fourth International Symposium on Bio-logging in Hobart, Tasmania on 8 March 2011, with the goal of providing a forum for the presentation, review, and discussion of various methods and approaches that are used to describe and analyze time-depth recorder and associated behavioral data records. The international meeting brought together 36 participants from 14 countries from a diversity of backgrounds including scientists from academia and government, graduate students, post-doctoral fellows, and developers of electronic tagging technology and analysis software. The specific objectives of the workshop were to host a series of invited presentations followed by discussion sessions focused on (1) identifying behavioral units and metrics that are suitable for empirical studies, (2) reviewing analytical approaches and techniques that can be used to objectively classify behavior, and (3) identifying cases when temporal autocorrelation structure is useful for identifying behaviors of interest. Outcomes of the workshop included highlighting the need to better define behavioral units and to devise more standardized processing and analytical techniques in order to ensure that results are comparable across studies and taxonomic groups.

  11. Generalized Geophysical Retrieval and Analysis Tool for Planetary Atmospheres Project

    Data.gov (United States)

    National Aeronautics and Space Administration — CPI proposes to develop an innovative, generalized retrieval algorithm and analysis tool (GRANT) that will facilitate analysis of remote sensing data from both...

  12. Interactive Graphics Tools for Analysis of MOLA and Other Data

    Science.gov (United States)

    Frey, H.; Roark, J.; Sakimoto, S.

    2000-01-01

    We have developed several interactive analysis tools based on the IDL programming language for the analysis of Mars Orbiting Laser Altimeter (MOLA) profile and gridded data which are available to the general community.

  13. Predictive tools and data needs for long term performance of in-situ stabilization and containment systems: DOE/OST stabilization workshop, June 26-27, Park City, Utah

    International Nuclear Information System (INIS)

    This paper summarizes the discussion within the Predictive Tools and Data Needs for Long Term Performance Assessment Subgroup. This subgroup formed at the DOE Office of Science and Technology workshop to address long-term performance of in situ stabilization and containment systems. The workshop was held in Park City, Utah, 26 and 27 June, 1996. All projects, engineering and environmental, have built-in decision processes that involve varying risk/reward scenarios. Such decision-processes maybe awkward to describe but are utilized every day following approaches that range from intuitive to advanced mathematical and numerical. Examples are the selection of components of home sound system, the members of a sports team, investments in a portfolio, and the members of a committee. Inherent in the decision method are an understanding of the function or process of the system requiring a decision or prediction, an understanding of the criteria on which decisions are made such as cost, performance, durability and verifiability. Finally, this process requires a means to judge or predict how the objects, activities, people and processes being analyzed will perform relative to the operations and functions of the system and relative to the decision criteria posed for the problem. These risk and decision analyses are proactive and iterative throughout the life of a remediation project. Prediction inherent to the analyses are based on intuition, experience, trial and error, and system analysis often using numerical approaches

  14. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    . An example of a prototype for a digital conceptual design tool with integrated real time structural analysis is presented and compared with a more common Building Information Modelling (BIM) approach. It is concluded that a digital conceptual design tool with embedded real time structural analysis...

  15. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri;

    2011-01-01

    We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order...

  16. A Multidimensional Analysis Tool for Visualizing Online Interactions

    Science.gov (United States)

    Kim, Minjeong; Lee, Eunchul

    2012-01-01

    This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

  17. The physics analysis tools project for the ATLAS experiment

    International Nuclear Information System (INIS)

    The Large Hadron Collider is expected to start colliding proton beams in 2009. The enormous amount of data produced by the ATLAS experiment (≅1 PB per year) will be used in searches for the Higgs boson and Physics beyond the standard model. In order to meet this challenge, a suite of common Physics Analysis Tools has been developed as part of the Physics Analysis software project. These tools run within the ATLAS software framework, ATHENA, covering a wide range of applications. There are tools responsible for event selection based on analysed data and detector quality information, tools responsible for specific physics analysis operations including data quality monitoring and physics validation, and complete analysis tool-kits (frameworks) with the goal to aid the physicist to perform his analysis hiding the details of the ATHENA framework. (authors)

  18. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  19. An Integrated Tool for System Analysis of Sample Return Vehicles

    Science.gov (United States)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  20. Interactive Decision Analysis; Proceedings of an International Workshop on Interactive Decision Analysis and Interpretative Computer Intelligence, Laxenburg, Austria, September 20-23, 1983

    OpenAIRE

    Grauer, M.; A.P. Wierzbicki

    1984-01-01

    An International Workshop on Interactive Decision Analysis and Interpretative Computer Intelligence was held at IIASA in September 1983. The Workshop was motivated, firstly, by the realization that the rapid development of computers, especially microcomputers, will greatly increase the scope and capabilities of computerized decision-support systems. It is important to explore the potential of these systems for use in handling the complex technological, environmental, economic and social probl...

  1. Papers presented during the Narodowy Bank Polski Workshop: Recent trends in the real estate market and its analysis, 2013, Volume 2.

    OpenAIRE

    Hanna Augustyniak; Jacek Łaszek; Krzysztof Olszewski; Joanna Waszczuk; Robert Leszczyński; Franz Fuerst; Wayne T. Lim; Matysiak, George A.; Wojciech Doliński; María Jesús Bárcena; Patricia Menéndez; María Blanca Palacios; Fernando Tusell; Branimir Jovanovic; Martin Lux

    2014-01-01

    The Narodowy Bank Polski organized during November 14-15, 2013 an international workshop to discuss current issues in the field of real estate analysis from the central bank’s point of view. The development of residential real estate prices as well as commercial real estate prices and real estate financing were also covered during the workshop. The workshop was aimed at researchers who work in academia, private firms and central banks. The conference focused on topics: Real estate finance, fi...

  2. Papers presented during the Narodowy Bank Polski Workshop: Recent trends in the real estate market and its analysis, 2013. Volume 1

    OpenAIRE

    Hanna Augustyniak; Jacek Łaszek; Krzysytof Olszewski; Derry O’Brien; Thomas Westermann; Zbigniew Krysiak; Kazimierz Kirejczyk; Michael Lea; Guenter Karl; Andrey Tumanov; Evgeniya Zhelezova; Otmar M. Stöcker; Hans-Joachim Dübel; Agnieszka Tułodziecka; Agnieszka Nierodka

    2014-01-01

    The Narodowy Bank Polski organized during November 14-15, 2013 an international workshop to discuss current issues in the field of real estate analysis from the central bank’s point of view. The development of residential real estate prices as well as commercial real estate prices and real estate financing were also covered during the workshop. The workshop was aimed at researchers who work in academia, private firms and central banks. The conference focused on topics: Real estate finance, fi...

  3. Graphical Acoustic Liner Design and Analysis Tool

    Science.gov (United States)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  4. Using Visual Tools for Analysis and Learning

    OpenAIRE

    Burton, Rob; Barlow, Nichola; Barker, Caroline

    2010-01-01

    This pack is intended as a resource for lecturers and students to facilitate the further development of their learning and teaching strategies. Visual tools were initially introduced within a module of the Year 3 nursing curriculum within the University of Huddersfield by Dr Rob Burton. Throughout the period of 2007-2008 a small team of lecturers with a keen interest in this teaching and learning strategy engaged in exploring and reviewing the literature. They also attended a series of loc...

  5. Tools for Physics Analysis in CMS

    International Nuclear Information System (INIS)

    The CMS Physics Analysis Toolkit (PAT) is presented. The PAT is a high-level analysis layer enabling the development of common analysis efforts across and within physics analysis groups. It aims at fulfilling the needs of most CMS analyses, providing both ease-of-use for the beginner and flexibility for the advanced user. The main PAT concepts are described in detail and some examples from realistic physics analyses are given.

  6. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  7. Total life cycle management - assessment tool an exploratory analysis

    OpenAIRE

    Young, Brad de

    2008-01-01

    It is essential for the Marine Corps to ensure the successful supply, movement and maintenance of an armed force in peacetime and combat. Integral to an effective, long-term logistics plan is the ability to accurately forecast future requirements to sustain materiel readiness. Total Life Cycle Management Assessment Tool (TLCM-AT) is a simulation tool combining operations, maintenance, and logistics. This exploratory analysis gives insight into the factors used by TLCM-AT beyond the tool s emb...

  8. Dynamics of the 1054 UT March 22, 1979, substorm event - CDAW 6. [Coordinated Data Analysis Workshop

    Science.gov (United States)

    Mcpherron, R. L.; Manka, R. H.

    1985-01-01

    The Coordinated Data Analysis Workshop (CDAW 6) has the primary objective to trace the flow of energy from the solar wind through the magnetosphere to its ultimate dissipation in the ionosphere. An essential role in this energy transfer is played by magnetospheric substorms, however, details are not yet completely understood. The International Magnetospheric Study (IMS) has provided an ideal data base for the study conducted by CDAW 6. The present investigation is concerned with the 1054 UT March 22, 1979, substorm event, which had been selected for detailed examination in connection with the studies performed by the CDAW 6. The observations of this substorm are discussed, taking into account solar wind conditions, ground magnetic activity on March 22, 1979, observations at synchronous orbit, observations in the near geomagnetic tail, and the onset of the 1054 UT expansion phase. Substorm development and magnetospheric dynamics are discussed on the basis of a synthesis of the observations.

  9. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  10. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  11. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...... system at a high level of functional abstraction, analyze single and multiple fault scenarios and automatically generate parity relations for diagnosis for the system in normal and impaired conditions. User interface and algorithmic details are presented....

  12. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...... paper and two short tool descriptions were carefully reviewed and selected from a total of 107 submissions. The papers are organized in topical sections on software and formal methods, formal methods, timed and hybrid systems, infinite and parameterized systems, diagnostic and test generation, efficient...... model checking, model-checking tools, symbolic model checking, visual tools, and verification of critical systems....

  13. Surface Operations Data Analysis and Adaptation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  14. The environment power system analysis tool development program

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  15. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  16. Simplified Analysis Tool for Ship-Ship Collision

    DEFF Research Database (Denmark)

    Yamada, Yasuhira; Pedersen, Preben Terndrup

    The purpose of this paper is to develop a simplified ship collision analysis tool in order to rapidly estimate the structural damage and energy absorption of both striking and struck ships as well as prediction of rupture of cargo oil tanks of struck tankers. The present tool calculates external...... and internal dynamics independently. The 2-dimensional horizontal motions of both ships are taken into account. in the horizontal plane. Structural deformation for both the striking and the struck ship is evaluated independently using rigid-plastic simplified analysis procedure. The dDeveloped tool...

  17. Simplified Analysis Tool for Ship-Ship Collision

    DEFF Research Database (Denmark)

    Yamada, Yasuhira; Pedersen, Preben Terndrup

    2007-01-01

    The purpose of this paper is to develop a simplified ship collision analysis tool in order to rapidly estimate the structural damage and energy absorption of both striking and struck ships as well as prediction of rupture of cargo oil tanks of struck tankers. The present tool calculates external...... and internal dynamics independently. The 2-dimensional horizontal motions of both ships are taken into account. in the horizontal plane. Structural deformation for both the striking and the struck ship is evaluated independently using rigid-plastic simplified analysis procedure. The dDeveloped tool...

  18. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  19. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a......Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... complicated endeavor. In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete...

  20. A New Web-based Tool for Aerosol Data Analysis: the AERONET Data Synergy Tool

    Science.gov (United States)

    Giles, D. M.; Holben, B. N.; Slutsker, I.; Welton, E. J.; Chin, M.; Schmaltz, J.; Kucsera, T.; Diehl, T.

    2006-12-01

    The Aerosol Robotic Network (AERONET) provides important aerosol microphysical and optical properties via an extensive distribution of continental sites and sparsely-distributed coastal and oceanic sites among the major oceans and inland seas. These data provide only spatial point measurements while supplemental data are needed for a complete aerosol analysis. Ancillary data sets (e.g., MODIS true color imagery and back trajectory analyses) are available by navigating to several web data sources. In an effort to streamline aerosol data discovery and analysis, a new web data tool called the "AERONET Data Synergy Tool" was launched from the AERONET web site. This tool provides access to ground-based (AERONET and MPLNET), satellite (MODIS, SeaWiFS, TOMS, and OMI) and model (GOCART and back trajectory analyses) databases via one web portal. The Data Synergy Tool user can access these data sources to obtain properties such as the optical depth, composition, absorption, size, spatial and vertical distribution, and source region of aerosols. AERONET Ascension Island and COVE platform site data will be presented to highlight the Data Synergy Tool capabilities in analyzing urban haze, smoke, and dust aerosol events over the ocean. Future development of the AERONET Data Synergy Tool will include the expansion of current data sets as well as the implementation of other Earth Science data sets pertinent to advancing aerosol research.

  1. A 3D image analysis tool for SPECT imaging

    Science.gov (United States)

    Kontos, Despina; Wang, Qiang; Megalooikonomou, Vasileios; Maurer, Alan H.; Knight, Linda C.; Kantor, Steve; Fisher, Robert S.; Simonian, Hrair P.; Parkman, Henry P.

    2005-04-01

    We have developed semi-automated and fully-automated tools for the analysis of 3D single-photon emission computed tomography (SPECT) images. The focus is on the efficient boundary delineation of complex 3D structures that enables accurate measurement of their structural and physiologic properties. We employ intensity based thresholding algorithms for interactive and semi-automated analysis. We also explore fuzzy-connectedness concepts for fully automating the segmentation process. We apply the proposed tools to SPECT image data capturing variation of gastric accommodation and emptying. These image analysis tools were developed within the framework of a noninvasive scintigraphic test to measure simultaneously both gastric emptying and gastric volume after ingestion of a solid or a liquid meal. The clinical focus of the particular analysis was to probe associations between gastric accommodation/emptying and functional dyspepsia. Employing the proposed tools, we outline effectively the complex three dimensional gastric boundaries shown in the 3D SPECT images. We also perform accurate volume calculations in order to quantitatively assess the gastric mass variation. This analysis was performed both with the semi-automated and fully-automated tools. The results were validated against manual segmentation performed by a human expert. We believe that the development of an automated segmentation tool for SPECT imaging of the gastric volume variability will allow for other new applications of SPECT imaging where there is a need to evaluate complex organ function or tumor masses.

  2. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  3. Match Analysis an undervalued coaching tool

    CERN Document Server

    Sacripanti, Attilio

    2010-01-01

    From a Biomechanical point of view, Judo competition is an intriguing complex nonlinear system, with many chaotic and fractals aspects, It is also the test bed in which all coaching capabilities and athlete's performances are evaluated and put to the test. Competition is the moment of truth of all conditioning time, preparation and technical work, before developed, and it is also the climax of the teaching point of view. Furthermore, it is the most important source of technical assessment. Studying it is essential to the coaches because they can obtain useful information for their coaching. Match Analysis could be seen as the master key in all situation sports (dual or team) like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a short summary of the most important methodological achievement in judo match analysis. It is also presented, at light of the last technological improvement, the first systematization toward new fiel...

  4. Development of microfluidic tools for cell analysis

    Czech Academy of Sciences Publication Activity Database

    Václavek, Tomáš; Křenková, Jana; Foret, František

    Brno: Ústav analytické chemie AV ČR, v. v. i, 2015 - (Foret, F.; Křenková, J.; Drobníková, I.; Klepárník, K.), s. 209-211 ISBN 978-80-904959-3-7. [CECE 2015. International Interdisciplinary Meeting on Bioanalysis /12./. Brno (CZ), 21.09.2015-23.09.2015] R&D Projects: GA ČR(CZ) GBP206/12/G014; GA ČR(CZ) GA14-06319S Institutional support: RVO:68081715 Keywords : microfluidic device * 3D- printing * single cell analysis Subject RIV: CB - Analytical Chemistry, Separation http://www.ce-ce.org/CECE2015/CECE%202015%20proceedings_full.pdf

  5. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    , as well as modification and analysis. Graphical work stations provide the opportunity to work — not only with textual representations of Petri nets — but also directly with the graphical representations. This paper describes some of the different kinds of tools which are needed in the Petri net area......The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets....... It describes some of the requirements which these tools must fulfil, in order to support the user in a natural and effective way. Finally some references are given to papers which describe examples of existing Petri net tools....

  6. Development of data analysis tool for combat system integration

    Science.gov (United States)

    Shin, Seung-Chun; Shin, Jong-Gye; Oh, Dae-Kyun

    2013-03-01

    System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT) for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  7. Application of Multivariate Analysis Tools to Industrial Scale Fermentation Data

    DEFF Research Database (Denmark)

    Mears, Lisa; Nørregård, Rasmus; Stocks, Stuart M.;

    The analysis of batch process data can provide insight into the process operation, and there is a vast amount of historical data available for data mining. Empirical modelling utilising this data is desirable where there is a lack of understanding regarding the underlying process (Formenti et al...... concentration (Nomikos and MacGregor 1995). Multivariate analysis is a powerful tool for investigating large data sets by identification of trends in the data. However, there are also challenges associated with the application of multivariate analysis tools to batch process data. This is due to issues related...... application of multivariate methods to industrial scale process data to cover these considerations....

  8. Tool for efficient intermodulation analysis using conventional HB packages

    OpenAIRE

    Vannini, G.; Filicori, F.; Traverso, P.

    1999-01-01

    A simple and efficient approach is proposed for the intermodulation analysis of nonlinear microwave circuits. The algorithm, which is based on a very mild assumption about the frequency response of the linear part of the circuit, allows for a reduction in computing time and memory requirement. Moreover. It can be easily implemented using any conventional tool for harmonic-balance circuit analysis

  9. Tools for analysis of Dirac structures on banach spaces

    NARCIS (Netherlands)

    Iftime, Orest V.; Sandovici, Adrian; Golo, Goran

    2005-01-01

    Power-conserving and Dirac structures are known as an approach to mathematical modeling of physical engineering systems. In this paper connections between Dirac structures and well known tools from standard functional analysis are presented. The analysis can be seen as a possible starting framework

  10. Safety and reliability of plant technology: Application of knowledge based systems technology in analysis and management of power and process plant components and systems. Vol. 3. SPRINT/KBS dissemination workshops of MPA seminar

    International Nuclear Information System (INIS)

    The third volume of this report covers the following topics: 1) SPRINT SP249 and RA230 dissemination workshop: Implementation of power plant component life assessment technology using a knowledge-based system, 2) SPRINT RA230 and SP249 dissemination workshop: Methodologies, tools and systems, 3) SPRINT SP249 and RA230 dissemination workshop: KBS-dissemination and case studies, 4) SPRINT SP249 and RA230 dissemination workshop: KBS-dissemination and data, 5) SPRINT SP249 and RA230 dissemination workshop: KBS-dissemination and plant operating. (orig./GL)

  11. Proceedings of the 1st Space Plasma Computer Analysis Network (SCAN) Workshop. [space plasma computer networks

    Science.gov (United States)

    Green, J. L.; Waite, J. H.; Johnson, J. F. E.; Doupnik, J. R.; Heelis, R. A.

    1983-01-01

    The purpose of the workshop was to identify specific cooperative scientific study topics within the discipline of Ionosphere Magnetosphere Coupling processes and to develop methods and procedures to accomplish this cooperative research using SCAN facilities. Cooperative scientific research was initiated in the areas of polar cusp composition, O+ polar outflow, and magnetospheric boundary morphology studies and an approach using a common metafile structure was adopted to facilitate the exchange of data and plots between the various workshop participants. The advantages of in person versus remote workshops were discussed also.

  12. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  13. Tool Failure Analysis in High Speed Milling of Titanium Alloys

    Institute of Scientific and Technical Information of China (English)

    ZHAO Xiuxu; MEYER Kevin; HE Rui; YU Cindy; NI Jun

    2006-01-01

    In high speed milling of titanium alloys the high rate of tool failure is the main reason for its high manufacturing cost. In this study, fractured tools which were used in a titanium alloys 5-axis milling process have been observed both in the macro scale using a PG-1000 light microscope and in the micro scale using a Scanning Electron Microscope (SEM) respectively. These observations indicate that most of these tool fractures are the result of tool chipping. Further analysis of each chipping event has shown that beachmarks emanate from points on the cutting edge. This visual evidence indicates that the cutting edge is failing in fatigue due to cyclical mechanical and/or thermal stresses. Initial analyses explaining some of the outlying conditions for this phenomenon are discussed. Future analysis regarding determining the underlying causes of the fatigue phenomenon is then outlined.

  14. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Divisions's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed off line from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving off-line data analysis environment on the USC computers

  15. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  16. An Integrated Traverse Planner and Analysis Tool for Planetary Exploration

    OpenAIRE

    Johnson, Aaron William; Hoffman, Jeffrey A.; Newman, Dava; Mazarico, Erwan Matias; Zuber, Maria

    2010-01-01

    Future planetary explorations will require surface traverses of unprecedented frequency, length, and duration. As a result, there is need for exploration support tools to maximize productivity, scientific return, and safety. The Massachusetts Institute of Technology is currently developing such a system, called the Surface Exploration Traverse Analysis and Navigation Tool (SEXTANT). The goal of this system is twofold: to allow for realistic simulations of traverses in order to assist with har...

  17. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  18. A multi-criteria decision analysis tool to support electricity

    OpenAIRE

    Ribeiro, Fernando; Ferreira, Paula Varandas; Araújo, Maria Madalena Teixeira de

    2012-01-01

    A Multi-Criteria Decision Analysis (MCDA) tool was designed to support the evaluation of different electricity production scenarios. The MCDA tool is implemented in Excel worksheet and uses information obtained from a mixed integer optimization model. Given the input, the MCDA allowed ranking different scenarios relying on their performance on 13 criteria covering economic, job market, quality of life of local populations, technical and environmental issues. The criteria were weighte...

  19. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  20. Physics analysis tools for beauty physics in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Anastopoulos, C [Physics Department, Aristotle University Of Thessaloniki (Greece); Bouhova-Thacker, E; Catmore, J; Mora, L de [Department of Physics, Lancaster University (United Kingdom); Dallison, S [Particle Physics Department, CCLRC Rutherford Appleton Laboratory (United Kingdom); Derue, F [LPNHE, IN2P3 - CNRS - Universites Paris VI et Paris VII (France); Epp, B; Jussel, P [Institute for Astro- and Particle Physics, University of Innsbruck (Austria); Kaczmarska, A [Institute of Nuclear Physics, Polish Academy of Sciences (Poland); Radziewski, H v; Stahl, T [Department of Physics, University of Siegen (Germany); Reznicek, P [IPNP, Faculty of Mathematics and Physics, Charles University in Prague (Czech Republic)], E-mail: pavel.reznicek@cern.ch

    2008-07-15

    The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools.

  1. PREFACE: European Microbeam Analysis Society's 14th European Workshop on Modern Developments and Applications in Microbeam Analysis (EMAS 2015), Portorož, Slovenia, 3-7 May 2015

    Science.gov (United States)

    Llovet, Xavier; Matthews, Michael B.; Čeh, Miran; Langer, Enrico; Žagar, Kristina

    2016-02-01

    This volume of the IOP Conference Series: Materials Science and Engineering contains papers from the 14th Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis which took place from the 3rd to the 7th of May 2015 in the Grand Hotel Bernardin, Portorož, Slovenia. The primary aim of this series of workshops is to assess the state-of-the-art and reliability of microbeam analysis techniques. The workshops also provide a forum where students and young scientists starting out on a career in microbeam analysis can meet and discuss with the established experts. The workshops have a unique format comprising invited plenary lectures by internationally recognized experts, poster presentations by the participants and round table discussions on the key topics led by specialists in the field.This workshop was organized in collaboration with the Jožef Stefan Institute and SDM - Slovene Society for Microscopy. The technical programme included the following topics: electron probe microanalysis, STEM and EELS, materials applications, cathodoluminescence and electron backscatter diffraction (EBSD), and their applications. As at previous workshops there was also a special oral session for young scientists. The best presentation by a young scientist was awarded with an invitation to attend the 2016 Microscopy and Microanalysis meeting at Columbus, Ohio. The prize went to Shirin Kaboli, of the Department of Metals and Materials Engineering of McGill University (Montréal, Canada), for her talk entitled "Electron channelling contrast reconstruction with electron backscattered diffraction". The continuing relevance of the EMAS workshops and the high regard in which they are held internationally can be seen from the fact that 71 posters from 16 countries were on display at the meeting and that the participants came from as far away as Japan, Canada, USA, and Australia. A selection of participants with posters was invited

  2. ITERA: IDL Tool for Emission-line Ratio Analysis

    CERN Document Server

    Groves, Brent

    2010-01-01

    We present a new software tool to enable astronomers to easily compare observations of emission line ratios with those determined by photoionization and shock models, ITERA, the IDL Tool for Emission-line Ratio Analysis. This tool can plot ratios of emission lines predicted by models and allows for comparison of observed line ratios against grids of these models selected from model libraries associated with the tool. We provide details of the libraries of standard photoionization and shock models available with ITERA, and, in addition, present three example emission line ratio diagrams covering a range of wavelengths to demonstrate the capabilities of ITERA. ITERA, and associated libraries, is available from \\url{http://www.brentgroves.net/itera.html}

  3. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from......Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  4. Phonological assessment and analysis tools for Tagalog: Preliminary development.

    Science.gov (United States)

    Chen, Rachelle Kay; Bernhardt, B May; Stemberger, Joseph P

    2016-01-01

    Information and assessment tools concerning Tagalog phonological development are minimally available. The current study thus sets out to develop elicitation and analysis tools for Tagalog. A picture elicitation task was designed with a warm-up, screener and two extension lists, one with more complex and one with simpler words. A nonlinear phonological analysis form was adapted from English (Bernhardt & Stemberger, 2000) to capture key characteristics of Tagalog. The tools were piloted on a primarily Tagalog-speaking 4-year-old boy living in a Canadian-English-speaking environment. The data provided initial guidance for revision of the elicitation tool (available at phonodevelopment.sites.olt.ubc.ca). The analysis provides preliminary observations about possible expectations for primarily Tagalog-speaking 4-year-olds in English-speaking environments: Lack of mastery for tap/trill 'r', and minor mismatches for vowels, /l/, /h/ and word stress. Further research is required in order to develop the tool into a norm-referenced instrument for Tagalog in both monolingual and multilingual environments. PMID:27096390

  5. 3rd International Workshop on Intelligent Data Analysis and Management (IDAM)

    CERN Document Server

    Wang, Leon; Hong, Tzung-Pei; Yang, Hsin-Chang; Ting, I-Hsien

    2013-01-01

    These papers on Intelligent Data Analysis and Management (IDAM) examine issues related to the research and applications of Artificial Intelligence techniques in data analysis and management across a variety of disciplines. The papers derive from the 2013 IDAM conference in Kaohsiung ,Taiwan. It is an interdisciplinary research field involving academic researchers in information technologies, computer science, public policy, bioinformatics, medical informatics, and social and behavior studies, etc. The techniques studied include (but are not limited to): data visualization, data pre-processing, data engineering, database mining techniques, tools and applications, evolutionary algorithms, machine learning, neural nets, fuzzy logic, statistical pattern recognition, knowledge filtering, and post-processing, etc.

  6. Economics of Tobacco Toolkit, Tool 2. Data for Economic Analysis

    OpenAIRE

    Czart, Christina; Chaloupka, Frank

    2013-01-01

    This tool provides a general introduction to 'the art' of building databases. It addresses a number of issues pertaining to the search, identification and preparation of data for meaningful economic analysis. It can best be thought of as a reference mechanism that provides support for the occasionally frustrated but endlessly hungry researcher working through the adventures of tobacco cont...

  7. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  8. Shape Analysis for Complex Systems Using Information Geometry Tools.

    OpenAIRE

    Sanctis, Angela De

    2012-01-01

    In this paper we use Information Geometry tools to model statistically patterns arising in complex systems and describe their evolution in time. In particular, we focus on the analysis of images with medical applications and propose an index that can estimate the level of self-organization and predict future problems that may occur in these systems.

  9. The Adversarial Route Analysis Tool: A Web Application

    Energy Technology Data Exchange (ETDEWEB)

    Casson, William H. Jr. [Los Alamos National Laboratory

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  10. A Spreadsheet Teaching Tool For Analysis Of Pipe Networks

    OpenAIRE

    El Bahrawy, Aly N.

    1997-01-01

    Spreadsheets are used widely in engineering to perform several analysis and design calculations. They are also very attractive as educational tools due to their flexibility and efficiency. This paper demonstrates the use of spreadsheets in teaching the analysis of water pipe networks, which involves the calculation of pipe flows or nodal heads given the network layout, pipe characteristics (diameter, length, and roughness), in addition to external flows. The network performance is better und...

  11. DFTCalc: a tool for efficient fault tree analysis (extended version)

    OpenAIRE

    Arnold, Florian; Belinfante, Axel; Berg, de, MT Mark; Guck, Dennis; Stoelinga, Mariëlle

    2013-01-01

    Effective risk management is a key to ensure that our nuclear power plants, medical equipment, and power grids are dependable; and is often required by law. Fault Tree Analysis (FTA) is a widely used methodology here, computing important dependability measures like system reliability. This paper presents DFTCalc, a powerful tool for FTA, providing (1) efficient fault tree modelling via compact representations; (2) effective analysis, allowing a wide range of dependability properties to be ana...

  12. DFTCalc: a tool for efficient fault tree analysis

    OpenAIRE

    Arnold F.; Belinfante A.; Van Der Berg F.; Guck D.; Stoelinga M.

    2013-01-01

    Effective risk management is a key to ensure that our nuclear power plants, medical equipment, and power grids are dependable; and it is often required by law. Fault Tree Analysis (FTA) is a widely used methodology here, computing important dependability measures like system reliability. This paper presents DFTCalc, a powerful tool for FTA, providing (1) efficient fault tree modelling via compact representations; (2) effective analysis, allowing a wide range of dependability properties to be ...

  13. json2run: a tool for experiment design & analysis

    OpenAIRE

    Urli, Tommaso

    2013-01-01

    json2run is a tool to automate the running, storage and analysis of experiments. The main advantage of json2run is that it allows to describe a set of experiments concisely as a JSON-formatted parameter tree. It also supports parallel execution of experiments, automatic parameter tuning through the F-Race framework and storage and analysis of experiments with MongoDB and R.

  14. Second international tsunami workshop on the technical aspects of tsunami warning systems, tsunami analysis, preparedness, observation and instrumentation

    International Nuclear Information System (INIS)

    The Second Workshop on the Technical Aspects of Tsunami Warning Systems, Tsunami Analysis, Preparedness, Observation, and Instrumentation, sponsored and convened by the Intergovernmental Oceanographic Commission (IOC), was held on 1-2 August 1989, in the modern and attractive research town of Academgorodok, which is located 20 km south from downtown Novosibirsk, the capital of Siberia, USSR. The Program was arranged in eight major areas of interest covering the following: Opening and Introduction; Survey of Existing Tsunami Warning Centers - present status, results of work, plans for future development; Survey of some existing seismic data processing systems and future projects; Methods for fast evaluation of Tsunami potential and perspectives of their implementation; Tsunami data bases; Tsunami instrumentation and observations; Tsunami preparedness; and finally, a general discussion and adoption of recommendations. The Workshop presentations not only addressed the conceptual improvements that have been made, but focused on the inner workings of the Tsunami Warning System, as well, including computer applications, on-line processing and numerical modelling. Furthermore, presentations reported on progress has been made in the last few years on data telemetry, instrumentation and communications. Emphasis was placed on new concepts and their application into operational techniques that can result in improvements in data collection, rapid processing of the data, in analysis and prediction. A Summary Report on the Second International Tsunami Workshop, containing abstracted and annotated proceedings has been published as a separate report. The present Report is a Supplement to the Summary Report and contains the full text of the papers presented at this Workshop. Refs, figs and tabs

  15. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    International Nuclear Information System (INIS)

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  16. Discovery and New Frontiers Project Budget Analysis Tool

    Science.gov (United States)

    Newhouse, Marilyn E.

    2011-01-01

    The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

  17. Web-Oriented Visual Performance Analysis Tool for HPC: THPTiii

    Institute of Scientific and Technical Information of China (English)

    SHIPeizhi; LISanli

    2003-01-01

    Improving the low efficiency of most parallel applications with performance tool is an important issue in high performance computing. Performance tool, which usually collects and analyzes performance data, is an effective way of improving performance. This paper explores both the collecting and analysis of performance data, and two innovation ideas are proposed: both types of runtime performance data, concerning both system load and application behavior, should be collected simultaneously, which requires multiple instrument flow and low probing cost; and the performance analysis should be Weboriented, which can exploit the excellent portability and usability brought by Internet. This paper presents a Weboriented HPC (High performance computing) performance tool, which can collect information about both resource utilization, including the utilizing ratio of CPU and memory, and the program behavior during runtime, including the statuses such as sending and computing, and visualize the information in the users' browser window with JAVA applets in multiple filters and multiple views. Furthermore, this performance tool exposes the data dependency between components and provides an entry of task scheduling. With this performance tool, programmers can monitor the runtime state of the application, analyze the relationship between program process and system load, find out the performance bottleneck, and improve the performance of the application at last.

  18. MICCAI Workshops

    CERN Document Server

    Nedjati-Gilani, Gemma; Venkataraman, Archana; O'Donnell, Lauren; Panagiotaki, Eleftheria

    2014-01-01

    This volume contains the proceedings from two closely related workshops: Computational Diffusion MRI (CDMRI’13) and Mathematical Methods from Brain Connectivity (MMBC’13), held under the auspices of the 16th International Conference on Medical Image Computing and Computer Assisted Intervention, which took place in Nagoya, Japan, September 2013. Inside, readers will find contributions ranging from mathematical foundations and novel methods for the validation of inferring large-scale connectivity from neuroimaging data to the statistical analysis of the data, accelerated methods for data acquisition, and the most recent developments on mathematical diffusion modeling. This volume offers a valuable starting point for anyone interested in learning computational diffusion MRI and mathematical methods for brain connectivity as well as offers new perspectives and insights on current research challenges for those currently in the field. It will be of interest to researchers and practitioners in computer science, ...

  19. Monte Carlo tools for Beyond the Standard Model Physics , April 14-16

    DEFF Research Database (Denmark)

    Badger...[], Simon; Christensen, Christian Holm; Dalsgaard, Hans Hjersing;

    2011-01-01

    already exist for the study of low energy supersymmetry and the MSSM in particular, this workshop will instead focus on tools for alternative TeV-scale physics models. The main goals of the workshop are: To survey what is available. To provide feedback on user experiences with Monte Carlo tools for BSM......This workshop aims to gather together theorists and experimentalists interested in developing and using Monte Carlo tools for Beyond the Standard Model Physics in an attempt to be prepared for the analysis of data focusing on the Large Hadron Collider. Since a large number of excellent tools...

  20. Comparing gene set analysis methods on single-nucleotide polymorphism data from Genetic Analysis Workshop 16

    OpenAIRE

    Tintle, Nathan L; Borchers, Bryce; Brown, Marshall; Bekmetjev, Airat

    2009-01-01

    Recently, gene set analysis (GSA) has been extended from use on gene expression data to use on single-nucleotide polymorphism (SNP) data in genome-wide association studies. When GSA has been demonstrated on SNP data, two popular statistics from gene expression data analysis (gene set enrichment analysis [GSEA] and Fisher's exact test [FET]) have been used. However, GSEA and FET have shown a lack of power and robustness in the analysis of gene expression data. The purpose of this work is to in...

  1. Linguistics and cognitive linguistics as tools of pedagogical discourse analysis

    Directory of Open Access Journals (Sweden)

    Kurovskaya Yulia G.

    2016-01-01

    Full Text Available The article discusses the use of linguistics and cognitive linguistics as tools of pedagogical discourse analysis, thus establishing a new branch of pedagogy called pedagogical semiology that is concerned with students’ acquisition of culture encoded in symbols and the way students’ sign consciousness formed in the context of learning affects their world cognition and interpersonal communication. The article introduces a set of tools that would enable the teacher to organize the educational process in compliance with the rules of language as a sign system applied to the context of pedagogy and with the formation of younger generation’s language picture of the world.

  2. Campaign effects and self-analysis Internet tool

    Energy Technology Data Exchange (ETDEWEB)

    Brange, Birgitte [Danish Electricity Saving Trust (Denmark); Fjordbak Larsen, Troels [IT Energy ApS (Denmark); Wilke, Goeran [Danish Electricity Saving Trust (Denmark)

    2007-07-01

    In October 2006, the Danish Electricity Saving Trust launched a large TV campaign targeting domestic electricity consumption. The campaign was based on the central message '1000 kWh/year per person is enough'. The campaign was accompanied by a new internet portal with updated information about numerous household appliances, and by analysis tools for bringing down electricity consumption to 1000 kWh/year per person. The effects of the campaign are monitored through repeated surveys and analysed in relation to usage of internet tools.

  3. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Science.gov (United States)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  4. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  5. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  6. Anaphe - OO Libraries and Tools for Data Analysis

    CERN Document Server

    Couet, O; Molnar, Z; Moscicki, J T; Pfeiffer, A; Sang, M

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, we have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create "shadow classes" for the Python language, which map the definitio...

  7. SMART: Statistical Metabolomics Analysis-An R Tool.

    Science.gov (United States)

    Liang, Yu-Jen; Lin, Yu-Ting; Chen, Chia-Wei; Lin, Chien-Wei; Chao, Kun-Mao; Pan, Wen-Harn; Yang, Hsin-Chou

    2016-06-21

    Metabolomics data provide unprecedented opportunities to decipher metabolic mechanisms by analyzing hundreds to thousands of metabolites. Data quality concerns and complex batch effects in metabolomics must be appropriately addressed through statistical analysis. This study developed an integrated analysis tool for metabolomics studies to streamline the complete analysis flow from initial data preprocessing to downstream association analysis. We developed Statistical Metabolomics Analysis-An R Tool (SMART), which can analyze input files with different formats, visually represent various types of data features, implement peak alignment and annotation, conduct quality control for samples and peaks, explore batch effects, and perform association analysis. A pharmacometabolomics study of antihypertensive medication was conducted and data were analyzed using SMART. Neuromedin N was identified as a metabolite significantly associated with angiotensin-converting-enzyme inhibitors in our metabolome-wide association analysis (p = 1.56 × 10(-4) in an analysis of covariance (ANCOVA) with an adjustment for unknown latent groups and p = 1.02 × 10(-4) in an ANCOVA with an adjustment for hidden substructures). This endogenous neuropeptide is highly related to neurotensin and neuromedin U, which are involved in blood pressure regulation and smooth muscle contraction. The SMART software, a user guide, and example data can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/metabolomics/SMART.htm . PMID:27248514

  8. Virtual tool mark generation for efficient striation analysis.

    Science.gov (United States)

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-07-01

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5-10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners. PMID:24502818

  9. Virtual Tool Mark Generation for Efficient Striation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-02-16

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5–10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.

  10. Microscopy image segmentation tool: Robust image data analysis

    International Nuclear Information System (INIS)

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy

  11. PREFACE: 3rd International Workshop on Materials Analysis and Processing in Magnetic Fields (MAP3)

    Science.gov (United States)

    Sakka, Yoshio; Hirota, Noriyuki; Horii, Shigeru; Ando, Tsutomu

    2009-07-01

    The 3rd International Workshop on Materials Analysis and Processing in Materials Fields (MAP3) was held on 14-16 May 2008 at the University of Tokyo, Japan. The first was held in March 2004 at the National High Magnetic Field Laboratory in Tallahassee, USA. Two years later the second took place in Grenoble, France. MAP3 was held at The University of Tokyo International Symposium, and jointly with MANA Workshop on Materials Processing by External Stimulation, and JSPS CORE Program of Construction of the World Center on Electromagnetic Processing of Materials. At the end of MAP3 it was decided that the next MAP4 will be held in Atlanta, USA in 2010. Processing in magnetic fields is a rapidly expanding research area with a wide range of promising applications in materials science. MAP3 focused on the magnetic field interactions involved in the study and processing of materials in all disciplines ranging from physics to chemistry and biology: Magnetic field effects on chemical, physical, and biological phenomena Magnetic field effects on electrochemical phenomena Magnetic field effects on thermodynamic phenomena Magnetic field effects on hydrodynamic phenomena Magnetic field effects on crystal growth Magnetic processing of materials Diamagnetic levitation Magneto-Archimedes effect Spin chemistry Application of magnetic fields to analytical chemistry Magnetic orientation Control of structure by magnetic fields Magnetic separation and purification Magnetic field-induced phase transitions Materials properties in high magnetic fields Development of NMR and MRI Medical application of magnetic fields Novel magnetic phenomena Physical property measurement by Magnetic fields High magnetic field generation> MAP3 consisted of 84 presentations including 16 invited talks. This volume of Journal of Physics: Conference Series contains the proceeding of MAP3 with 34 papers that provide a scientific record of the topics covered by the conference with the special topics (13 papers) in

  12. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  13. Tools for Search, Analysis and Management of Patent Portfolios

    Directory of Open Access Journals (Sweden)

    Muqbil Burhan,

    2012-05-01

    Full Text Available Patents have been acknowledged worldwide as rich sources of information for technology forecasting,competitive analysis and management of patent portfolios. Because of the high potential of patents as animportant indicator of various technology measurements and as econometric measure, patent analysis hasbecome vital for corporate world and of interest to academic research. Retrieving relevant prior art, concerningthe technology of interest, has been vital for managers and consultants dealing with intellectual propertyrights. Tremendous progress in the field of electronic search tools as of late has led to a specialised and lesstime consuming search capabilities even in the fields where search is mostly based on formulas, drawingsand flowcharts. Online patent databases and various other analytical tools have given patent analysis animportant edge, which otherwise required extensive and time consuming data collection and calculations.Patents provide valuable information which could be used for various purposes by industry, academia, andpolicy analysts. This article explores the various options and tools available for patent search, analysis andmanagement of patent portfolios, for efficiently identifying the relevant prior art, managing their own patentclusters and/or competitive intelligence.

  14. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  15. Applying Instructional Design Theories to Bioinformatics Education in Microarray Analysis and Primer Design Workshops

    Science.gov (United States)

    Shachak, Aviv; Ophir, Ron; Rubin, Eitan

    2005-01-01

    The need to support bioinformatics training has been widely recognized by scientists, industry, and government institutions. However, the discussion of instructional methods for teaching bioinformatics is only beginning. Here we report on a systematic attempt to design two bioinformatics workshops for graduate biology students on the basis of…

  16. EMERGING MODALITIES FOR SOIL CARBON ANALYSIS: SAMPLING STATISTICS AND ECONOMICS WORKSHOP.

    Energy Technology Data Exchange (ETDEWEB)

    WIELOPOLSKI, L.

    2006-04-01

    The workshop's main objectives are (1) to present the emerging modalities for analyzing carbon in soil, (2) to assess their error propagation, (3) to recommend new protocols and sampling strategies for the new instrumentation, and, (4) to compare the costs of the new methods with traditional chemical ones.

  17. Multi-Spacecraft Analysis with Generic Visualization Tools

    Science.gov (United States)

    Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.

    2010-12-01

    To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

  18. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  19. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    Science.gov (United States)

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  20. Applying AI tools to operational space environmental analysis

    Science.gov (United States)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  1. Validating and Verifying a New Thermal-Hydraulic Analysis Tool

    International Nuclear Information System (INIS)

    The Idaho National Engineering and Environmental Laboratory (INEEL) has developed a new analysis tool by coupling the Fluent computational fluid dynamics (CFD) code to the RELAP5-3DC/ATHENA advanced thermal-hydraulic analysis code. This tool enables researchers to perform detailed, three-dimensional analyses using Fluent's CFD capability while the boundary conditions required by the Fluent calculation are provided by the balance-of-system model created using RELAP5-3DC/ATHENA. Both steady-state and transient calculations can be performed, using many working fluids and point to three-dimensional neutronics. A general description of the techniques used to couple the codes is given. The validation and verification (V and V) matrix is outlined. V and V is presently ongoing. (authors)

  2. Space mission scenario development and performance analysis tool

    Science.gov (United States)

    Kordon, Mark; Baker, John; Gilbert, John; Hanks, David

    2004-01-01

    This paper discusses a new and innovative approach for a rapid spacecraft multi-disciplinary performance analysis using a tool called the Mission Scenario Development Workbench (MSDW). To meet the needs of new classes of space missions, analysis tools with proven models were developed and integrated into a framework to enable rapid trades and analyses between spacecraft designs and operational scenarios during the formulation phase of a mission. Generally speaking, spacecraft resources are highly constrained on deep space missions and this approach makes it possible to maximize the use of existing resources to attain the best possible science return. This approach also has the potential benefit of reducing the risk of costly design changes made later in the design cycle necessary to meet the mission requirements by understanding system design sensitivities early and adding appropriate margins. This paper will describe the approach used by the Mars Science Laboratory Project to accomplish this result.

  3. SOCIAL SENSOR: AN ANALYSIS TOOL FOR SOCIAL MEDIA

    Directory of Open Access Journals (Sweden)

    Chun-Hsiao Wu

    2016-05-01

    Full Text Available In this research, we propose a new concept for social media analysis called Social Sensor, which is an innovative design attempting to transform the concept of a physical sensor in the real world into the world of social media with three design features: manageability, modularity, and reusability. The system is a case-centered design that allows analysts to select the type of social media (such as Twitter, the target data sets, and appropriate social sensors for analysis. By adopting parameter templates, one can quickly apply the experience of other experts at the beginning of a new case or even create one’s own templates. We have also modularized the analysis tools into two social sensors: Language Sensor and Text Sensor. A user evaluation was conducted and the results showed that usefulness, modularity, reusability, and manageability of the system were all very positive. The results also show that this tool can greatly reduce the time needed to perform data analysis, solve the problems encountered in traditional analysis process, and obtained useful results. The experimental results reveal that the concept of social sensor and the proposed system design are useful for big data analysis of social media.

  4. COMPARISON OF MALAYSIA MANUFACTURING COMPANIES BY FINANCIAL STATEMENT ANALYSIS TOOLS

    OpenAIRE

    MALEK, Afagh; Mohammadi, Maryam; NASSIRI, Fardokht

    2012-01-01

    One of the best ways to get the expected results from trading in the stock market is to acquire a good evaluation of companies’ performance. Similarly, this study aims at comparing the financial performance of Lb Aluminium Berhad and Seal Incorporated Berhad manufacturing companies, which are listed in the main market of Malaysian stock exchange. The data were gathered from the annual reports of companies during last three years and analysed by financial statement analysis tools, which are ...

  5. Keel A Data Mining Tool: Analysis With Genetic

    Directory of Open Access Journals (Sweden)

    Ms. Pooja Mittal

    2012-06-01

    Full Text Available This work is related to the KEEL (Knowledge Extraction basedon Evolutionary Learning tool, an open source software thatsupports data management and provides a platform for theanalysis of evolutionary learning for Data Mining problems ofdifferent kinds including as regression, classification,unsupervised learning. It includes a big collection of evolutionarylearning algorithms based on different approaches: Pittsburgh,Michigan. It empowers the user to perform complete analysis ofany genetic fuzzy system in comparison to existing ones, with astatistical test module for comparison.

  6. Analysis of assessment tools used in engineering degree programs

    OpenAIRE

    Martínez Martínez, María del Rosario; Olmedo Torre, Noelia; Amante García, Beatriz; Farrerons Vidal, Óscar; Cadenato Matia, Ana María

    2014-01-01

    This work presents an analysis of the assessment tools used by professors at the Universitat Politécnica de Catalunya to assess the generic competencies introduced in the Bachelor’s Degrees in Engineering. In order to conduct this study, a survey was designed and administered anonymously to a sample of the professors most receptive to educational innovation at their own university. All total, 80 professors responded to this survey, of whom 26% turned out to be members of the un...

  7. Ethics Auditing and Conflict Analysis as Management Tools

    OpenAIRE

    Anu Virovere; Merle Rihma

    2008-01-01

    This paper deals with management tools like conflict analysis and ethics auditing. Ethics auditing is understood as an opportunity and agreement to devise a system to inform on ethical corporate behaviour. This system essentially aims to increase the transparency and credibility of a companyís commitment to ethics. At the same time, the process of elaborating this system allows us to introduce the moral dimension into the companyís actions and decisions, thereby completing a key dimension of ...

  8. Nucleonica: Web-based Software Tools for Simulations and Analysis

    OpenAIRE

    Magill, Joseph; DREHER Raymond; SOTI Zsolt; LASCHE George

    2012-01-01

    The authors present a description of a new web-based software portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data res...

  9. A Performance Analysis Tool for PVM Parallel Programs

    Institute of Scientific and Technical Information of China (English)

    Chen Wang; Yin Liu; Changjun Jiang; Zhaoqing Zhang

    2004-01-01

    In this paper,we introduce the design and implementation of ParaVT,which is a visual performance analysis and parallel debugging tool.In ParaVT,we propose an automated instrumentation mechanism. Based on this mechanism,ParaVT automatically analyzes the performance bottleneck of parallel applications and provides a visual user interface to monitor and analyze the performance of parallel programs.In addition ,it also supports certain extensions.

  10. Validation of retrofit analysis simulation tool: Lessons learned

    OpenAIRE

    Trcka, Marija; Pasini, Jose Miguel; Oggianu, Stella Maris

    2014-01-01

    It is well known that residential and commercial buildings account for about 40% of the overall energy consumed in the United States, and about the same percentage of CO2 emissions. Retrofitting existing old buildings, which account for 99% of the building stock, represents the best opportunity of achieving challenging energy and emission targets. United Technologies Research Center (UTC) has developed a methodology and tool that provides computational support for analysis and decision-making...

  11. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  12. Aerospace Power Systems Design and Analysis (APSDA) Tool

    Science.gov (United States)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  13. WALLTURB International Workshop

    CERN Document Server

    Jimenez, Javier; Marusic, Ivan

    2011-01-01

    This book brings together selected contributions from the WALLTURB workshop on ”Understanding and modelling of wall turbulence” held in Lille, France, on April 21st to 23rd 2009. This workshop was organized by the WALLTURB consortium, in order to present to the relevant scientific community the main results of the project and to stimulate scientific discussions around the subject of wall turbulence. The workshop reviewed the recent progress in theoretical, experimental and numerical approaches to wall turbulence. The problems of zero pressure gradient, adverse pressure gradient and separating turbulent boundary layers were addressed in detail with the three approaches, using the most advanced tools. This book is a milestone in the research field, thanks to the high level of the invited speakers and the involvement of the contributors and a testimony of the achievement of the WALLTURB project.

  14. TTCScope - A scope-based TTC analysis tool

    CERN Document Server

    Moosavi, P

    2013-01-01

    This document describes a scope-based analysis tool for the TTC system. The software, ttcscope, was designed to sample the encoded TTC signal from a TTCex module with the aid of a LeCroy WaveRunner oscilloscope. From the sampled signal, the bunch crossing clock is recovered, along with the signal contents: level-1 accepts, TTC commands, and trigger types. Two use-cases are addressed: analysis of TTC signals and calibration of TTC crates. The latter includes calibration schemes for two signal phase shifts, one related to level-1 accepts, and the other to TTC commands.

  15. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  16. Judo match analysis,a powerful coaching tool, basic and advanced tools

    CERN Document Server

    Sacripanti, A

    2013-01-01

    In this second paper on match analysis, we analyze in deep the competition steps showing the evolution of this tool at National Federation level.On the basis of our,first classification. Furthermore, it is the most important source of technical assessment. Studying competition with this tool is essential for the coaches because they can obtain useful information for their coaching. Match Analysis is today the master key in situation sports like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a deeper study of the judo competitions at high level both from the male and female point of view, explaining at light of biomechanics, not only the throws evolution in time, introduction of Innovative and Chaotic techniques, but also the evolution of fighting style in these high level competitions, both connected with the grow of this Olympic Sport in the Word Arena. It is shown how new interesting ways are opened by this powerful coac...

  17. CTBTO international cooperation workshop

    International Nuclear Information System (INIS)

    The International Cooperation Workshop took place in Vienna, Austria, on 16 and 17 November 1998, with the participation of 104 policy/decision makers, Research and Development managers and diplomatic representatives from 58 States Signatories to the Comprehensive Nuclear-Test Ban Treaty (CTBT). The Workshop attempted to develop Treaty stipulations to: promote cooperation to facilitate and participate in the fullest possible exchange relating to technologies used in the verification of the Treaty; enable member states to strengthen national implementation of verification measures, and to benefit from the application of such technologies for peaceful purposes. The potential benefits arising from the CTBT monitoring, analysis and data communication systems are multifaceted, and as yet unknown. This Workshop provided the opportunity to examine some of these possibilities. An overview of the CTBT verification regime on the general aspects of the four monitoring technologies (seismic, hydro-acoustic, infrasound and radionuclides), including some of the elements that are the subject of international cooperation, were presented and discussed. Questions were raised on the potential benefits that can be derived by participating in the CTBT regime and broad-based discussions took place. Several concrete proposals on ways and means to facilitate and promote cooperation among States Signatories were suggested. The main points discussed by the participants can be summarized as follows: the purpose of the CTBT Organization is to assist member states to monitor Treaty compliance; the CTBT can be a highly effective technological tool which can generate wide-ranging data, which can be used for peaceful purposes; there are differences in the levels of technology development in the member states that is why peaceful applications should be supported by the Prep Com for the benefit of all member states, whether developed or developing, training being a key element to optimize the CTBT

  18. The Astronomy Workshop

    Science.gov (United States)

    Hamilton, D. P.; Asbury, M. L.; Proctor, A.

    2001-12-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is an interactive online astronomy resource developed, and maintained at the University of Maryland, for use by students, educators and the general public. The Astronomy Workshop has been extensively tested and used successfully at many different levels, including High School and Junior High School science classes, University introductory astronomy courses, and University intermediate and advanced astronomy courses. Some topics currently covered in the Astronomy Workshop are: Animated Orbits of Planets and Moons: The orbits of the nine planets and 91 known planetary satellites are shown in animated, to-scale drawings. The orbiting bodies move at their correct relative speeds about their parent, which is rendered as an attractive, to-scale gif image. Solar System Collisions: This most popular of our applications shows what happens when an asteroid or comet with user-defined size and speed impacts a given planet. The program calculates many effects, including the country impacted (if Earth is the target), energy of the explosion, crater size, magnitude of the planetquake generated. It also displays a relevant image (e.g. terrestrial crater, lunar crater, etc.). Planetary and Satellite Data Calculators: These tools allow the user to easily calculate physical data for all of the planets or satellites simultaneously, making comparison very easy. Orbital Simulations: These tools allow the student to investigate different aspects of the three-body problem of celestial mechanics. Astronomy Workshop Bulletin Board: Get innovative teaching ideas and read about in-class experiences with the Astronomy Workshop. Share your ideas with other educators by posting on the Bulletin Board. Funding for the Astronomy Workshop is provided by the National Science Foundation.

  19. Graphical tools for network meta-analysis in STATA.

    Directory of Open Access Journals (Sweden)

    Anna Chaimani

    Full Text Available Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  20. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  1. Comparative analysis of marine ecosystems: workshop on predator-prey interactions

    DEFF Research Database (Denmark)

    Bailey, Kevin M.; Ciannelli, Lorenzo; Hunsicker, Mary;

    2010-01-01

    Climate and human influences on marine ecosystems are largely manifested by changes in predator–prey interactions. It follows that ecosystem-based management of the world's oceans requires a better understanding of food web relationships. An international workshop on predator–prey interactions...... in marine ecosystems was held at the Oregon State University, Corvallis, OR, USA on 16–18 March 2010. The meeting brought together scientists from diverse fields of expertise including theoretical ecology, animal behaviour, fish and seabird ecology, statistics, fisheries science and ecosystem modelling....... The goals of the workshop were to critically examine the methods of scaling-up predator–prey interactions from local observations to systems, the role of shifting ecological processes with scale changes, and the complexity and organizational structure in trophic interactions....

  2. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    International Nuclear Information System (INIS)

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  3. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    Science.gov (United States)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  4. Tagungsbericht: 3. Workshop "Qualitative Psychology: Research Questions and Matching Methods of Analysis

    OpenAIRE

    Gürtler, Leo

    2003-01-01

    Dieser Tagungsbericht behandelt den dritten Workshop Qualitative Psychologie, welcher vom 25.-27. Oktober 2002 in Perlora (Spanien) stattfand und von dem Zentrum für Qualitative Psychologie (Tübingen) ausgerichtet wurde. Das Hauptthema der Konferenz war Untersuchungsfragen und passende Analysemethoden. Forscher/innen (junge wie erfahrene Expert/innen des Bereiches Qualitative Methodologie) aus unterschiedlichen Ländern präsentierten ihre Arbeit und die hieraus folgenden Ergebnisse. Ein besond...

  5. PREFACE: Proceedings of the 8th Gravitational Wave Data Analysis Workshop, Milwaukee, WI, USA, 17-20 December 2003

    Science.gov (United States)

    Allen, Bruce

    2004-10-01

    It is now almost two decades since Bernard Schutz organized a landmark meeting on data analysis for gravitational wave detectors at the University of Cardiff, UK [1]. The proceedings of that meeting make interesting reading. Among the issues discussed were optimal ways to carry out searches for binary inspiral signals, and ways in which the projected growth in computer speed, memory and networking bandwidth would influence searches for gravitational wave signals. The Gravitational Wave Data Analysis Workshop traces its history to the mid-1990s. With the construction of the US LIGO detectors and the European GEO and VIRGO detectors already underway, Kip Thorne and Sam Finn realized that it was important for the world-wide data analysis community to start discussing some of the big unsettled issues in analysis. What was the optimal way to perform a pulsar search? To ensure confident detection, how accurately did binary inspiral waveforms have to be calculated? It was largely Kip and Sam's initiative that got the GWDAW started. The first (official) GWDAW was hosted by Rai Weiss at Massachusetts Institute of Technology, USA in 1996, as a follow-on to an informal meeting organized in the previous year by Sam Finn. I have pleasant memories of this first MIT GWDAW. I was new to the field and remember my excitement at learning that I had many colleagues interested in (and working on) the important issues. I also remember how refreshing it was to hear a pair of talks by Pia Astone and Marialessandra Papa who were not only studying methods but had actually carried out serious pulsar and burst searches using data from the Rome resonant bar detectors. A lot has changed since then. This issue is the Proceedings of the 8th Annual Gravitational Wave Data Analysis Workshop, held on 17-20 December 2003 at the University of Wisconsin-Milwaukee, USA. Many of the contributions concern technical details of the analysis of real data from resonant mass and interferometric detectors

  6. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    Science.gov (United States)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  7. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  8. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  9. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  10. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic in...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results.......The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic...

  11. Analysis and processing tools for nuclear trade related data

    International Nuclear Information System (INIS)

    This paper describes the development of a system used by the Nuclear Trade Analysis Unit of the Department of Safeguards for handling, processing, analyzing, reporting and storing nuclear trade related data. The data handling and analysis part of the system is already functional, but several additional features are being added to optimize its use. The aim is to develop the system in a manner that actively contributes to the management of the Department's overall knowledge and supports the departmental State evaluation process. Much of the data originates from primary sources and comes in many different formats and languages. It also comes with diverse security needs. The design of the system has to meet the special challenges set by the large volume and different types of data that needs to be handled in a secure and reliable environment. Data is stored in a form appropriate for access and analysis in both structured and unstructured formats. The structured data is entered into a database (knowledge base) called the Procurement Tracking System (PTS). PTS allows effective linking, visualization and analysis of new data with that already included in the system. The unstructured data is stored in text searchable folders (information base) equipped with indexing and search capabilities. Several other tools are linked to the system including a visual analysis tool for structured information and a system for visualizing unstructured data. All of which are designed to help the analyst locate the specific information required amongst a myriad of unrelated information. This paper describes the system's concept, design and evolution - highlighting its special features and capabilities, which include the need to standardize the data collection, entry and analysis processes. All this enables the analyst to approach tasks consistently and in a manner that both enhances teamwork and leads to the development of an institutional memory related to cover trade activities that can be

  12. Principles and tools for collaborative entity-based intelligence analysis.

    Science.gov (United States)

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis. PMID:20075480

  13. Anaphe - OO libraries and tools for data analysis

    International Nuclear Information System (INIS)

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, the authors have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create 'shadow classes' for the Python language, which map the definitions of the Abstract Interfaces almost at a one-to-one level. The authors will give an overview of the architecture and design choices and will present the current status and future developments of the project

  14. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  15. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  16. Anaphe—OO Libraries and Tools for Data Analysis

    Institute of Scientific and Technical Information of China (English)

    O.Couet; B.Ferrero-Merlino; 等

    2001-01-01

    The Anaple project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments,A range of commercial and public domain libraries is used to cover basic functionalities;on top of these libraries a set of HENP-sepcific C++ class libraries for histogram management fitting,plotting and ntuple-like data analysis has been developed .In order to comply with the user requireements for a command-line driven tool,we have chosen to use a scripting language(Python)as the fromt-ent for a data analysis tool.The loose coupling provided by the consequent use of (AIDA compliant)Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provies an easy integration of existing libraries into modern scipting languages thus allowing for rapid application development.This integration is simplified even further suing a specialised toolkit(SWIG)to create" shadow Classes"for the Python language,which map the definitions of the Abstract Interfaces almost at a one-to-one level.This paper will give an overview of the architecture and design choices and will present the current status and future developments of the project.

  17. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    Science.gov (United States)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  18. SmashCommunity: A metagenomic annotation and analysis tool

    DEFF Research Database (Denmark)

    Arumugam, Manimozhiyan; Harrington, Eoghan D; Foerstner, Konrad U;

    2010-01-01

    SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate...... the quantitative phylogenetic and functional compositions of metagenomes, to compare compositions of multiple metagenomes and to produce intuitive visual representations of such analyses. AVAILABILITY: SmashCommunity is freely available at http://www.bork.embl.de/software/smash CONTACT: bork@embl.de....

  19. A new bioinformatics analysis tools framework at EMBL–EBI

    OpenAIRE

    Goujon, Mickael; McWilliam, Hamish; Li, Weizhong; Valentin, Franck; Squizzato, Silvano; Paern, Juri; Lopez, Rodrigo

    2010-01-01

    The EMBL-EBI provides access to various mainstream sequence analysis applications. These include sequence similarity search services such as BLAST, FASTA, InterProScan and multiple sequence alignment tools such as ClustalW, T-Coffee and MUSCLE. Through the sequence similarity search services, the users can search mainstream sequence databases such as EMBL-Bank and UniProt, and more than 2000 completed genomes and proteomes. We present here a new framework aimed at both novice as well as exper...

  20. Modeling energy technology choices. Which investment analysis tools are appropriate?

    International Nuclear Information System (INIS)

    A variety of tools from modern investment theory appear to hold promise for unraveling observed energy technology investment behavior that often appears anomalous when analyzed using traditional investment analysis methods. This paper reviews the assumptions and important insights of the investment theories most commonly suggested as candidates for explaining the apparent ''energy technology investment paradox''. The applicability of each theory is considered in the light of important aspects of energy technology investment problems, such as sunk costs, uncertainty and imperfect information. The theories addressed include the capital asset pricing model, the arbitrage pricing theory, and the theory of irreversible investment. Enhanced net present value methods are also considered. (author)

  1. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  2. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    Science.gov (United States)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  3. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  4. Comparison and analysis of collisional-radiative models at the NLTE-7 workshop

    International Nuclear Information System (INIS)

    We present the main results of the 7. Non-Local Thermodynamic Equilibrium Code Comparison Workshop held in December 2011 in Vienna, Austria. More than twenty researchers from nine countries, who actively work on development of collisional-radiative codes for plasma kinetics modeling, attended the meeting and submitted their results for a number of comparison cases. The cases included free electron-laser-inspired time-dependent relaxation of photoexcited Ne-like Ar, ionization balance and spectra for highly charged tungsten, spectroscopic diagnostics of krypton L-shell spectra, and an investigation of Ne model convergence with principal quantum number. (authors)

  5. PREFACE: Proceedings of the 11th European Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis

    Science.gov (United States)

    2010-07-01

    This volume of IOP Conference Series: Materials Science and Engineering contains papers from the 11th Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis which took place from 10-14 May 2009 in the Hotel Faltom, Gdynia, Poland. The primary aim of this series of workshops is to assess the state-of-the-art and reliability of microbeam analysis techniques. The workshops also provide a forum where students and young scientists starting out on careers in microbeam analysis can meet and discuss with the established experts. The workshops have a very distinct format comprising invited plenary lectures by internationally recognized experts, poster presentations by the participants and round table discussions on the key topics led by specialists in the field. For this workshop EMAS invited speakers on the following topics: EPMA, EBSD, fast energy-dispersive X-ray spectroscopy, three-dimensional microanalysis, and micro-and nanoanalysis in the natural resources industry. The continuing relevance of the EMAS workshops and the high regard in which they are held internationally can be seen from the fact that 69 posters from 16 countries were on display at the meeting and that the participants came from as far away as Japan and the USA. A number of participants with posters were invited to give short oral presentations of their work in two dedicated sessions. As at previous workshops there was also a special oral session for young scientists. Small cash prizes were awarded for the three best posters and for the best oral presentation by a young scientist. The prize for the best poster went to the contribution by G Tylko, S Dubchak, Z Banach and K Turnau, entitled Monte Carlo simulation for an assessment of standard validity and quantitative X-ray microanalysis in plant. Joanna Wojewoda-Budka of the Institute of Metallurgy and Materials Science, Krakow, received the prize for the best oral presentation by a

  6. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    Science.gov (United States)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  7. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  8. The Lagrangian analysis tool LAGRANTO - version 2.0

    Science.gov (United States)

    Sprenger, M.; Wernli, H.

    2015-02-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities: (i) trajectory starting positions can be described easily based on different geometrical and/or meteorological conditions; e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels; (ii) a versatile selection of trajectories is offered based on single or combined criteria; these criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity (PV) greater than 2 PVU); and (iii) full versions are available for global ECMWF and regional COSMO data; core functionality is also provided for the regional WRF and UM models, and for the global 20th Century Reanalysis data set. The intuitive application of LAGRANTO is first presented for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO is used to quasi-operationally diagnose stratosphere-troposphere exchange events over Europe. Whereas these example rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution are needed to adequately resolve the rather complex flow structure associated with orographic blocking due to the Alps. Finally, an example of backward trajectories presents the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes simple tools, e.g., to visualize and merge trajectories. Furthermore, a detailed user guide exists, which describes all LAGRANTO capabilities.

  9. The LAGRANTO Lagrangian analysis tool - version 2.0

    Science.gov (United States)

    Sprenger, M.; Wernli, H.

    2015-08-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities. Trajectory starting positions can be defined easily and flexibly based on different geometrical and/or meteorological conditions, e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels. After the computation of the trajectories, a versatile selection of trajectories is offered based on single or combined criteria. These criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity, PV, greater than 2 PVU; 1 PVU = 10-6 K m2 kg-1 s-1). Full versions of this new version of LAGRANTO are available for global ECMWF and regional COSMO data, and core functionality is provided for the regional WRF and MetUM models and the global 20th Century Reanalysis data set. The paper first presents the intuitive application of LAGRANTO for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO can be used to quasi-operationally diagnose stratosphere-troposphere exchange events. Whereas these examples rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution serve to resolve the rather complex flow structure associated with orographic blocking due to the Alps, as shown in a third example. A final example illustrates the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes auxiliary tools, e.g., to visualize trajectories. A detailed user guide describes all LAGRANTO capabilities.

  10. Topological Tools For The Analysis Of Active Region Filament Stability

    Science.gov (United States)

    DeLuca, Edward E.; Savcheva, A.; van Ballegooijen, A.; Pariat, E.; Aulanier, G.; Su, Y.

    2012-05-01

    The combination of accurate NLFFF models and high resolution MHD simulations allows us to study the changes in stability of an active region filament before a CME. Our analysis strongly supports the following sequence of events leading up to the CME: first there is a build up of magnetic flux in the filament through flux cancellation beneath a developing flux rope; as the flux rope develops a hyperbolic flux tube (HFT) forms beneath the flux rope; reconnection across the HFT raises the flux rope while adding addition flux to it; the eruption is triggered when the flux rope becomes torus-unstable. The work applies topological analysis tools that have been developed over the past decade and points the way for future work on the critical problem of CME initiation in solar active regions. We will present the uses of this approach, current limitations and future prospects.

  11. Message Correlation Analysis Tool for NOvA

    International Nuclear Information System (INIS)

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  12. Message Correlation Analysis Tool for NOvA

    CERN Document Server

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  13. An interactive tool for the analysis of nuclear emulsion data

    International Nuclear Information System (INIS)

    In this paper, we present an interactive tool developed to analyze and display events recorded in nuclear emulsion experiments. Besides providing an excellent way to complement automatic emulsion scanning techniques this program implements a new approach for the complete three-dimensional (3D) analysis of the events. The program uses information recorded during automatic emulsion scanning procedure and commonly called 'video images'. The main features implemented are: automatic full track reconstruction up to wide angles (∼0.8 rad) with efficiency greater than 95%, vertex finding, 'kink' finding, two-dimensional 'microscope' visualization, 3D full view display and analysis. Performances were evaluated using a large set of video images containing manually checked neutrino interaction events and preliminary test beam exposure data

  14. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  15. PyRAT (python radiography analysis tool): overview

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Jerawan C [Los Alamos National Laboratory; Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory

    2011-01-14

    PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

  16. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  17. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  18. A tool for finite element deflection analysis of wings

    Energy Technology Data Exchange (ETDEWEB)

    Carlen, Ingemar

    2005-03-01

    A first version (ver 0.1) of a new tool for finite element deflection analysis of wind turbine blades is presented. The software is called SOLDE (SOLid blaDE), and was developed as a Matlab shell around the free finite element codes CGX (GraphiX - pre-processor), and CCX (CrunchiX - solver). In the present report a brief description of SOLDE is given, followed by a basic users guide. The main features of SOLDE are: - Deflection analysis of wind turbine blades, including 3D effects and warping. - Accurate prediction of eigenmodes and eigenfrequencies. - Derivation of 2-node slender elements for use in various aeroelastic analyses. The main differences between SOLDE and other similar tools can be summarised as: - SOLDE was developed without a graphical user interface or a traditional text file input deck. Instead the input is organised as Matlab data structures that have to be formed by a user provided pre-processor. - SOLDE uses a solid representation of the geometry instead of a thin shell approximation. The benefit is that the bending-torsion couplings will automatically be correctly captured. However, a drawback with the current version is that the equivalent orthotropic shell idealisation violates the local bending characteristics, which makes the model useless for buckling analyses. - SOLDE includes the free finite element solver CCX, and thus no expensive commercial software (e.g. Ansys, or Nastran) is required to produce results.

  19. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  20. Dynamic event tree analysis as a risk management tool

    International Nuclear Information System (INIS)

    This work demonstrates the use of dynamic event tree (DET) methods in the development of optimal severe accident management strategies. The ADAPT (Analysis of Dynamic Accident Progression Trees) software is a tool which can generate dynamic event trees (DETs) using a system model and a user-specified set of branching rules. DETs explicitly account for time in the system analysis and can more accurately model stochastic processes which can occur in a system compared to traditional event tree analysis. The kinds of information which can be extracted from a DET analysis (specifically, an ADAPT analysis) towards risk management and the advantages of such an approach are illustrated using a sodium fast reactor (SFR). The scenario studied is that of an aircraft crash which eliminates most reactor vessel auxiliary cooling system (RVACS) heat removal capability (three of four RVACS towers). Probability distributions on worker response time were developed and used as branching rules. In order to demonstrate the abilities of dynamic methodologies in the context of severe accident mitigation strategies, several recovery strategies are considered. Probability distributions of fuel temperature and tower recovery times are generated to determine which strategy resulted in the lowest overall risk. (authors)

  1. Assessing Extremes Climatology Using NWS Local Climate Analysis Tool

    Science.gov (United States)

    Timofeyeva, M. M.; Hollingshead, A.; Hilderbrand, D.; Mayes, B.; Hartley, T.; Kempf McGavock, N. M.; Lau, E.; Olenic, E. A.; Motta, B.; Bunge, R.; Brown, L. E.; Fritsch, F.

    2010-12-01

    The Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices’ ability to access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NWS Regional Headquarters, Weather Forecast Offices, Weather Service Offices, and River Forecast Centers the ability to conduct regional and local climate studies using station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. Field offices need standardized, scientifically sound methodology for local climate analysis (such as trend, composites, and principal statistical and time-series analysis) that is comprehensive, accessible, and efficient, with the potential to expand with growing NOAA Climate Services needs. The methodology for climate analyses is practiced by the NWS Climate Prediction Center (CPC), NOAA National Climatic Data Center, and NOAA Earth System Research Laboratory, as well as NWS field office staff. LCAT will extend this practice at the local level, allowing it to become both widespread and standardized, and thus improve NWS climate services capabilities. LCAT focus is on the local scale (as opposed to national and global scales of CPC products). The LCAT will: -Improve professional competency of local office staff and expertise in providing local information to their users. LCAT will improve quality of local climate services -Ensure adequate local input to CPC products that depend on local information, such as the U.S. Drought Monitor. LCAT will allow improvement of CPC climate products -Allow testing of local climate variables beyond temperature averages and precipitation totals such as climatology of

  2. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Ivan Bekavac; Daniela Garbin Praničević

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  3. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Parish, Esther S [ORNL; Nugent, Philip J [ORNL

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  4. Verification and Validation of the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  5. PSIM: A TOOL FOR ANALYSIS OF DEVICE PAIRING METHODS

    Directory of Open Access Journals (Sweden)

    Yasir Arfat Malkani

    2009-10-01

    Full Text Available Wireless networks are a common place nowadays and almost all of the modern devices support wireless communication in some form. These networks differ from more traditional computing systems due tothe ad-hoc and spontaneous nature of interactions among devices. These systems are prone to security risks, such as eavesdropping and require different techniques as compared to traditional securitymechanisms. Recently, secure device pairing in wireless environments has got substantial attention from many researchers. As a result, a significant set of techniques and protocols have been proposed to deal with this issue. Some of these techniques consider devices equipped with infrared, laser, ultrasound transceivers or 802.11 network interface cards; while others require embedded accelerometers, cameras and/or LEDs, displays, microphones and/or speakers. However, many of the proposed techniques or protocols have not been implemented at all; while others are implemented and evaluated in a stand-alone manner without being compared with other related work [1]. We believe that it is because of the lack of specialized tools that provide a common platform to test thepairing methods. As a consequence, we designed such a tool. In this paper, we are presenting design and development of the Pairing Simulator (PSim that can be used to perform the analysis of devicepairing methods.

  6. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  7. Nuclear Fuel Cycle Analysis and Simulation Tool (FAST)

    International Nuclear Information System (INIS)

    This paper describes the Nuclear Fuel Cycle Analysis and Simulation Tool (FAST) which has been developed by the Korea Atomic Energy Research Institute (KAERI). Categorizing various mix of nuclear reactors and fuel cycles into 11 scenario groups, the FAST calculates all the required quantities for each nuclear fuel cycle component, such as mining, conversion, enrichment and fuel fabrication for each scenario. A major advantage of the FAST is that the code employs a MS Excel spread sheet with the Visual Basic Application, allowing users to manipulate it with ease. The speed of the calculation is also quick enough to make comparisons among different options in a considerably short time. This user-friendly simulation code is expected to be beneficial to further studies on the nuclear fuel cycle to find best options for the future all proliferation risk, environmental impact and economic costs considered

  8. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  9. Input Range Testing for the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  10. In silico tools for the analysis of antibiotic biosynthetic pathways

    DEFF Research Database (Denmark)

    Weber, Tilmann

    2014-01-01

    Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel...... screening methods based on the genome sequences of potential producing organisms. The basic principle of such genome mining approaches is to identify genes, which are involved in the biosynthesis of such molecules, and to predict the products of the identified pathways. Thus, bioinformatics methods and...... tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data....

  11. Nuclear Fuel Cycle Analysis and Simulation Tool (FAST)

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Won Il; Kwon, Eun Ha; Kim, Ho Dong

    2005-06-15

    This paper describes the Nuclear Fuel Cycle Analysis and Simulation Tool (FAST) which has been developed by the Korea Atomic Energy Research Institute (KAERI). Categorizing various mix of nuclear reactors and fuel cycles into 11 scenario groups, the FAST calculates all the required quantities for each nuclear fuel cycle component, such as mining, conversion, enrichment and fuel fabrication for each scenario. A major advantage of the FAST is that the code employs a MS Excel spread sheet with the Visual Basic Application, allowing users to manipulate it with ease. The speed of the calculation is also quick enough to make comparisons among different options in a considerably short time. This user-friendly simulation code is expected to be beneficial to further studies on the nuclear fuel cycle to find best options for the future all proliferation risk, environmental impact and economic costs considered.

  12. The Astronomy Workshop

    Science.gov (United States)

    Hamilton, Douglas P.

    2012-05-01

    {\\bf The Astronomy Workshop} (http://janus.astro.umd.edu) is a collection of interactive online educational tools developed for use by students, educators, professional astronomers, and the general public. The more than 20 tools in the Astronomy workshop are rated for ease-of-use, and have been extensively tested in large university survey courses as well as more specialized classes for undergraduate majors and graduate students. Here we briefly describe a few of the available tools. {\\bf Solar Systems Visualizer}: The orbital motions of planets, moons, and asteroids in the Solar System as well as many of the planets in exoplanetary systems are animated at their correct relative speeds in accurate to-scale drawings. Zoom in from the chaotic outer satellite systems of the giant planets all the way to their innermost ring systems. {\\bf Solar System Calculators}: These tools calculate a user-defined mathematical expression simultaneously for all of the Solar System's planets (Planetary Calculator) or moons (Satellite Calculator). Key physical and orbital data are automatically accessed as needed. {\\bf Stellar Evolution}: The "Life of the Sun" tool animates the history of the Sun as a movie, showing students how the size and color of our star has evolved and will evolve over billions of years. In "Star Race," the user selects two stars of different masses and watches their evolution in a split-screeen format that emphasizes the great differences in stellar lifetimes and fates.

  13. System-of-Systems Technology-Portfolio-Analysis Tool

    Science.gov (United States)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  14. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  15. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    Science.gov (United States)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  16. Best-practice checklists for tephra collection, analysis and reporting - a draft consensus from the Tephra 2014 workshop

    Science.gov (United States)

    Wallace, K.; Bursik, M. I.; Kuehn, S. C.

    2015-12-01

    The Tephra 2014 Workshop (3-7 August 2014) discussed major developments, best practices, future directions, and critical needs in tephra studies from both volcanological and tephrochronological perspectives. In a consensus-seeking session held at the end of the workshop, the international group of over 70 tephra scientists focused on two complementary themes: (A) the need for common best practices in tephra data collection and reporting among different scientific disciplines, and (B) the need to establish common, accessible mechanisms for tephra data archiving and retrieval. Tephra is the focus of a wide range of research in volcanology, petrology, tephrochronology and tephrostratigraphy (with applications in studies of environmental/climate change, surface processes, paleolimnology, etc.), ash dispersion and fallout modeling, and archaeology, paleoanthropology, and human origins. Researchers in each field have specific objectives that may or may not overlap. The focus on best practices is a first step towards standardized protocols for the collection, analysis and reporting of tephra data across and within disciplines. Such uniformity will facilitate the development and population of useful tephra databases. Current initiatives include the development of best practice checklists as a starting point for ensuring uniformity and completeness. The goals of the checklists are to: 1) ensure consistency among tephra scientists, regardless of research focus, 2) provide basic, comprehensible, metadata requirements, especially those who collect tephra as a peripheral part of their research, 3) help train students, and 4) help journal editors to know which essential metadata should be included in published works. Consistency in tephra sample collection, analysis, and reporting attained by use of these checklists should ultimately aid in improving correlation of tephras across geographically large areas, and facilitate collaborative tephra research. Current and future

  17. H3ABioNet computational metagenomics workshop in Mauritius: training to analyse microbial diversity for Africa

    OpenAIRE

    Baichoo, Shakuntala; Botha, Gerrit; Jaufeerally-Fakim, Yasmina; Mungloo-Dilmohamud, Zahra; Lundin, Daniel; Mulder, Nicola; Promponas, Vasilis J.; Ouzounis, Christos A.

    2015-01-01

    In the context of recent international initiatives to bolster genomics research for Africa, and more specifically to develop bioinformatics expertise and networks across the continent, a workshop on computational metagenomics was organized during the end of 2014 at the University of Mauritius. The workshop offered background on various aspects of computational biology, including databases and algorithms, sequence analysis fundamentals, metagenomics concepts and tools, practical exercises, jou...

  18. Lagrangian analysis. Modern tool of the dynamics of solids

    Science.gov (United States)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  19. Emerging methods and tools for environmental risk assessment, decision-making, and policy for nanomaterials: summary of NATO Advanced Research Workshop

    International Nuclear Information System (INIS)

    Nanomaterials and their associated technologies hold promising opportunities for the development of new materials and applications in a wide variety of disciplines, including medicine, environmental remediation, waste treatment, and energy conservation. However, current information regarding the environmental effects and health risks associated with nanomaterials is limited and sometimes contradictory. This article summarizes the conclusions of a 2008 NATO workshop designed to evaluate the wide-scale implications (e.g., benefits, risks, and costs) of the use of nanomaterials on human health and the environment. A unique feature of this workshop was its interdisciplinary nature and focus on the practical needs of policy decision makers. Workshop presentations and discussion panels were structured along four main themes: technology and benefits, human health risk, environmental risk, and policy implications. Four corresponding working groups (WGs) were formed to develop detailed summaries of the state-of-the-science in their respective areas and to discuss emerging gaps and research needs. The WGs identified gaps between the rapid advances in the types and applications of nanomaterials and the slower pace of human health and environmental risk science, along with strategies to reduce the uncertainties associated with calculating these risks.

  20. Generalized Analysis Tools for Multi-Spacecraft Missions

    Science.gov (United States)

    Chanteur, G. M.

    2011-12-01

    Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI

  1. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    OpenAIRE

    Drechsel Marion; Wilkening Stefan; Chen Bowang; Hemminki Kari

    2009-01-01

    Abstract Background Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools p...

  2. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Pakarinen Jyri

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  3. Thermal buckling comparative analysis using Different FE (Finite Element) tools

    Energy Technology Data Exchange (ETDEWEB)

    Banasiak, Waldemar; Labouriau, Pedro [INTECSEA do Brasil, Rio de Janeiro, RJ (Brazil); Burnett, Christopher [INTECSEA UK, Surrey (United Kingdom); Falepin, Hendrik [Fugro Engineers SA/NV, Brussels (Belgium)

    2009-12-19

    High operational temperature and pressure in offshore pipelines may lead to unexpected lateral movements, sometimes call lateral buckling, which can have serious consequences for the integrity of the pipeline. The phenomenon of lateral buckling in offshore pipelines needs to be analysed in the design phase using FEM. The analysis should take into account many parameters, including operational temperature and pressure, fluid characteristic, seabed profile, soil parameters, coatings of the pipe, free spans etc. The buckling initiation force is sensitive to small changes of any initial geometric out-of-straightness, thus the modeling of the as-laid state of the pipeline is an important part of the design process. Recently some dedicated finite elements programs have been created making modeling of the offshore environment more convenient that has been the case with the use of general purpose finite element software. The present paper aims to compare thermal buckling analysis of sub sea pipeline performed using different finite elements tools, i.e. general purpose programs (ANSYS, ABAQUS) and dedicated software (SAGE Profile 3D) for a single pipeline resting on an the seabed. The analyses considered the pipeline resting on a flat seabed with a small levels of out-of straightness initiating the lateral buckling. The results show the quite good agreement of results of buckling in elastic range and in the conclusions next comparative analyses with sensitivity cases are recommended. (author)

  4. Spectral Analysis Tool 6.2 for Windows

    Science.gov (United States)

    Morgan, Feiming; Sue, Miles; Peng, Ted; Tan, Harry; Liang, Robert; Kinman, Peter

    2006-01-01

    Spectral Analysis Tool 6.2 is the latest version of a computer program that assists in analysis of interference between radio signals of the types most commonly used in Earth/spacecraft radio communications. [An earlier version was reported in Software for Analyzing Earth/Spacecraft Radio Interference (NPO-20422), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 52.] SAT 6.2 calculates signal spectra, bandwidths, and interference effects for several families of modulation schemes. Several types of filters can be modeled, and the program calculates and displays signal spectra after filtering by any of the modeled filters. The program accommodates two simultaneous signals: a desired signal and an interferer. The interference-to-signal power ratio can be calculated for the filtered desired and interfering signals. Bandwidth-occupancy and link-budget calculators are included for the user s convenience. SAT 6.2 has a new software structure and provides a new user interface that is both intuitive and convenient. SAT 6.2 incorporates multi-tasking, multi-threaded execution, virtual memory management, and a dynamic link library. SAT 6.2 is designed for use on 32- bit computers employing Microsoft Windows operating systems.

  5. Tool for Sizing Analysis of the Advanced Life Support System

    Science.gov (United States)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  6. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  7. Fatigue in cold-forging dies: Tool life analysis

    DEFF Research Database (Denmark)

    Skov-Hansen, P.; Bay, Niels; Grønbæk, J.;

    1999-01-01

    In the present investigation it is shown how the tool life of heavily loaded cold-forging dies can be predicted. Low-cycle fatigue and fatigue crack growth testing of the tool materials are used in combination with finite element modelling to obtain predictions of tool lives. In the models the...... number of forming cycles is calculated first to crack initiation and then during crack growth to fatal failure. An investigation of a critical die insert in an industrial cold-forging tool as regards the influence of notch radius, the amount and method of pre-stressing and the selected tool material is...

  8. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    Science.gov (United States)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  9. Study of academic achievements using spatial analysis tools

    Science.gov (United States)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  10. BUSINESS INTELLIGENCE TOOLS FOR DATA ANALYSIS AND DECISION MAKING

    Directory of Open Access Journals (Sweden)

    DEJAN ZDRAVESKI

    2011-04-01

    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  11. Analysis tool for predicting the transient hydrodynamics resulting from the rapid filling of voided piping systems

    International Nuclear Information System (INIS)

    An analysis tool is constructed for the purpose of predicting the transient hydrodynamics resulting from the rapid filling of voided piping systems. The basic requirements of such an analysis tool are established, and documentation is presented for several fluid transient computer codes which were considered for the tool. The code modifications necessary to meet the analysis tool requirements are described. To test the computational capability of the analysis tool a verification problem is considered and the results reported. These results serve only to demonstrate the applicability of the analysis tool to this type of problem; the code has not been validated by comparison with experiment. Documentation is provided for a brief sensitivity study involving the sample problem. Additional analysis tool information, as well as detailed sample problem results are included in the form of appendices

  12. SisRadiologia: a new software tool for analysis of radiological accidents and incidents in industrial radiography

    International Nuclear Information System (INIS)

    According to the International Atomic Energy Agency (IAEA), many efforts have been made by Member states, aiming a better control of radioactive sources. Accidents mostly happened in practices named as high radiological risk and classified by IAEA in categories 1 and 2, being highlighted those related to radiotherapy, large irradiators and industrial radiography. Worldwide, more than 40 radiological accidents have been recorded in the industrial radiography. Worldwide, more than 40 radiological accidents have been recorded in the industrial radiography area, involving 37 workers, 110 members of the public and 12 fatalities. Records display 5 severe radiological accidents in industrial radiography activities in Brazil, in which 7 workers and 19 members of the public were involved. Such events led to hands and fingers radiodermatitis, but to no death occurrence. The purpose of this study is to present a computational program that allows the data acquisition and recording in the company, in such a way to ease a further detailed analysis of radiological event, besides providing the learning cornerstones aiming the avoidance of future occurrences. After one year of the 'Industrial SisRadiologia' computational program application - and mostly based upon the workshop about Analysis and Dose Calculation of Radiological Accidents in Industrial Radiography (Workshop sobre Analise e Calculo de dose de acidentes Radiologicos em Radiografia Industrial - IRD 2012), in which several Radiation Protection officers took part - it can be concluded that the computational program is a powerful tool to data acquisition, as well as, to accidents and incidents events recording and surveying in Industrial Radiography. The program proved to be efficient in the report elaboration to the Brazilian Regulatory Authority, and very useful in workers training to fix the lessons learned from radiological events.

  13. SisRadiologia: a new software tool for analysis of radiological accidents and incidents in industrial radiography

    Energy Technology Data Exchange (ETDEWEB)

    Lima, Camila M. Araujo; Silva, Francisco C.A. da, E-mail: araujocamila@yahoo.com.br, E-mail: dasilva@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Araujo, Rilton A., E-mail: consultoria@maximindustrial.com.br [Maxim Industrial Assessoria TI, Rio de Janeiro, RJ (Brazil)

    2013-07-01

    According to the International Atomic Energy Agency (IAEA), many efforts have been made by Member states, aiming a better control of radioactive sources. Accidents mostly happened in practices named as high radiological risk and classified by IAEA in categories 1 and 2, being highlighted those related to radiotherapy, large irradiators and industrial radiography. Worldwide, more than 40 radiological accidents have been recorded in the industrial radiography. Worldwide, more than 40 radiological accidents have been recorded in the industrial radiography area, involving 37 workers, 110 members of the public and 12 fatalities. Records display 5 severe radiological accidents in industrial radiography activities in Brazil, in which 7 workers and 19 members of the public were involved. Such events led to hands and fingers radiodermatitis, but to no death occurrence. The purpose of this study is to present a computational program that allows the data acquisition and recording in the company, in such a way to ease a further detailed analysis of radiological event, besides providing the learning cornerstones aiming the avoidance of future occurrences. After one year of the 'Industrial SisRadiologia' computational program application - and mostly based upon the workshop about Analysis and Dose Calculation of Radiological Accidents in Industrial Radiography (Workshop sobre Analise e Calculo de dose de acidentes Radiologicos em Radiografia Industrial - IRD 2012), in which several Radiation Protection officers took part - it can be concluded that the computational program is a powerful tool to data acquisition, as well as, to accidents and incidents events recording and surveying in Industrial Radiography. The program proved to be efficient in the report elaboration to the Brazilian Regulatory Authority, and very useful in workers training to fix the lessons learned from radiological events.

  14. FULLPROF as a new tool for flipping ratio analysis: further improvements

    International Nuclear Information System (INIS)

    In the international workshop on polarized neutron for condensed matter investigation (Juelich, September 2002), we presented the implementations done in FULLPROF in order to introduce the ability of performing flipping ratio analysis. During this year we have modified the program in order to extend the initial features. We have tested these new implementations by re-analyzing flipping ratio data on Metrz-Nit (C10H16N5O2) compound

  15. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  16. Suspended Cell Culture ANalysis (SCAN) Tool to Enhance ISS On-Orbit Capabilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences and partner, Draper Laboratory, propose to develop an on-orbit immuno-based label-free Suspension Cell Culture ANalysis tool, SCAN tool,...

  17. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    Science.gov (United States)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  18. A measuring tool for tree-rings analysis

    Science.gov (United States)

    Shumilov, Oleg; Kanatjev, Alexander; Kasatkina, Elena

    2013-04-01

    A special tool has been created for the annual tree-ring widths measurement and analysis. It consists of professional scanner, computer system and software. This created complex in many aspects does not yield the similar systems (LINTAB, WinDENDRO), but in comparison to manual measurement systems, it offers a number of advantages: productivity gain, possibility of archiving the results of the measurements at any stage of the processing, operator comfort. It has been developed a new software, allowing processing of samples of different types (cores, saw cuts), including those which is difficult to process, having got a complex wood structure (inhomogeneity of growing in different directions, missed, light and false rings etc.). This software can analyze pictures made with optical scanners, analog or digital cameras. The complex software program was created on programming language C++, being compatible with modern operating systems like Windows X. Annual ring widths are measured along paths traced interactively. These paths can have any orientation and can be created so that ring widths are measured perpendicular to ring boundaries. A graphic of ring-widths in function of the year is displayed on a screen during the analysis and it can be used for visual and numerical cross-dating and comparison with other series or master-chronologies. Ring widths are saved to the text files in a special format, and those files are converted to the format accepted for data conservation in the International Tree-Ring Data Bank. The created complex is universal in application that will allow its use for decision of the different problems in biology and ecology. With help of this complex it has been reconstructed a long-term juniper (1328-2004) and pine (1445-2005) tree-ring chronologies on the base of samples collected at Kola Peninsula (northwestern Russia).

  19. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    Science.gov (United States)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  20. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  1. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  2. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    Science.gov (United States)

    Adams, David; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Farrell, Steven; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-12-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  3. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning; Mullins, Michael

    2006-01-01

    possible approaches for working with digital tectonics by means of acoustics: The architects, the architect-engineer or hybrid practitioner and finally a prototype for a possible digital tectonic tool. For the third approach in the case study a prototype digital tectonic tool is tested on the design...

  4. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this wor...

  5. The Utility of a Geostationary Doppler radar applied to the hurricane analysis and prediction problem: A Report on the 1st Nexrad in Space Workshop

    Science.gov (United States)

    Tripoli, G. J.; Chandrasekar, V.; Chen, S. S.; Holland, G. J.; Im, E.; Kakar, R.; Lewis, W. E.; Marks, F. D.; Smith, E. A.; Tanelli, S.

    2007-12-01

    Last April the first Nexrad in Space (NIS) workshop was held in Miami, Florida to discuss the value and requirements for a possible satellite mission featuring a Doppler radar in geostationary orbit capable of measuring the internal structure of tropical cyclones over a circular scan area 50 degrees latitude in diameter. The proposed NIS technology, based on the PR2 radar design developed at JPL and an innovative deployable antenna design developed at UCLA would be capable of 3D volume sampling with 12 km horizontal and 300 m vertical resolution and 1 hour scan period. The workshop participants consisted of the JPL and UCLA design teams and cross section of tropical cyclone forecasters, researchers and modelers who could potentially benefit from this technology. The consensus of the workshop included: (a) the NIS technology would provide observations to benefit hurricane forecasters, real time weather prediction models and model researchers, (b) the most important feature of NIS was its high frequency coverage together with its 3D observation capability. These features were found to fill a data gap, now developing within cloud resolving analysis and prediction systems for which there is no other proposed solution, particularly over the oceans where TCs form. Closing this data gap is important to the improvement of TC intensity prediction. A complete description of the potential benefits and recommended goals for this technology concluded by the workshop participants will be given at the oral presentation.

  6. SawjaCard: A Static Analysis Tool for Certifying Java Card Applications

    OpenAIRE

    Besson, Frédéric; Jensen, Thomas; Vittet, Pierre

    2014-01-01

    This paper describes the design and implementation of a static analysis tool for certifying Java Card applications, according to security rules defined by the smart card industry. Java Card is a dialect of Java designed for programming multi-application smart cards and the tool, called SawjaCard, has been specialised for the particular Java Card programming patterns. The tool is built around a static analysis engine which uses a combination of numeric and heap analysis. It includes a model of...

  7. Proceedings of the workshop on applications of synchrotron radiation to trace impurity analysis for advanced silicon processing

    Energy Technology Data Exchange (ETDEWEB)

    Laderman, S [Integrated Circuits Business Div., Hewlett Packard Co., Palo Alto, CA (United States); Pianetta, P [Stanford Linear Accelerator Center, Menlo Park, CA (United States)

    1993-03-01

    Wafer surface trace impurity analysis is essential for development of competitive Si circuit technologies. Today's grazing incidence x-ray fluorescence techniques with rotating anodes fall short of requirements for the future. Hewlett Packard/Toshiba experiments indicate that with second generation synchrotron sources such as SSRL, the techniques can be extended sufficiently to meet important needs of the leading edge Si circuit industry through nearly all of the 1990's. This workshop was held to identify people interested in use of synchrotron radiation-based methods and to document needs and concerns for further development. Viewgraphs are included for the following presentations: microcontamination needs in silicon technology (M. Liehr), analytical methods for wafer surface contamination (A. Schimazaki), trace impurity analysis of liquid drops using synchrotron radiation (D. Wherry), TRXRF using synchrotron sources (S. Laderman), potential role of synchrotron radiation TRXRF in Si process R D (M. Scott), potenital development of synchrotron radiation facilities (S. Brennan), and identification of goals, needs and concerns (M. Garner).

  8. Highlights of the Workshop

    Science.gov (United States)

    Noor, Ahmed K.

    1997-01-01

    Economic stresses are forcing many industries to reduce cost and time-to-market, and to insert emerging technologies into their products. Engineers are asked to design faster, ever more complex systems. Hence, there is a need for novel design paradigms and effective design tools to reduce the design and development times. Several computational tools and facilities have been developed to support the design process. Some of these are described in subsequent presentations. The focus of the workshop is on the computational tools and facilities which have high potential for use in future design environment for aerospace systems. The outline for the introductory remarks is given. First, the characteristics and design drivers for future aerospace systems are outlined; second, simulation-based design environment, and some of its key modules are described; third, the vision for the next-generation design environment being planned by NASA, the UVA ACT Center and JPL is presented. The anticipated major benefits of the planned environment are listed; fourth, some of the government-supported programs related to simulation-based design are listed; and fifth, the objectives and format of the workshop are presented.

  9. In Silico Analysis of Crop Science: Report on the First China-UK Workshop on Chips, Computers and Crops

    Institute of Scientific and Technical Information of China (English)

    Ming Chen; Andrew Harrison

    2008-01-01

    A workshop on "Chips, Computers and Crops" was held in Hangzhou, China during September 26-27, 2008. The main objective of the workshop was to bring together China and UK scientists from mathematics, bioinformatics and plant molecular biology communities to exchange ideas, enhance awareness of each others' fields,explore synergisms and make recommendations on fruitful future directions in crop science. Here we describe the contributions to the workshop, and examine some conceptual issues that lie at the foundations and future of crop systems biology.

  10. CLARINET workshop 2001

    Energy Technology Data Exchange (ETDEWEB)

    Wensem, J. van [Soil Protection Technical Commitee, The Hague (Netherlands)

    2003-07-01

    In spring 2001, the CLARINET workshop (CLARINET, 2001) on ecological risk assessment agreed on an outline of an EU-framework on site specific ecological risk assessment (SS-ERA). The main final conclusion of this workshop was: 'On the one hand there agreement on the outline of an EU-framework on ERA. On the other hand much details are not filled in yet or have not been discussed yet. From these two facts it can be concluded that there is a good basis for filling in the ERA in future and ongoing discussion is recommended'. - As important common elements for an European framework for SS-ERA were identified: - Generic values in the first tier; - Bioassays; - Bioavailability; - Land use specific; - Negotiable with stakeholders. Although the workshop agreed on a tiered approach, no final conclusion was drawn about the elements of each tier. In this special session of ConSoil we will continue the discussion about use of SS-ERA and the possibilities for a European framework. We will start with introductionary presentations that will focus on four main topics for SS-ERA: - Implementation of site specific ecological risk assessment as a regulatory tool: what to take into consideration; - The feasibility of bio-assays in site specific ecological risk assessment; - Bio-availability; Higher tier field research in ecological risk assessment: a case study. (orig.)

  11. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  12. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc

    2015-04-21

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  13. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  14. SIMS applications workshop. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    The first ANSTO/AINSE SIMS Workshop drew together a mixture of Surface Analysis experts and Surface Analysis users with the concept that SIMS analysis has to be enfolded within the spectrum of surface analysis techniques and that the user should select the technique most applicable to the problem. With this concept in mind the program was structured as sessions on SIMS Facilities; Applications to Mineral Surfaces; Applications to Biological Systems, Applications to Surfaces as Semi- conductors, Catalysts and Surface Coatings; and Applications to Ceramics

  15. SIMS applications workshop. Proceedings

    International Nuclear Information System (INIS)

    The first ANSTO/AINSE SIMS Workshop drew together a mixture of Surface Analysis experts and Surface Analysis users with the concept that SIMS analysis has to be enfolded within the spectrum of surface analysis techniques and that the user should select the technique most applicable to the problem. With this concept in mind the program was structured as sessions on SIMS Facilities; Applications to Mineral Surfaces; Applications to Biological Systems, Applications to Surfaces as Semi- conductors, Catalysts and Surface Coatings; and Applications to Ceramics

  16. Hybrid modular tooling design methodology, based on manufacturability analysis

    OpenAIRE

    Kerbrat, Olivier

    2009-01-01

    In order to stay competitive, manufacturers have to develop new products in a very short time and with reduced costs, whereas customers require more and more quality and flexibility. The field of tooling (dies and molds) does not break these constraints and one possibility to improve competitiveness is to design and manufacture tools with modular and hybrid points of views. Instead of a one-piece tool, it is seen as a 3-D puzzle with modules realized separately and further assembled. With the...

  17. Online Analysis of Wind and Solar Part II: Transmission Tool

    Energy Technology Data Exchange (ETDEWEB)

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the tool has been developed and implemented in software.

  18. Online Analysis of Wind and Solar Part I: Ramping Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  19. Applied climate-change analysis: the climate wizard tool.

    Directory of Open Access Journals (Sweden)

    Evan H Girvetz

    Full Text Available BACKGROUND: Although the message of "global climate change" is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. METHODOLOGY/PRINCIPAL FINDINGS: To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951-2002 occurred in northern hemisphere countries (especially during January-April, but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50 degrees N during February-March to 10 degrees N during August-September. Precipitation decreases occurred most commonly in countries between 0-20 degrees N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs for 2070-2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. CONCLUSIONS/SIGNIFICANCE: The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally

  20. Creating Fantastic PI Workshops

    Energy Technology Data Exchange (ETDEWEB)

    Biedermann, Laura B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Clark, Blythe G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Colbert, Rachel S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dagel, Amber Lynn [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gupta, Vipin P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hibbs, Michael R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perkins, David Nikolaus [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); West, Roger Derek [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The goal of this SAND report is to provide guidance for other groups hosting workshops and peerto-peer learning events at Sandia. Thus this SAND report provides detail about our team structure, how we brainstormed workshop topics and developed the workshop structure. A Workshop “Nuts and Bolts” section provides our timeline and check-list for workshop activities. The survey section provides examples of the questions we asked and how we adapted the workshop in response to the feedback.

  1. Application of the ORIGEN Fallout Analysis Tool and the DELFIC Fallout Planning Tool to National Technical Nuclear Forensics

    International Nuclear Information System (INIS)

    The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool for use in response to emerging improvised nuclear device (IND) post-detonation situations. The project met those goals by research and development of models to predict the physical, chemical, and radiological properties of fallout debris. ORNL has developed new graphical user interfaces for two existing codes, the Oak Ridge Isotope Generation (ORIGEN) code and the Defense Land Fallout Interpretive Code (DELFIC). ORIGEN is a validated, radionuclide production and decay code that has been implemented into the Fallout Analysis Tool to predict the fallout source term nuclide inventory after the detonation of an IND. DELFIC is a validated, physics-based, research reference fallout prediction software package. It has been implemented into the Fallout Planning Tool and is used to predict the fractionated isotope concentrations in fallout, particle sizes, fractionation ratios, dose rate, and integrated dose over the planned collection routes - information vital to ensure quality samples for nuclear forensic analysis while predicting dose to the sample collectors. DELFIC contains a particle activity module, which models the radiochemical fractionation of the elements in a cooling fireball as they condense into and onto particles to predict the fractionated activity size distribution for a given scenario. This provides the most detailed physics-based characterization of the fallout source term phenomenology available in an operational fallout model.

  2. Application of the ORIGEN Fallout Analysis Tool and the DELFIC Fallout Planning Tool to National Technical Nuclear Forensics

    Energy Technology Data Exchange (ETDEWEB)

    Jodoin, Vincent J [ORNL; Lee, Ronald W [ORNL; Peplow, Douglas E. [ORNL; Lefebvre, Jordan P [ORNL

    2011-01-01

    The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool for use in response to emerging improvised nuclear device (IND) post-detonation situations. The project met those goals by research and development of models to predict the physical, chemical, and radiological properties of fallout debris. ORNL has developed new graphical user interfaces for two existing codes, the Oak Ridge Isotope Generation (ORIGEN) code and the Defense Land Fallout Interpretive Code (DELFIC). ORIGEN is a validated, radionuclide production and decay code that has been implemented into the Fallout Analysis Tool to predict the fallout source term nuclide inventory after the detonation of an IND. DELFIC is a validated, physics-based, research reference fallout prediction software package. It has been implemented into the Fallout Planning Tool and is used to predict the fractionated isotope concentrations in fallout, particle sizes, fractionation ratios, dose rate, and integrated dose over the planned collection routes - information vital to ensure quality samples for nuclear forensic analysis while predicting dose to the sample collectors. DELFIC contains a particle activity module, which models the radiochemical fractionation of the elements in a cooling fireball as they condense into and onto particles to predict the fractionated activity size distribution for a given scenario. This provides the most detailed physics-based characterization of the fallout source term phenomenology available in an operational fallout model.

  3. Workshop; Goals and objectives

    International Nuclear Information System (INIS)

    One of the objectives of the workshop was to improve awareness of the international illicit nuclear trafficking information exchange mechanism, i.e. IAEA illicit nuclear trafficking database programme, its goals and objectives, reporting requirements and participation. Other objectives are: to enhance awareness of global and regional risk trends and patterns, share knowledge and assessment of regional developments, share methodology of the illicit nuclear trafficking information analysis and discuss ways for improving national and international illicit trafficking information management and coordination

  4. Analysis of Marketing Mix Tools in a Chosen Company

    OpenAIRE

    Havlíková, Žaneta

    2008-01-01

    The complex structure of the marketing mix of the Velteko s.r.o. company is analysed in my work. Characterising of the concern, basic analysing of the individual tools of the marketing mix and clients survey evaluating is the main object. Conclusion is focused on the rating of the level and the efficiency of the utilization of the used tools. Relevant recommendations are added.

  5. Jitterbug and TrueTime: Analysis Tools for Real-Time Control Systems

    OpenAIRE

    Cervin, Anton; Henriksson, Dan; Lincoln, Bo; Årzén, Karl-Erik

    2002-01-01

    The paper presents two recently developed, Matlab-based analysis tools for realtime control systems. The first tool, called Jitterbug, is used to compute a performance criterion for a control loop under various timing conditions. The tool makes it easy to quickly judge how sensitive a controller is to implementation effects such as slow sampling, delays, jitter, etc. The second tool, called TrueTime, allows detailed co-simulation of process dynamics, control task execution, and network commun...

  6. Collaborative authoring workshop

    NARCIS (Netherlands)

    Klemke, Roland; Schmitz, Birgit

    2009-01-01

    Klemke, R., & Schmitz, B. (2009). Collaborative authoring workshop. Workshop presentation at the Joint Technology Enhanced Learning Summerschool (JTELSS 2009). May, 30-June, 6, 2009, Terchova, Slovakia.

  7. A Tale Of 160 Scientists, Three Applications, A Workshop and A Cloud

    CERN Document Server

    Berriman, G Bruce; Gelino, Dawn; Wittman, Dennis K; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Kinney, Jamie

    2012-01-01

    The NASA Exoplanet Science Institute (NExScI) hosts the annual Sagan Workshops, thematic meetings aimed at introducing researchers to the latest tools and methodologies in exoplanet research. The theme of the Summer 2012 workshop, held from July 23 to July 27 at Caltech, was to explore the use of exoplanet light curves to study planetary system architectures and atmospheres. A major part of the workshop was to use hands-on sessions to instruct attendees in the use of three open source tools for the analysis of light curves, especially from the Kepler mission. Each hands-on session involved the 160 attendees using their laptops to follow step-by-step tutorials given by experts. We describe how we used the Amazon Elastic Cloud 2 to run these applications.

  8. TPC Workshop

    International Nuclear Information System (INIS)

    The first workshop to focus on time projection chambers was held at TRIUMF (Canada) this summer. Some 75 participants came from groups in Europe and North America using TPCs in a variety of applications in experimental physics. Reports included several general descriptions of existing detectors as well as some proposals for new instruments. A time projection chamber (TPC) is the name given to a class of large volume drift chambers which operate generally with parallel electric and magnetic fields. Applications span energies from a few MeV in double beta decay searches, through intermediate energies in muon decay studies to large high energy arrays planned for LEP at CERN

  9. Workshop experience

    Directory of Open Access Journals (Sweden)

    Georgina Holt

    2007-04-01

    Full Text Available The setting for the workshop was a heady mix of history, multiculturalism and picturesque riverscapes. Within the group there was, as in many food studies, a preponderance of female scientists (or ethnographers, but the group interacted on lively, non-gendered terms - focusing instead on an appreciation of locals food and enthusiasm for research shared by all, and points of theoretical variance within that.The food provided by our hosts was of the very highest eating and local food qualities...

  10. Network workshop

    DEFF Research Database (Denmark)

    Bruun, Jesper; Evans, Robert Harry

    2014-01-01

    This paper describes the background for, realisation of and author reflections on a network workshop held at ESERA2013. As a new research area in science education, networks offer a unique opportunity to visualise and find patterns and relationships in complicated social or academic network data...... and analysed networks using the commonly used and freely available software Gephi (gephi.org). Reflecting upon why science education researchers might be hesitant to adopt network methodology we identify a key problem for networks in science education research: The cost in resources of learning how to include...

  11. WooW-II: Workshop on open workflows

    Directory of Open Access Journals (Sweden)

    Daniel Arribas-Bel

    2015-07-01

    Full Text Available This resource describes WooW-II, a two-day workshop on open workflows for quantitative social scientists. The workshop is broken down in five main parts, where each of them typically consists of an introductionary tutorial and a hands-on assignment. The specific tools discussed in this workshop are Markdown, Pandoc, Git, Github, R, and Rstudio, but the theoretical approach applies to a wider range of tools (e.g., LATEX and Python. By the end of the workshop, participants should be able to reproduce a paper of their own and make it available in an open form applying the concepts and tools introduced.

  12. 77 FR 48107 - Workshop on Performance Assessments of Near-Surface Disposal Facilities: FEPs Analysis, Scenario...

    Science.gov (United States)

    2012-08-13

    ...-Surface Disposal Facilities: FEPs Analysis, Scenario and Conceptual Model Development, and Code Selection... Radioactive Waste.'' These regulations were published in the Federal Register on December 27, 1982 (47 FR... on three aspects of a performance assessment: (1) Features, Events, and Processes (FEPs) analysis,...

  13. Advancing Risk Analysis for Nanoscale Materials: Report from an International Workshop on the Role of Alternative Testing Strategies for Advancement.

    Science.gov (United States)

    Shatkin, J A; Ong, Kimberly J; Beaudrie, Christian; Clippinger, Amy J; Hendren, Christine Ogilvie; Haber, Lynne T; Hill, Myriam; Holden, Patricia; Kennedy, Alan J; Kim, Baram; MacDonell, Margaret; Powers, Christina M; Sharma, Monita; Sheremeta, Lorraine; Stone, Vicki; Sultan, Yasir; Turley, Audrey; White, Ronald H

    2016-08-01

    The Society for Risk Analysis (SRA) has a history of bringing thought leadership to topics of emerging risk. In September 2014, the SRA Emerging Nanoscale Materials Specialty Group convened an international workshop to examine the use of alternative testing strategies (ATS) for manufactured nanomaterials (NM) from a risk analysis perspective. Experts in NM environmental health and safety, human health, ecotoxicology, regulatory compliance, risk analysis, and ATS evaluated and discussed the state of the science for in vitro and other alternatives to traditional toxicology testing for NM. Based on this review, experts recommended immediate and near-term actions that would advance ATS use in NM risk assessment. Three focal areas-human health, ecological health, and exposure considerations-shaped deliberations about information needs, priorities, and the next steps required to increase confidence in and use of ATS in NM risk assessment. The deliberations revealed that ATS are now being used for screening, and that, in the near term, ATS could be developed for use in read-across or categorization decision making within certain regulatory frameworks. Participants recognized that leadership is required from within the scientific community to address basic challenges, including standardizing materials, protocols, techniques and reporting, and designing experiments relevant to real-world conditions, as well as coordination and sharing of large-scale collaborations and data. Experts agreed that it will be critical to include experimental parameters that can support the development of adverse outcome pathways. Numerous other insightful ideas for investment in ATS emerged throughout the discussions and are further highlighted in this article. PMID:27510619

  14. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    Science.gov (United States)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  15. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  16. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Directory of Open Access Journals (Sweden)

    Marilyn Wilhelmina Leonora Monster

    2015-12-01

    Full Text Available The multispecimen protocol (MSP is a method to estimate the Earth’s magnetic field’s past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA, that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected calculated following Dekkers and Böhnel (2006 and Fabian and Leonhardt (2010 and a number of other parameters proposed by Fabian and Leonhardt (2010, it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM and the partial thermoremanent magnetization (pTRM gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  17. Fatigue analysis-based numerical design of stamping tools made of cast iron

    OpenAIRE

    Ben Slima, Khalil; Penazzi, Luc; Mabru, Catherine; Ronde-Oustau, François

    2013-01-01

    International audience This work concerns stress and fatigue analysis of stamping tools made of cast iron with an essentially pearlitic matrix and containing foundry defects. Our approach consists at first, in coupling the stamping numerical processing simulations and structure analysis in order to improve the tool stiffness geometry for minimizing the stress state and optimizing their fatigue lifetime. The method consists in simulating the stamping process by considering the tool as a per...

  18. WOPANets: a tool for WOrst case performance analysis of embedded networks

    OpenAIRE

    Mifdaoui, Ahlem; Ayed, Hamdi

    2010-01-01

    WOPANets (WOrst case Performance Analysis of embedded Networks) is a design aided-decision tool for embedded networks. This tool offers an ergonomic interface to the designer to describe the network and the circulating traffic and embodies a static performance evaluation technique based on the Network Calculus theory combined to optimization analysis to support early system level design exploration for embedded networks. In this paper, we describe the features set of WOPANets tool and we p...

  19. Designing an Exploratory Text Analysis Tool for Humanities and Social Sciences Research

    OpenAIRE

    Shrikumar, Aditi

    2013-01-01

    This dissertation presents a new tool for exploratory text analysis that attempts to improve the experience of navigating and exploring text and its metadata. The design of the tool was motivated by the unmet need for text analysis tools in the humanities and social sciences. In these fields, it is common for scholars to have hundreds or thousands of text-based source documents of interest from which they extract evidence for complex arguments about society and culture. These collections are...

  20. Recent Workshops

    CERN Multimedia

    Wickens, F. J.

    Since the previous edition of ATLAS e-news, the NIKHEF Institute in Amsterdam has hosted not just one but two workshops related to ATLAS TDAQ activities. The first in October was dedicated to the Detector Control System (DCS). Just three institutes, CERN, NIKHEF and St Petersburg, provide the effort for the central DCS services, but each ATLAS sub-detector provides effort for their own controls. Some 30 people attended, including representatives for all of the ATLAS sub-detectors, representatives of the institutes working on the central services and the project leader of JCOP, which brings together common aspects of detector controls across the LHC experiments. During the three-day workshop the common components were discussed, and each sub-detector described their experiences and plans for their future systems. Whilst many of the components to be used are standard commercial components, a key custom item for ATLAS is the ELMB (Embedded Local Monitor Board). Prototypes for this have now been extensively test...

  1. Lagrangian analysis. Modern tool of the dynamics of solids

    Science.gov (United States)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  2. Enabling Collaborative Analysis: State Evaluation Groups, the Electronic State File, and Collaborative Analysis Tools

    International Nuclear Information System (INIS)

    The timely collection and analysis of all safeguards relevant information is the key to drawing and maintaining soundly-based safeguards conclusions. In this regard, the IAEA has made multidisciplinary State Evaluation Groups (SEGs) central to this process. To date, SEGs have been established for all States and tasked with developing State-level approaches (including the identification of technical objectives), drafting annual implementation plans specifying the field and headquarters activities necessary to meet technical objectives, updating the State evaluation on an ongoing basis to incorporate new information, preparing an annual evaluation summary, and recommending a safeguards conclusion to IAEA senior management. To accomplish these tasks, SEGs need to be staffed with relevant expertise and empowered with tools that allow for collaborative access to, and analysis of, disparate information sets. To ensure SEGs have the requisite expertise, members are drawn from across the Department of Safeguards based on their knowledge of relevant data sets (e.g., nuclear material accountancy, material balance evaluation, environmental sampling, satellite imagery, open source information, etc.) or their relevant technical (e.g., fuel cycle) expertise. SEG members also require access to all available safeguards relevant data on the State. To facilitate this, the IAEA is also developing a common, secure platform where all safeguards information can be electronically stored and made available for analysis (an electronic State file). The structure of this SharePoint-based system supports IAEA information collection processes, enables collaborative analysis by SEGs, and provides for management insight and review. In addition to this common platform, the Agency is developing, deploying, and/or testing sophisticated data analysis tools that can synthesize information from diverse information sources, analyze diverse datasets from multiple viewpoints (e.g., temporal, geospatial

  3. Workshop "Risk and multicriteria Analysis. An application to natural resources management"

    OpenAIRE

    Bushenkov, Vladimir; Oliveira, Manuela

    2011-01-01

    Program Alexander Lotov (CCRAS, Russia) "Computer visualization of production possibility set in Data Envelopment Analysis" Roman Efremov (URJC, Spain) “Methodology for modeling processes of participatory decision-making in the environmental field with examples from the Water Debate in Catalonia” Maria João Batista (LNEG, Portugal) “Data analysis applied to mineral resources management: Exploration and environmental diagnosis” Susete Marques (ISA, Portugal) “Assessing wildfire ris...

  4. Interactive tool that empowers structural understanding and enables FEM analysis in a parametric design environment

    DEFF Research Database (Denmark)

    Christensen, Jesper Thøger; Parigi, Dario; Kirkegaard, Poul Henning

    2014-01-01

    This paper introduces an interactive tool developed to integrate structural analysis in the architectural design environment from the early conceptual design stage. The tool improves exchange of data between the design environment of Rhino Grasshopper and the FEM analysis of Autodesk Robot Struct...

  5. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    mappings between features and source code. To support programmers in overcoming these difficulties, we present a feature-centric analysis tool, Featureous. Our tool extends the NetBeans IDE with mechanisms for efficient location of feature implementations in legacy source code, and an extensive analysis of...

  6. “DRYPACK” - a calculation and analysis tool

    DEFF Research Database (Denmark)

    Andreasen, M.B.; Toftegaard, R.; Schneider, P.;

    2013-01-01

    “DryPack” is a calculation tool that visualises the energy consumption of airbased and superheated steam drying processes. With “DryPack”, it is possible to add different components to a simple drying process, and thereby increase the flexibility, which makes it possible to analyse the most common...... energy consumption reductions by using “DryPack” are calculated. With the “DryPack” calculation tool, it is possible to calculate four different unit operations with moist air (dehumidification of air, humidification of air, mixing of two air streams, and heating of air). In addition, a Mollier diagram...... with temperatures above 100°C may be generated....

  7. Isogeometric analysis: a powerful numerical tool for the elastic analysis of historical masonry arches

    Science.gov (United States)

    Cazzani, Antonio; Malagù, Marcello; Turco, Emilio

    2016-03-01

    We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.

  8. Pathway-PDT: a flexible pathway analysis tool for nuclear families

    OpenAIRE

    Park, Yo Son; Schmidt, Michael; Martin, Eden R.; Pericak-Vance, Margaret A.; Chung, Ren-Hua

    2013-01-01

    Background Pathway analysis based on Genome-Wide Association Studies (GWAS) data has become popular as a secondary analysis strategy. Although many pathway analysis tools have been developed for case–control studies, there is no tool that can use all information from raw genotypes in general nuclear families. We developed Pathway-PDT, which uses the framework of Pedigree Disequilibrium Test (PDT) for general family data, to perform pathway analysis based on raw genotypes in family-based GWAS....

  9. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    Science.gov (United States)

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper. PMID:26679001

  10. 21st Century Kinematics : The 2012 NSF Workshop

    CERN Document Server

    2013-01-01

    21st Century Kinematics focuses on algebraic problems in the analysis and synthesis of mechanisms and robots, compliant mechanisms, cable-driven systems and protein kinematics. The specialist contributors provide the background for a series of presentations at the 2012 NSF Workshop. The text shows how the analysis and design of innovative mechanical systems yield increasingly complex systems of polynomials, characteristic of those systems. In doing so, takes advantage of increasingly sophisticated computational tools developed for numerical algebraic geometry and demonstrates the now routine derivation of polynomial systems dwarfing the landmark problems of even the recent past. The 21st Century Kinematics workshop echoes the NSF-supported 1963 Yale Mechanisms Teachers Conference that taught a generation of university educators the fundamental principles of kinematic theory. As such these proceedings will be provide admirable supporting theory for a graduate course in modern kinematics and should be of consid...

  11. The Astronomy Workshop

    Science.gov (United States)

    Hamilton, Douglas P.

    2013-05-01

    Abstract (2,250 Maximum Characters): The Astronomy Workshop (http://janus.astro.umd.edu) is a collection of interactive online educational tools developed for use by students, educators, professional astronomers, and the general public. The more than 20 tools in the Astronomy Workshop are rated for ease-of-use, and have been extensively tested in large university survey courses as well as more specialized classes for undergraduate majors and graduate students. Here we briefly describe the tools most relevant for the Professional Dynamical Astronomer. Solar Systems Visualizer: The orbital motions of planets, moons, and asteroids in the Solar System as well as many of the planets in exoplanetary systems are animated at their correct relative speeds in accurate to-scale drawings. Zoom in from the chaotic outer satellite systems of the giant planets all the way to their innermost ring systems. Orbital Integrators: Determine the orbital evolution of your initial conditions for a number of different scenarios including motions subject to general central forces, the classic three-body problem, and satellites of planets and exoplanets. Zero velocity curves are calculated and automatically included on relevant plots. Orbital Elements: Convert quickly and easily between state vectors and orbital elements with Changing the Elements. Use other routines to visualize your three-dimensional orbit and to convert between the different commonly used sets of orbital elements including the true, mean, and eccentric anomalies. Solar System Calculators: These tools calculate a user-defined mathematical expression simultaneously for all of the Solar System's planets (Planetary Calculator) or moons (Satellite Calculator). Key physical and orbital data are automatically accessed as needed.

  12. Tools for Developing a Quality Management Program: Proactive Tools (Process Mapping, Value Stream Mapping, Fault Tree Analysis, and Failure Mode and Effects Analysis)

    International Nuclear Information System (INIS)

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings

  13. N Reactor Lessons Learned workshop

    International Nuclear Information System (INIS)

    This report describes a workshop designed to introduce participants to a process, or model, for adapting LWR Safety Standards and Analysis Methods for use on rector designs significantly different than LWR. The focus of the workshop is on the ''Lessons Learned'' from the multi-year experience in the operation of N Reactor and the efforts to adapt the safety standards developed for commercial light water reactors to a graphite moderated, water cooled, channel type reactor. It must be recognized that the objective of the workshop is to introduce the participants to the operation of a non-LWR in a LWR regulatory world. The total scope of this topic would take weeks to provide a through overview. The objective of this workshop is to provide an introduction and hopefully establish a means to develop a longer term dialogue for technical exchange. This report provides outline of the workshop, a proposed schedule of the workshop, and a description of the tasks will be required to achieve successful completion of the project

  14. PIXE and μ-PIXE analysis of glazes from terracotta sculptures of the della Robbia workshop

    International Nuclear Information System (INIS)

    A series of PIXE analyses has been performed on glazes from terracotta sculptures of the Italian Renaissance and on reference standards. The problems related to the investigation of such heterogeneous materials are discussed and the experimental uncertainties are evaluated, for each element, from the PIXE analysis of standard glasses. Some examples from artefacts coming from Italian collections are given. This research has been conducted in the framework of the COST-G1 European action

  15. Proceedings of the workshop on world oil supply-demand analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, K.C. (ed.)

    1977-01-01

    Twelve papers and four panel discussions are included. A separate abstract was prepared for each paper. The panel discussions were on: technical and physical policy elements affecting world oil supply and demand; financial, tax, and tariff issues in world oil supply and demand; the world economy as influenced by world oil prices and availability; the use of models and analysis in the policy process. (DLC)

  16. MACD - Analysis of weaknesses of the most powerful technical analysis tool

    Directory of Open Access Journals (Sweden)

    Sanel Halilbegovic

    2016-05-01

    Full Text Available Due to the huge popularization of the stock trading amongst youth, in the recent years more and more of trading and brokerage houses are trying to find a one ‘easy to understand’ tool for the novice traders.  Moving average convergence divergence seems to be the main pick and unfortunately inexperienced traders are relying on this one tool for analysis and trading of various securities.   In this paper, I will investigate the validity of MACD as the ‘magic wand’ when solely used in investment trading decision making.  The main limitation of this study is that it could be used more widely across industries and various sizes of companies, funds, and other trading instruments.

  17. Analysis of online quizzes as a teaching and assessment tool

    Directory of Open Access Journals (Sweden)

    Lorenzo Salas-Morera

    2012-03-01

    Full Text Available This article deals with the integrated use of online quizzes as a teaching and assessment tool in the general program of the subject Proyectos in the third course of Ingeniero Técnico en Informática de Gestión over five consecutive years. The research undertaken aimed to test quizzes effectiveness on student performance when used, not only as an isolated assessment tool, but also when integrated into a combined strategy, which support the overall programming of the subject. The results obtained during the five years of experimentation using online quizzes shows that such quizzes have a proven positive influence on students' academic performance. Furthermore, surveys conducted at the end of each course revealed the high value students accord to use of online quizzes in course instruction.

  18. Forensic Analysis of Windows Hosts Using UNIX-based Tools

    Energy Technology Data Exchange (ETDEWEB)

    Cory Altheide

    2004-07-19

    Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linux operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.

  19. Storybuilder-A tool for the analysis of accident reports

    International Nuclear Information System (INIS)

    As part of an ongoing effort by the ministry of Social Affairs and Employment of The Netherlands a research project is being undertaken to construct a causal model for the most commonly occurring scenarios related to occupational risk. This model should provide quantitative insight in the causes and consequences of occupational accidents. The results should be used to help selecting optimal strategies to reduce these risks taking the costs of accidents and of measures into account. The research is undertaken by an international consortium under the name of Workgroup Occupational Risk Model. One of the components of the model is a tool to systematically classify and analyse past accidents. This tool: 'Storybuilder' and its place in the Occupational Risk Model (ORM) are described in the paper. The paper gives some illustrations of the application of the Storybuilder, drawn from the study of ladder accidents, which forms one of the biggest single accident categories in the Dutch data

  20. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  1. An ontological knowledge based system for selection of process monitoring and analysis tools

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    monitoring and analysis tools for a wide range of operations has made their selection a difficult, time consuming and challenging task. Therefore, an efficient and systematic knowledge base coupled with an inference system is necessary to support the optimal selection of process monitoring and analysis tools...... hand, it facilitates the selection of proper monitoring and analysis tools for a given application or process. On the other hand, it permits the identification of potential applications for a given monitoring technique or tool. An efficient inference system based on forward as well as reverse search......Efficient process monitoring and analysis tools provide the means for automated supervision and control of manufacturing plants and therefore play an important role in plant safety, process control and assurance of end product quality. The availability of a large number of different process...

  2. Evaluation of a Surface Exploration Traverse Analysis and Navigation Tool

    OpenAIRE

    Gilkey, Andrea L.; Galvan, Raquel Christine; Johnson, Aaron William; Kobrick, Ryan L.; Hoffman, Jeffrey A.; Melo, Paulo L.; Newman, Dava

    2011-01-01

    SEXTANT is an extravehicular activity (EVA) mission planner tool developed in MATLAB, which computes the most efficient path between waypoints across a planetary surface. The traverse efficiency can be optimized around path distance, time, or explorer energy consumption. The user can select waypoints and the time spent at each, and can visualize a 3D map of the optimal path. Once the optimal path is generated, the thermal load on suited astronauts or solar power generation of rovers is displa...

  3. Integration of management control tools. Analysis of a case study

    OpenAIRE

    Raúl Comas Rodríguez; Dianelys Nogueira Rivera; Félix Romero Bartutis; Marisdany Lumpuy Rodríguez

    2015-01-01

    The objective of this article is to design and to implement a procedure that integrates management control tools focusing on process, to improve the efficiency and the efficacy. It was carried out an experimental study where is defined a procedure, based in the Balanced Scorecard, which integrates the process management into the strategic planning and their evaluation. As results of this work, we define the key factors of success associated with the four perspectives of the Balanced Scorecard...

  4. Monitoring SOA Applications with SOOM Tools: A Competitive Analysis

    OpenAIRE

    Ivan Zoraja; Goran Trlin; Marko Matijević

    2013-01-01

    Background: Monitoring systems decouple monitoring functionality from application and infrastructure layers and provide a set of tools that can invoke operations on the application to be monitored. Objectives: Our monitoring system is a powerful yet agile solution that is able to online observe and manipulate SOA (Service-oriented Architecture) applications. The basic monitoring functionality is implemented via lightweight components inserted into SOA frameworks thereby keeping the monitoring...

  5. The Mission Planning Lab: A Visualization and Analysis Tool

    Science.gov (United States)

    Daugherty, Sarah C.; Cervantes, Benjamin W.

    2009-01-01

    Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).

  6. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    OpenAIRE

    Tousif ur Rehman; Muhammad Naeem Ahmed Khan; Naveed Riaz

    2013-01-01

    Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The stu...

  7. SNMP Traffic Analysis: Approaches, Tools, and First Results

    OpenAIRE

    Schönwälder, Jürgen; Pras, Aiko; Harvan, Mat´uˇs; Schippers, Jorrit; Meent, van de, Remco

    2007-01-01

    The Simple Network Management Protocol (SNMP) is widely deployed to monitor, control, and configure network elements. Even though the SNMP technology is well documented and understood, it remains relatively unclear how SNMP is used in practice and what the typical SNMP usage patterns are. This paper discusses how to perform large-scale SNMP traffic measurements in order to develop a better understanding of how SNMP is used in production networks. The tools described in this paper have been ap...

  8. CeTA - A Tool for Certified Termination Analysis

    OpenAIRE

    Sternagel, Christian; Thiemann, René; Winkler, Sarah; Zankl, Harald

    2012-01-01

    Since the first termination competition in 2004 it is of great interest, whether a proof that has been automatically generated by a termination tool, is indeed correct. The increasing number of termination proving techniques as well as the increasing complexity of generated proofs (e.g., combinations of several techniques, exhaustive labelings, tree automata, etc.), make certifying (i.e., checking the correctness of) such proofs more and more tedious for humans. Hence the interest in automate...

  9. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    OpenAIRE

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verificat...

  10. Risk Assessment and Life Cycle Assessment, Environmental Strategies, Nordic Workshop, Vedbæk 1999

    DEFF Research Database (Denmark)

    Poll, Christian

    At a Nordic workshop on Product-oriented Environmental Strategies the roles of risk and hazard assessment and life cycle assessment of products in the future regulation of chemicals were discussed by participants representing administration, academia and industry from the Nordic countries. This...... summaries of the workshop discussions, including comparisons between tools and strategies, as these are presently applied in administrative practice. The report also contains an overview of the most used databases within these tools. In its conclusions, the report emphasises the need for administrators...... Analyses, Input/output analysis, Environmental Audits and Performance evaluations and Cost Accounting they constitute the toolbox of analysis and management tools required for a full product-related strategy towards a sustainable development....

  11. Matlab symbolic circuit analysis and simulation tool ming PSpice netlist for circuits optimization

    OpenAIRE

    Ushie, OJ; Abbod, M; Ashigwuike, E

    2015-01-01

    This paper presents new Matlab symbolic circuit analysis and simulation (MSCAM) tool that make uses of netlist from PSpice to generate matrices. These matrices can be used to calculate circuit parameters or for optimization. The tool can handle active and passive components such as resistors, capacitors, inductors, operational amplifiers, and transistors. The transistors are converted into small signal analysis and operational amplifiers make use of the small signal analysis which can easily ...

  12. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    Directory of Open Access Journals (Sweden)

    Drechsel Marion

    2009-10-01

    Full Text Available Abstract Background Single nucleotide polymorphism (SNP genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis. Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  13. Optical Network Testbeds Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Joe Mambretti

    2007-06-01

    This is the summary report of the third annual Optical Networking Testbed Workshop (ONT3), which brought together leading members of the international advanced research community to address major challenges in creating next generation communication services and technologies. Networking research and development (R&D) communities throughout the world continue to discover new methods and technologies that are enabling breakthroughs in advanced communications. These discoveries are keystones for building the foundation of the future economy, which requires the sophisticated management of extremely large qualities of digital information through high performance communications. This innovation is made possible by basic research and experiments within laboratories and on specialized testbeds. Initial network research and development initiatives are driven by diverse motives, including attempts to solve existing complex problems, the desire to create powerful new technologies that do not exist using traditional methods, and the need to create tools to address specific challenges, including those mandated by large scale science or government agency mission agendas. Many new discoveries related to communications technologies transition to wide-spread deployment through standards organizations and commercialization. These transition paths allow for new communications capabilities that drive many sectors of the digital economy. In the last few years, networking R&D has increasingly focused on advancing multiple new capabilities enabled by next generation optical networking. Both US Federal networking R&D and other national R&D initiatives, such as those organized by the National Institute of Information and Communications Technology (NICT) of Japan are creating optical networking technologies that allow for new, powerful communication services. Among the most promising services are those based on new types of multi-service or hybrid networks, which use new optical networking

  14. Multiscale Analysis of Surface Topography from Single Point Incremental Forming using an Acetal Tool

    International Nuclear Information System (INIS)

    Single point incremental forming (SPIF) is a sheet metal manufacturing process that forms a part by incrementally applying point loads to the material to achieve the desired deformations and final part geometry. This paper investigates the differences in surface topography between a carbide tool and an acetal-tipped tool. Area-scale analysis is performed on the confocal areal surface measurements per ASME B46. The objective of this paper is to determine at which scales surfaces formed by two different tool materials can be differentiated. It is found that the surfaces in contact with the acetal forming tool have greater relative areas at all scales greater than 5 × 104 μm2 than the surfaces in contact with the carbide tools. The surfaces not in contact with the tools during forming, also referred to as the free surface, are unaffected by the tool material

  15. Fractography analysis of tool samples used for cold forging

    DEFF Research Database (Denmark)

    Dahl, K.V.

    2002-01-01

    using new technology developed by Böhler. All three steels have the same nominal composition of alloying elements. The failure in both types of material occurs as a crack formation at a notch inside ofthe tool. Generally the cold forging dies constructed in third generation steels have a longer lifetime...... than the ones constructed in traditional steel, which is connected to differences in micro-structure. Focus has been put on differences in the size anddistribution of car-bides. It is found that the third generation steel contains smaller and more finely dis-persed carbides and has an increased...

  16. MultiAlign: a multiple LC-MS analysis tool for targeted omics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lamarche, Brian L.; Crowell, Kevin L.; Jaitly, Navdeep; Petyuk, Vladislav A.; Shah, Anuj R.; Polpitiya, Ashoka D.; Sandoval, John D.; Kiebel, Gary R.; Monroe, Matthew E.; Callister, Stephen J.; Metz, Thomas O.; Anderson, Gordon A.; Smith, Richard D.

    2013-02-12

    MultiAlign is a free software tool that aligns multiple liquid chromatography-mass spectrometry datasets to one another by clustering mass and LC elution features across datasets. Applicable to both label-free proteomics and metabolomics comparative analyses, the software can be operated in several modes. Clustered features can be matched to a reference database to identify analytes, used to generate abundance profiles, linked to tandem mass spectra based on parent precursor masses, and culled for targeted liquid chromatography-tandem mass spectrometric analysis. MultiAlign is also capable of tandem mass spectral clustering to describe proteome structure and find similarity in subsequent sample runs.

  17. Error Modeling and Sensitivity Analysis of a Five-Axis Machine Tool

    OpenAIRE

    Wenjie Tian; Weiguo Gao; Wenfen Chang; Yingxin Nie

    2014-01-01

    Geometric error modeling and its sensitivity analysis are carried out in this paper, which is helpful for precision design of machine tools. Screw theory and rigid body kinematics are used to establish the error model of an RRTTT-type five-axis machine tool, which enables the source errors affecting the compensable and uncompensable pose accuracy of the machine tool to be explicitly separated, thereby providing designers and/or field engineers with an informative guideline for the accuracy im...

  18. The Gender Analysis Tools Applied in Natural Disasters Management: A Systematic Literature Review

    OpenAIRE

    Sohrabizadeh, Sanaz; Tourani, Sogand; Khankeh, Hamid Reza

    2014-01-01

    Background: Although natural disasters have caused considerable damages around the world, and gender analysis can improve community disaster preparedness or mitigation, there is little research about the gendered analytical tools and methods in communities exposed to natural disasters and hazards. These tools evaluate gender vulnerability and capacity in pre-disaster and post-disaster phases of the disaster management cycle. Objectives: Identifying the analytical gender tools and the strength...

  19. Practical Multi-Disciplinary Analysis Tools for Combustion Devices Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The use of multidisciplinary analysis (MDA) techniques for combustion device environment prediction, including complex fluid mixing phenomena, is now becoming...

  20. Practical Multi-Disciplinary Analysis Tools for Combustion Devices Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The use of multidisciplinary analysis (MDA) techniques for complex fluid/structure interaction phenomena is increasing as proven numerical and visualization...

  1. Workshop objectives and structure

    International Nuclear Information System (INIS)

    The overall aim of the Workshop was to create a platform in order to better understand different approaches to managing uncertainty in post-closure safety cases and regulatory approaches in different national waste management programmes. The principal objectives of the Workshop were to: - To identify common elements in different approaches for managing uncertainty. - To facilitate information exchange and to promote discussion on different technical approaches to the management and characterisation of uncertainty and on the role of risk. - To explore the merits of alternative approaches to risk-informed decision making. - To identify the potential for further developments of methods or strategies to support the management of uncertainties. The workshop was organised into plenary sessions and working group discussions: The first plenary session focused on establishing a framework for understanding the management of uncertainties and the use of risk. It comprised oral presentations drawing on a range of experience from both active participants in the development and assessment of safety cases and keynotes presentations by external participants involved in risk management in other sectors. The working group discussions covered three technical themes: Risk management and decision making. Regulatory requirements and review of uncertainty and risk in safety cases. Practical approaches and tools for the management of uncertainties and the assignment of probabilities, the use of expert judgements, and the presentation of information on uncertainties and risk were examined. The aim of the working groups was to develop an understanding of the specific issues, and to identify any further activities that will support the development and/or evaluation of safety cases. The round up plenary session brought together information and conclusions from each of the working groups. Common elements in the different approaches to treating uncertainty and risk were identified, along with

  2. Statement by Ms Ana Maria Cetto at the Workshop on IAEA Tools for Nuclear Energy System Assessment (NESA) for Long-Term Planning and Development Vienna, 23 July 2009

    International Nuclear Information System (INIS)

    We are all aware that energy is central to sustainable development and poverty reduction efforts. A 2006 report by the Task Force for the UN Millennium Project, 'Energy Services for the Millennium Development Goals', warns that without increased investment in the energy sector, and major improvements in the quality and quantity of energy services in developing countries, it will not be possible to meet any of the Millennium Development Goals. Demand for energy continues to grow worldwide, as countries seek to improve living standards for their populations. The bulk of this growth in demand is coming from less economically advanced countries. Currently, conventional cooperation approaches are being used by Member States and the Agency to achieve the main goal of phase I of the 'milestone book', namely getting ready to decide to launch a nuclear power programme and make an informed commitment. Most of the countries planning to introduce a nuclear programme are currently in phase I. The Agency is open to consider, for the future TC programme cycle, national projects to apply NESA tools and INPRO methodologies in an integrated approach and help Member States in the preparatory work for the call of bids and construction of their first NPP. Ladies and gentlemen, Workshops such as this one are an important means of sharing experiences and learning from each other. These days you have had the opportunity to learn more about the tools and methods that the Agency offers to support long term energy planning and nuclear energy system assessments, and today you will be providing us with feedback on applying these tools. By sharing your experiences, the lessons you have learned and the constraints you have faced, you will strengthen the Agency's ability to respond to your needs. Your comments will help us to further develop and refine the Agency's support to the sustainable development of nuclear energy

  3. PREFACE: Collapse Calderas Workshop

    Science.gov (United States)

    Gottsmann, Jo; Aguirre-Diaz, Gerardo

    2008-10-01

    Caldera-formation is one of the most awe-inspiring and powerful displays of nature's force. Resultant deposits may cover vast areas and significantly alter the immediate topography. Post-collapse activity may include resurgence, unrest, intra-caldera volcanism and potentially the start of a new magmatic cycle, perhaps eventually leading to renewed collapse. Since volcanoes and their eruptions are the surface manifestation of magmatic processes, calderas provide key insights into the generation and evolution of large-volume silicic magma bodies in the Earth's crust. Despite their potentially ferocious nature, calderas play a crucial role in modern society's life. Collapse calderas host essential economic deposits and supply power for many via the exploitation of geothermal reservoirs, and thus receive considerable scientific, economic and industrial attention. Calderas also attract millions of visitors world-wide with their spectacular scenic displays. To build on the outcomes of the 2005 calderas workshop in Tenerife (Spain) and to assess the most recent advances on caldera research, a follow-up meeting was proposed to be held in Mexico in 2008. This abstract volume presents contributions to the 2nd Calderas Workshop held at Hotel Misión La Muralla, Querétaro, Mexico, 19-25 October 2008. The title of the workshop `Reconstructing the evolution of collapse calderas: Magma storage, mobilisation and eruption' set the theme for five days of presentations and discussions, both at the venue as well as during visits to the surrounding calderas of Amealco, Amazcala and Huichapan. The multi-disciplinary workshop was attended by more than 40 scientist from North, Central and South America, Europe, Australia and Asia. Contributions covered five thematic topics: geology, geochemistry/petrology, structural analysis/modelling, geophysics, and hazards. The workshop was generously supported by the International Association of Volcanology and the Chemistry of The Earth's Interior

  4. Django girls workshop at IdeaSquare

    CERN Document Server

    2016-01-01

    Short video highlights of the Django Girls coding workshop organized at IdeaSquare on Feb 26-27, 2016 by the Rosehipsters non profit organization, supported by the CERN diversity team and the IT department, attracting 39 women from 15 countries. The aim of the workshop was to introduce participants to the world of computer programming and technology by teaching them how to successfully create a blog application and deploy it to the internet. Most of the 16 volunteer mentors were female. Django Girls is a non-profit organization and a community that empowers and helps women to organize free, one-day programming workshops by providing tools, resources and support.

  5. Workshop on PSA applications, Sofia, Bulgaria, 7-11 October 1996. Lecturing materials

    International Nuclear Information System (INIS)

    The objective of this workshop was to present detailed, systematic and useful information about PSA-based tools and PSA applications. The first presentation of the workshop was titled ''The role of PSA in safety management''. This topic served to introduce the workshop and to highlight several concepts that were afterwards stressed during the week, i.e. the defence in depth principle and the use of deterministic and probabilistic approaches in a complementary way. This presentation provided a basis for the discussion of ''PSA applications''. As a complement to the theoretical lectures, there was a workshop during which three different exercises were run in parallel. For two of these, computer-based PSA tools were used. One of them was focused towards the analysis of design modifications and the other one towards demonstrating configuration control strategies. The objective of the third practice was to obtain Allowed Outage Times using different PSA-based approaches and to discuss the differences observed and the insights obtained. To conclude the workshop, stress was put on the importance of the quality of the PSA (the development of a high quality Living PSA should be the first step), the necessity to be cautious (before taking decisions both the qualitative and numerical results should be carefully analyzed), and the logical order for the implementation of PSA applications. Refs, figs, tabs

  6. Workshop in economics - the problem of climate change benefit-cost analysis

    International Nuclear Information System (INIS)

    Could benefit-cost analysis play a larger role in the discussion of policies to deal with the greenhouse effect? The paper also investigates the causes of this lack of influence. Selected forms of benefit-cost research are probed, particularly the critical discussions raised by this type of research, in an effort to suggest where the chances of greater acceptance lie. The paper begins by discussing the search for an appropriate policy: optimal, targeted, or incremental. It then describes the work being done in specifying and estimating climate change damage relationships. A consideration of the work being done in specifying and estimating abatement (both mitigation and adaptation) cost relationships follows. Finally, the paper ends with an examination of the search for the appropriate policy instrument. International and methodological concerns cut across these areas and are discussed in each section. This paper concludes that there seem to be a number of reasons that benefit-cost results play only a limited role in policy development. There is some evidence that the growing interest in market-based approaches to climate change policy and to other environmental control matters is a sign of increased acceptance. Suggestions about research directions are made throughout this paper

  7. Integrated structural analysis tool using the Linear Matching Method part 2 – Application and verification

    International Nuclear Information System (INIS)

    In an accompanying paper, a new integrated structural analysis tool using the Linear Matching Method framework for the assessment of design limits in plasticity including load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures was developed using Abaqus CAE plug-ins with graphical user interfaces. In the present paper, a demonstration of the use of this new Linear Matching Method analysis tool is provided. A header branch pipe in a typical advanced gas-cooled reactor power plant is analysed as a worked example of the current demonstration and verification of the Linear Matching Method tool within the context of an R5 assessment. The detailed shakedown analysis, steady state cycle and ratchet analysis are carried out for the chosen header branch pipe. The comparisons of the Linear Matching Method solutions with results based on the R5 procedure and step-by-step elastic–plastic finite element analysis verify the accuracy, convenience and efficiency of this new integrated Linear Matching Method structural analysis tool. - Highlights: • The demonstration of the use of a new LMM software tool is provided. • A header branch pipe in a typical AGR power plant is analysed as a worked example. • The verification of LMM software tool is within the context of an R5 assessment. • We include the R5 procedure and step-by-step elastic–plastic FEA for comparison. • We verify the accuracy, convenience and efficiency of this new integrated LMM tool

  8. The Photoplethismographic Signal Processed with Nonlinear Time Series Analysis Tools

    International Nuclear Information System (INIS)

    Finger photoplethismography (PPG) signals were submitted to nonlinear time series analysis. The applied analytical techniques were: (i) High degree polynomial fitting for baseline estimation; (ii) FFT analysis for estimating power spectra; (iii) fractal dimension estimation via the Higuchi's time-domain method, and (iv) kernel nonparametric estimation for reconstructing noise free-attractors and also for estimating signal's stochastic components

  9. Graphical Models for Security : Second International Workshop

    OpenAIRE

    Kordy, Barbara; Mauw, Sjouke; Jajodia, Sushil

    2016-01-01

    International audience This volume constitutes the thoroughly refereed post-conference proceedings of the Second International Workshop on Graphical Models for Security, GraMSec 2015, held in Verona, Italy, in July 2015.The 5 revised full papers presented together with one short tool paper and one invited article were carefully reviewed and selected from 13 submissions. The GraMSec workshop contributes to the development of well-founded graphical security models, efficient algorithms for t...

  10. Workshop on Accelerator Operation (WAO 2001)

    International Nuclear Information System (INIS)

    The 3rd Workshop on Accelerator Operation (WAO 2001) followed earlier workshops in 1996 and 1998. Most topics relevant for the efficient and effective operation of accelerators were covered. These included the tools and utilities necessary in the control rooms; the organization of accelerator operation (process monitoring, shift work, stress); the monitoring of beam quality; safety issues and standards; and questions particularly relevant for superconducting accelerators, in particular cryogenics. (author)

  11. The Future Workshop: Democratic problem solving

    Directory of Open Access Journals (Sweden)

    Rene Victor Valqui Vidal

    2006-03-01

    Full Text Available The origins, principles and practice of a very popular method known as The Future Workshop are presented. The fundamental theory and principles of this method are presented in an introductory way. In addition, practical guidelines to carry out such a workshop are outlined and several types of applications are shortly described. The crucial importance of both the facilitation process and the use of creative tools in team work are enhanced.

  12. HAWCStab2 with super element foundations: A new tool for frequency analysis of offshore wind turbines

    DEFF Research Database (Denmark)

    Henriksen, Lars Christian; Hansen, Anders Melchior; Kragh, Knud Abildgaard; Yde, Anders

    HAWCStab2 is a linear frequency domain aero-elastic tool, developed by DTU Wind Energy, suitable for frequency and stability analysis of horizontal axis 3 bladed wind turbines [1]. This tool has now been extended to also handle complex offshore foundation types, such as jacket structures and...

  13. Technical analysis as a tool of market timing

    International Nuclear Information System (INIS)

    Risk management teams carry a high burden because businesses have to compete on a global scale. Timing is everything, once a decision to hedge in the futures has been made. How to determine when it is a good time to buy or sell, and be certain that this information has not already been factored into the price is tricky. There are two schools: fundamental analysis and technical analysis. Statistics and supply and demand data are examined to determine why in fundamental analysis, while technical analysis determines when by interpreting charts. With the help of charts that were displayed, the author examined the price of crude oil in 1990 and 1991 in an attempt to demonstrate technical analysis. Technical analysis is based on mathematics, but it is more art than science. It looks at market patterns that repeat themselves endlessly. Markets almost always enter into a period of consolidation or distribution when they break out. Additional charts displaying the price of crude oil for various periods from 1986 to 1991 were presented and analysed. The author concluded by stating that technical analysis provides visual guidelines to hedgers and traders to assist them in making intelligent forecasts about price and risk. figs

  14. Limits, limits everywhere the tools of mathematical analysis

    CERN Document Server

    Applebaum, David

    2012-01-01

    A quantity can be made smaller and smaller without it ever vanishing. This fact has profound consequences for science, technology, and even the way we think about numbers. In this book, we will explore this idea by moving at an easy pace through an account of elementary real analysis and, in particular, will focus on numbers, sequences, and series.Almost all textbooks on introductory analysis assume some background in calculus. This book doesn't and, instead, the emphasis is on the application of analysis to number theory. The book is split into two parts. Part 1 follows a standard university

  15. ATLASWatchMan, a tool for automatized data analysis

    International Nuclear Information System (INIS)

    The ATLAS detector will start soon to take data and many New Physics phenomena are expected. The ATLASWatchMan package has been developed with the principles of CASE (Computer Aided Software Engineering) and it helps the user setting up any analysis by automatically generating the actual analysis code and data files from user settings. ATLASWatchMan provides a light and transparent framework to plug in user-defined cuts and algorithms to look at as many channels the user wants, running the analysis both locally and on the Grid. Examples of analyses run with the package using the latest release of the ATLAS software are shown

  16. GammaWorkshops Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Ramebaeck, H. (ed.) (Swedish Defence Research Agency (Sweden)); Straalberg, E. (Institute for Energy Technology, Kjeller (Norway)); Klemola, S. (Radiation and Nuclear Safety Authority, STUK (Finland)); Nielsen, Sven P. (Technical Univ. of Denmark. Risoe National Lab. for Sustainable Energy, Roskilde (Denmark)); Palsson, S.E. (Icelandic Radiation Safety Authority (Iceland))

    2012-01-15

    Due to a sparse interaction during the last years between practioners in gamma ray spectrometry in the Nordic countries, a NKS activity was started in 2009. This GammaSem was focused on seminars relevant to gamma spectrometry. A follow up seminar was held in 2010. As an outcome of these activities it was suggested that the 2011 meeting should be focused on practical issues, e.g. different corrections needed in gamma spectrometric measurements. This three day's meeting, GammaWorkshops, was held in September at Risoe-DTU. Experts on different topics relevant for gamma spectrometric measurements were invited to the GammaWorkshops. The topics included efficiency transfer, true coincidence summing corrections, self-attenuation corrections, measurement of natural radionuclides (natural decay series), combined measurement uncertainty calculations, and detection limits. These topics covered both lectures and practical sessions. The practical sessions included demonstrations of tools for e.g. corrections and calculations of the above meantioned topics. (Author)

  17. User Behavior Analysis from Web Log using Log Analyzer Tool

    Directory of Open Access Journals (Sweden)

    Brijesh Bakariya

    2013-11-01

    Full Text Available Now a day, internet plays a role of huge database in which many websites, information and search engines are available. But due to unstructured and semi-structured data in webpage, it has become a challenging task to extract relevant information. Its main reason is that traditional knowledge based technique are not correct to efficiently utilization the knowledge, because it consist of many discover pattern, contains a lots of noise and uncertainty. In this paper, analyzing of web usage mining has been made with the help if web log data for which web log analyzer tool, “Deep Log Analyzer” to find out abstract information from particular server and also tried to find out the user behavior and also developed an ontology which consist the relation among efficient web apart of web usage mining.

  18. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    Directory of Open Access Journals (Sweden)

    Tousif ur Rehman

    2013-02-01

    Full Text Available Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non-functional requirement, yet are naturally considered fundamental to secure software development.

  19. Integration of management control tools. Analysis of a case study

    Directory of Open Access Journals (Sweden)

    Raúl Comas Rodríguez

    2015-09-01

    Full Text Available The objective of this article is to design and to implement a procedure that integrates management control tools focusing on process, to improve the efficiency and the efficacy. It was carried out an experimental study where is defined a procedure, based in the Balanced Scorecard, which integrates the process management into the strategic planning and their evaluation. As results of this work, we define the key factors of success associated with the four perspectives of the Balanced Scorecard that are linked through the cause-effect relations obtaining the strategic map that allows visualizing and communicating the enterprise strategy. The indicators evaluate the key factor of success, integrating the process with the assistance of a software. The implementation of the procedure in a commercialization enterprise contributed to integrate the process definition into the strategic planning. The alignment was evaluated and the efficiency and efficacy indicators improved the company´s performance.

  20. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    Science.gov (United States)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  1. Silver Solid Amalgam Electrodes - Tools of Choice in DNA Analysis

    Czech Academy of Sciences Publication Activity Database

    Fadrná, Renata; Josypčuk, Bohdan; Fojta, Miroslav

    Galway : National University of Ireland , 2004. s. 119. [International Conference on Electroanalysis /10./. 06.06.2004-10.06.2004, Galway] Keywords : silver solid amalgam * electrodes * DNA analysis Subject RIV: CG - Electrochemistry

  2. Design and Analysis Tools for Deployable Solar Array Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Large, lightweight, deployable solar array structures have been identified as a key enabling technology for NASA with analysis and design of these structures being...

  3. SOCIAL SENSOR: AN ANALYSIS TOOL FOR SOCIAL MEDIA

    OpenAIRE

    Chun-Hsiao Wu; Tsai-Yen Li

    2016-01-01

    In this research, we propose a new concept for social media analysis called Social Sensor, which is an innovative design attempting to transform the concept of a physical sensor in the real world into the world of social media with three design features: manageability, modularity, and reusability. The system is a case-centered design that allows analysts to select the type of social media (such as Twitter), the target data sets, and appropriate social sensors for analysis. By adopting paramet...

  4. Practical survival analysis tools for heterogeneous cohorts and informative censoring

    OpenAIRE

    Rowley, M; Garmo, H; Van Hemelrijck, M; Wulaningsih, W.; Grundmark, B.; Zethelius, B.; Hammar, N.; Walldius, G; M. Inoue; Holmberg, L; Coolen, A. C. C.

    2015-01-01

    In heterogeneous cohorts and those where censoring by non-primary risks is informative many conventional survival analysis methods are not applicable; the proportional hazards assumption is usually violated at population level and the observed crude hazard rates are no longer estimators of what they would have been in the absence of other risks. In this paper, we develop a fully Bayesian survival analysis to determine the probabilistically optimal description of a heterogeneous cohort and we ...

  5. Markov Chains as Tools for Jazz Improvisation Analysis

    OpenAIRE

    Franz, David Matthew

    1998-01-01

    This thesis describes an exploratory application of a statistical analysis and modeling technique (Markov chains) for the modeling of jazz improvisation with the intended subobjective of providing increased insight into an improviser's style and creativity through the postulation of quantitative measures of style and creativity based on the constructed Markovian analysis techniques. Using Visual Basic programming language, Markov chains of orders one to three are created using transcriptio...

  6. Interfacing interactive data analysis tools with the GRID: the PPDG CS-11 activity

    International Nuclear Information System (INIS)

    For today's physicists, who work in large geographically distributed collaborations, the data grid promises significantly greater capabilities for analysis of experimental data and production of physics results than is possible with today's 'remote access' technologies. The goal of letting scientists at their home institutions interact with and analyze data as if they were physically present at the major laboratory that houses their detector and computer center has yet to be accomplished. The Particle Physics Data Grid project (www.ppdg.net) has recently embarked on an effort to 'Interface and Integrate Interactive Data Analysis Tools with the grid and identify Common Components and Services'. The initial activities are to collect known and identify new requirements for grid services and analysis tools from a range of current and future experiments (ALICE, ATLAS, BaBar, D0, CMS, JLab, STAR, others welcome), to determine if existing plans for tools and services meet these requirements. Follow-on activities will foster the interaction between grid service developers, analysis tool developers, experiment analysis framework developers and end user physicists, and will identify and carry out specific development/integration work so that interactive analysis tools utilizing grid services actually provide the capabilities that users need. This talk will summarize what we know of requirements for analysis tools and grid services, as well as describe the identified areas where more development work is needed

  7. Welcome from the Workshop Chairs

    OpenAIRE

    Berntsson Svensson, Richard (Ed.); Daneva, Maya; Marczak, Sabrina; Ernst, Neil; Madhavji, Nazim

    2015-01-01

    Welcome to the fifth International Workshop on Empirical Requirements Engineering (EmpiRE 2015) at RE’15! In the past few years, some important developments in the Information Technology Services marketplace as well as in the software industry in particular fueled the debate on the evaluation of Requirements Engineering (RE) approaches, techniques and tools and the comparison of their usefulness, effectiveness and utility in specific practical contexts. Examples of such market trends include,...

  8. Applying observations of work activity in designing prototype data analysis tools

    Energy Technology Data Exchange (ETDEWEB)

    Springmeyer, R.R.

    1993-07-06

    Designers, implementers, and marketers of data analysis tools typically have different perspectives than users. Consequently, data analysis often find themselves using tools focused on graphics and programming concepts rather than concepts which reflect their own domain and the context of their work. Some user studies focus on usability tests late in development; others observe work activity, but fail to show how to apply that knowledge in design. This paper describes a methodology for applying observations of data analysis work activity in prototype tool design. The approach can be used both in designing improved data analysis tools, and customizing visualization environments to specific applications. We present an example of user-centered design for a prototype tool to cull large data sets. We revisit the typical graphical approach of animating a large data set from the point of view of an analysis who is culling data. Field evaluations using the prototype tool not only revealed valuable usability information, but initiated in-depth discussions about user`s work, tools, technology, and requirements.

  9. Evaluation of fatigue damage in nuclear power plants: evolution and new tools of analysis

    International Nuclear Information System (INIS)

    This paper presents new fatigue mechanisms requiring analysis, tools developed for evaluation and the latest trends and studies that are currently working in the nuclear field, and allow proper management referring facilities the said degradation mechanism.

  10. Transportation routing analysis geographic information system - tragis, progress on improving a routing tool

    International Nuclear Information System (INIS)

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental United States. This paper outlines some of the features available in this model. (authors)

  11. Transportation Routing Analysis Geographic Information System -- TRAGIS, progress on improving a routing tool

    International Nuclear Information System (INIS)

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental US. This paper outlines some of the features available in this model

  12. Grid Analysis and Display System (GrADS): A practical tool for earth science visualization

    Science.gov (United States)

    Kinter, James L., III; Doty, Brian E.

    1991-01-01

    Viewgraphs on grid analysis and display system (GrADS): a practical tool for earth science visualization are presented. Topics covered include: GrADS design goals; data sets; and temperature profiles.

  13. Effectiveness of Workshop on Evaluation Methodology for Medical Teachers

    OpenAIRE

    Chinmay Shah; P. A. Gokhale; Mehta, H. B.

    2011-01-01

    A workshop on evaluation methodology was designed at Government Medical College, Bhavnagar. The workshop comprised of six modules namely: Mechanics of Paper setting, MCQ formulation & Item analysis, Mini CEX, OSPE, OSCE and Structured Viva. Study was carried out with aim to find out effectiveness of workshop in changing knowledge and attitude towards different evaluation methodology. Method: Instruction was provided during a one day workshop wit...

  14. ROOT User Workshop 2013

    CERN Document Server

    2013-01-01

    Since almost two decades, ROOT has established itself as the framework for HENP data processing and analysis. The LHC upgrade program and the new experiments being designed at CERN and elsewhere will pose even more formidable challenges in terms of data complexity and size. The new parallel and heterogeneous computing architectures that are either announced or already available will call for a deep rethinking of the code and the data structures to be exploited efficiently. This workshop, following from a successful series of such events, will allow you to learn in detail about the new ROOT 6 and will help shape the future evolution of ROOT.

  15. Purpose of the workshop

    International Nuclear Information System (INIS)

    The main purpose of the Workshop is to share the experience on emergency data management and to review various conceptual, technical, organisational and operational aspects and problems. The problems posed by hardware and software, the interplay of software developers and users/operators and the positive and negative experiences both from development and operation of data management systems are discussed. Emergency data management systems and their demonstrations are divided into four classes of possible applications: video games, training and simulation systems, 'history writing' = post-event analysis and documentation systems, real-time operational systems. (author)

  16. SICOMAT : a system for SImulation and COntrol analysis of MAchine Tools

    OpenAIRE

    Gautier, Maxime; Pham, Minh Tu; Khalil, Wisama; Lemoine, Philippe; Poignet, Philippe

    2001-01-01

    This paper presents a software package for the simulation and the control analysis of machine tool axes. This package which is called SICOMAT (SImulation and COntrol analysis of MAchine Tools), provides a large variety of toolboxes to analyze the behavior and the control of the machine. The software takes into account several elements such as the flexibility of bodies, the interaction between several axes, the effect of numerical control and the availability to reduce models.

  17. Development of a task analysis tool to facilitate user interface design

    Science.gov (United States)

    Scholtz, Jean C.

    1992-01-01

    A good user interface is one that facilitates the user in carrying out his task. Such interfaces are difficult and costly to produce. The most important aspect in producing a good interface is the ability to communicate to the software designers what the user's task is. The Task Analysis Tool is a system for cooperative task analysis and specification of the user interface requirements. This tool is intended to serve as a guide to development of initial prototypes for user feedback.

  18. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.

    2014-12-01

    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  19. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  20. Uraninite chemistry as forensic tool for provenance analysis

    International Nuclear Information System (INIS)

    Highlights: • Uraninite chemistry can be used as fingerprint and provenance tool. • U/Th ratio and total REE contents are good indicators of crystallisation temperature. • REE fractionation is strongly dependent on uraninite genesis. • Application to uraninite from the Witwatesrand Basin highlights its detrital nature. • Witwatersrand uraninite is derived from a variety of magmatic sources. - Abstract: Electron microprobe and laser ablation-inductively coupled plasma mass spectrometric (LA-ICPMS) analyses were carried out on individual uraninite grains from several localities worldwide, representing a variety of different U-deposit types ranging in age from Mesoarchaean to the Mesozoic. For the first time, concentration data on a comprehensive set of minor/trace elements in uraninite are presented, i.e. LA-ICPMS concentration data for Th, Si, Al, Fe, Mn, Ca, Mg, P, Ti, V, Cr, Co, Ni, Pb, Zn, As, rare earth elements (REE), Y, Zr, Nb, Mo, Ag, Ta, W, Bi, and Au. Most of these elements could be detected in significant quantities in many of the studied examples. The results obtained in this study, supplemented by previously published data on major element and REE concentrations, reveal systematic differences in uraninite composition between genetically different deposit types and also, for a given genetic type, between different locations. Low-temperature hydrothermal uraninite is marked by U/Th >1000, whereas high-temperature metamorphic and magmatic (granitic, pegmatitic) uraninite has U/Th <100. Our new data also confirm previous observations that low-temperature, hydrothermal uraninite has low total REE contents (<1 wt%) whereas higher temperature uraninite can contain as much as several percent total REE. Genetically different uraninite types can be further identified by means of different REE fractionation patterns. Systematic differences between primary uraninite from different localities could be also noted with respect to the abundances of especially

  1. 77 FR 31371 - Public Workshop: Privacy Compliance Workshop

    Science.gov (United States)

    2012-05-25

    ... SECURITY Office of the Secretary Public Workshop: Privacy Compliance Workshop AGENCY: Privacy Office, DHS. ACTION: Notice Announcing Public Workshop. SUMMARY: The Department of Homeland Security Privacy Office will host a public workshop, ``Privacy Compliance Workshop.'' DATES: The workshop will be held on...

  2. RASTA: A generalized tool for radiation source term analysis

    International Nuclear Information System (INIS)

    A FORTRAN computer code has been written for generalized radiation source term preparation. The RASTA (Radiation Source Term Analysis) code calculates the neutron and photon sources for any input isotopic combination and collapses to a user-selected multigroup format. The code is very easy to use, requiring minimal input. It provides extensive output edits suitable for data analysis or direct input into radiation transport codes. RASTA runs on the SRS RS6000 workstation cluster, but it should be easily portable to other computers

  3. SCALE 5: Powerful new criticality safety analysis tools

    International Nuclear Information System (INIS)

    Version 5 of the SCALE computer software system developed at Oak Ridge National Laboratory, scheduled for release in December 2003, contains several significant new modules and sequences for criticality safety analysis and marks the most important update to SCALE in more than a decade. This paper highlights the capabilities of these new modules and sequences, including continuous energy flux spectra for processing multigroup problem-dependent cross sections; one- and three-dimensional sensitivity and uncertainty analyses for criticality safety evaluations; two-dimensional flexible mesh discrete ordinates code; automated burnup-credit analysis sequence; and one-dimensional material distribution optimization for criticality safety. (author)

  4. Extension of a System Level Tool for Component Level Analysis

    Science.gov (United States)

    Majumdar, Alok; Schallhorn, Paul

    2002-01-01

    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  5. MetaboTools: A Comprehensive Toolbox for Analysis of Genome-Scale Metabolic Models

    Science.gov (United States)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-01-01

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.

  6. Decoding astrocyte heterogeneity: New tools for clonal analysis.

    Science.gov (United States)

    Bribián, A; Figueres-Oñate, M; Martín-López, E; López-Mascaraque, L

    2016-05-26

    The importance of astrocyte heterogeneity came out as a hot topic in neurosciences especially over the last decades, when the development of new methodologies allowed demonstrating the existence of big differences in morphological, neurochemical and physiological features between astrocytes. However, although the knowledge about the biology of astrocytes is increasing rapidly, an important characteristic that remained unexplored, until the last years, has been the relationship between astrocyte lineages and cell heterogeneity. To fill this gap, a new method called StarTrack was recently developed, a powerful genetic tool that allows tracking astrocyte lineages forming cell clones. Using StarTrack, a single astrocyte progenitor and its progeny can be specifically labeled from its generation, during embryonic development, to its final fate in the adult brain. Because of this specific labeling, astrocyte clones, exhibiting heterogeneous morphologies and features, can be easily analyzed in relation to their ontogenetic origin. This review summarizes how astrocyte heterogeneity can be decoded studying the embryonic development of astrocyte lineages and their clonal relationship. Finally, we discuss about some of the challenges and opportunities emerging in this exciting area of investigation. PMID:25917835

  7. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Science.gov (United States)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  8. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, we built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of

  9. Measures of radioactivity: a tool for understanding statistical data analysis

    OpenAIRE

    Montalbano, Vera; Quattrini, Sonia

    2012-01-01

    A learning path on radioactivity in the last class of high school is presented. An introduction to radioactivity and nuclear phenomenology is followed by measurements of natural radioactivity. Background and weak sources are monitored for days or weeks. The data are analyzed in order to understand the importance of statistical analysis in modern physics.

  10. GATA: a graphic alignment tool for comparative sequence analysis

    OpenAIRE

    Nix David A; Eisen Michael B

    2005-01-01

    Abstract Background Several problems exist with current methods used to align DNA sequences for comparative sequence analysis. Most dynamic programming algorithms assume that conserved sequence elements are collinear. This assumption appears valid when comparing orthologous protein coding sequences. Functional constraints on proteins provide strong selective pressure against sequence inversions, and minimize sequence duplications and feature shuffling. For non-coding sequences this collineari...

  11. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  12. ProbFAST: Probabilistic Functional Analysis System Tool

    Directory of Open Access Journals (Sweden)

    Oliveira Thiago YK

    2010-03-01

    Full Text Available Abstract Background The post-genomic era has brought new challenges regarding the understanding of the organization and function of the human genome. Many of these challenges are centered on the meaning of differential gene regulation under distinct biological conditions and can be performed by analyzing the Multiple Differential Expression (MDE of genes associated with normal and abnormal biological processes. Currently MDE analyses are limited to usual methods of differential expression initially designed for paired analysis. Results We proposed a web platform named ProbFAST for MDE analysis which uses Bayesian inference to identify key genes that are intuitively prioritized by means of probabilities. A simulated study revealed that our method gives a better performance when compared to other approaches and when applied to public expression data, we demonstrated its flexibility to obtain relevant genes biologically associated with normal and abnormal biological processes. Conclusions ProbFAST is a free accessible web-based application that enables MDE analysis on a global scale. It offers an efficient methodological approach for MDE analysis of a set of genes that are turned on and off related to functional information during the evolution of a tumor or tissue differentiation. ProbFAST server can be accessed at http://gdm.fmrp.usp.br/probfast.

  13. Workshop introduction

    International Nuclear Information System (INIS)

    The Department of Energy's National Nuclear Security Administration's Global Threat Reduction Initiative (GTRI) has three subprograms that directly reduce the nuclear/radiological threat; Convert (Highly Enriched Uranium), Protect (Facilities), and Remove (Materials). The primary mission of the Off-Site Source Recovery Project (OSRP) falls under the 'Remove' subset. The purpose of this workshop is to provide a venue for joint-technical collaboration between the OSRP and the Nuclear Radiation Safety Service (NRSS). Eisenhower's Atoms for Peace initiative and the Soviet equivalent both promoted the spread of the paradoxical (peaceful and harmful) properties of the atom. The focus of nonproliferation efforts has been rightly dedicated to fissile materials and the threat they pose. Continued emphasis on radioactive materials must also be encouraged. An unquantifiable threat still exists in the prolific quantity of sealed radioactive sources (sources) spread worldwide. It does not appear that the momentum of the evolution in the numerous beneficial applications of radioactive sources will subside in the near future. Numerous expert studies have demonstrated the potentially devastating economic and psychological impacts of terrorist use of a radiological dispersal or emitting device. The development of such a weapon, from the acquisition of the material to the technical knowledge needed to develop and use it, is straightforward. There are many documented accounts worldwide of accidental and purposeful diversions of radioactive materials from regulatory control. The burden of securing sealed sources often falls upon the source owner, who may not have a disposal pathway once the source reaches the end of its useful life. This disposal problem is exacerbated by some source owners not having the resources to safely and compliantly store them. US Nuclear Regulatory Commission (NRC) data suggests that, in the US alone, there are tens of thousands of high-activity (IAEA

  14. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  15. Design and Analysis Tools for Concurrent Blackboard Systems

    Science.gov (United States)

    McManus, John W.

    1991-01-01

    A blackboard system consists of a set of knowledge sources, a blackboard data structure, and a control strategy used to activate the knowledge sources. The blackboard model of problem solving is best described by Dr. H. Penny Nii of the Stanford University AI Laboratory: "A Blackboard System can be viewed as a collection of intelligent agents who are gathered around a blackboard, looking at pieces of information written on it, thinking about the current state of the solution, and writing their conclusions on the blackboard as they generate them. " The blackboard is a centralized global data structure, often partitioned in a hierarchical manner, used to represent the problem domain. The blackboard is also used to allow inter-knowledge source communication and acts as a shared memory visible to all of the knowledge sources. A knowledge source is a highly specialized, highly independent process that takes inputs from the blackboard data structure, performs a computation, and places the results of the computation in the blackboard data structure. This design allows for an opportunistic control strategy. The opportunistic problem-solving technique allows a knowledge source to contribute towards the solution of the current problem without knowing which of the other knowledge sources will use the information. The use of opportunistic problem-solving allows the data transfers on the blackboard to determine which processes are active at a given time. Designing and developing blackboard systems is a difficult process. The designer is trying to balance several conflicting goals and achieve a high degree of concurrent knowledge source execution while maintaining both knowledge and semantic consistency on the blackboard. Blackboard systems have not attained their apparent potential because there are no established tools or methods to guide in their construction or analyze their performance.

  16. Automated simultaneous analysis phylogenetics (ASAP: an enabling tool for phlyogenomics

    Directory of Open Access Journals (Sweden)

    Lee Ernest K

    2008-02-01

    Full Text Available Abstract Background The availability of sequences from whole genomes to reconstruct the tree of life has the potential to enable the development of phylogenomic hypotheses in ways that have not been before possible. A significant bottleneck in the analysis of genomic-scale views of the tree of life is the time required for manual curation of genomic data into multi-gene phylogenetic matrices. Results To keep pace with the exponentially growing volume of molecular data in the genomic era, we have developed an automated technique, ASAP (Automated Simultaneous Analysis Phylogenetics, to assemble these multigene/multi species matrices and to evaluate the significance of individual genes within the context of a given phylogenetic hypothesis. Conclusion Applications of ASAP may enable scientists to re-evaluate species relationships and to develop new phylogenomic hypotheses based on genome-scale data.

  17. GAVO Tools for the Analysis of Stars and Nebulae

    CERN Document Server

    Rauch, Thomas

    2007-01-01

    Within the framework of the German Astrophysical Virtual Observatory (GAVO), we provide synthetic spectra, simulation software for the calculation of NLTE model atmospheres, as well as necessary atomic data. This will enable a VO user to directly compare observation and model-atmosphere spectra on three levels: The easiest and fastest way is the use of our pre-calculated flux-table grid in which one may inter- and extrapolate. For a more precise analysis of an abservation, the VO user may improve the fit to the observation by the calculation of individual model atmospheres with fine-tuned photospheric parameters via the WWW interface TMAW. The more experienced VO user may create own atomic-data files for a more detailed analysis and calculate model atmosphere and flux tables with these.

  18. SYSTID - A flexible tool for the analysis of communication systems.

    Science.gov (United States)

    Dawson, C. T.; Tranter, W. H.

    1972-01-01

    Description of the System Time Domain Simulation (SYSTID) computer-aided analysis program which is specifically structured for communication systems analysis. The SYSTID program is user oriented so that very little knowledge of computer techniques and very little programming ability are required for proper application. The program is designed so that the user can go from a system block diagram to an accurate simulation by simply programming a single English language statement for each block in the system. The mathematical and functional models available in the SYSTID library are presented. An example problem is given which illustrates the ease of modeling communication systems. Examples of the outputs available are presented, and proposed improvements are summarized.

  19. Application of Multivariate Analysis Tools to Industrial Scale Fermentation Data

    OpenAIRE

    Mears, Lisa; Nørregård, Rasmus; Stocks, Stuart M.; Albæk, Mads O.; Sin, Gürkan; Gernaey, Krist; Villez, Kris

    2015-01-01

    The analysis of batch process data can provide insight into the process operation, and there is a vast amount of historical data available for data mining. Empirical modelling utilising this data is desirable where there is a lack of understanding regarding the underlying process (Formenti et al. 2014). This may be the case for fed-batch fermentation processes, where mechanistic modelling is challenging due to non-linear dynamics, and non-steady state operation. There is also a lack of sensor...

  20. BiologicalNetworks: visualization and analysis tool for systems biology

    OpenAIRE

    Baitaluk, Michael; Sedova, Mayya; Ray, Animesh; Gupta, Amarnath

    2006-01-01

    Systems level investigation of genomic scale information requires the development of truly integrated databases dealing with heterogeneous data, which can be queried for simple properties of genes or other database objects as well as for complex network level properties, for the analysis and modelling of complex biological processes. Towards that goal, we recently constructed PathSys, a data integration platform for systems biology, which provides dynamic integration over a diverse set of dat...

  1. Soldier Station: A Tool for Dismounted Infantry Analysis

    OpenAIRE

    Pratt, Shirley; Ohman, David; Brown, Steve; Galloway, John; Pratt, David

    1997-01-01

    Soldier Station is a networked, human-in-the-loop, virtual dismounted infantryman (DI) simulator with underlying constructive model algorithms for movement, detection, engagement, and damage assessment. It is being developed by TRADOC Analysis Center - White Sands Missile Range, New Mexico, to analyze DI issues pertaining to situational awareness, command and control, and tactics techniques and procedures. It is unique in its design to integrate virtual and constructive simulation...

  2. Power Systems Life Cycle Analysis Tool (Power L-CAT).

    Energy Technology Data Exchange (ETDEWEB)

    Andruski, Joel; Drennen, Thomas E.

    2011-01-01

    The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation; and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).

  3. IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation

    Science.gov (United States)

    Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.

    2005-01-01

    This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.

  4. DYNAMICS ANALYSIS OF SPECIAL STRUCTURE OF MILLING-HEAD MACHINE TOOL

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The milling-bead machine tool is a sophisticated and high-quality machine tool of which the spindle system is made up of special multi-element structure. Two special mechanical configurations make the cutting performance of the machine tool decline. One is the milling head spindle supported on two sets of complex bearings. The mechanical dynamic rigidity of milling head structure is researched on designed digital prototype with finite element analysis(FEA) and modal synthesis analysis (MSA) for identifying the weak structures. The other is the ram structure hanging on milling head. The structure is researched to get dynamic performance on cutting at different ram extending positions. The analysis results on spindle and ram are used to improve the mechanical configurations and structure in design. The machine tool is built up with modified structure and gets better dynamic rigidity than it was before.

  5. IPHE Infrastructure Workshop Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    None

    2010-02-01

    This proceedings contains information from the IPHE Infrastructure Workshop, a two-day interactive workshop held on February 25-26, 2010, to explore the market implementation needs for hydrogen fueling station development.

  6. Time-frequency tools of signal processing for EISCAT data analysis

    Directory of Open Access Journals (Sweden)

    J. Lilensten

    Full Text Available We demonstrate the usefulness of some signal-processing tools for the EISCAT data analysis. These tools are somewhat less classical than the familiar periodogram, squared modulus of the Fourier transform, and therefore not as commonly used in our community. The first is a stationary analysis, "Thomson's estimate'' of the power spectrum. The other two belong to time-frequency analysis: the short-time Fourier transform with the spectrogram, and the wavelet analysis via the scalogram. Because of the highly non-stationary character of our geophysical signals, the latter two tools are better suited for this analysis. Their results are compared with both a synthetic signal and EISCAT ion-velocity measurements. We show that they help to discriminate patterns such as gravity waves from noise.

  7. AAAI 2002 Workshops

    OpenAIRE

    Blake, Brian; Haigh, Karen; Hexmoor, Henry; Falcone, Rino; Soh, Leen-Kiat; Baral, Chitta; McIlraith, Sheila; Gmytrasiewicz, Piotr; Parsons, Simon; Malaka, Rainer; Krueger, Antonio; Bouquet, Paolo; Smart, Bill; Kurumantani, Koichi; Pease, Adam

    2002-01-01

    The Association for the Advancement of Artificial Intelligence (AAAI) presented the AAAI-02 Workshop Program on Sunday and Monday, 28-29 July 2002 at the Shaw Convention Center in Edmonton, Alberta, Canada. The AAAI-02 workshop program included 18 workshops covering a wide range of topics in AI. The workshops were Agent-Based Technologies for B2B Electronic-Commerce; Automation as a Caregiver: The Role of Intelligent Technology in Elder Care; Autonomy, Delegation, and Control: From Interagent...

  8. The Temporary City Workshop

    OpenAIRE

    Moore, Niamh; McCarthy, Linda

    2014-01-01

    The Temporary City Workshop was hosted by Dr Niamh Moore-Cherry on Tuesday 21 October in Nova UCD. The workshop is part of the Greening as Spatial Politics project funded by the IRC New Foundations scheme 2013 and is a collaboration between geographers at University College Dublin and the University of Wisconsin-Milwaukee. The goal of the workshop was to facilitate networking across a diversity of stakeholders and initiate discussion on temporary urban interventions in Dublin. The workshop wa...

  9. Materiality, Description and Comparison as Tools for Cultural Difference Analysis

    OpenAIRE

    Zimmermann, Basile

    2013-01-01

    Working in a Chinese studies department based in Europe, I am often confronted with the challenges not only of working with cultural difference, but also of working with the concept of “culture” in itself – one of the most famously difficult concepts in the social sciences and humanities. Further, recent socioeconomic changes in China—and the new media dynamics of the “Chinese Internet”—have produced new situations requiring socio-cultural analysis, but lacking a clear theoretical or methodol...

  10. Stakeholder analysis: a useful tool for biobank planning.

    Science.gov (United States)

    Bjugn, Roger; Casati, Bettina

    2012-06-01

    Stakeholders are individuals, groups, or organizations that are affected by or can affect a particular action undertaken by others. Biobanks relate to a number of donors, researchers, research institutions, regulatory bodies, funders, and others. These stakeholders can potentially have a strong influence upon the organization and operation of a biobank. A sound strategy for stakeholder engagement is considered essential in project management and organization theory. In this article, we review relevant stakeholder theory and demonstrate how a stakeholder analysis was undertaken in the early stage of a planned research biobank at a public hospital in Norway. PMID:24835062

  11. SCit: web tools for protein side chain conformation analysis.

    Science.gov (United States)

    Gautier, R; Camproux, A-C; Tufféry, P

    2004-07-01

    SCit is a web server providing services for protein side chain conformation analysis and side chain positioning. Specific services use the dependence of the side chain conformations on the local backbone conformation, which is described using a structural alphabet that describes the conformation of fragments of four-residue length in a limited library of structural prototypes. Based on this concept, SCit uses sets of rotameric conformations dependent on the local backbone conformation of each protein for side chain positioning and the identification of side chains with unlikely conformations. The SCit web server is accessible at http://bioserv.rpbs.jussieu.fr/SCit. PMID:15215438

  12. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  13. Neutron activation analysis: a powerful tool in provenance investigations

    International Nuclear Information System (INIS)

    It is well known that neutron activation analysis (NAA), both instrumental and destructive, allows the simultaneous determination of a number of elements, mostly trace elements, with high levels of precision and accuracy. These peculiar properties of NAA are very useful when applied to provenance studies, i.e. to the identification of the origin of raw materials with which artifacts had been manufactured in ancient times. Data reduction by statistical procedures, especially multivariate analysis techniques, provides a statistical 'fingerprint' of investigated materials, both raw materials and archaeological artifacts, that, upon comparison, allows the identification of the provenance of prime matters used for artifact manufacturing. Thus information on quarries and flows exploitation in the antiquity, on technological raw materials processing, on trade routes and about the circulation of fakes, can be obtained. In the present paper two case studies are reported. The first one deals with the identification of the provenance of clay used to make ceramic materials, mostly bricks and tiles, recovered from the excavation of a Roman 'villa' in Lomello (Roman name Laumellum) and of Roman settlings in Casteggio (Roman name Clastidium). Both sites are located in the Province of Pavia in areas called Lomellina and Oltrepo respectively. The second one investigates the origin of the white marble used to build medieval arks, Carolingian age, located in the church of San Felice, now property of the University of Pavia. Experimental set-up, analytical results and data reduction procedures are presented and discussed. (author)

  14. Nested sampling as a tool for LISA data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gair, Jonathan R [Institute of Astronomy, Madingley Road, CB3 0HA, Cambridge (United Kingdom); Feroz, Farhan; Graff, Philip; Hobson, Michael P [Astrophysics Group, Cavendish Laboratory, JJ Thomson Avenue, Cambridge CB3 0HE (United Kingdom); Babak, Stanislav; Petiteau, Antoine [Max-Planck-Institut fuer Gravitationsphysik, Am Muehlenberg 1, 14476, Potsdam (Germany); Porter, Edward K, E-mail: jgair@ast.cam.ac.u [APC, UMR 7164, Universite Paris 7 Denis Diderot, 10, rue Alice Domon et Leonie Duquet, 75205 Paris Cedex 13 (France)

    2010-05-01

    Nested sampling is a technique for efficiently computing the probability of a data set under a particular hypothesis, also called the Bayesian Evidence or Marginal Likelihood, and for evaluating the posterior. MULTINEST is a multi-modal nested sampling algorithm which has been designed to efficiently explore and characterize posterior probability surfaces containing multiple secondary solutions. We have applied the MULTINEST algorithm to a number of problems in gravitational wave data analysis. In this article, we describe the algorithm and present results for several applications of the algorithm to analysis of mock LISA data. We summarise recently published results for a test case in which we searched for two non-spinning black hole binary merger signals in simulated LISA data. We also describe results obtained with MULTINEST in the most recent round of the Mock LISA Data Challenge (MLDC), in which the algorithm was used to search for and characterise both spinning supermassive black hole binary inspirals and bursts from cosmic string cusps. In all these applications, the algorithm found the correct number of signals and efficiently recovered the posterior probability distribution. Moreover, in most cases the waveform corresponding to the best a-posteriori parameters had an overlap in excess of 99% with the true signal.

  15. Nested sampling as a tool for LISA data analysis

    International Nuclear Information System (INIS)

    Nested sampling is a technique for efficiently computing the probability of a data set under a particular hypothesis, also called the Bayesian Evidence or Marginal Likelihood, and for evaluating the posterior. MULTINEST is a multi-modal nested sampling algorithm which has been designed to efficiently explore and characterize posterior probability surfaces containing multiple secondary solutions. We have applied the MULTINEST algorithm to a number of problems in gravitational wave data analysis. In this article, we describe the algorithm and present results for several applications of the algorithm to analysis of mock LISA data. We summarise recently published results for a test case in which we searched for two non-spinning black hole binary merger signals in simulated LISA data. We also describe results obtained with MULTINEST in the most recent round of the Mock LISA Data Challenge (MLDC), in which the algorithm was used to search for and characterise both spinning supermassive black hole binary inspirals and bursts from cosmic string cusps. In all these applications, the algorithm found the correct number of signals and efficiently recovered the posterior probability distribution. Moreover, in most cases the waveform corresponding to the best a-posteriori parameters had an overlap in excess of 99% with the true signal.

  16. MATING DESIGNS: HELPFUL TOOL FOR QUANTITATIVE PLANT BREEDING ANALYSIS

    Directory of Open Access Journals (Sweden)

    Athanase Nduwumuremyi

    2013-12-01

    Full Text Available Selection of parental materials and good mating designs in conventional plant breeding are the keys to the successful plant breeding programme. However, there are several factors affecting the choices of mating designs. Mating design refers to the procedure of producing the progenies, in plant breeding, plant breeders and geneticists, theoretically and practically, they use different form of mating designs and arrangements for targeted purpose. The choice of a mating design for estimating genetic variances should be dictated by the objectives of the study, time, space, cost and other biological limitations. In all mating designs, the individuals are taken randomly and crossed to produce progenies which are related to each other as half-sibs or full-sibs. A form of multivariate analysis or the analysis of variance can be adopted to estimate the components of variances. Therefore, this review aimed at highlighting the most used mating design in plant breeding and genetics studies. It provides easy and quick insight of the different form of mating designs and some statistical components for successful plant breeding.

  17. IBIXFIT: A Tool For The Analysis Of Microcalorimeter PIXE Spectra

    International Nuclear Information System (INIS)

    PIXE analysis software has been for long mainly tuned to the needs of Si(Li) detector based spectra analysis and quantification methods based on Kα or Lα X-ray lines. Still, recent evidences related to the study of relative line intensities and new developments in detection equipment, namely the emergence of commercial microcalorimeter based X-ray detectors, have brought up the possibility that in the near future PIXE will become more than just major lines quantification. A main issue that became evident as a consequence of this was the need to be able to fit PIXE spectra without prior knowledge of relative line intensities. Considering new developments it may be necessary to generalize PIXE to a wider notion of ion beam induced X-ray (IBIX) emission, to include the quantification of processes such as Radiative Auger Emission. In order to answer to this need, the IBIXFIT code was created based much on the Bayesian Inference and Simulated Annealing routines implemented in the Datafurnace code [1]. In this presentation, the IBIXFIT is used to fit a microcalorimeter spectrum of a BaxSr(1-x)TiO3 thin film sample and the specific possibility of selecting between fixed and free line ratios combined with other specificities of the IBIXFIT algorithm are shown to be essential to overcome the problems faced.

  18. Tools for the analysis and characterization of therapeutic protein species

    Directory of Open Access Journals (Sweden)

    Fuh MM

    2016-05-01

    Full Text Available Marceline Manka Fuh, Pascal Steffen, Hartmut Schlüter Mass Spectrometric Proteomics, Institute for Clinical Chemistry and Laboratory Medicine, University Medical Center Hamburg-Eppendorf, Hamburg, Germany Abstract: A continuously increasing number of therapeutic proteins are being released into the market, including biosimilars. In contrast to small organic drugs, therapeutic proteins require an extensive analysis of their exact chemical composition because of their complexity and proof of the absence of contaminants, such as host cell proteins and nucleic acids. Especially challenging is the detection of low abundant species of therapeutic proteins because these species are usually very similar to the target therapeutic protein. However, the detection of these species is very important for the safety of patients because a very small change of the exact chemical composition may cause serious side effects. In this review, we give a brief overview about the most important analytical approaches for characterizing therapeutic protein species and their contaminants and focus on the progress in this field during the past 3 years. Top-down mass spectrometry of intact therapeutic proteins in the future may solve many of the current problems in their analysis. Keywords: therapeutic protein species, biosimilars, liquid chromatography, mass spectrometry, capillary electrophoresis

  19. Model for nuclear proliferation resistance analysis using decision making tools

    International Nuclear Information System (INIS)

    The nuclear proliferation risks of nuclear fuel cycles is being considered as one of the most important factors in assessing advanced and innovative nuclear systems in GEN IV and INPRO program. They have been trying to find out an appropriate and reasonable method to evaluate quantitatively several nuclear energy system alternatives. Any reasonable methodology for integrated analysis of the proliferation resistance, however, has not yet been come out at this time. In this study, several decision making methods, which have been used in the situation of multiple objectives, are described in order to see if those can be appropriately used for proliferation resistance evaluation. Especially, the AHP model for quantitatively evaluating proliferation resistance is dealt with in more detail. The theoretical principle of the method and some examples for the proliferation resistance problem are described. For more efficient applications, a simple computer program for the AHP model is developed, and the usage of the program is introduced here in detail. We hope that the program developed in this study could be useful for quantitative analysis of the proliferation resistance involving multiple conflict criteria

  20. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    OpenAIRE

    ASLANTAŞ, Kubilay

    2003-01-01

    The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutt...