WorldWideScience

Sample records for analysis tools workshop

  1. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  2. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  3. Video Analysis and Modeling Tool for Physics Education: A workshop for Redesigning Pedagogy

    CERN Document Server

    Wee, Loo Kang

    2012-01-01

    This workshop aims to demonstrate how the Tracker Video Analysis and Modeling Tool engages, enables and empowers teachers to be learners so that we can be leaders in our teaching practice. Through this workshop, the kinematics of a falling ball and a projectile motion are explored using video analysis and in the later video modeling. We hope to lead and inspire other teachers by facilitating their experiences with this ICT-enabled video modeling pedagogy (Brown, 2008) and free tool for facilitating students-centered active learning, thus motivate students to be more self-directed.

  4. 6th International Parallel Tools Workshop

    CERN Document Server

    Brinkmann, Steffen; Gracia, José; Resch, Michael; Nagel, Wolfgang

    2013-01-01

    The latest advances in the High Performance Computing hardware have significantly raised the level of available compute performance. At the same time, the growing hardware capabilities of modern supercomputing architectures have caused an increasing complexity of the parallel application development. Despite numerous efforts to improve and simplify parallel programming, there is still a lot of manual debugging and  tuning work required. This process  is supported by special software tools, facilitating debugging, performance analysis, and optimization and thus  making a major contribution to the development of  robust and efficient parallel software. This book introduces a selection of the tools, which were presented and discussed at the 6th International Parallel Tools Workshop, held in Stuttgart, Germany, 25-26 September 2012.

  5. Earth Exploration Toolbook Workshops: Web-Conferencing and Teleconferencing Professional Development Bringing Earth Science Data Analysis and Visualization Tools to K-12 Teachers and Students

    Science.gov (United States)

    McAuliffe, C.; Ledley, T.

    2008-12-01

    The Earth Exploration Toolbook (EET) Workshops Project provides a mechanism for teachers and students to have successful data-using educational experiences. In this professional development project, teachers learn to use National Science Digital Library (NSDL), the Digital Library for Earth System Education (DLESE), and an Earth Exploration Toolbook (EET) chapter. In an EET Data Analysis Workshop, participants walk through an Earth Exploration Toolbook (EET) chapter, learning basic data analysis techniques and discussing ways to use Earth science datasets and analysis tools with their students. We have offered twenty-eight Data Analysis Workshops since the project began. The total number of participants in the twenty-eight workshops to date is three hundred eleven, which reflects one hundred eighty different teachers participating in one or more workshops. Our workshops reach middle and high school teachers across the United States at schools with lower socioeconomic levels and at schools with large numbers of minority students. Our participants come from thirty-eight different states including Alaska, Maine, Florida, Montana, and many others. Eighty-six percent of our participants are classroom teachers. The remaining fourteen percent are staff development specialists, university faculty, or outreach educators working with teachers. Of the classroom teachers, one third are middle school teachers (grades 6 to 8) and two thirds are high school teachers (grades 9 to 12.) Thirty-four percent of our participants come from schools where minority populations are the majority make up of the school. Twenty-five percent of our participants are at schools where the majority of the students receive free or reduced cost lunches. Our professional development workshops are helping to raise teachers' awareness of both the Digital Library for Earth System Education (DLESE) and the National Science Digital Library (NSDL). Prior to taking one of our workshops, forty-two percent of

  6. Workshop Physics and Related Curricula: A 25-Year History of Collaborative Learning Enhanced by Computer Tools for Observation and Analysis

    Science.gov (United States)

    Laws, Priscilla W.; Willis, Maxine C.; Sokoloff, David R.

    2015-10-01

    This article describes the 25-year history of development of the activity-based Workshop Physics (WP) at Dickinson College, its adaptation for use at Gettysburg Area High School, and its synergistic influence on curricular materials developed at the University of Oregon and Tufts University and vice versa. WP and these related curricula: 1) are based on Physics Education Research (PER) findings and are PER-validated; 2) feature active, collaborative learning; and 3) use computer-based tools that enable students to learn by making predictions and then collecting, displaying, and analyzing data from their experiments.

  7. UVI Cyber-security Workshop Workshop Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Kuykendall, Tommie G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allsop, Jacob Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Benjamin Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Boumedine, Marc [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Carter, Cedric [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Galvin, Seanmichael Yurko [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Oscar [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lee, Wellington K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lin, Han Wei [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Morris, Tyler Jake [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nauer, Kevin S.; Potts, Beth A.; Ta, Kim Thanh; Trasti, Jennifer; White, David R.

    2015-07-08

    The cybersecurity consortium, which was established by DOE/NNSA’s Minority Serving Institutions Partnerships Program (MSIPP), allows students from any of the partner schools (13 HBCUs, two national laboratories, and a public school district) to have all consortia options available to them, to create career paths and to open doors to DOE sites and facilities to student members of the consortium. As a part of this year consortium activities, Sandia National Laboratories and the University of Virgin Islands conducted a week long cyber workshop that consisted of three courses; Digital Forensics and Malware Analysis, Python Programming, and ThunderBird Cup. These courses are designed to enhance cyber defense skills and promote learning within STEM related fields.

  8. Workshop Physics and Related Curricula: "A 25-Year History of Collaborative Learning Enhanced by Computer Tools for Observation and Analysis"

    Science.gov (United States)

    Laws, Priscilla W.; Willis, Maxine C.; Sokoloff, David R.

    2015-01-01

    This article describes the 25-year history of development of the activity-based Workshop Physics (WP) at Dickinson College, its adaptation for use at Gettysburg Area High School, and its synergistic influence on curricular materials developed at the University of Oregon and Tufts University and vice versa. WP and these related curricula: 1) are…

  9. Collaboration tools for the global accelerator network: Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Olson, Gary [Univ. of Michigan, Ann Arbor, MI (United States); Olson, Judy [Univ. of Michigan, Ann Arbor, MI (United States)

    2002-09-15

    The concept of a ''Global Accelerator Network'' (GAN) has been put forward as a means for inter-regional collaboration in the operation of internationally constructed and operated frontier accelerator facilities. A workshop was held to allow representatives of the accelerator community and of the collaboratory development community to meet and discuss collaboration tools for the GAN environment. This workshop, called the Collaboration Tools for the Global Accelerator Network (GAN) Workshop, was held on August 26, 2002 at Lawrence Berkeley National Laboratory. The goal was to provide input about collaboration tools in general and to provide a strawman for the GAN collaborative tools environment. The participants at the workshop represented accelerator physicists, high-energy physicists, operations, technology tool developers, and social scientists that study scientific collaboration.

  10. Collaboration tools for the global accelerator network: Workshop Report

    International Nuclear Information System (INIS)

    The concept of a ''Global Accelerator Network'' (GAN) has been put forward as a means for inter-regional collaboration in the operation of internationally constructed and operated frontier accelerator facilities. A workshop was held to allow representatives of the accelerator community and of the collaboratory development community to meet and discuss collaboration tools for the GAN environment. This workshop, called the Collaboration Tools for the Global Accelerator Network (GAN) Workshop, was held on August 26, 2002 at Lawrence Berkeley National Laboratory. The goal was to provide input about collaboration tools in general and to provide a strawman for the GAN collaborative tools environment. The participants at the workshop represented accelerator physicists, high-energy physicists, operations, technology tool developers, and social scientists that study scientific collaboration

  11. Workshop One : Risk Analysis

    NARCIS (Netherlands)

    Carlson, T.J.; Jong, C.A.F. de; Dekeling, R.P.A.

    2012-01-01

    The workshop looked at the assessment of risk to aquatic animals exposed to anthropogenic sound. The discussion focused on marine mammals given the worldwide attention being paid to them at the present time, particularly in relationship to oil and gas exploration, ocean power, and increases in ship

  12. Ninth Thermal and Fluids Analysis Workshop Proceedings

    Science.gov (United States)

    Sakowski, Barbara (Compiler)

    1999-01-01

    The Ninth Thermal and Fluids Analysis Workshop (TFAWS 98) was held at the Ohio Aerospace Institute in Cleveland, Ohio from August 31 to September 4, 1998. The theme for the hands-on training workshop and conference was "Integrating Computational Fluid Dynamics and Heat Transfer into the Design Process." Highlights of the workshop (in addition to the papers published herein) included an address by the NASA Chief Engineer, Dr. Daniel Mulville; a CFD short course by Dr. John D. Anderson of the University of Maryland; and a short course by Dr. Robert Cochran of Sandia National Laboratories. In addition, lectures and hands-on training were offered in the use of several cutting-edge engineering design and analysis-oriented CFD and Heat Transfer tools. The workshop resulted in international participation of over 125 persons representing aerospace and automotive industries, academia, software providers, government agencies, and private corporations. The papers published herein address issues and solutions related to the integration of computational fluid dynamics and heat transfer into the engineering design process. Although the primary focus is aerospace, the topics and ideas presented are applicable to many other areas where these and other disciplines are interdependent.

  13. Applications of ion beam analysis workshop. Workshop handbook

    International Nuclear Information System (INIS)

    A workshop on applications of ion beam analysis was held at ANSTO, immediate prior to the IBMM-95 Conference in Canberra. It aims was to review developments and current status on use of ion beams for analysis, emphasizing the following aspects: fundamental ion beam research and secondary effects of ion beams; material sciences, geological, life sciences, environmental and industrial applications; computing codes for use in accelerator research; high energy heavy ion scattering and recoil; recent technological development using ion beams. The handbook contains the workshop's program, 29 abstracts and a list of participants

  14. Sawja: Static Analysis Workshop for Java

    Science.gov (United States)

    Hubert, Laurent; Barré, Nicolas; Besson, Frédéric; Demange, Delphine; Jensen, Thomas; Monfort, Vincent; Pichardie, David; Turpin, Tiphaine

    Static analysis is a powerful technique for automatic verification of programs but raises major engineering challenges when developing a full-fledged analyzer for a realistic language such as Java. Efficiency and precision of such a tool rely partly on low level components which only depend on the syntactic structure of the language and therefore should not be redesigned for each implementation of a new static analysis. This paper describes the Sawja library: a static analysis workshop fully compliant with Java 6 which provides OCaml modules for efficiently manipulating Java bytecode programs. We present the main features of the library, including i) efficient functional data-structures for representing a program with implicit sharing and lazy parsing, ii) an intermediate stack-less representation, and iii) fast computation and manipulation of complete programs. We provide experimental evaluations of the different features with respect to time, memory and precision.

  15. The EADGENE Microarray Data Analysis Workshop

    DEFF Research Database (Denmark)

    de Koning, Dirk-Jan; Jaffrézic, Florence; Lund, Mogens Sandø;

    2007-01-01

    Microarray analyses have become an important tool in animal genomics. While their use is becoming widespread, there is still a lot of ongoing research regarding the analysis of microarray data. In the context of a European Network of Excellence, 31 researchers representing 14 research groups from...... 10 countries performed and discussed the statistical analyses of real and simulated 2-colour microarray data that were distributed among participants. The real data consisted of 48 microarrays from a disease challenge experiment in dairy cattle, while the simulated data consisted of 10 microarrays...... statistical weights, to omitting a large number of spots or omitting entire slides. Surprisingly, these very different approaches gave quite similar results when applied to the simulated data, although not all participating groups analysed both real and simulated data. The workshop was very successful...

  16. Proceedings of pollution prevention and waste minimization tools workshop

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-11-01

    Pollution Prevention (P2) has evolved into one of DOE`s sprime strategies to meet environmental, fiscal, and worker safety obligations. P2 program planning, opportunity identification, and implementation tools were developed under the direction of the Waste Minimization Division (EM-334). Forty experts from EM, DP, ER and DOE subcontractors attended this 2-day workshop to formulate the incentives to drive utilization of these tools. Plenary and small working group sessions were held both days. Working Group 1 identified incentives to overcoming barriers in the area of P2 program planning and resource allocation. Working Group 2 identified mechanisms to drive the completion of P2 assessments and generation of opportunities. Working Group 3 compiled and documented a broad range of potential P2 incentives that address fundamental barriers to implementation of cost effective opportunities.

  17. Proceedings of pollution prevention and waste minimization tools workshop

    International Nuclear Information System (INIS)

    Pollution Prevention (P2) has evolved into one of DOE's sprime strategies to meet environmental, fiscal, and worker safety obligations. P2 program planning, opportunity identification, and implementation tools were developed under the direction of the Waste Minimization Division (EM-334). Forty experts from EM, DP, ER and DOE subcontractors attended this 2-day workshop to formulate the incentives to drive utilization of these tools. Plenary and small working group sessions were held both days. Working Group 1 identified incentives to overcoming barriers in the area of P2 program planning and resource allocation. Working Group 2 identified mechanisms to drive the completion of P2 assessments and generation of opportunities. Working Group 3 compiled and documented a broad range of potential P2 incentives that address fundamental barriers to implementation of cost effective opportunities

  18. 9th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Hilbrich, Tobias; Niethammer, Christoph; Gracia, José; Nagel, Wolfgang; Resch, Michael

    2016-01-01

    High Performance Computing (HPC) remains a driver that offers huge potentials and benefits for science and society. However, a profound understanding of the computational matters and specialized software is needed to arrive at effective and efficient simulations. Dedicated software tools are important parts of the HPC software landscape, and support application developers. Even though a tool is by definition not a part of an application, but rather a supplemental piece of software, it can make a fundamental difference during the development of an application. Such tools aid application developers in the context of debugging, performance analysis, and code optimization, and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 9th International Parallel Tools Workshop held in Dresden, Germany, September 2-3, 2015, which offered an established forum for discussing the latest advances in paral...

  19. Seventh Workshop and Tutorial on Practical Use of Coloured Petri Nets and the CPN Tools

    DEFF Research Database (Denmark)

    This booklet contains the proceedings of the Seventh Workshop on Practical Use of Coloured Petri Nets and the CPN Tools, October 24-26, 2006.......This booklet contains the proceedings of the Seventh Workshop on Practical Use of Coloured Petri Nets and the CPN Tools, October 24-26, 2006....

  20. 2nd International Workshop on Isogeometric Analysis and Applications

    CERN Document Server

    Simeon, Bernd

    2015-01-01

    Isogeometric Analysis is a groundbreaking computational approach that promises the possibility of integrating the finite element  method into conventional spline-based CAD design tools. It thus bridges the gap between numerical analysis and geometry, and moreover it allows to tackle new cutting edge applications at the frontiers of research in science and engineering. This proceedings volume contains a selection of outstanding research papers presented at the second International Workshop on Isogeometric Analysis and Applications, held at Annweiler, Germany, in April 2014.

  1. Ninth Workshop and Tutorial on Practical Use of Coloured Petri Nets and the CPN Tools

    DEFF Research Database (Denmark)

    This booklet contains the proceedings of the Ninth Workshop on Pratical Use of Coloured Petri Nets and CPN Tools, October 20-22, 2008. The workshop is organised by the CPN group at the Department of Computer Science, Aarhus University, Denmark. Coloured Petri Nets and the CPN Tools are now licensed...... to more than 7,200 users in 138 countries. The aim of the workshop is to bring together some of the users and in this way provide a forum for those who are interested in the practical use of Coloured Petri nets and their tools. The submitted papers were evaluated by a programme committee...... to practical use -- often in an industrial setting. The remaining papers deal with different extensions of tools and methodology. The papers from the first eight CPN Workshops can be found via web pages: http://www.daimi.au.dk/CPnets/. After an additional round of reviewing and revision, some of the papers...

  2. Proceedings Fifth Workshop on Formal Languages and Analysis of Contract-Oriented Software

    CERN Document Server

    Pimentel, Ernesto

    2011-01-01

    This volume consists of the proceedings of the 5th Workshop on Formal Languages and Analysis of Contract-Oriented Software (FLACOS'11). The FLACOS Workshops serve as annual meeting places to bring together researchers and practitioners working on language-based solutions to contract-oriented software development. High-level models of contracts are needed as a tool to negotiate contracts and provide services conforming to them. This Workshop provides language-based solutions to the above issues through formalization of contracts, design of appropriate abstraction mechanisms, and formal analysis of contract languages and software. The program of this edition consists of 5 regular papers and 3 invited presentations. Detailed information about the FLACOS 2011 Workshop can be found at http://flacos2011.lcc.uma.es/. The 5th edition of the FLACOS Workshop was organized by the University of M\\'alaga. It took place in M\\'alaga, Spain, during September 22-23, 2011.

  3. Workshop on IAEA Tools for Nuclear Energy System Assessment for Long-Term Planning and Development

    International Nuclear Information System (INIS)

    The purpose of the workshop is to present to Member States tools and methods that are available from the IAEA in support of long-term energy planning and nuclear energy system assessments, both focusing on the sustainable development of nuclear energy. This includes tools devoted to energy system planning, indicators for sustainable energy development, the INPRO methodology for Nuclear Energy System Assessment (NESA) and tools for analysing nuclear fuel cycle material balance. The workshop also intends to obtain feedback from Member States on applying the tools, share experiences and lessons learned, and identify needs for IAEA support

  4. BENCHMARKING WORKSHOPS AS A TOOL TO RAISE BUSINESS EXCELLENCE

    Directory of Open Access Journals (Sweden)

    Milos Jelic

    2011-03-01

    Full Text Available Annual competition for national award for business excellence appears to be a good opportunity for participating organizations to demonstrate their practices particularly those ones which enable them to excel. National quality award competition in Serbia (and Montenegro, namely "OSKAR KVALITETA" started in 1995 but was limited to competition cycle only. However, upon establishing Fund for Quality Culture and Excellence - FQCE in 2002, which took over OSKAR KVALITETA model, several changes took place. OSKAR KVALITETA turned to be annual competition in business excellence, but at the same time FQCE started to offer much wider portfolio of its services including levels of excellence programs, assessment and self-assessment training courses and benchmarking workshops. These benchmarking events have hosted by Award winners or other laureates in OSKAR KVALITETA competition who demonstrated excellence in regard of some particular criteria thus being in position to share their practice with other organizations. For six years experience in organizing benchmarking workshops FQCE scored 31 workshops covering major part of model issues. Increasing level of participation on the workshops and distinct positive trends of participants expressed satisfaction may serve as a reliable indicator that the workshops have been effective in actuating people to think and move in business excellence direction.

  5. 100-KE REACTOR CORE REMOVAL PROJECT ALTERNATIVE ANALYSIS WORKSHOP REPORT

    Energy Technology Data Exchange (ETDEWEB)

    HARRINGTON RA

    2010-01-15

    . In brief, the Path Forward was developed to reconsider potential open air demolition areas; characterize to determine if any zircaloy exists, evaluate existing concrete data to determine additional characterization needs, size the new building to accommodate human machine interface and tooling, consider bucket thumb and use ofshape-charges in design, and finally to utilize complex-wide and industry explosive demolition lessons learned in the design approach. Appendix B documents these results from the team's use ofValue Engineering process tools entitled Weighted Analysis Alternative Matrix, Matrix Conclusions, Evaluation Criteria, and Alternative Advantages and Disadvantages. These results were further supported with the team's validation of parking-lot information sheets: memories (potential ideas to consider), issues/concerns, and assumptions, contained in Appendix C. Appendix C also includes the recorded workshop flipchart notes taken from the SAR Alternatives and Project Overview presentations. The SAR workshop presentations, including a 3-D graphic illustration demonstration video have been retained in the CHPRC project file, and were not included in this report due to size limitations. The workshop concluded with a round robin close-out where each member was engaged for any last minute items and meeting utility. In summary, the team felt the session was value added and looked forward to proceeding with the recommended actions and conceptual design.

  6. Fourth Workshop and Tutorial on Practical Use of Coloured Petri Nets and the CPN Tools

    DEFF Research Database (Denmark)

    Coloured Petri Nets and the CPN tools are now used by more than 750 organisations in 50 different countries all over the world (including 150 commercial companies). The purpose of this event is to bring together some of the users and in this way provide a forum for those who are interested...... in the practical use of Coloured Petri Nets and the CPN tools. This booklet contains the proceedings of the Fourth Workshop on Practical Use of Coloured Petri Nets and the CPN Tools, August 28-30, 2002. The workshop is organised by the CPN group at the Department of Computer Science, University of Aarhus, Denmark....

  7. Pollution prevention and waste minimization tools workshops: Proceedings. Part 2

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-31

    The purpose of the second workshop was to bring together representatives of DOE and DOE contractor organizations to discuss four topics: process waste assessments (PWAs), a continuation of one of the sessions held at the first workshop in Clearwater; waste minimization reporting requirements; procurement systems for waste minimization; and heating, ventilating, and air conditioning (HVAC) and replacements for chlorofluorocarbons (CFCs). The topics were discussed in four concurrent group sessions. Participants in each group were encouraged to work toward achieving two main objectives: establish a ``clear vision`` of the overall target for their session`s program, focusing not just on where the program is now but on where it should go in the long term; and determine steps to be followed to carry out the target program.

  8. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  9. Summary of the Third International Planetary Dunes Workshop: remote sensing and image analysis of planetary dunes

    Science.gov (United States)

    Fenton, Lori K.; Hayward, Rosalyn K.; Horgan, Briony H.N.; Rubin, David M.; Titus, Timothy N.; Bishop, Mark A.; Burr, Devon M.; Chojnacki, Matthew; Dinwiddie, Cynthia L.; Kerber, Laura; Gall, Alice Le; Michaels, Timothy I.; Neakrase, Lynn D.V.; Newman, Claire E.; Tirsch, Daniela; Yizhaq, Hezi; Zimbelman, James R.

    2013-01-01

    The Third International Planetary Dunes Workshop took place in Flagstaff, AZ, USA during June 12–15, 2012. This meeting brought together a diverse group of researchers to discuss recent advances in terrestrial and planetary research on aeolian bedforms. The workshop included two and a half days of oral and poster presentations, as well as one formal (and one informal) full-day field trip. Similar to its predecessors, the presented work provided new insight on the morphology, dynamics, composition, and origin of aeolian bedforms on Venus, Earth, Mars, and Titan, with some intriguing speculation about potential aeolian processes on Triton (a satellite of Neptune) and Pluto. Major advancements since the previous International Planetary Dunes Workshop include the introduction of several new data analysis and numerical tools and utilization of low-cost field instruments (most notably the time-lapse camera). Most presentations represented advancement towards research priorities identified in both of the prior two workshops, although some previously recommended research approaches were not discussed. In addition, this workshop provided a forum for participants to discuss the uncertain future of the Planetary Aeolian Laboratory; subsequent actions taken as a result of the decisions made during the workshop may lead to an expansion of funding opportunities to use the facilities, as well as other improvements. The interactions during this workshop contributed to the success of the Third International Planetary Dunes Workshop, further developing our understanding of aeolian processes on the aeolian worlds of the Solar System.

  10. Workshop

    DEFF Research Database (Denmark)

    Hess, Regitze; Lotz, Katrine

    2003-01-01

    Program for en arkitektur-workshop med focus på de danske havne. Præsentation af 57 yngre danske og internationale deltagende arkitekter.......Program for en arkitektur-workshop med focus på de danske havne. Præsentation af 57 yngre danske og internationale deltagende arkitekter....

  11. Proceedings of the of the Eleventh Workshop on Language Descriptions, Tools and Applications (LDTA 2011)

    DEFF Research Database (Denmark)

    This volume contains the proceedings of the Eleventh Workshop on Language Descriptions, Tools and Applications (LDTA 2011), held in Saarbrücken, Germany on March 26 & 27, 2011. LDTA is a two-day satellite event of ETAPS (European Joint Conferences on Theory and Practice of Software) and organized...... in cooperation with ACM SIGPLAN. LDTA is an application and tool-oriented workshop focused on grammarware---software based on grammars in some form. Grammarware applications are typically language processing applications and traditional examples include parsers, program analyzers, optimizers and translators...... demonstration papers were selected. In addition to these 11 papers the proceedings includes the paper "Getting a Grip on Tasks that Coordinate Tasks" by Rinus Plasmeijer of Radboud University Nijmegen in The Netherlands. This paper accompanied his invited talk at the workshop. This year LDTA also puts theory...

  12. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  13. Tools for Project Management, Workshops and Consulting A Must-Have Compendium of Essential Tools and Techniques

    CERN Document Server

    Andler, Nicolai

    2012-01-01

    Typically today's tasks in management and consulting include project management, running workshops and strategic work - all complex activities, which require a multitude of skills and competencies. This standard work, which is also well accepted amongst consultants, gives you a reference or cookbook-style access to the most important tools, including a rating of each tool in terms of applicability, ease of use and effectiveness.In his book, Nicolai Andler presents about 120 of such tools, grouped into task-specific categories entitled Define Situation, Gather Information, Information Consolida

  14. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  15. 77 FR 14814 - Tobacco Product Analysis; Scientific Workshop; Request for Comments

    Science.gov (United States)

    2012-03-13

    ... days before the workshop. Comments: Regardless of attendance at the public workshop, interested persons... within the same class of constituents into a single analysis. Particularly discuss the benefits...

  16. Neutron multiplicity analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Scott L [Los Alamos National Laboratory

    2010-01-01

    I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This

  17. Proceedings of the Workshop on software tools for distributed intelligent control systems

    Energy Technology Data Exchange (ETDEWEB)

    Herget, C.J. (ed.)

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  18. Workshop in Moodle: a tool for peer critiquing

    OpenAIRE

    Brown, C.; Honeychurch, S.; Munro, J

    2011-01-01

    This paper will begin with a brief discussion of the benefits of peer assessment and peer critiquing. In particular, it will examine how both can be beneficial in helping to introduce, and reinforce, valuable graduate attributes in students throughout their university careers. It will then examine the tools available at the University of Glasgow and evaluate them in terms of their strengths and weaknesses. In order to explain this in detail, a real life case study from a third year c...

  19. PREFACE: EMAS 2013 Workshop: 13th European Workshop on Modern Developments and Applications in Microbeam Analysis

    Science.gov (United States)

    Llovet, Xavier, Dr; Matthews, Mr Michael B.; Brisset, François, Dr; Guimarães, Fernanda, Dr; Vieira, Professor Joaquim M., Dr

    2014-03-01

    This volume of the IOP Conference Series: Materials Science and Engineering contains papers from the 13th Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis which took place from the 12th to the 16th of May 2013 in the Centro de Congressos do Alfândega, Porto, Portugal. The primary aim of this series of workshops is to assess the state-of-the-art and reliability of microbeam analysis techniques. The workshops also provide a forum where students and young scientists starting out on a career in microbeam analysis can meet and discuss with the established experts. The workshops have a very specific format comprising invited plenary lectures by internationally recognized experts, poster presentations by the participants and round table discussions on the key topics led by specialists in the field. This workshop was organized in collaboration with LNEG - Laboratório Nacional de Energia e Geologia and SPMICROS - Sociedade Portuguesa de Microscopia. The technical programme included the following topics: electron probe microanalysis, future technologies, electron backscatter diffraction (EBSD), particle analysis, and applications. As at previous workshops there was also a special oral session for young scientists. The best presentation by a young scientist was awarded with an invitation to attend the 2014 Microscopy and Microanalysis meeting at Hartford, Connecticut. The prize went to Shirin Kaboli, of the Department of Metals and Materials Engineering of McGill University (Montréal, Canada), for her talk entitled ''Plastic deformation studies with electron channelling contrast imaging and electron backscattered diffraction''. The continuing relevance of the EMAS workshops and the high regard in which they are held internationally can be seen from the fact that 74 posters from 21 countries were on display at the meeting and that the participants came from as far away as Japan, Canada and the USA. A

  20. Summary of Training Workshop on the Use of NASA tools for Coastal Resource Management in the Gulf of Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Judd, Chaeli; Judd, Kathleen S.; Gulbransen, Thomas C.; Thom, Ronald M.

    2009-03-01

    A two-day training workshop was held in Xalapa, Mexico from March 10-11 2009 with the goal of training end users from the southern Gulf of Mexico states of Campeche and Veracruz in the use of tools to support coastal resource management decision-making. The workshop was held at the computer laboratory of the Institute de Ecologia, A.C. (INECOL). This report summarizes the results of that workshop and is a deliverable to our NASA client.

  1. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  2. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  3. 9th Workshop on Stochastic Analysis and Related Topics

    CERN Document Server

    Decreusefond, Laurent; Stochastic Analysis and Related Topics

    2012-01-01

    Since the early eighties, Ali Suleyman Ustunel has been one of the main contributors to the field of Malliavin calculus. In a workshop held in Paris, June 2010 several prominent researchers gave exciting talks in honor of his 60th birthday. The present volume includes scientific contributions from this workshop. Probability theory is first and foremost aimed at solving real-life problems containing randomness. Markov processes are one of the key tools for modeling that plays a vital part concerning such problems. Contributions on inventory control, mutation-selection in genetics and public-pri

  4. 8th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2015-01-01

    Numerical simulation and modelling using High Performance Computing has evolved into an established technique in academic and industrial research. At the same time, the High Performance Computing infrastructure is becoming ever more complex. For instance, most of the current top systems around the world use thousands of nodes in which classical CPUs are combined with accelerator cards in order to enhance their compute power and energy efficiency. This complexity can only be mastered with adequate development and optimization tools. Key topics addressed by these tools include parallelization on heterogeneous systems, performance optimization for CPUs and accelerators, debugging of increasingly complex scientific applications, and optimization of energy usage in the spirit of green IT. This book represents the proceedings of the 8th International Parallel Tools Workshop, held October 1-2, 2014 in Stuttgart, Germany – which is a forum to discuss the latest advancements in the parallel tools.

  5. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    CERN Document Server

    Battaglieri, M; Celentano, A; Chung, S -U; D'Angelo, A; De Vita, R; Döring, M; Dudek, J; Eidelman, S; Fegan, S; Ferretti, J; Fox, G; Galata, G; Garcia-Tecocoatzi, H; Glazier, D I; Grube, B; Hanhart, C; Hoferichter, M; Hughes, S M; Ireland, D G; Ketzer, B; Klein, F J; Kubis, B; Liu, B; Masjuan, P; Mathieu, V; McKinnon, B; Mitchell, R; Nerling, F; Paul, S; Pelaez, J R; Rademacker, J; Rizzo, A; Salgado, C; Santopinto, E; Sarantsev, A V; Sato, T; Schlüter, T; da Silva, M L L; Stankovic, I; Strakovsky, I; Szczepaniak, A; Vassallo, A; Walford, N K; Watts, D P; Zana, L

    2014-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near...

  6. IFPA Meeting 2013 Workshop Report II: use of 'omics' in understanding placental development, bioinformatics tools for gene expression analysis, planning and coordination of a placenta research network, placental imaging, evolutionary approaches to understanding pre-eclampsia.

    Science.gov (United States)

    Ackerman, W E; Adamson, L; Carter, A M; Collins, S; Cox, B; Elliot, M G; Ermini, L; Gruslin, A; Hoodless, P A; Huang, J; Kniss, D A; McGowen, M R; Post, M; Rice, G; Robinson, W; Sadovsky, Y; Salafia, C; Salomon, C; Sled, J G; Todros, T; Wildman, D E; Zamudio, S; Lash, G E

    2014-02-01

    Workshops are an important part of the IFPA annual meeting as they allow for discussion of specialized topics. At the IFPA meeting 2013 twelve themed workshops were presented, five of which are summarized in this report. These workshops related to various aspects of placental biology but collectively covered areas of new technologies for placenta research: 1) use of 'omics' in understanding placental development and pathologies; 2) bioinformatics and use of omics technologies; 3) planning and coordination of a placenta research network; 4) clinical imaging and pathological outcomes; 5) placental evolution.

  7. The Astronomy Workshop: Computer Assisted Learning Tools with Instructor Support Materials and Student Activities

    Science.gov (United States)

    Deming, Grace; Hamilton, D.; Hayes-Gehrke, M.

    2006-12-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is a collection of interactive World Wide Web tools that were developed under the direction of Doug Hamilton for use in undergraduate classes, as supplementary materials appropriate for grades 9-12, and by the general public. The philosophy of the website is to foster student and public interest in astronomy by capitalizing on their fascination with computers and the internet. Many of the tools were developed by graduate and undergraduate students at UMD. This website contains over 20 tools on topics including scientific notation, giant impacts, extrasolar planets, astronomical distances, planets, moons, comets, and asteroids. Educators around the country at universities, colleges, and secondary schools have used the Astronomy Workshop’s tools and activities as homework assignments, in-class demos, or extra credit. Since 2005, Grace Deming has assessed several of the Astronomy Workshop’s tools for clarity and effectiveness by interviewing students as they used tools on the website. Based on these interviews, Deming wrote student activities and instructor support materials and posted them to the website. Over the next three years, we will continue to interview students, develop web materials, and field-test activities. We are targeting classes in introductory undergraduate astronomy courses and grades 11-12 for our Spring 2007 field tests. We are interested in hearing your ideas on how we can make the Astronomy Workshop more appealing to educators, museum directors, specialty programs, and professors. This research is funded by NASA EPO grants NNG04GM18G and NNG06GGF99G.

  8. Social Data Analysis Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi; Hardt, Daniel;

    2014-01-01

    As governments, citizens and organizations have moved online there is an increasing need for academic enquiry to adapt to this new context for communication and political action. This adaptation is crucially dependent on researchers being equipped with the necessary methodological tools to extract......, analyze and visualize patterns of web activity. This volume profiles the latest techniques being employed by social scientists to collect and interpret data from some of the most popular social media applications, the political parties' own online activist spaces, and the wider system of hyperlinks...... and analyze web data in the process of investigating substantive questions....

  9. Second Workshop on Stochastic Analysis and Related Topics

    CERN Document Server

    Ustunel, Ali

    1990-01-01

    The Second Silivri Workshop functioned as a short summer school and a working conference, producing lecture notes and research papers on recent developments of Stochastic Analysis on Wiener space. The topics of the lectures concern short time asymptotic problems and anticipative stochastic differential equations. Research papers are mostly extensions and applications of the techniques of anticipative stochastic calculus.

  10. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  11. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  12. Proceedings of the of the Tenth Workshop on Language Descriptions, Tools and Applications (LDTA 2010)

    DEFF Research Database (Denmark)

    Brabrand, Claus

    2010-01-01

    This volume contains the proceedings of the Tenth Workshop on Language Descriptions, Tools and Applications (LDTA 2010), held in Paphos, Cyprus on March 28--29, 2010. LDTA is a two-day satellite event of ETAPS (European Joint Conferences on Theory and Practice of Software) organized in cooperation......-Louis Giavitto ("A Domain Specific Language for Complex Natural & Artificial Systems Simulations") and the 11 contributed papers that were selected for presentation and the proceedings by the programme committee from 30 submissions (i.e., 37% acceptance rate). Every submission was reviewed by at least three...... of programming languages. Tools and techniques presented at LDTA are usually applicable in the context of "Language Workbenches" or "Meta Programming Systems" or simply as parts of advanced programming environments or IDEs. These proceedings include an extended abstract based on the invited talk by Jean...

  13. Performance Analysis using CPN Tools

    DEFF Research Database (Denmark)

    Wells, Lisa Marie

    2006-01-01

    This paper provides an overview of new facilities for performance analysis using Coloured Petri Nets and the tool CPN Tools. Coloured Petri Nets is a formal modeling language that is well suited for modeling and analyzing large and complex systems. The new facilities include support for collecting...... data during simulations, for generating different kinds of performance-related output, and for running multiple simulation replications. A simple example of a network protocol is used to illustrate the flexibility of the new facilities....

  14. Workshop tools and methodologies for evaluation of energy chains and for technology perspective

    Energy Technology Data Exchange (ETDEWEB)

    Appert, O. [Institut Francais du Petrole (IFP), 92 - Rueil-Malmaison (France); Maillard, D. [Energy and Raw Materials, 75 - Paris (France); Pumphrey, D. [Energy Cooperation, US Dept. of Energy (United States); Sverdrup, G.; Valdez, B. [National Renewable Energy Laboratory, Golden, CO (United States); Schindler, J. [LB-Systemtechnik (LBST), GmbH, Ottobrunn (Germany); His, St.; Rozakis, St. [Centre International de Recherche sur Environnement Developpement (CIRED), 94 - Nogent sur Marne (France); Sagisaka, M. [LCA Research Centre (Japan); Bjornstad, D. [Oak Ridge National Laboratory, Oak Ridge, Tennessee (United States); Madre, J.L. [Institut National de Recherche sur les Transports et leur Securite, 94 - Arcueil (France); Hourcade, J.Ch. [Centre International de Recherche sur l' Environnement le Developpement (CIRED), 94 - Nogent sur Marne (France); Ricci, A.; Criqui, P.; Chateau, B.; Bunger, U.; Jeeninga, H. [EU/DG-R (Italy); Chan, A. [National Research Council (Canada); Gielen, D. [IEA-International Energy Associates Ltd., Fairfax, VA (United States); Tosato, G.C. [Energy Technology Systems Analysis Programme (ETSAP), 75 - Paris (France); Akai, M. [Agency of Industrial Science and technology (Japan); Ziesing, H.J. [Deutsches Institut fur Wirtschaftsforschung, DIW Berlin (Germany); Leban, R. [Conservatoire National des Arts et Metiers (CNAM), 75 - Paris (France)

    2005-07-01

    The aims of this workshop is to better characterize the future in integrating all the dynamic interaction between the economy, the environment and the society. It offers presentations on the Hydrogen chains evaluation, the micro-economic modelling for evaluation of bio-fuel options, life cycle assessment evolution and potentialities, the consumer valuation of energy technologies attributes, the perspectives for evaluation of changing behavior, the incentive systems and barriers to social acceptability, the internalization of external costs, the endogenous technical change in long-tem energy models, ETSAP/technology dynamics in partial equilibrium energy models, very long-term energy environment modelling, ultra long-term energy technology perspectives, the socio-economic toolbox of the EU hydrogen road-map, the combined approach using technology oriented optimization and evaluation of impacts of individual policy measures and the application of a suite of basic research portfolio management tools. (A.L.B.)

  15. Development of Workshops on Biodiversity and Evaluation of the Educational Effect by Text Mining Analysis

    Science.gov (United States)

    Baba, R.; Iijima, A.

    2014-12-01

    Conservation of biodiversity is one of the key issues in the environmental studies. As means to solve this issue, education is becoming increasingly important. In the previous work, we have developed a course of workshops on the conservation of biodiversity. To disseminate the course as a tool for environmental education, determination of the educational effect is essential. A text mining enables analyses of frequency and co-occurrence of words in the freely described texts. This study is intended to evaluate the effect of workshop by using text mining technique. We hosted the originally developed workshop on the conservation of biodiversity for 22 college students. The aim of the workshop was to inform the definition of biodiversity. Generally, biodiversity refers to the diversity of ecosystem, diversity between species, and diversity within species. To facilitate discussion, supplementary materials were used. For instance, field guides of wildlife species were used to discuss about the diversity of ecosystem. Moreover, a hierarchical framework in an ecological pyramid was shown for understanding the role of diversity between species. Besides, we offered a document material on the historical affair of Potato Famine in Ireland to discuss about the diversity within species from the genetic viewpoint. Before and after the workshop, we asked students for free description on the definition of biodiversity, and analyzed by using Tiny Text Miner. This technique enables Japanese language morphological analysis. Frequently-used words were sorted into some categories. Moreover, a principle component analysis was carried out. After the workshop, frequency of the words tagged to diversity between species and diversity within species has significantly increased. From a principle component analysis, the 1st component consists of the words such as producer, consumer, decomposer, and food chain. This indicates that the students have comprehended the close relationship between

  16. Development of Student Exercises with Instructor Support at the Astronomy Workshop Solar System Collisions Web Tool

    Science.gov (United States)

    Deming, G. L.; Hamilton, D. P.

    2005-12-01

    During the spring 2005 semester, seven students taking ASTR101 General Astronomy for non-science majors at the University of Maryland were interviewed while completing an assignment using the Astronomy Workshop Solar System Collisions web tool (http://janus.umd.edu/astro/impact/). The Astronomy Workshop Solar System Collisions web tool can be used to investigate how different variables affect collisions in a fun, but informative manner. Based on the 2005 spring interviews, three web-based activities were developed as appropriate for homework or as enrichment to coursework. The first activity explores how the impactor's mass affects energy released, crater diameter, frequency of similar impacts, and magnitude of the earthquake generated by the impact. The second activity investigates the energy released and damage done when the impactor's density is changed. Collisions by icy bodies are compared to those of rocky and metallic materials. The third activity compares collisions on different planets. In addition to masses and densities, velocities vary in these collisions. The activities are written so that introductory astronomy students will interpret the differences observed in terms of kinetic energy. During the fall 2005 semester, ASTR101 students at the University of Maryland were interviewed and observed as they completed the three activities described above using the Solar System Collisions website. The twelve students in this study were selected based on pretest scores on the Astronomy Diagnostic Test. An effort was made to include students of diverse backgrounds and mathematical experiences. Based on these interviews, final revisions have been made. Student exercises on the website and the directions on how instructors can use these materials in their courses are ready for field-testing at other institutions. Faculty interested in participating in the field-test for this project during spring 2006 are encouraged to contact the authors. This research is funded

  17. Workshop on the applications of new computer tools to thermal engineering; Applications a la thermique des nouveaux outils informatiques

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    This workshop on the applications of new computer tools to thermal engineering has been organized by the French society of thermal engineers. Seven papers have been presented, from which two papers dealing with thermal diffusivity measurements in materials and with the optimization of dryers have been selected for ETDE. (J.S.)

  18. Proceedings of the Workshop on Methods & Tools for Computer Supported Collaborative Creativity Process: Linking creativity & informal learning

    NARCIS (Netherlands)

    Retalis, Symeon; Sloep, Peter

    2009-01-01

    Retalis, S., & Sloep, P. B. (Eds.) (2009). Collection of 4 symposium papers at EC-TEL 2009. Proceedings of the Workshop on Methods & Tools for Computer Supported Collaborative Creativity Process: Linking creativity & informal learning. September, 30, 2009, Nice, France. http://sunsite.informatik.rwt

  19. Proceedings of the CEC/USDOE workshop on uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Elderkin, C.E. (Pacific Northwest Lab., Richland, WA (USA)); Kelly, G.N. (eds.)(Commission of the European Communities, Brussels (Belgium))

    1990-09-01

    In recent years it has become increasingly important to specify the uncertainty inherent in consequence assessments and in the models that trace radionuclides from their source, through the environment, to their impacts on human health. European and US scientists have, been independently developing and applying methods for analyzing uncertainty. It recently became apparent that a scientific exchange on this subject would be beneficial as improvements are sought and as uncertainty methods find broader application. The Commission of the European Communities (CEC) and the Office of Health and Environmental Research of the US Department of Energy (OHER/DOE), through their continuing agreement for cooperation, decided to co-sponsor the CEC/USDOE Workshop on Uncertainty Analysis. CEC's Radiation Protection Research Programme and OHER's Atmospheric Studies in Complex Terrain Program collaborated in planning and organizing the workshop, which was held in Santa Fe, New Mexico, on November 13 through 16, 1989. As the workshop progressed, the perspectives of individual participants, each with their particular background and interests in some segment of consequence assessment and its uncertainties, contributed to a broader view of how uncertainties are introduced and handled. This proceedings contains, first, the editors' introduction to the problem of uncertainty analysis and their general summary and conclusions. These are then followed by the results of the working groups, and the abstracts of individual presentations.

  20. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    Science.gov (United States)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  1. A Pluralistic, Longitudinal Method: Using Participatory Workshops, Interviews and Lexicographic Analysis to Investigate Relational Evolution

    DEFF Research Database (Denmark)

    Evers, Winie; Marroun, Sana; Young, Louise

    2016-01-01

    development facilitates the ability to cope with these changes. Network development was given impetus via two workshops where network members discussed the firm’s opportunities and challenges using a number of tools to facilitate brainstorming and general discussion. The impact of these workshops...

  2. Tools for income mobility analysis

    OpenAIRE

    Philippe Kerm

    2002-01-01

    A set of Stata routines to help analysis of `income mobility' are presented and illustrated. Income mobility is taken here as the pattern of income change from one time period to another within an income distribution. Multiple approaches have been advocated to assess the magnitude of income mobility. The macros presented provide tools for estimating several measures of income mobility, e.g. the Shorrocks (JET 1978) or King (Econometrica 1983) indices or summary statistics for transition matri...

  3. Shot Planning and Analysis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  4. Finite element analysis of degraded concrete structures - Workshop proceedings

    International Nuclear Information System (INIS)

    This workshop is related to the finite element analysis of degraded concrete structures. It is composed of three sessions. The first session (which title is: the use of finite element analysis in safety assessments) comprises six papers which titles are: Historical Development of Concrete Finite Element Modeling for Safety Evaluation of Accident-Challenged and Aging Concrete Structures; Experience with Finite Element Methods for Safety Assessments in Switzerland; Stress State Analysis of the Ignalina NPP Confinement System; Prestressed Containment: Behaviour when Concrete Cracking is Modelled; Application of FEA for Design and Support of NPP Containment in Russia; Verification Problems of Nuclear Installations Safety Software of Strength Analysis (NISS SA). The second session (title: concrete containment structures under accident loads) comprises seven papers which titles are: Two Application Examples of Concrete Containment Structures under Accident Load Conditions Using Finite Element Analysis; What Kind of Prediction for Leak rates for Nuclear Power Plant Containments in Accidental Conditions; Influence of Different Hypotheses Used in Numerical Models for Concrete At Elevated Temperatures on the Predicted Behaviour of NPP Core Catchers Under Severe Accident Conditions; Observations on the Constitutive Modeling of Concrete Under Multi-Axial States at Elevated Temperatures; Analyses of a Reinforced Concrete Containment with Liner Corrosion Damage; Program of Containment Concrete Control During Operation for the Temelin Nuclear Power Plant; Static Limit Load of a Deteriorated Hyperbolic Cooling Tower. The third session (concrete structures under extreme environmental load) comprised five papers which titles are: Shear Transfer Mechanism of RC Plates After Cracking; Seismic Back Calculation of an Auxiliary Building of the Nuclear Power Plant Muehleberg, Switzerland; Seismic Behaviour of Slightly Reinforced Shear Wall Structures; FE Analysis of Degraded Concrete

  5. Dialogue and Roles in a Strategy Workshop: Discovering Patterns through Discourse Analysis

    OpenAIRE

    Duffy, Martin

    2010-01-01

    Strategy workshops are frequently used by Executive management teams to discuss and formulate strategy but are under-researched and under-reported in the academic literature. This study uses Discourse Analysis to discover participant roles and dialogic patterns in an Executive management team’s strategy workshop, together with their effect on the workshop’s operation and outcome. The study shows how the workshop participants adopt different roles through their language and content. It then...

  6. A Decision-Analytic Feasibility Study of Upgrading Machinery at a Tools Workshop

    Directory of Open Access Journals (Sweden)

    M. L. Chew Hernandez

    2012-04-01

    Full Text Available This paper presents the evaluation, from a Decision Analysis point of view, of the feasibility of upgrading machinery at an existing metal-forming workshop. The Integral Decision Analysis (IDA methodology is applied to clarify the decision and develop a decision model. One of the key advantages of the IDA is its careful selection of the problem frame, allowing a correct problem definition. While following most of the original IDA methodology, an addition to this methodology is proposed in this work, that of using the strategic Means-Ends Objective Network as a backbone for the development of the decision model. The constructed decision model uses influence diagrams to include factual operator and vendor expertise, simulation to evaluate the alternatives and a utility function to take into account the risk attitude of the decision maker. Three alternatives are considered: Base (no modification, CNC (installing an automatic lathe and CF (installation of an automatic milling machine. The results are presented as a graph showing zones in which a particular alternative should be selected. The results show the potential of IDA to tackle technical decisions that are otherwise approached without the due care.

  7. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  8. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  9. General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  10. Assessing the interactivity and prescriptiveness of faculty professional development workshops: The Real-Time Professional Development Observation Tool (R-PDOT)

    CERN Document Server

    Olmstead, Alice

    2016-01-01

    Professional development workshops are one of the primary mechanisms used to help faculty improve their teaching, and draw in many STEM instructors every year. Although workshops serve a critical role in changing instructional practices within our community, we rarely assess workshops through careful consideration of how they engage faculty. Initial evidence suggests that workshop leaders often overlook central tenets of education research that are well-established in classroom contexts, such as the role of interactivity in enabling student learning. As such, there is a need to develop more robust, evidence-based models of how best to support faculty learning in professional development contexts, and to activity support workshop leaders in relating their design decisions to familiar ideas form other educational contexts. In response to these needs, we have developed an observation tool, the Real-Time Professional Development Observation Tool (R-PDOT), to document the form and focus of faculty's engagement dur...

  11. Haplotype sharing analysis with SNPs in candidate genes : The genetic analysis workshop 12 example

    NARCIS (Netherlands)

    Fischer, C; Beckmann, L; Majoram, P; Meerman, GT; Chang-Claude, J

    2003-01-01

    Haplotype sharing analysis was used to investigate the association of affection status with single nucleotide polymorphism (SNP) haplotypes within candidate gene 1 in one sample each from the isolated and the general population of Genetic Analysis Workshop (GAW) 12 simulated data. Gene 1 has direct

  12. Exploration tools in formal concept analysis

    OpenAIRE

    Stumme, Gerd

    1996-01-01

    The development of conceptual knowledge systems specifically requests knowledge acquisition tools within the framework of formal concept analysis. In this paper, the existing tools are presented, and furhter developments are discussed.

  13. Proceedings: Workshop on Advanced Mathematics and Computer Science for Power Systems Analysis

    Energy Technology Data Exchange (ETDEWEB)

    None

    1991-08-01

    EPRI's Office of Exploratory Research sponsors a series of workshops that explore how to apply recent advances in mathematics and computer science to the problems of the electric utility industry. In this workshop, participants identified research objectives that may significantly improve the mathematical methods and computer architecture currently used for power system analysis.

  14. Chronic wasting disease risk analysis workshop: An integrative approach

    Science.gov (United States)

    Gillette, Shana; Dein, Joshua; Salman, Mo; Richards, Bryan; Duarte, Paulo

    2004-01-01

    Risk analysis tools have been successfully used to determine the potential hazard associated with disease introductions and have facilitated management decisions designed to limit the potential for disease introduction. Chronic Wasting Disease (CWD) poses significant challenges for resource managers due to an incomplete understanding of disease etiology and epidemiology and the complexity of management and political jurisdictions. Tools designed specifically to assess the risk of CWD introduction would be of great value to policy makers in areas where CWD has not been detected.

  15. Workshop on Thermal Emission Spectroscopy and Analysis of Dust, Disk, and Regoliths

    Science.gov (United States)

    Sprague, Ann L. (Editor); Lynch, David K. (Editor); Sitko, Michael (Editor)

    1999-01-01

    This volume contains abstracts that have been accepted for presentation at the workshop on Thermal Emission Spectroscopy and analysis of Dust, Disks and Regoliths, held April 28-30, 1999, in Houston Texas.

  16. Multi-mission telecom analysis tool

    Science.gov (United States)

    Hanks, D.; Kordon, M.; Baker, J.

    2002-01-01

    In the early formulation phase of a mission it is critically important to have fast, easy to use, easy to integrate space vehicle subsystem analysis tools so that engineers can rapidly perform trade studies not only by themselves but in coordination with other subsystem engineers as well. The Multi-Mission Telecom Analysis Tool (MMTAT) is designed for just this purpose.

  17. TECHNICAL NOTES Modelica - Language, Libraries, Tools, Workshop and EU-Project

    OpenAIRE

    Otter, Martin; Elmqvist, Hilding

    2000-01-01

    Modelica is a new language for convenient modeling of physical sytems. In this article an overview about the language features is given, the organisation behind the language development, available Modelica libraries and Modelica simulation environments, the recent, first workshop about Modelica and the EU-Project RealSim to enhance hardware-in-the-loop simulation and design optimization techniques on the basis of Modelica.

  18. SHARAD Radargram Analysis Tool Development in JMARS

    Science.gov (United States)

    Adler, J. B.; Anwar, S.; Dickenshied, S.; Carter, S.

    2016-09-01

    New tools are being developed in JMARS, a free GIS software, for SHARAD radargram viewing and analysis. These capabilities are useful for the polar science community, and for constraining the viability of ice resource deposits for human exploration.

  19. Quick Spacecraft Thermal Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  20. High-Speed Research: 1994 Sonic Boom Workshop. Configuration, Design, Analysis and Testing

    Science.gov (United States)

    McCurdy, David A. (Editor)

    1999-01-01

    The third High-Speed Research Sonic Boom Workshop was held at NASA Langley Research Center on June 1-3, 1994. The purpose of this workshop was to provide a forum for Government, industry, and university participants to present and discuss progress in their research. The workshop was organized into sessions dealing with atmospheric propagation; acceptability studies; and configuration design, and testing. Attendance at the workshop was by invitation only. The workshop proceedings include papers on design, analysis, and testing of low-boom high-speed civil transport configurations and experimental techniques for measuring sonic booms. Significant progress is noted in these areas in the time since the previous workshop a year earlier. The papers include preliminary results of sonic boom wind tunnel tests conducted during 1993 and 1994 on several low-boom designs. Results of a mission performance analysis of all low-boom designs are also included. Two experimental methods for measuring near-field signatures of airplanes in flight are reported.

  1. Turbine Aerodynamics Design Tool Development

    Science.gov (United States)

    Huber, Frank W.; Turner, James E. (Technical Monitor)

    2001-01-01

    This paper presents the Marshal Space Flight Center Fluids Workshop on Turbine Aerodynamic design tool development. The topics include: (1) Meanline Design/Off-design Analysis; and (2) Airfoil Contour Generation and Analysis. This paper is in viewgraph form.

  2. Proceedings of a NEA workshop on probabilistic structure integrity analysis and its relationship to deterministic analysis

    International Nuclear Information System (INIS)

    This workshop was hosted jointly by the Swedish Nuclear Power Inspectorate (SKi) and the Swedish Royal Institute of Technology (KTH). It was sponsored by the Principal Working Group 3 (PWG-3) of the NEA CSNI. PWG-3 deals with the integrity of structures and components, and has three sub-groups, dealing with the integrity of metal components and structures, ageing of concrete structures, and the seismic behaviour of structures. The sub-group dealing with metal components has three mains areas of activity: non-destructive examination; fracture mechanics; and material degradation. The topic of this workshop is primarily probabilistic fracture mechanics, but probabilistic integrity analysis includes NDE and materials degradation also. Session 1 (5 papers) was devoted to the development of probabilistic models; Session 2 (5 papers) to the random modelling of defects and material properties; Session 3 (8 papers) to the applications of probabilistic modelling to nuclear components; Sessions 4 is a concluding panel discussion

  3. Abstract Interfaces for Data Analysis —Component Architecture for Data Analysis Tools

    Institute of Scientific and Technical Information of China (English)

    G.Barrand; P.Binko; 等

    2001-01-01

    The fast turnover of software technologies,in particular in the domain of in teractivity(covering user interface and visualisation)makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete.At the HepVis '99 workshop,a working group has been formed to improve the rpoduction of software tools for data analysis in HENP.Beside promoting a distributed development organisation,one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques.An initial domain analysis has come up with several categories(componets)found in typical data analysis tools:historams,Ntuples,Functions,Vectors,Fitter,Plotter,Analyzer and Controller,Special Emphasis was put on reducing the couplings between the categories to a minimum,thus optimising re-use and maintainability of any component individually.The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++(Anaphe/Lizard,Openscientist)and Java(Java Analysis Studio),A special implementation aims at accessing the Java Liraries(through their Abstract Interfaces)from C++.This paper giver an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

  4. An Automatic Hierarchical Delay Analysis Tool

    Institute of Scientific and Technical Information of China (English)

    FaridMheir-El-Saadi; BozenaKaminska

    1994-01-01

    The performance analysis of VLSI integrated circuits(ICs) with flat tools is slow and even sometimes impossible to complete.Some hierarchical tools have been developed to speed up the analysis of these large ICs.However,these hierarchical tools suffer from a poor interaction with the CAD database and poorly automatized operations.We introduce a general hierarchical framework for performance analysis to solve these problems.The circuit analysis is automatic under the proposed framework.Information that has been automatically abstracted in the hierarchy is kept in database properties along with the topological information.A limited software implementation of the framework,PREDICT,has also been developed to analyze the delay performance.Experimental results show that hierarchical analysis CPU time and memory requirements are low if heuristics are used during the abstraction process.

  5. Virtual Workshop

    DEFF Research Database (Denmark)

    Buus, Lillian; Bygholm, Ann

    experiences for the learning design of MVU courses. The workshop intented to give the participants the possibility to draw their own experiences with issues on computer supported collaboration, group work in a virtual environment, synchronous and asynchronous communication media, and different perspectives......In relation to the Tutor course in the Mediterranean Virtual University (MVU) project, a virtual workshop “Getting experiences with different synchronous communication media, collaboration, and group work” was held with all partner institutions in January 2006. More than 25 key-tutors within MVU...... participated from different institutions in the workshop. The result of the workshop was experiences with different communication tools and media. Facing the difficulties and possibilities in collaborateting virtually concerned around group work and development of a shared presentation. All based on getting...

  6. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  7. Surface analysis of stone and bone tools

    Science.gov (United States)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  8. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  9. Built Environment Energy Analysis Tool Overview (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  10. Proceedings: Workshop on advanced mathematics and computer science for power systems analysis

    Energy Technology Data Exchange (ETDEWEB)

    Esselman, W.H.; Iveson, R.H. (Electric Power Research Inst., Palo Alto, CA (United States))

    1991-08-01

    The Mathematics and Computer Workshop on Power System Analysis was held February 21--22, 1989, in Palo Alto, California. The workshop was the first in a series sponsored by EPRI's Office of Exploratory Research as part of its effort to develop ways in which recent advances in mathematics and computer science can be applied to the problems of the electric utility industry. The purpose of this workshop was to identify research objectives in the field of advanced computational algorithms needed for the application of advanced parallel processing architecture to problems of power system control and operation. Approximately 35 participants heard six presentations on power flow problems, transient stability, power system control, electromagnetic transients, user-machine interfaces, and database management. In the discussions that followed, participants identified five areas warranting further investigation: system load flow analysis, transient power and voltage analysis, structural instability and bifurcation, control systems design, and proximity to instability. 63 refs.

  11. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    Science.gov (United States)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  12. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  13. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  14. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  15. Statistical Tools for Forensic Analysis of Toolmarks

    Energy Technology Data Exchange (ETDEWEB)

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  16. From sensor networks to connected analysis tools

    Science.gov (United States)

    Dawes, N.; Bavay, M.; Egger, T.; Sarni, S.; Salehi, A.; Davison, A.; Jeung, H.; Aberer, K.; Lehning, M.

    2012-04-01

    Multi-disciplinary data systems provide excellent tools for locating data, but most eventually provide a series of local files for further processing, providing marginal advantages for the regular user. The Swiss Experiment Platform (SwissEx) was built with the primary goal of enabling high density measurements, integrating them with lower density existing measurements and encouraging cross/inter-disciplinary collaborations. Nearing the end of the project, we have exceeded these goals, also providing connected tools for direct data access from analysis applications. SwissEx (www.swiss-experiment.ch) provides self-organising networks for rapid deployment and integrates these data with existing measurements from across environmental research. The data are categorised and documented according to their originating experiments and fieldsites as well as being searchable globally. Data from SwissEx are available for download, but we also provide tools to directly access data from within common scientific applications (Matlab, LabView, R) and numerical models such as Alpine3D (using a data acquisition plugin and preprocessing library, MeteoIO). The continuation project (the Swiss Environmental Data and Knowledge Platform) will aim to continue the ideas developed within SwissEx and (alongside cloud enablement and standardisation) work on the development of these tools for application specific tasks. We will work alongside several projects from a wide range of disciplines to help them to develop tools which either require real-time data, or large data samples. As well as developing domain specific tools, we will also be working on tools for the utilisation of the latest knowledge in data control, trend analysis, spatio-temporal statistics and downscaling (developed within the CCES Extremes project), which will be a particularly interesting application when combined with the large range of measurements already held in the system. This presentation will look at the

  17. Proceedings Fourth International Workshop on Testing, Analysis and Verification of Web Software

    CERN Document Server

    Salaün, Gwen; Hallé, Sylvain; 10.4204/EPTCS.35

    2010-01-01

    This volume contains the papers presented at the fourth international workshop on Testing, Analysis and Verification of Software, which was associated with the 25th IEEE/ACM International Conference on Automated Software Engineering (ASE 2010). The collection of papers includes research on formal specification, model-checking, testing, and debugging of Web software.

  18. PREFACE: EMAS 2011: 12th European Workshop on Modern Developments in Microbeam Analysis

    Science.gov (United States)

    Brisset, François; Dugne, Olivier; Robaut, Florence; Lábár, János L.; Walker, Clive T.

    2012-03-01

    This volume of IOP Conference Series: Materials Science and Engineering contains papers from the 12th Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis, which took place from the 15-19 May 2011 in the Angers Congress Centre, Angers, France. The primary aim of this series of workshops is to assess the state-of-the-art and reliability of microbeam analysis techniques. The workshops also provide a forum where students and young scientists starting out on a career in microbeam analysis can meet and discuss with the established experts. The workshops have a very specific format comprising invited plenary lectures by internationally recognized experts, poster presentations by the participants and round table discussions on the key topics led by specialists in the field. This workshop was organized in collaboration with GN-MEBA - Groupement National de Microscopie Electronique à Balayage et de microAnalysis, France. The technical programme included the following topics: the limits of EPMA, new techniques, developments and concepts in microanalysis, microanalysis in the SEM, and new and less common applications of micro- and nanoanalysis. As at previous workshops there was also a special oral session for young scientists. The best presentation by a young scientist was awarded with an invitation to attend the 2012 Microscopy and Microanalysis meeting at Phoenix, Arizona. The prize went to Pierre Burdet, of the Federal Institute of Technology of Lausanne (EPFL), for his talk entitled '3D EDS microanalysis by FIB-SEM: enhancement of elemental quantification'. The continuing relevance of the EMAS workshops and the high regard in which they are held internationally can be seen from the fact that 74 posters from 18 countries were on display at the meeting, and that the participants came from as far away as Japan, Canada and the USA. A selection of participants with posters were invited to give a short oral

  19. Energy demand analysis in the workshop on alternative energy strategies

    Energy Technology Data Exchange (ETDEWEB)

    Carhart, S C

    1978-04-01

    The Workshop on Alternative Energy Strategies, conducted from 1974 through 1977, was an international study group formed to develop consistent national energy alternatives within a common analytical framework and global assumptions. A major component of this activity was the demand program, which involved preparation of highly disaggregated demand estimates based upon estimates of energy-consuming activities and energy requirements per unit of activity reported on a consistent basis for North America, Europe, and Japan. Comparison of the results of these studies reveals that North America requires more energy per unit of activity in many consumption categories, that major improvements in efficiency will move North America close to current European and Japanese efficiencies, and that further improvements in European and Japanese efficiencies may be anticipated as well. When contrasted with expected availabilities of fuels, major shortfalls of oil relative to projected demands emerge in the eighties and nineties. Some approaches to investment in efficiency improvements which will offset these difficulties are discussed.

  20. Presentations and recorded keynotes of the First European Workshop on Latent Semantic Analysis in Technology Enhanced Learning

    NARCIS (Netherlands)

    Several

    2007-01-01

    Presentations and recorded keynotes at the 1st European Workshop on Latent Semantic Analysis in Technology-Enhanced Learning, March, 29-30, 2007. Heerlen, The Netherlands: The Open University of the Netherlands. Please see the conference website for more information: http://homer.ou.nl/lsa-workshop0

  1. Space Debris Reentry Analysis Methods and Tools

    Institute of Scientific and Technical Information of China (English)

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe

    2011-01-01

    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  2. Data Base Directions: Information Resource Management - Strategies and Tools. Proceedings of the Workshop of the National Bureau of Standards and the Association for Computing Machinery (Ft. Lauderdale, Florida, October 20-22, 1980).

    Science.gov (United States)

    Goldfine, Alan H., Ed.

    This workshop investigated how managers can evaluate, select, and effectively use information resource management (IRM) tools, especially data dictionary systems (DDS). An executive summary, which provides a definition of IRM as developed by workshop participants, precedes the keynote address, "Data: The Raw Material of a Paper Factory," by John…

  3. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...... are specified using the temporal logic Probabilistic Computation Tree Logic (PCTL) and we employ stochastic model checking, by means of the model checker PRISM, to compute their exact values. We present a simplified example of a distributed stochastic system where we determine a reachability property...

  4. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  5. A tool for subjective analysis of TTOs

    OpenAIRE

    Resende, David Nunes; Gibson, David V.; Jarrett, James

    2011-01-01

    The objective of this article is to present a proposal (working paper) for a quantitative analysis tool to help technology transfer offices (TTOs) improve their structures, processes and procedures. Our research started from the study of internal practices and structures that facilitate the interaction between R&D institutions, their TTOs and regional surroundings. We wanted to identify “bottlenecks” in those processes, procedures, and structures. We mapped the bottlenecks in a set of “...

  6. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  7. A content analysis of advertisements for psychotherapy workshops: implications for disseminating empirically supported treatments.

    Science.gov (United States)

    Cook, Joan M; Weingardt, Kenneth R; Jaszka, Jacqueline; Wiesner, Michael

    2008-03-01

    This study involved a content analysis of 261 unique advertisements for psychotherapy workshops that appeared in two bimonthly clinical magazines, Psychotherapy Networker and Counselor, during a 2-year period. Two independent judges coded each advertisement and documented the type and prevalence of advertising appeals used. From the seminal diffusion of innovations model, Rogers' (2003) five perceived characteristics of innovations found to influence adoption in diverse fields were not well represented in these workshops appeals, appearing less than 10% each. Few advertisements cited specific empirically supported treatments or presented any evidence of treatment effectiveness beyond expert testimonials. The most frequently noted appeals were to benefit the clinician (e.g., earning education credit or developing skills), characteristics that enhance credibility of the workshop (e.g., reference to storied history or mention of faculty), and features of the advertisements itself (e.g., use of superlatives and exclamation points). Promotional strategies to advertise psychotherapy workshops can be used to inform the dissemination of empirically supported treatments. PMID:18271002

  8. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  9. Materials characterization center workshop on compositional and microstructural analysis of nuclear waste materials. Summary report

    International Nuclear Information System (INIS)

    The purpose of the Workshop on Compositional and Microstructural Analysis of Nuclear Waste Materials, conducted November 11 and 12, 1980, was to critically examine and evaluate the various methods currently used to study non-radioactive, simulated, nuclear waste-form performance. Workshop participants recognized that most of the Materials Characterization Center (MCC) test data for inclusion in the Nuclear Waste Materials Handbook will result from application of appropriate analytical procedures to waste-package materials or to the products of performance tests. Therefore, the analytical methods must be reliable and of known accuracy and precision, and results must be directly comparable with those from other laboratories and from other nuclear waste materials. The 41 participants representing 18 laboratories in the United States and Canada were organized into three working groups: Analysis of Liquids and Solutions, Quantitative Analysis of Solids, and Phase and Microstructure Analysis. Each group identified the analytical methods favored by their respective laboratories, discussed areas needing attention, listed standards and reference materials currently used, and recommended means of verifying interlaboratory comparability of data. The major conclusions from this workshop are presented

  10. Web-based pre-Analysis Tools

    CERN Document Server

    Moskalets, Tetiana

    2014-01-01

    The project consists in the initial development of a web based and cloud computing services to allow students and researches to perform fast and very useful cut-based pre-analysis on a browser, using real data and official Monte-Carlo simulations (MC). Several tools are considered: ROOT files filter, JavaScript Multivariable Cross-Filter, JavaScript ROOT browser and JavaScript Scatter-Matrix Libraries. Preliminary but satisfactory results have been deployed online for test and future upgrades.

  11. 16th International workshop on Advanced Computing and Analysis Techniques in physics (ACAT)

    CERN Document Server

    Lokajicek, M; Tumova, N

    2015-01-01

    16th International workshop on Advanced Computing and Analysis Techniques in physics (ACAT). The ACAT workshop series, formerly AIHENP (Artificial Intelligence in High Energy and Nuclear Physics), was created back in 1990. Its main purpose is to gather researchers related with computing in physics research together, from both physics and computer science sides, and bring them a chance to communicate with each other. It has established bridges between physics and computer science research, facilitating the advances in our understanding of the Universe at its smallest and largest scales. With the Large Hadron Collider and many astronomy and astrophysics experiments collecting larger and larger amounts of data, such bridges are needed now more than ever. The 16th edition of ACAT aims to bring related researchers together, once more, to explore and confront the boundaries of computing, automatic data analysis and theoretical calculation technologies. It will create a forum for exchanging ideas among the fields an...

  12. DEVELOPING NEW TOOLS FOR POLICY ANALYSIS

    International Nuclear Information System (INIS)

    For the past three years, the Office of Security Policy has been aggressively pursuing substantial improvements in the U. S. Department of Energy (DOE) regulations and directives related to safeguards and security (S and S). An initial effort focused on areas where specific improvements could be made. This revision was completed during 2009 with the publication of a number of revised manuals. Developing these revisions involved more than 100 experts in the various disciplines involved, yet the changes made were only those that could be identified and agreed upon based largely on expert opinion. The next phase of changes will be more analytically based. A thorough review of the entire (S and S) directives set will be conducted using software tools to analyze the present directives with a view toward (1) identifying areas of positive synergism among topical areas, (2) identifying areas of unnecessary duplication within and among topical areas, and (3) identifying requirements that are less than effective in achieving the intended protection goals. This paper will describe the software tools available and in development that will be used in this effort. Some examples of the output of the tools will be included, as will a short discussion of the follow-on analysis that will be performed when these outputs are available to policy analysts.

  13. 11th International Workshop in Model-Oriented Design and Analysis

    CERN Document Server

    Müller, Christine; Atkinson, Anthony

    2016-01-01

    This volume contains pioneering contributions to both the theory and practice of optimal experimental design. Topics include the optimality of designs in linear and nonlinear models, as well as designs for correlated observations and for sequential experimentation. There is an emphasis on applications to medicine, in particular, to the design of clinical trials. Scientists from Europe, the US, Asia, Australia and Africa contributed to this volume of papers from the 11th Workshop on Model Oriented Design and Analysis.

  14. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system...

  15. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  16. Standardised risk analysis as a communication tool

    International Nuclear Information System (INIS)

    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  17. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  18. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  19. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  20. Scientific Ballooning Technologies Workshop STO-2 Thermal Design and Analysis

    Science.gov (United States)

    Ferguson, Doug

    2016-01-01

    The heritage thermal model for the full STO-2 (Stratospheric Terahertz Observatory II), vehicle has been updated to model the CSBF (Columbia Scientific Balloon Facility) SIP-14 (Scientific Instrument Package) in detail. Analysis of this model has been performed for the Antarctica FY2017 launch season. Model temperature predictions are compared to previous results from STO-2 review documents.

  1. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  2. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  3. Ultrasonic vibrating system design and tool analysis

    Institute of Scientific and Technical Information of China (English)

    Kei-Lin KUO

    2009-01-01

    The applications of ultrasonic vibrations for material removal processes exist predominantly in the area of vertical processing of hard and brittle materials. This is because the power generated by vertical vibrating oscillators generates the greatest direct penetration, in order to conduct material removal on workpieces by grains. However, for milling processes, vertical vibrating power has to be transformed into lateral (horizontal) vibration to produce the required horizontal cutting force. The objective of this study is to make use of ultrasonic lateral transformation theory to optimize processing efficiency, through the use of the finite element method for design and analysis of the milling tool. In addition, changes can be made to the existing vibrating system to generate best performance under consistent conditions, namely, using the same piezoelectric ceramics.

  4. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  5. Summary of the workshop on structural analysis needs for magnetic fusion energy superconducting magnets

    International Nuclear Information System (INIS)

    The technical portions of the meeting were divided into three major sessions as follows: (1) Review of methods being presently used by the MFE community for structural evaluation of current designs. (2) Future structural analysis needs. (3) Open discussions dealing with adequacy of present methods, the improvements needed for MFE magnet structural analysis, and the establishment of an MFE magnet structural advisory group. Summaries of the individual talks presented on Wednesday and Thursday (i.e., items 1 and 2 above) are included following the workshop schedule given later in this synopsis

  6. Proceedings of the International Workshop on: methods and tools for water-related adaptation to climate change and climate proofing

    NARCIS (Netherlands)

    Moerwanto, A.S.; Driel, van W.; Susandi, A.; Schrevel, A.; Meer, van der P.J.; Jacobs, C.

    2010-01-01

    The workshop fits in the National Water Plan of the Netherlands’ government of which the international chapter includes the strengthening of cooperation with other delta countries, including Indonesia, Vietnam and Bangladesh and is part of the work plan of the Cooperative Programme on Water and Clim

  7. Cretaceous oceanic red bed deposition, a tool for paleoenvironmental changes--Workshop of IGCP 463 & 494

    Institute of Scientific and Technical Information of China (English)

    MihaelaCarmenMelinte; RobertScott; ChengshanWANG; XiumianHU

    2005-01-01

    Members of IGCP 463, Cretaceous Oceanic Red Beds (CORBs), held the third workshop in Romania. In addition to scientific sessions,discussions of results and future plans, the participants examined exposures of Upper Cretaceous Red Beds of the Romanian Carpathians characterized both by pelagic/hemipelagic and turbiditic facies.

  8. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  9. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  10. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a comple...

  11. Risk analysis as a decision tool

    International Nuclear Information System (INIS)

    From 1983 - 1985 a lecture series entitled ''Risk-benefit analysis'' was held at the Swiss Federal Institute of Technology (ETH), Zurich, in cooperation with the Central Department for the Safety of Nuclear Installations of the Swiss Federal Agency of Energy Economy. In that setting the value of risk-oriented evaluation models as a decision tool in safety questions was discussed on a broad basis. Experts of international reputation from the Federal Republic of Germany, France, Canada, the United States and Switzerland have contributed to report in this joint volume on the uses of such models. Following an introductory synopsis on risk analysis and risk assessment the book deals with practical examples in the fields of medicine, nuclear power, chemistry, transport and civil engineering. Particular attention is paid to the dialogue between analysts and decision makers taking into account the economic-technical aspects and social values. The recent chemical disaster in the Indian city of Bhopal again signals the necessity of such analyses. All the lectures were recorded individually. (orig./HP)

  12. ISHM Decision Analysis Tool: Operations Concept

    Science.gov (United States)

    2006-01-01

    The state-of-the-practice Shuttle caution and warning system warns the crew of conditions that may create a hazard to orbiter operations and/or crew. Depending on the severity of the alarm, the crew is alerted with a combination of sirens, tones, annunciator lights, or fault messages. The combination of anomalies (and hence alarms) indicates the problem. Even with much training, determining what problem a particular combination represents is not trivial. In many situations, an automated diagnosis system can help the crew more easily determine an underlying root cause. Due to limitations of diagnosis systems,however, it is not always possible to explain a set of alarms with a single root cause. Rather, the system generates a set of hypotheses that the crew can select from. The ISHM Decision Analysis Tool (IDAT) assists with this task. It presents the crew relevant information that could help them resolve the ambiguity of multiple root causes and determine a method for mitigating the problem. IDAT follows graphical user interface design guidelines and incorporates a decision analysis system. I describe both of these aspects.

  13. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  14. Parallel Enhancements of the General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  15. 1995 NASA High-Speed Research Program Sonic Boom Workshop. Volume 2; Configuration Design, Analysis, and Testing

    Science.gov (United States)

    Baize, Daniel G. (Editor)

    1999-01-01

    The High-Speed Research Program and NASA Langley Research Center sponsored the NASA High-Speed Research Program Sonic Boom Workshop on September 12-13, 1995. The workshop was designed to bring together NASAs scientists and engineers and their counterparts in industry, other Government agencies, and academia working together in the sonic boom element of NASAs High-Speed Research Program. Specific objectives of this workshop were to: (1) report the progress and status of research in sonic boom propagation, acceptability, and design; (2) promote and disseminate this technology within the appropriate technical communities; (3) help promote synergy among the scientists working in the Program; and (4) identify technology pacing, the development C, of viable reduced-boom High-Speed Civil Transport concepts. The Workshop was organized in four sessions: Sessions 1 Sonic Boom Propagation (Theoretical); Session 2 Sonic Boom Propagation (Experimental); Session 3 Acceptability Studies-Human and Animal; and Session 4 - Configuration Design, Analysis, and Testing.

  16. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  17. Predictive tools and data needs for long term performance of in-situ stabilization and containment systems: DOE/OST stabilization workshop, June 26-27, Park City, Utah

    International Nuclear Information System (INIS)

    This paper summarizes the discussion within the Predictive Tools and Data Needs for Long Term Performance Assessment Subgroup. This subgroup formed at the DOE Office of Science and Technology workshop to address long-term performance of in situ stabilization and containment systems. The workshop was held in Park City, Utah, 26 and 27 June, 1996. All projects, engineering and environmental, have built-in decision processes that involve varying risk/reward scenarios. Such decision-processes maybe awkward to describe but are utilized every day following approaches that range from intuitive to advanced mathematical and numerical. Examples are the selection of components of home sound system, the members of a sports team, investments in a portfolio, and the members of a committee. Inherent in the decision method are an understanding of the function or process of the system requiring a decision or prediction, an understanding of the criteria on which decisions are made such as cost, performance, durability and verifiability. Finally, this process requires a means to judge or predict how the objects, activities, people and processes being analyzed will perform relative to the operations and functions of the system and relative to the decision criteria posed for the problem. These risk and decision analyses are proactive and iterative throughout the life of a remediation project. Prediction inherent to the analyses are based on intuition, experience, trial and error, and system analysis often using numerical approaches

  18. Ball Bearing Analysis with the ORBIS Tool

    Science.gov (United States)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  19. Tools for Knowledge Analysis, Synthesis, and Sharing

    Science.gov (United States)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  20. Science for Managing Riverine Ecosystems: Actions for the USGS Identified in the Workshop "Analysis of Flow and Habitat for Instream Aquatic Communities"

    Science.gov (United States)

    Bencala, Kenneth E.; Hamilton, David B.; Petersen, James H.

    2006-01-01

    Federal and state agencies need improved scientific analysis to support riverine ecosystem management. The ability of the USGS to integrate geologic, hydrologic, chemical, geographic, and biological data into new tools and models provides unparalleled opportunities to translate the best riverine science into useful approaches and usable information to address issues faced by river managers. In addition to this capability to provide integrated science, the USGS has a long history of providing long-term and nationwide information about natural resources. The USGS is now in a position to advance its ability to provide the scientific support for the management of riverine ecosystems. To address this need, the USGS held a listening session in Fort Collins, Colorado in April 2006. Goals of the workshop were to: 1) learn about the key resource issues facing DOI, other Federal, and state resource management agencies; 2) discuss new approaches and information needs for addressing these issues; and 3) outline a strategy for the USGS role in supporting riverine ecosystem management. Workshop discussions focused on key components of a USGS strategy: Communications, Synthesis, and Research. The workshop identified 3 priority actions the USGS can initiate now to advance its capabilities to support integrated science for resource managers in partner government agencies and non-governmental organizations: 1) Synthesize the existing science of riverine ecosystem processes to produce broadly applicable conceptual models, 2) Enhance selected ongoing instream flow projects with complementary interdisciplinary studies, and 3) Design a long-term, watershed-scale research program that will substantively reinvent riverine ecosystem science. In addition, topical discussion groups on hydrology, geomorphology, aquatic habitat and populations, and socio-economic analysis and negotiation identified eleven important complementary actions required to advance the state of the science and to

  1. Diving into the analysis of time-depth recorder and behavioural data records: A workshop summary

    Science.gov (United States)

    Womble, Jamie N.; Horning, Markus; Lea, Mary-Anne; Rehberg, Michael J.

    2013-04-01

    Directly observing the foraging behavior of animals in the marine environment can be extremely challenging, if not impossible, as such behavior often takes place beneath the surface of the ocean and in extremely remote areas. In lieu of directly observing foraging behavior, data from time-depth recorders and other types of behavioral data recording devices are commonly used to describe and quantify the behavior of fish, squid, seabirds, sea turtles, pinnipeds, and cetaceans. Often the definitions of actual behavioral units and analytical approaches may vary substantially which may influence results and limit our ability to compare behaviors of interest across taxonomic groups and geographic regions. A workshop was convened in association with the Fourth International Symposium on Bio-logging in Hobart, Tasmania on 8 March 2011, with the goal of providing a forum for the presentation, review, and discussion of various methods and approaches that are used to describe and analyze time-depth recorder and associated behavioral data records. The international meeting brought together 36 participants from 14 countries from a diversity of backgrounds including scientists from academia and government, graduate students, post-doctoral fellows, and developers of electronic tagging technology and analysis software. The specific objectives of the workshop were to host a series of invited presentations followed by discussion sessions focused on (1) identifying behavioral units and metrics that are suitable for empirical studies, (2) reviewing analytical approaches and techniques that can be used to objectively classify behavior, and (3) identifying cases when temporal autocorrelation structure is useful for identifying behaviors of interest. Outcomes of the workshop included highlighting the need to better define behavioral units and to devise more standardized processing and analytical techniques in order to ensure that results are comparable across studies and taxonomic groups.

  2. Interactive Graphics Tools for Analysis of MOLA and Other Data

    Science.gov (United States)

    Frey, H.; Roark, J.; Sakimoto, S.

    2000-01-01

    We have developed several interactive analysis tools based on the IDL programming language for the analysis of Mars Orbiting Laser Altimeter (MOLA) profile and gridded data which are available to the general community.

  3. Generalized Geophysical Retrieval and Analysis Tool for Planetary Atmospheres Project

    Data.gov (United States)

    National Aeronautics and Space Administration — CPI proposes to develop an innovative, generalized retrieval algorithm and analysis tool (GRANT) that will facilitate analysis of remote sensing data from both...

  4. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  5. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri;

    2011-01-01

    We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order...

  6. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  7. Interactive Decision Analysis; Proceedings of an International Workshop on Interactive Decision Analysis and Interpretative Computer Intelligence, Laxenburg, Austria, September 20-23, 1983

    OpenAIRE

    Grauer, M.; A.P. Wierzbicki

    1984-01-01

    An International Workshop on Interactive Decision Analysis and Interpretative Computer Intelligence was held at IIASA in September 1983. The Workshop was motivated, firstly, by the realization that the rapid development of computers, especially microcomputers, will greatly increase the scope and capabilities of computerized decision-support systems. It is important to explore the potential of these systems for use in handling the complex technological, environmental, economic and social probl...

  8. An Integrated Tool for System Analysis of Sample Return Vehicles

    Science.gov (United States)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  9. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  10. Graphical Acoustic Liner Design and Analysis Tool

    Science.gov (United States)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  11. Using Visual Tools for Analysis and Learning

    OpenAIRE

    Burton, Rob; Barlow, Nichola; Barker, Caroline

    2010-01-01

    This pack is intended as a resource for lecturers and students to facilitate the further development of their learning and teaching strategies. Visual tools were initially introduced within a module of the Year 3 nursing curriculum within the University of Huddersfield by Dr Rob Burton. Throughout the period of 2007-2008 a small team of lecturers with a keen interest in this teaching and learning strategy engaged in exploring and reviewing the literature. They also attended a series of loc...

  12. Dynamics of the 1054 UT March 22, 1979, substorm event - CDAW 6. [Coordinated Data Analysis Workshop

    Science.gov (United States)

    Mcpherron, R. L.; Manka, R. H.

    1985-01-01

    The Coordinated Data Analysis Workshop (CDAW 6) has the primary objective to trace the flow of energy from the solar wind through the magnetosphere to its ultimate dissipation in the ionosphere. An essential role in this energy transfer is played by magnetospheric substorms, however, details are not yet completely understood. The International Magnetospheric Study (IMS) has provided an ideal data base for the study conducted by CDAW 6. The present investigation is concerned with the 1054 UT March 22, 1979, substorm event, which had been selected for detailed examination in connection with the studies performed by the CDAW 6. The observations of this substorm are discussed, taking into account solar wind conditions, ground magnetic activity on March 22, 1979, observations at synchronous orbit, observations in the near geomagnetic tail, and the onset of the 1054 UT expansion phase. Substorm development and magnetospheric dynamics are discussed on the basis of a synthesis of the observations.

  13. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  14. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  15. Fully Parallel MHD Stability Analysis Tool

    Science.gov (United States)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2015-11-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Results of MARS parallelization and of the development of a new fix boundary equilibrium code adapted for MARS input will be reported. Work is supported by the U.S. DOE SBIR program.

  16. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  17. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  18. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...... paper and two short tool descriptions were carefully reviewed and selected from a total of 107 submissions. The papers are organized in topical sections on software and formal methods, formal methods, timed and hybrid systems, infinite and parameterized systems, diagnostic and test generation, efficient...... model checking, model-checking tools, symbolic model checking, visual tools, and verification of critical systems....

  19. Surface Operations Data Analysis and Adaptation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  20. The environment power system analysis tool development program

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  1. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  2. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  3. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a complicated endeavor....... In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete programs, and checking of type...

  4. A 3D image analysis tool for SPECT imaging

    Science.gov (United States)

    Kontos, Despina; Wang, Qiang; Megalooikonomou, Vasileios; Maurer, Alan H.; Knight, Linda C.; Kantor, Steve; Fisher, Robert S.; Simonian, Hrair P.; Parkman, Henry P.

    2005-04-01

    We have developed semi-automated and fully-automated tools for the analysis of 3D single-photon emission computed tomography (SPECT) images. The focus is on the efficient boundary delineation of complex 3D structures that enables accurate measurement of their structural and physiologic properties. We employ intensity based thresholding algorithms for interactive and semi-automated analysis. We also explore fuzzy-connectedness concepts for fully automating the segmentation process. We apply the proposed tools to SPECT image data capturing variation of gastric accommodation and emptying. These image analysis tools were developed within the framework of a noninvasive scintigraphic test to measure simultaneously both gastric emptying and gastric volume after ingestion of a solid or a liquid meal. The clinical focus of the particular analysis was to probe associations between gastric accommodation/emptying and functional dyspepsia. Employing the proposed tools, we outline effectively the complex three dimensional gastric boundaries shown in the 3D SPECT images. We also perform accurate volume calculations in order to quantitatively assess the gastric mass variation. This analysis was performed both with the semi-automated and fully-automated tools. The results were validated against manual segmentation performed by a human expert. We believe that the development of an automated segmentation tool for SPECT imaging of the gastric volume variability will allow for other new applications of SPECT imaging where there is a need to evaluate complex organ function or tumor masses.

  5. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  6. Match Analysis an undervalued coaching tool

    CERN Document Server

    Sacripanti, Attilio

    2010-01-01

    From a Biomechanical point of view, Judo competition is an intriguing complex nonlinear system, with many chaotic and fractals aspects, It is also the test bed in which all coaching capabilities and athlete's performances are evaluated and put to the test. Competition is the moment of truth of all conditioning time, preparation and technical work, before developed, and it is also the climax of the teaching point of view. Furthermore, it is the most important source of technical assessment. Studying it is essential to the coaches because they can obtain useful information for their coaching. Match Analysis could be seen as the master key in all situation sports (dual or team) like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a short summary of the most important methodological achievement in judo match analysis. It is also presented, at light of the last technological improvement, the first systematization toward new fiel...

  7. Proceedings of the 1st Space Plasma Computer Analysis Network (SCAN) Workshop. [space plasma computer networks

    Science.gov (United States)

    Green, J. L.; Waite, J. H.; Johnson, J. F. E.; Doupnik, J. R.; Heelis, R. A.

    1983-01-01

    The purpose of the workshop was to identify specific cooperative scientific study topics within the discipline of Ionosphere Magnetosphere Coupling processes and to develop methods and procedures to accomplish this cooperative research using SCAN facilities. Cooperative scientific research was initiated in the areas of polar cusp composition, O+ polar outflow, and magnetospheric boundary morphology studies and an approach using a common metafile structure was adopted to facilitate the exchange of data and plots between the various workshop participants. The advantages of in person versus remote workshops were discussed also.

  8. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    , as well as modification and analysis. Graphical work stations provide the opportunity to work — not only with textual representations of Petri nets — but also directly with the graphical representations. This paper describes some of the different kinds of tools which are needed in the Petri net area......The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets....... It describes some of the requirements which these tools must fulfil, in order to support the user in a natural and effective way. Finally some references are given to papers which describe examples of existing Petri net tools....

  9. Vulnerability assessment using two complementary analysis tools

    Energy Technology Data Exchange (ETDEWEB)

    Paulus, W.K.

    1993-07-01

    To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has begun using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. Assess analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weakness of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weakness, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

  10. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    Science.gov (United States)

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  11. Tools for analysis of Dirac structures on banach spaces

    NARCIS (Netherlands)

    Iftime, Orest V.; Sandovici, Adrian; Golo, Goran

    2005-01-01

    Power-conserving and Dirac structures are known as an approach to mathematical modeling of physical engineering systems. In this paper connections between Dirac structures and well known tools from standard functional analysis are presented. The analysis can be seen as a possible starting framework

  12. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where....... The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and designers...

  13. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  14. Tool Failure Analysis in High Speed Milling of Titanium Alloys

    Institute of Scientific and Technical Information of China (English)

    ZHAO Xiuxu; MEYER Kevin; HE Rui; YU Cindy; NI Jun

    2006-01-01

    In high speed milling of titanium alloys the high rate of tool failure is the main reason for its high manufacturing cost. In this study, fractured tools which were used in a titanium alloys 5-axis milling process have been observed both in the macro scale using a PG-1000 light microscope and in the micro scale using a Scanning Electron Microscope (SEM) respectively. These observations indicate that most of these tool fractures are the result of tool chipping. Further analysis of each chipping event has shown that beachmarks emanate from points on the cutting edge. This visual evidence indicates that the cutting edge is failing in fatigue due to cyclical mechanical and/or thermal stresses. Initial analyses explaining some of the outlying conditions for this phenomenon are discussed. Future analysis regarding determining the underlying causes of the fatigue phenomenon is then outlined.

  15. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  16. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  17. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning; Mullins, Michael

    2006-01-01

    possible approaches for working with digital tectonics by means of acoustics: The architects, the architect-engineer or hybrid practitioner and finally a prototype for a possible digital tectonic tool. For the third approach in the case study a prototype digital tectonic tool is tested on the design...... into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and designers. The paper contains a case study of three...

  18. Development of a climate data analysis tool (CDAT)

    Energy Technology Data Exchange (ETDEWEB)

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  19. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  20. MEASUREMENT UNCERTAINTY ANALYSIS OF DIFFERENT CNC MACHINE TOOLS MEASUREMENT SYSTEMS

    Directory of Open Access Journals (Sweden)

    Leszek Semotiuk

    2013-09-01

    Full Text Available In this paper the results of measurement uncertainty tests conducted with a Heidenhain TS 649 probe on CNC machine tools are presented. In addition, identification and analysis of random and systematic errors of measurement were presented. Analyses were performed on the basis of measurements taken on two different CNC machine tools with Heidenhain control system. The evaluated errors were discussed and compensation procedures were proposed. The obtained results were described in tables and figures.

  1. A Semi-Automated Functional Test Data Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Peng; Haves, Philip; Kim, Moosung

    2005-05-01

    The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

  2. [SIGAPS, a tool for the analysis of scientific publications].

    Science.gov (United States)

    Sillet, Arnauld

    2015-04-01

    The System for the Identification, Management and Analysis of Scientific Publications (SIGAPS) is essential for the funding of teaching hospitals on the basis of scientific publications. It is based on the analysis of articles indexed in Medline and is calculated by taking into account the place of the author and the ranking of the journal according to the disciplinary field. It also offers tools for the bibliometric analysis of scientific production.

  3. Physics analysis tools for beauty physics in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Anastopoulos, C [Physics Department, Aristotle University Of Thessaloniki (Greece); Bouhova-Thacker, E; Catmore, J; Mora, L de [Department of Physics, Lancaster University (United Kingdom); Dallison, S [Particle Physics Department, CCLRC Rutherford Appleton Laboratory (United Kingdom); Derue, F [LPNHE, IN2P3 - CNRS - Universites Paris VI et Paris VII (France); Epp, B; Jussel, P [Institute for Astro- and Particle Physics, University of Innsbruck (Austria); Kaczmarska, A [Institute of Nuclear Physics, Polish Academy of Sciences (Poland); Radziewski, H v; Stahl, T [Department of Physics, University of Siegen (Germany); Reznicek, P [IPNP, Faculty of Mathematics and Physics, Charles University in Prague (Czech Republic)], E-mail: pavel.reznicek@cern.ch

    2008-07-15

    The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools.

  4. PREFACE: European Microbeam Analysis Society's 14th European Workshop on Modern Developments and Applications in Microbeam Analysis (EMAS 2015), Portorož, Slovenia, 3-7 May 2015

    Science.gov (United States)

    Llovet, Xavier; Matthews, Michael B.; Čeh, Miran; Langer, Enrico; Žagar, Kristina

    2016-02-01

    This volume of the IOP Conference Series: Materials Science and Engineering contains papers from the 14th Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis which took place from the 3rd to the 7th of May 2015 in the Grand Hotel Bernardin, Portorož, Slovenia. The primary aim of this series of workshops is to assess the state-of-the-art and reliability of microbeam analysis techniques. The workshops also provide a forum where students and young scientists starting out on a career in microbeam analysis can meet and discuss with the established experts. The workshops have a unique format comprising invited plenary lectures by internationally recognized experts, poster presentations by the participants and round table discussions on the key topics led by specialists in the field.This workshop was organized in collaboration with the Jožef Stefan Institute and SDM - Slovene Society for Microscopy. The technical programme included the following topics: electron probe microanalysis, STEM and EELS, materials applications, cathodoluminescence and electron backscatter diffraction (EBSD), and their applications. As at previous workshops there was also a special oral session for young scientists. The best presentation by a young scientist was awarded with an invitation to attend the 2016 Microscopy and Microanalysis meeting at Columbus, Ohio. The prize went to Shirin Kaboli, of the Department of Metals and Materials Engineering of McGill University (Montréal, Canada), for her talk entitled "Electron channelling contrast reconstruction with electron backscattered diffraction". The continuing relevance of the EMAS workshops and the high regard in which they are held internationally can be seen from the fact that 71 posters from 16 countries were on display at the meeting and that the participants came from as far away as Japan, Canada, USA, and Australia. A selection of participants with posters was invited

  5. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  6. MICCAI Workshops

    CERN Document Server

    Nedjati-Gilani, Gemma; Venkataraman, Archana; O'Donnell, Lauren; Panagiotaki, Eleftheria

    2014-01-01

    This volume contains the proceedings from two closely related workshops: Computational Diffusion MRI (CDMRI’13) and Mathematical Methods from Brain Connectivity (MMBC’13), held under the auspices of the 16th International Conference on Medical Image Computing and Computer Assisted Intervention, which took place in Nagoya, Japan, September 2013. Inside, readers will find contributions ranging from mathematical foundations and novel methods for the validation of inferring large-scale connectivity from neuroimaging data to the statistical analysis of the data, accelerated methods for data acquisition, and the most recent developments on mathematical diffusion modeling. This volume offers a valuable starting point for anyone interested in learning computational diffusion MRI and mathematical methods for brain connectivity as well as offers new perspectives and insights on current research challenges for those currently in the field. It will be of interest to researchers and practitioners in computer science, ...

  7. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from......Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  8. A Suite of Tools for ROC Analysis of Spatial Models

    Directory of Open Access Journals (Sweden)

    Hermann Rodrigues

    2013-09-01

    Full Text Available The Receiver Operating Characteristic (ROC is widely used for assessing the performance of classification algorithms. In GIScience, ROC has been applied to assess models aimed at predicting events, such as land use/cover change (LUCC, species distribution and disease risk. However, GIS software packages offer few statistical tests and guidance tools for ROC analysis and interpretation. This paper presents a suite of GIS tools designed to facilitate ROC curve analysis for GIS users by applying proper statistical tests and analysis procedures. The tools are freely available as models and submodels of Dinamica EGO freeware. The tools give the ROC curve, the area under the curve (AUC, partial AUC, lower and upper AUCs, the confidence interval of AUC, the density of event in probability bins and tests to evaluate the difference between the AUCs of two models. We present first the procedures and statistical tests implemented in Dinamica EGO, then the application of the tools to assess LUCC and species distribution models. Finally, we interpret and discuss the ROC-related statistics resulting from various case studies.

  9. Phonological assessment and analysis tools for Tagalog: Preliminary development.

    Science.gov (United States)

    Chen, Rachelle Kay; Bernhardt, B May; Stemberger, Joseph P

    2016-01-01

    Information and assessment tools concerning Tagalog phonological development are minimally available. The current study thus sets out to develop elicitation and analysis tools for Tagalog. A picture elicitation task was designed with a warm-up, screener and two extension lists, one with more complex and one with simpler words. A nonlinear phonological analysis form was adapted from English (Bernhardt & Stemberger, 2000) to capture key characteristics of Tagalog. The tools were piloted on a primarily Tagalog-speaking 4-year-old boy living in a Canadian-English-speaking environment. The data provided initial guidance for revision of the elicitation tool (available at phonodevelopment.sites.olt.ubc.ca). The analysis provides preliminary observations about possible expectations for primarily Tagalog-speaking 4-year-olds in English-speaking environments: Lack of mastery for tap/trill 'r', and minor mismatches for vowels, /l/, /h/ and word stress. Further research is required in order to develop the tool into a norm-referenced instrument for Tagalog in both monolingual and multilingual environments. PMID:27096390

  10. Second international tsunami workshop on the technical aspects of tsunami warning systems, tsunami analysis, preparedness, observation and instrumentation

    International Nuclear Information System (INIS)

    The Second Workshop on the Technical Aspects of Tsunami Warning Systems, Tsunami Analysis, Preparedness, Observation, and Instrumentation, sponsored and convened by the Intergovernmental Oceanographic Commission (IOC), was held on 1-2 August 1989, in the modern and attractive research town of Academgorodok, which is located 20 km south from downtown Novosibirsk, the capital of Siberia, USSR. The Program was arranged in eight major areas of interest covering the following: Opening and Introduction; Survey of Existing Tsunami Warning Centers - present status, results of work, plans for future development; Survey of some existing seismic data processing systems and future projects; Methods for fast evaluation of Tsunami potential and perspectives of their implementation; Tsunami data bases; Tsunami instrumentation and observations; Tsunami preparedness; and finally, a general discussion and adoption of recommendations. The Workshop presentations not only addressed the conceptual improvements that have been made, but focused on the inner workings of the Tsunami Warning System, as well, including computer applications, on-line processing and numerical modelling. Furthermore, presentations reported on progress has been made in the last few years on data telemetry, instrumentation and communications. Emphasis was placed on new concepts and their application into operational techniques that can result in improvements in data collection, rapid processing of the data, in analysis and prediction. A Summary Report on the Second International Tsunami Workshop, containing abstracted and annotated proceedings has been published as a separate report. The present Report is a Supplement to the Summary Report and contains the full text of the papers presented at this Workshop. Refs, figs and tabs

  11. 3rd International Workshop on Intelligent Data Analysis and Management (IDAM)

    CERN Document Server

    Wang, Leon; Hong, Tzung-Pei; Yang, Hsin-Chang; Ting, I-Hsien

    2013-01-01

    These papers on Intelligent Data Analysis and Management (IDAM) examine issues related to the research and applications of Artificial Intelligence techniques in data analysis and management across a variety of disciplines. The papers derive from the 2013 IDAM conference in Kaohsiung ,Taiwan. It is an interdisciplinary research field involving academic researchers in information technologies, computer science, public policy, bioinformatics, medical informatics, and social and behavior studies, etc. The techniques studied include (but are not limited to): data visualization, data pre-processing, data engineering, database mining techniques, tools and applications, evolutionary algorithms, machine learning, neural nets, fuzzy logic, statistical pattern recognition, knowledge filtering, and post-processing, etc.

  12. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  13. Shape Analysis for Complex Systems Using Information Geometry Tools.

    OpenAIRE

    Sanctis, Angela De

    2012-01-01

    In this paper we use Information Geometry tools to model statistically patterns arising in complex systems and describe their evolution in time. In particular, we focus on the analysis of images with medical applications and propose an index that can estimate the level of self-organization and predict future problems that may occur in these systems.

  14. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  15. Selected Tools for Risk Analysis in Logistics Processes

    Science.gov (United States)

    Kulińska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  16. The Adversarial Route Analysis Tool: A Web Application

    Energy Technology Data Exchange (ETDEWEB)

    Casson, William H. Jr. [Los Alamos National Laboratory

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  17. Models as Tools of Analysis of a Network Organisation

    Directory of Open Access Journals (Sweden)

    Wojciech Pająk

    2013-06-01

    Full Text Available The paper presents models which may be applied as tools of analysis of a network organisation. The starting point of the discussion is defining the following terms: supply chain and network organisation. Further parts of the paper present basic assumptions analysis of a network organisation. Then the study characterises the best known models utilised in analysis of a network organisation. The purpose of the article is to define the notion and the essence of network organizations and to present the models used for their analysis.

  18. A Spreadsheet Teaching Tool For Analysis Of Pipe Networks

    OpenAIRE

    El Bahrawy, Aly N.

    1997-01-01

    Spreadsheets are used widely in engineering to perform several analysis and design calculations. They are also very attractive as educational tools due to their flexibility and efficiency. This paper demonstrates the use of spreadsheets in teaching the analysis of water pipe networks, which involves the calculation of pipe flows or nodal heads given the network layout, pipe characteristics (diameter, length, and roughness), in addition to external flows. The network performance is better und...

  19. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results....

  20. Organisational Self - Evaluation as a Possible Tool of Organisational Analysis

    OpenAIRE

    Mariann Veresné Somosi

    2004-01-01

    The clue of enduring success of companies / institutes is the ability to recognise new challenges betimes and to react them quickly and flexible. The management however does not dispose of the appropriate tools and methodological knowledge in cases of complex and complicated organisational forming to map fields in critical situations. During this presentation, I examine one of the possible systems of goals and fields of organisational analysis with the help of the organisational analysis proc...

  1. DFTCalc: a tool for efficient fault tree analysis (extended version)

    OpenAIRE

    Arnold, Florian; Belinfante, Axel; Berg, de, MT Mark; Guck, Dennis; Stoelinga, Mariëlle

    2013-01-01

    Effective risk management is a key to ensure that our nuclear power plants, medical equipment, and power grids are dependable; and is often required by law. Fault Tree Analysis (FTA) is a widely used methodology here, computing important dependability measures like system reliability. This paper presents DFTCalc, a powerful tool for FTA, providing (1) efficient fault tree modelling via compact representations; (2) effective analysis, allowing a wide range of dependability properties to be ana...

  2. DFTCalc: a tool for efficient fault tree analysis

    OpenAIRE

    Arnold F.; Belinfante A.; Van Der Berg F.; Guck D.; Stoelinga M.

    2013-01-01

    Effective risk management is a key to ensure that our nuclear power plants, medical equipment, and power grids are dependable; and it is often required by law. Fault Tree Analysis (FTA) is a widely used methodology here, computing important dependability measures like system reliability. This paper presents DFTCalc, a powerful tool for FTA, providing (1) efficient fault tree modelling via compact representations; (2) effective analysis, allowing a wide range of dependability properties to be ...

  3. Discovery and New Frontiers Project Budget Analysis Tool

    Science.gov (United States)

    Newhouse, Marilyn E.

    2011-01-01

    The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

  4. Web-Oriented Visual Performance Analysis Tool for HPC: THPTiii

    Institute of Scientific and Technical Information of China (English)

    SHIPeizhi; LISanli

    2003-01-01

    Improving the low efficiency of most parallel applications with performance tool is an important issue in high performance computing. Performance tool, which usually collects and analyzes performance data, is an effective way of improving performance. This paper explores both the collecting and analysis of performance data, and two innovation ideas are proposed: both types of runtime performance data, concerning both system load and application behavior, should be collected simultaneously, which requires multiple instrument flow and low probing cost; and the performance analysis should be Weboriented, which can exploit the excellent portability and usability brought by Internet. This paper presents a Weboriented HPC (High performance computing) performance tool, which can collect information about both resource utilization, including the utilizing ratio of CPU and memory, and the program behavior during runtime, including the statuses such as sending and computing, and visualize the information in the users' browser window with JAVA applets in multiple filters and multiple views. Furthermore, this performance tool exposes the data dependency between components and provides an entry of task scheduling. With this performance tool, programmers can monitor the runtime state of the application, analyze the relationship between program process and system load, find out the performance bottleneck, and improve the performance of the application at last.

  5. Comparative analysis of marine ecosystems: workshop on predator-prey interactions

    DEFF Research Database (Denmark)

    Bailey, Kevin M.; Ciannelli, Lorenzo; Hunsicker, Mary;

    2010-01-01

    Climate and human influences on marine ecosystems are largely manifested by changes in predator–prey interactions. It follows that ecosystem-based management of the world's oceans requires a better understanding of food web relationships. An international workshop on predator–prey interactions....... The goals of the workshop were to critically examine the methods of scaling-up predator–prey interactions from local observations to systems, the role of shifting ecological processes with scale changes, and the complexity and organizational structure in trophic interactions....

  6. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Science.gov (United States)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  7. Campaign effects and self-analysis Internet tool

    Energy Technology Data Exchange (ETDEWEB)

    Brange, Birgitte [Danish Electricity Saving Trust (Denmark); Fjordbak Larsen, Troels [IT Energy ApS (Denmark); Wilke, Goeran [Danish Electricity Saving Trust (Denmark)

    2007-07-01

    In October 2006, the Danish Electricity Saving Trust launched a large TV campaign targeting domestic electricity consumption. The campaign was based on the central message '1000 kWh/year per person is enough'. The campaign was accompanied by a new internet portal with updated information about numerous household appliances, and by analysis tools for bringing down electricity consumption to 1000 kWh/year per person. The effects of the campaign are monitored through repeated surveys and analysed in relation to usage of internet tools.

  8. Linguistics and cognitive linguistics as tools of pedagogical discourse analysis

    Directory of Open Access Journals (Sweden)

    Kurovskaya Yulia G.

    2016-01-01

    Full Text Available The article discusses the use of linguistics and cognitive linguistics as tools of pedagogical discourse analysis, thus establishing a new branch of pedagogy called pedagogical semiology that is concerned with students’ acquisition of culture encoded in symbols and the way students’ sign consciousness formed in the context of learning affects their world cognition and interpersonal communication. The article introduces a set of tools that would enable the teacher to organize the educational process in compliance with the rules of language as a sign system applied to the context of pedagogy and with the formation of younger generation’s language picture of the world.

  9. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  10. Comparing gene set analysis methods on single-nucleotide polymorphism data from Genetic Analysis Workshop 16

    OpenAIRE

    Tintle, Nathan L; Borchers, Bryce; Brown, Marshall; Bekmetjev, Airat

    2009-01-01

    Recently, gene set analysis (GSA) has been extended from use on gene expression data to use on single-nucleotide polymorphism (SNP) data in genome-wide association studies. When GSA has been demonstrated on SNP data, two popular statistics from gene expression data analysis (gene set enrichment analysis [GSEA] and Fisher's exact test [FET]) have been used. However, GSEA and FET have shown a lack of power and robustness in the analysis of gene expression data. The purpose of this work is to in...

  11. CRAB: the CMS distributed analysis tool development and design

    Energy Technology Data Exchange (ETDEWEB)

    Spiga, D. [University and INFN Perugia (Italy); Lacaprara, S. [INFN Legnaro (Italy); Bacchi, W. [University and INFN Bologna (Italy); Cinquilli, M. [University and INFN Perugia (Italy); Codispoti, G. [University and INFN Bologna (Italy); Corvo, M. [CERN (Switzerland); Dorigo, A. [INFN Padova (Italy); Fanfani, A. [University and INFN Bologna (Italy); Fanzago, F. [CERN (Switzerland); Farina, F. [INFN Milano-Bicocca (Italy); Gutsche, O. [FNAL (United States); Kavka, C. [INFN Trieste (Italy); Merlo, M. [INFN Milano-Bicocca (Italy); Servoli, L. [University and INFN Perugia (Italy)

    2008-03-15

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  12. CRAB: the CMS distributed analysis tool development and design

    CERN Document Server

    Spiga, D; Bacchi, W; Cinquilli, M; Codispoti, G; Corvo, M; Dorigo, A; Fanfani, A; Fanzago, F; Farina, F; Gutsche, O; Kavka, C; Merlo, M; Servoli, L

    2008-01-01

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  13. PREFACE: 3rd International Workshop on Materials Analysis and Processing in Magnetic Fields (MAP3)

    Science.gov (United States)

    Sakka, Yoshio; Hirota, Noriyuki; Horii, Shigeru; Ando, Tsutomu

    2009-07-01

    The 3rd International Workshop on Materials Analysis and Processing in Materials Fields (MAP3) was held on 14-16 May 2008 at the University of Tokyo, Japan. The first was held in March 2004 at the National High Magnetic Field Laboratory in Tallahassee, USA. Two years later the second took place in Grenoble, France. MAP3 was held at The University of Tokyo International Symposium, and jointly with MANA Workshop on Materials Processing by External Stimulation, and JSPS CORE Program of Construction of the World Center on Electromagnetic Processing of Materials. At the end of MAP3 it was decided that the next MAP4 will be held in Atlanta, USA in 2010. Processing in magnetic fields is a rapidly expanding research area with a wide range of promising applications in materials science. MAP3 focused on the magnetic field interactions involved in the study and processing of materials in all disciplines ranging from physics to chemistry and biology: Magnetic field effects on chemical, physical, and biological phenomena Magnetic field effects on electrochemical phenomena Magnetic field effects on thermodynamic phenomena Magnetic field effects on hydrodynamic phenomena Magnetic field effects on crystal growth Magnetic processing of materials Diamagnetic levitation Magneto-Archimedes effect Spin chemistry Application of magnetic fields to analytical chemistry Magnetic orientation Control of structure by magnetic fields Magnetic separation and purification Magnetic field-induced phase transitions Materials properties in high magnetic fields Development of NMR and MRI Medical application of magnetic fields Novel magnetic phenomena Physical property measurement by Magnetic fields High magnetic field generation> MAP3 consisted of 84 presentations including 16 invited talks. This volume of Journal of Physics: Conference Series contains the proceeding of MAP3 with 34 papers that provide a scientific record of the topics covered by the conference with the special topics (13 papers) in

  14. SMART: Statistical Metabolomics Analysis-An R Tool.

    Science.gov (United States)

    Liang, Yu-Jen; Lin, Yu-Ting; Chen, Chia-Wei; Lin, Chien-Wei; Chao, Kun-Mao; Pan, Wen-Harn; Yang, Hsin-Chou

    2016-06-21

    Metabolomics data provide unprecedented opportunities to decipher metabolic mechanisms by analyzing hundreds to thousands of metabolites. Data quality concerns and complex batch effects in metabolomics must be appropriately addressed through statistical analysis. This study developed an integrated analysis tool for metabolomics studies to streamline the complete analysis flow from initial data preprocessing to downstream association analysis. We developed Statistical Metabolomics Analysis-An R Tool (SMART), which can analyze input files with different formats, visually represent various types of data features, implement peak alignment and annotation, conduct quality control for samples and peaks, explore batch effects, and perform association analysis. A pharmacometabolomics study of antihypertensive medication was conducted and data were analyzed using SMART. Neuromedin N was identified as a metabolite significantly associated with angiotensin-converting-enzyme inhibitors in our metabolome-wide association analysis (p = 1.56 × 10(-4) in an analysis of covariance (ANCOVA) with an adjustment for unknown latent groups and p = 1.02 × 10(-4) in an ANCOVA with an adjustment for hidden substructures). This endogenous neuropeptide is highly related to neurotensin and neuromedin U, which are involved in blood pressure regulation and smooth muscle contraction. The SMART software, a user guide, and example data can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/metabolomics/SMART.htm . PMID:27248514

  15. Workshop on molecular animation.

    Science.gov (United States)

    Bromberg, Sarina; Chiu, Wah; Ferrin, Thomas E

    2010-10-13

    From February 25 to 26, 2010, in San Francisco, the Resource for Biocomputing, Visualization, and Informatics (RBVI) and the National Center for Macromolecular Imaging (NCMI) hosted a molecular animation workshop for 21 structural biologists, molecular animators, and creators of molecular visualization software. Molecular animation aims to visualize scientific understanding of biomolecular processes and structures. The primary goal of the workshop was to identify the necessary tools for producing high-quality molecular animations, understanding complex molecular and cellular structures, creating publication supplementary materials and conference presentations, and teaching science to students and the public. Another use of molecular animation emerged in the workshop: helping to focus scientific inquiry about the motions of molecules and enhancing informal communication within and between laboratories.

  16. WALLTURB International Workshop

    CERN Document Server

    Jimenez, Javier; Marusic, Ivan

    2011-01-01

    This book brings together selected contributions from the WALLTURB workshop on ”Understanding and modelling of wall turbulence” held in Lille, France, on April 21st to 23rd 2009. This workshop was organized by the WALLTURB consortium, in order to present to the relevant scientific community the main results of the project and to stimulate scientific discussions around the subject of wall turbulence. The workshop reviewed the recent progress in theoretical, experimental and numerical approaches to wall turbulence. The problems of zero pressure gradient, adverse pressure gradient and separating turbulent boundary layers were addressed in detail with the three approaches, using the most advanced tools. This book is a milestone in the research field, thanks to the high level of the invited speakers and the involvement of the contributors and a testimony of the achievement of the WALLTURB project.

  17. EMERGING MODALITIES FOR SOIL CARBON ANALYSIS: SAMPLING STATISTICS AND ECONOMICS WORKSHOP.

    Energy Technology Data Exchange (ETDEWEB)

    WIELOPOLSKI, L.

    2006-04-01

    The workshop's main objectives are (1) to present the emerging modalities for analyzing carbon in soil, (2) to assess their error propagation, (3) to recommend new protocols and sampling strategies for the new instrumentation, and, (4) to compare the costs of the new methods with traditional chemical ones.

  18. Applying Instructional Design Theories to Bioinformatics Education in Microarray Analysis and Primer Design Workshops

    Science.gov (United States)

    Shachak, Aviv; Ophir, Ron; Rubin, Eitan

    2005-01-01

    The need to support bioinformatics training has been widely recognized by scientists, industry, and government institutions. However, the discussion of instructional methods for teaching bioinformatics is only beginning. Here we report on a systematic attempt to design two bioinformatics workshops for graduate biology students on the basis of…

  19. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  20. AstroStat - A VO Tool for Statistical Analysis

    CERN Document Server

    Kembhavi, Ajit K; Kale, Tejas; Jagade, Santosh; Vibhute, Ajay; Garg, Prerak; Vaghmare, Kaustubh; Navelkar, Sharmad; Agrawal, Tushar; Nandrekar, Deoyani; Shaikh, Mohasin

    2015-01-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analysis is done using the public domain statistical software - R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features - as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on ...

  1. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  2. CTBTO international cooperation workshop

    International Nuclear Information System (INIS)

    The International Cooperation Workshop took place in Vienna, Austria, on 16 and 17 November 1998, with the participation of 104 policy/decision makers, Research and Development managers and diplomatic representatives from 58 States Signatories to the Comprehensive Nuclear-Test Ban Treaty (CTBT). The Workshop attempted to develop Treaty stipulations to: promote cooperation to facilitate and participate in the fullest possible exchange relating to technologies used in the verification of the Treaty; enable member states to strengthen national implementation of verification measures, and to benefit from the application of such technologies for peaceful purposes. The potential benefits arising from the CTBT monitoring, analysis and data communication systems are multifaceted, and as yet unknown. This Workshop provided the opportunity to examine some of these possibilities. An overview of the CTBT verification regime on the general aspects of the four monitoring technologies (seismic, hydro-acoustic, infrasound and radionuclides), including some of the elements that are the subject of international cooperation, were presented and discussed. Questions were raised on the potential benefits that can be derived by participating in the CTBT regime and broad-based discussions took place. Several concrete proposals on ways and means to facilitate and promote cooperation among States Signatories were suggested. The main points discussed by the participants can be summarized as follows: the purpose of the CTBT Organization is to assist member states to monitor Treaty compliance; the CTBT can be a highly effective technological tool which can generate wide-ranging data, which can be used for peaceful purposes; there are differences in the levels of technology development in the member states that is why peaceful applications should be supported by the Prep Com for the benefit of all member states, whether developed or developing, training being a key element to optimize the CTBT

  3. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  4. SABRE: A Tool for Stochastic Analysis of Biochemical Reaction Networks

    CERN Document Server

    Didier, Frederic; Mateescu, Maria; Wolf, Verena

    2010-01-01

    The importance of stochasticity within biological systems has been shown repeatedly during the last years and has raised the need for efficient stochastic tools. We present SABRE, a tool for stochastic analysis of biochemical reaction networks. SABRE implements fast adaptive uniformization (FAU), a direct numerical approximation algorithm for computing transient solutions of biochemical reaction networks. Biochemical reactions networks represent biological systems studied at a molecular level and these reactions can be modeled as transitions of a Markov chain. SABRE accepts as input the formalism of guarded commands, which it interprets either as continuous-time or as discrete-time Markov chains. Besides operating in a stochastic mode, SABRE may also perform a deterministic analysis by directly computing a mean-field approximation of the system under study. We illustrate the different functionalities of SABRE by means of biological case studies.

  5. Physics Analysis Tools for the CMS experiment at LHC

    CERN Document Server

    Fabozzi, Francesco; Hegner, Benedikt; Lista, Luca

    2008-01-01

    The CMS experiment is expected to start data taking during 2008, and large data samples, of the Peta-bytes scale, will be produced each year. The CMS Physics Tools package provides the CMS physicist with a powerful and flexible software layer for analysis of these huge datasets that is well integrated in the CMS experiment software. A core part of this package is the Candidate Model providing a coherent interface to different types of data. Standard tasks such as combinatorial analyses, generic cuts, MC truth matching and constrained fitting are supported. Advanced template techniques enable the user to add missing features easily. We explain the underlying model, certain details of the implementation and present some use cases showing how the tools are currently used in generator and full simulation studies as preparation for analysis of real data.

  6. Validating and Verifying a New Thermal-Hydraulic Analysis Tool

    International Nuclear Information System (INIS)

    The Idaho National Engineering and Environmental Laboratory (INEEL) has developed a new analysis tool by coupling the Fluent computational fluid dynamics (CFD) code to the RELAP5-3DC/ATHENA advanced thermal-hydraulic analysis code. This tool enables researchers to perform detailed, three-dimensional analyses using Fluent's CFD capability while the boundary conditions required by the Fluent calculation are provided by the balance-of-system model created using RELAP5-3DC/ATHENA. Both steady-state and transient calculations can be performed, using many working fluids and point to three-dimensional neutronics. A general description of the techniques used to couple the codes is given. The validation and verification (V and V) matrix is outlined. V and V is presently ongoing. (authors)

  7. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    Science.gov (United States)

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  8. SOCIAL SENSOR: AN ANALYSIS TOOL FOR SOCIAL MEDIA

    Directory of Open Access Journals (Sweden)

    Chun-Hsiao Wu

    2016-05-01

    Full Text Available In this research, we propose a new concept for social media analysis called Social Sensor, which is an innovative design attempting to transform the concept of a physical sensor in the real world into the world of social media with three design features: manageability, modularity, and reusability. The system is a case-centered design that allows analysts to select the type of social media (such as Twitter, the target data sets, and appropriate social sensors for analysis. By adopting parameter templates, one can quickly apply the experience of other experts at the beginning of a new case or even create one’s own templates. We have also modularized the analysis tools into two social sensors: Language Sensor and Text Sensor. A user evaluation was conducted and the results showed that usefulness, modularity, reusability, and manageability of the system were all very positive. The results also show that this tool can greatly reduce the time needed to perform data analysis, solve the problems encountered in traditional analysis process, and obtained useful results. The experimental results reveal that the concept of social sensor and the proposed system design are useful for big data analysis of social media.

  9. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  10. A Performance Analysis Tool for PVM Parallel Programs

    Institute of Scientific and Technical Information of China (English)

    Chen Wang; Yin Liu; Changjun Jiang; Zhaoqing Zhang

    2004-01-01

    In this paper,we introduce the design and implementation of ParaVT,which is a visual performance analysis and parallel debugging tool.In ParaVT,we propose an automated instrumentation mechanism. Based on this mechanism,ParaVT automatically analyzes the performance bottleneck of parallel applications and provides a visual user interface to monitor and analyze the performance of parallel programs.In addition ,it also supports certain extensions.

  11. In silico tools for the analysis of antibiotic biosynthetic pathways

    DEFF Research Database (Denmark)

    Weber, Tilmann

    2014-01-01

    Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel s...... and tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data....

  12. Validation of retrofit analysis simulation tool: Lessons learned

    OpenAIRE

    Trcka, Marija; Pasini, Jose Miguel; Oggianu, Stella Maris

    2014-01-01

    It is well known that residential and commercial buildings account for about 40% of the overall energy consumed in the United States, and about the same percentage of CO2 emissions. Retrofitting existing old buildings, which account for 99% of the building stock, represents the best opportunity of achieving challenging energy and emission targets. United Technologies Research Center (UTC) has developed a methodology and tool that provides computational support for analysis and decision-making...

  13. Nucleonica: Web-based Software Tools for Simulations and Analysis

    OpenAIRE

    Magill, Joseph; DREHER Raymond; SOTI Zsolt; LASCHE George

    2012-01-01

    The authors present a description of a new web-based software portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data res...

  14. The RUMBA software: tools for neuroimaging data analysis.

    Science.gov (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  15. AstroStat-A VO tool for statistical analysis

    Science.gov (United States)

    Kembhavi, A. K.; Mahabal, A. A.; Kale, T.; Jagade, S.; Vibhute, A.; Garg, P.; Vaghmare, K.; Navelkar, S.; Agrawal, T.; Chattopadhyay, A.; Nandrekar, D.; Shaikh, M.

    2015-06-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analyses are done using the public domain statistical software-R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features-as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on them.

  16. Aerospace Power Systems Design and Analysis (APSDA) Tool

    Science.gov (United States)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  17. Judo match analysis,a powerful coaching tool, basic and advanced tools

    CERN Document Server

    Sacripanti, A

    2013-01-01

    In this second paper on match analysis, we analyze in deep the competition steps showing the evolution of this tool at National Federation level.On the basis of our,first classification. Furthermore, it is the most important source of technical assessment. Studying competition with this tool is essential for the coaches because they can obtain useful information for their coaching. Match Analysis is today the master key in situation sports like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a deeper study of the judo competitions at high level both from the male and female point of view, explaining at light of biomechanics, not only the throws evolution in time, introduction of Innovative and Chaotic techniques, but also the evolution of fighting style in these high level competitions, both connected with the grow of this Olympic Sport in the Word Arena. It is shown how new interesting ways are opened by this powerful coac...

  18. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  19. Shot planning and analysis tools on the NIF project

    Energy Technology Data Exchange (ETDEWEB)

    Beeler, R. [Lawrence Livermore National Laboratory, Livermore, CA (United States); Casey, A., E-mail: casey20@llnl.gov [Lawrence Livermore National Laboratory, Livermore, CA (United States); Conder, A.; Fallejo, R.; Flegel, M.; Hutton, M.; Jancaitis, K.; Lakamsani, V.; Potter, D.; Reisdorf, S.; Tappero, J.; Whitman, P.; Carr, W.; Liao, Z. [Lawrence Livermore National Laboratory, Livermore, CA (United States)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Target shots in NIF, dozens a month, vary widely in laser and target configuration. Black-Right-Pointing-Pointer A planning tool helps select shot sequences that optimize valuable facility time. Black-Right-Pointing-Pointer Fabrication and supply of targets, diagnostics, etc. are integrated into the plan. Black-Right-Pointing-Pointer Predictive modeling of aging parts (e.g., optics) aids maintenance decision support. Black-Right-Pointing-Pointer We describe the planning/analysis tool and its use in NIF experimental operations. - Abstract: Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Campaign Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modeling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics

  20. Graphical tools for network meta-analysis in STATA.

    Science.gov (United States)

    Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  1. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  2. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  3. Graphical tools for network meta-analysis in STATA.

    Directory of Open Access Journals (Sweden)

    Anna Chaimani

    Full Text Available Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  4. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    Science.gov (United States)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  5. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  6. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    Science.gov (United States)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  7. Basic statistical tools in research and data analysis

    Science.gov (United States)

    Ali, Zulfiqar; Bhaskar, S Bala

    2016-01-01

    Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  8. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  9. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  10. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  11. Emerging methods and tools for environmental risk assessment, decision-making, and policy for nanomaterials: summary of NATO Advanced Research Workshop

    OpenAIRE

    Linkov, Igor; Steevens, Jeffery; Adlakha-Hutcheon, Gitanjali; Bennett, Erin; Chappell, Mark; Colvin, Vicki; Davis, J. Michael; Davis, Thomas; Elder, Alison; Hansen, Steffen Foss; Hakkinen, Pertti Bert; Hussain, Saber M.; Karkan, Delara; Korenstein, Rafi; Lynch, Iseult

    2008-01-01

    Nanomaterials and their associated technologies hold promising opportunities for the development of new materials and applications in a wide variety of disciplines, including medicine, environmental remediation, waste treatment, and energy conservation. However, current information regarding the environmental effects and health risks associated with nanomaterials is limited and sometimes contradictory. This article summarizes the conclusions of a 2008 NATO workshop designed to evaluate the wi...

  12. Analysis and processing tools for nuclear trade related data

    International Nuclear Information System (INIS)

    This paper describes the development of a system used by the Nuclear Trade Analysis Unit of the Department of Safeguards for handling, processing, analyzing, reporting and storing nuclear trade related data. The data handling and analysis part of the system is already functional, but several additional features are being added to optimize its use. The aim is to develop the system in a manner that actively contributes to the management of the Department's overall knowledge and supports the departmental State evaluation process. Much of the data originates from primary sources and comes in many different formats and languages. It also comes with diverse security needs. The design of the system has to meet the special challenges set by the large volume and different types of data that needs to be handled in a secure and reliable environment. Data is stored in a form appropriate for access and analysis in both structured and unstructured formats. The structured data is entered into a database (knowledge base) called the Procurement Tracking System (PTS). PTS allows effective linking, visualization and analysis of new data with that already included in the system. The unstructured data is stored in text searchable folders (information base) equipped with indexing and search capabilities. Several other tools are linked to the system including a visual analysis tool for structured information and a system for visualizing unstructured data. All of which are designed to help the analyst locate the specific information required amongst a myriad of unrelated information. This paper describes the system's concept, design and evolution - highlighting its special features and capabilities, which include the need to standardize the data collection, entry and analysis processes. All this enables the analyst to approach tasks consistently and in a manner that both enhances teamwork and leads to the development of an institutional memory related to cover trade activities that can be

  13. Anaphe - OO libraries and tools for data analysis

    International Nuclear Information System (INIS)

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, the authors have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create 'shadow classes' for the Python language, which map the definitions of the Abstract Interfaces almost at a one-to-one level. The authors will give an overview of the architecture and design choices and will present the current status and future developments of the project

  14. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  15. Anaphe—OO Libraries and Tools for Data Analysis

    Institute of Scientific and Technical Information of China (English)

    O.Couet; B.Ferrero-Merlino; 等

    2001-01-01

    The Anaple project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments,A range of commercial and public domain libraries is used to cover basic functionalities;on top of these libraries a set of HENP-sepcific C++ class libraries for histogram management fitting,plotting and ntuple-like data analysis has been developed .In order to comply with the user requireements for a command-line driven tool,we have chosen to use a scripting language(Python)as the fromt-ent for a data analysis tool.The loose coupling provided by the consequent use of (AIDA compliant)Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provies an easy integration of existing libraries into modern scipting languages thus allowing for rapid application development.This integration is simplified even further suing a specialised toolkit(SWIG)to create" shadow Classes"for the Python language,which map the definitions of the Abstract Interfaces almost at a one-to-one level.This paper will give an overview of the architecture and design choices and will present the current status and future developments of the project.

  16. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  17. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    Science.gov (United States)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  18. 3D-Aided-Analysis Tool for Lunar Rover

    Institute of Scientific and Technical Information of China (English)

    ZHANG Peng; LI Guo-peng; REN Xin; LIU Jian-jun; GAO Xing-ye; ZOU Xiao-duan

    2013-01-01

    3D-Aided-Analysis Tool (3DAAT) which is a virtual reality system is built up in this paper. 3DAAT is integrated with kinematics and dynamics model of rover as well as real lunar surface terrain mode. Methods of modeling which are proposed in this paper include constructing lunar surface, constructing 3D model of lander and rover, building up kinematic model of rover body. Photogrammetry technique and the remote sensing information are used to generate the terrain model of lunar surface. According to the implementation result, 3DAAT is an effective assist system for making exploration plan and analyzing the status of rover.

  19. SmashCommunity: A metagenomic annotation and analysis tool

    DEFF Research Database (Denmark)

    Arumugam, Manimozhiyan; Harrington, Eoghan D; Foerstner, Konrad U;

    2010-01-01

    SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate...... the quantitative phylogenetic and functional compositions of metagenomes, to compare compositions of multiple metagenomes and to produce intuitive visual representations of such analyses. AVAILABILITY: SmashCommunity is freely available at http://www.bork.embl.de/software/smash CONTACT: bork@embl.de....

  20. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  1. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    Science.gov (United States)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  2. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    Science.gov (United States)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  3. Comparison and analysis of collisional-radiative models at the NLTE-7 workshop

    International Nuclear Information System (INIS)

    We present the main results of the 7. Non-Local Thermodynamic Equilibrium Code Comparison Workshop held in December 2011 in Vienna, Austria. More than twenty researchers from nine countries, who actively work on development of collisional-radiative codes for plasma kinetics modeling, attended the meeting and submitted their results for a number of comparison cases. The cases included free electron-laser-inspired time-dependent relaxation of photoexcited Ne-like Ar, ionization balance and spectra for highly charged tungsten, spectroscopic diagnostics of krypton L-shell spectra, and an investigation of Ne model convergence with principal quantum number. (authors)

  4. EDITORIAL: Proceedings of the 12th Gravitational Wave Data Analysis Workshop (GWDAW 12), Cambridge, MA, USA, 13 16 December 2007

    Science.gov (United States)

    Hughes, S.; Katsavounidis, E.

    2008-09-01

    It was a great pleasure and an honor for us to host the 12th Gravitational Wave Data Analysis Workshop (GWDAW) at MIT and the LIGO Laboratory in Cambridge, Massachusetts, the place where this workshop series started in 1996. This time the conference was held at the conference facilities of the Royal Sonesta Hotel in Cambridge from 13 16 December, 2007. This 12th GWDAW found us with the ground interferometers having just completed their most sensitive search for gravitational waves and as they were starting their preparation to bring online and/or propose more sensitive instruments. Resonant mass detectors continued to observe the gravitational wave sky with instruments that have been operating now for many years. LISA, the Laser Interferometer Space Antenna, was recently reviewed by NASA's Beyond Einstein Program Assessment Committee (BEPAC) convened by the National Research Council (NRC) and found that 'on purely scientific grounds LISA is the mission that is the most promising and least scientifically risky…thus, the committee gave LISA its highest scientific ranking'. Even so, JDEM, the Joint Dark Energy Mission, was identified to go first, with LISA following a few years after. New methods, analysis ideas, results from the analysis of data collected by the instruments, as well as Mock Data Challenges for LISA were reported in this conference. While data from the most recent runs of the instruments are still being analyzed, the first upper limit results show how even non-detection statements can be interesting astrophysics. Beyond these traditional aspects of GWDAW though, for the first time in this workshop we tried to bring the non-gravitational wave physics and astronomy community on board in order to present, discuss and propose ways to work together as we pursue the first detection of gravitational waves and as we hope to transition to gravitational wave astronomy in the near future. Overview talks by colleagues leading observations in the electromagnetic

  5. Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.

    Science.gov (United States)

    Carlson, David H.

    1986-01-01

    This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…

  6. Proceedings Ninth Workshop on Quantitative Aspects of Programming Languages

    CERN Document Server

    Massink, Mieke; 10.4204/EPTCS.57

    2011-01-01

    This volume contains the proceedings of the Ninth Workshop on Quantitative Aspects of Programming Languages (QAPL 2011), held in Saarbrucken, Germany, April 1--3, 2011. QAPL 2011 is a satellite event of the European Joint Conferences on Theory and Practice of Software (ETAPS 2011). The workshop theme is on quantitative aspects of computation. These aspects are related to the use of physical quantities (storage space, time, bandwidth, etc.) as well as mathematical quantities (e.g. probability and measures for reliability, security and trust), and play an important (sometimes essential) role in characterising the behavior and determining the properties of systems. Such quantities are central to the definition of both the model of systems (architecture, language design, semantics) and the methodologies and tools for the analysis and verification of the systems properties. The aim of this workshop is to discuss the explicit use of quantitative information such as time and probabilities either directly in the mode...

  7. Proceedings Eighth Workshop on Quantitative Aspects of Programming Languages

    CERN Document Server

    Di Pierro, Alessandra; 10.4204/EPTCS.28

    2010-01-01

    This volume contains the proceedings of the Eighth Workshop on Quantitative Aspects of Programming Languages (QAPL 2010), held in Paphos, Cyprus, on March 27-28, 2010. QAPL 2010 is a satellite event of the European Joint Conferences on Theory and Practice of Software (ETAPS 2010). The workshop theme is on quantitative aspects of computation. These aspects are related to the use of physical quantities (storage space, time, bandwidth, etc.) as well as mathematical quantities (e.g. probability and measures for reliability, security and trust), and play an important (sometimes essential) role in characterising the behavior and determining the properties of systems. Such quantities are central to the definition of both the model of systems (architecture, language design, semantics) and the methodologies and tools for the analysis and verification of the systems properties. The aim of this workshop is to discuss the explicit use of quantitative information such as time and probabilities either directly in the model...

  8. Network workshop

    DEFF Research Database (Denmark)

    Bruun, Jesper; Evans, Robert Harry

    2014-01-01

    This paper describes the background for, realisation of and author reflections on a network workshop held at ESERA2013. As a new research area in science education, networks offer a unique opportunity to visualise and find patterns and relationships in complicated social or academic network data....... These include student relations and interactions and epistemic and linguistic networks of words, concepts and actions. Network methodology has already found use in science education research. However, while networks hold the potential for new insights, they have not yet found wide use in the science education...... research community. With this workshop, participants were offered a way into network science based on authentic educational research data. The workshop was constructed as an inquiry lesson with emphasis on user autonomy. Learning activities had participants choose to work with one of two cases of networks...

  9. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  10. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  11. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  12. Message Correlation Analysis Tool for NOvA

    International Nuclear Information System (INIS)

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  13. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  14. Message Correlation Analysis Tool for NOvA

    CERN Document Server

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  15. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  16. A tool for finite element deflection analysis of wings

    Energy Technology Data Exchange (ETDEWEB)

    Carlen, Ingemar

    2005-03-01

    A first version (ver 0.1) of a new tool for finite element deflection analysis of wind turbine blades is presented. The software is called SOLDE (SOLid blaDE), and was developed as a Matlab shell around the free finite element codes CGX (GraphiX - pre-processor), and CCX (CrunchiX - solver). In the present report a brief description of SOLDE is given, followed by a basic users guide. The main features of SOLDE are: - Deflection analysis of wind turbine blades, including 3D effects and warping. - Accurate prediction of eigenmodes and eigenfrequencies. - Derivation of 2-node slender elements for use in various aeroelastic analyses. The main differences between SOLDE and other similar tools can be summarised as: - SOLDE was developed without a graphical user interface or a traditional text file input deck. Instead the input is organised as Matlab data structures that have to be formed by a user provided pre-processor. - SOLDE uses a solid representation of the geometry instead of a thin shell approximation. The benefit is that the bending-torsion couplings will automatically be correctly captured. However, a drawback with the current version is that the equivalent orthotropic shell idealisation violates the local bending characteristics, which makes the model useless for buckling analyses. - SOLDE includes the free finite element solver CCX, and thus no expensive commercial software (e.g. Ansys, or Nastran) is required to produce results.

  17. Regional energy planning through SWOT analysis and strategic planning tools.

    Energy Technology Data Exchange (ETDEWEB)

    Terrados, J.; Almonacid, G.; Hontoria, L. [Research Group IDEA, Polytechnics School, Campus Las Lagunillas, Edificio A3, University of Jaen, 23071 Jaen (Spain)

    2007-08-15

    Strategic planning processes, which are commonly used as a tool for region development and territorial structuring, can be harnessed by politicians and public administrations, at the local level, to redesign the regional energy system and encourage renewable energy development and environmental preservation. In this sense, the province of Jaen, a southern Spanish region whose economy is mainly based on olive agriculture, has carried out its strategic plan aiming at a major socioeconomic development. Under the leadership of the provincial government and the University of Jaen, main provincial institutions joined to propose the elaboration of a participatory strategic plan for the whole province. Here, the elaboration of the energy part of the plan, which was directly focused on the exploitation of renewable resources, mainly solar and biomass energy, and which highlights the effectiveness of techniques from business management applied to a sustainable energy model design is presented. Renewable Energy development during the first years of plan execution is presented, and the impact of additional issues is discussed. It is concluded that, although multicriteria decision-making technologies (MCDA) are extensively used in energy planning, a different approach can be utilized to incorporate techniques from strategic analysis. Furthermore, SWOT (strengths, weaknesses, opportunities and threats) analysis has proved to be an effective tool and has constituted a suitable baseline to diagnose current problems and to sketch future action lines. (author)

  18. H3ABioNet computational metagenomics workshop in Mauritius: training to analyse microbial diversity for Africa

    OpenAIRE

    Baichoo, Shakuntala; Botha, Gerrit; Jaufeerally-Fakim, Yasmina; Mungloo-Dilmohamud, Zahra; Lundin, Daniel; Mulder, Nicola; Promponas, Vasilis J.; Ouzounis, Christos A.

    2015-01-01

    In the context of recent international initiatives to bolster genomics research for Africa, and more specifically to develop bioinformatics expertise and networks across the continent, a workshop on computational metagenomics was organized during the end of 2014 at the University of Mauritius. The workshop offered background on various aspects of computational biology, including databases and algorithms, sequence analysis fundamentals, metagenomics concepts and tools, practical exercises, jou...

  19. Emerging methods and tools for environmental risk assessment, decision-making, and policy for nanomaterials: summary of NATO Advanced Research Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Linkov, Igor, E-mail: Igor.Linkov@usace.army.mi [U.S. Army Corps of Engineers, Environmental Laboratory (United States); Steevens, Jeffery, E-mail: Jeffery.A.Steevens@us.army.mi [U.S. Army ERDC (United States); Adlakha-Hutcheon, Gitanjali, E-mail: Gitanjali.Adlakha-Hutcheon@drdc-rddc.gc.c [Defense Research and Development Canada (Canada); Bennett, Erin, E-mail: ebennett@bioengineering.co [Intertox Inc. and Bioengineering Group (United States); Chappell, Mark, E-mail: Mark.a.chappell@usace.army.mi [U.S. Army Corps of Engineers, Environmental Laboratory (United States); Colvin, Vicki, E-mail: colvin@rice.ed [Rice University, ICON (United States); Davis, J. Michael, E-mail: Davis.Jmichael@epa.go [Office of Research and Development, U.S. Environmental Protection Agency, National Center for Environmental Assessment (United States); Davis, Thomas, E-mail: ta.davis@umontreal.c [University of Montreal, Environment Canada and Department of Chemistry (Canada); Elder, Alison, E-mail: Alison_Elder@urmc.rochester.ed [University of Rochester, Department of Environmental Medicine (United States); Foss Hansen, Steffen, E-mail: sfh@er.dtu.d [Technical University of Denmark, Department of Environmental Engineering, NanoDTU (Denmark); Hakkinen, Pertti Bert, E-mail: berthakkinen@gmail.co [Toxicology Excellence for Risk Assessment (TERA) (United States); Hussain, Saber M., E-mail: Saber.Hussain@wpafb.af.mi [Air Force Research Laboratory (United States); Karkan, Delara, E-mail: Delara_karkan@hc-sc.gc.c [Health Canada (Canada); Korenstein, Rafi, E-mail: korens@post.tau.ac.i [Marian Gertner Institute for Medical Nanosystems, Tel Aviv University, Department of Physiology and Pharmacology, Faculty of Medicine (Israel); Lynch, Iseult, E-mail: iseult@fiachra.ucd.i [School of Chemistry and Chemical Biology, University College Dublin, Irish Centre for Colloid Science and Biomaterials (Ireland); Metcalfe, Chris, E-mail: cmetcalfe@trentu.c [Trent University (Canada)

    2009-04-15

    Nanomaterials and their associated technologies hold promising opportunities for the development of new materials and applications in a wide variety of disciplines, including medicine, environmental remediation, waste treatment, and energy conservation. However, current information regarding the environmental effects and health risks associated with nanomaterials is limited and sometimes contradictory. This article summarizes the conclusions of a 2008 NATO workshop designed to evaluate the wide-scale implications (e.g., benefits, risks, and costs) of the use of nanomaterials on human health and the environment. A unique feature of this workshop was its interdisciplinary nature and focus on the practical needs of policy decision makers. Workshop presentations and discussion panels were structured along four main themes: technology and benefits, human health risk, environmental risk, and policy implications. Four corresponding working groups (WGs) were formed to develop detailed summaries of the state-of-the-science in their respective areas and to discuss emerging gaps and research needs. The WGs identified gaps between the rapid advances in the types and applications of nanomaterials and the slower pace of human health and environmental risk science, along with strategies to reduce the uncertainties associated with calculating these risks.

  20. IPMP 2013--a comprehensive data analysis tool for predictive microbiology.

    Science.gov (United States)

    Huang, Lihan

    2014-02-01

    Predictive microbiology is an area of applied research in food science that uses mathematical models to predict the changes in the population of pathogenic or spoilage microorganisms in foods exposed to complex environmental changes during processing, transportation, distribution, and storage. It finds applications in shelf-life prediction and risk assessments of foods. The objective of this research was to describe the performance of a new user-friendly comprehensive data analysis tool, the Integrated Pathogen Modeling Model (IPMP 2013), recently developed by the USDA Agricultural Research Service. This tool allows users, without detailed programming knowledge, to analyze experimental kinetic data and fit the data to known mathematical models commonly used in predictive microbiology. Data curves previously published in literature were used to test the models in IPMP 2013. The accuracies of the data analysis and models derived from IPMP 2013 were compared in parallel to commercial or open-source statistical packages, such as SAS® or R. Several models were analyzed and compared, including a three-parameter logistic model for growth curves without lag phases, reduced Huang and Baranyi models for growth curves without stationary phases, growth models for complete growth curves (Huang, Baranyi, and re-parameterized Gompertz models), survival models (linear, re-parameterized Gompertz, and Weibull models), and secondary models (Ratkowsky square-root, Huang square-root, Cardinal, and Arrhenius-type models). The comparative analysis suggests that the results from IPMP 2013 were equivalent to those obtained from SAS® or R. This work suggested that the IPMP 2013 could be used as a free alternative to SAS®, R, or other more sophisticated statistical packages for model development in predictive microbiology.

  1. Cellular barcoding tool for clonal analysis in the hematopoietic system.

    Science.gov (United States)

    Gerrits, Alice; Dykstra, Brad; Kalmykowa, Olga J; Klauke, Karin; Verovskaya, Evgenia; Broekhuis, Mathilde J C; de Haan, Gerald; Bystrykh, Leonid V

    2010-04-01

    Clonal analysis is important for many areas of hematopoietic stem cell research, including in vitro cell expansion, gene therapy, and cancer progression and treatment. A common approach to measure clonality of retrovirally transduced cells is to perform integration site analysis using Southern blotting or polymerase chain reaction-based methods. Although these methods are useful in principle, they generally provide a low-resolution, biased, and incomplete assessment of clonality. To overcome those limitations, we labeled retroviral vectors with random sequence tags or "barcodes." On integration, each vector introduces a unique, identifiable, and heritable mark into the host cell genome, allowing the clonal progeny of each cell to be tracked over time. By coupling the barcoding method to a sequencing-based detection system, we could identify major and minor clones in 2 distinct cell culture systems in vitro and in a long-term transplantation setting. In addition, we demonstrate how clonal analysis can be complemented with transgene expression and integration site analysis. This cellular barcoding tool permits a simple, sensitive assessment of clonality and holds great promise for future gene therapy protocols in humans, and any other applications when clonal tracking is important.

  2. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Ivan Bekavac; Daniela Garbin Praničević

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  3. Curriculum for Development: Analysis and Review of Processes, Products and Outcomes. Final Report: Sub-Regional Curriculum Workshop (Colombo, Sri Lanka, October 1-30, 1976).

    Science.gov (United States)

    United Nations Educational, Scientific, and Cultural Organization, Bangkok (Thailand). Regional Office for Education in Asia and Oceania.

    Presenting proceedings and materials covered at an Asian curriculum workshop involving 15 participants from 7 countries (Afghanistan, Bangladesh, Indonesia, Malaysia, the Philippines, India, and Sri Lanka), this document includes: a discussion of criteria for curriculum analysis re: health education and nutrition instruction for grades 6-10; a…

  4. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  5. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Parish, Esther S [ORNL; Nugent, Philip J [ORNL

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  6. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-12-01

    The deployment and use of lithium-ion (Li-ion) batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge (SOC) histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory (NREL) has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite. This suite of tools pairs NREL’s high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  7. PSIM: A TOOL FOR ANALYSIS OF DEVICE PAIRING METHODS

    Directory of Open Access Journals (Sweden)

    Yasir Arfat Malkani

    2009-10-01

    Full Text Available Wireless networks are a common place nowadays and almost all of the modern devices support wireless communication in some form. These networks differ from more traditional computing systems due tothe ad-hoc and spontaneous nature of interactions among devices. These systems are prone to security risks, such as eavesdropping and require different techniques as compared to traditional securitymechanisms. Recently, secure device pairing in wireless environments has got substantial attention from many researchers. As a result, a significant set of techniques and protocols have been proposed to deal with this issue. Some of these techniques consider devices equipped with infrared, laser, ultrasound transceivers or 802.11 network interface cards; while others require embedded accelerometers, cameras and/or LEDs, displays, microphones and/or speakers. However, many of the proposed techniques or protocols have not been implemented at all; while others are implemented and evaluated in a stand-alone manner without being compared with other related work [1]. We believe that it is because of the lack of specialized tools that provide a common platform to test thepairing methods. As a consequence, we designed such a tool. In this paper, we are presenting design and development of the Pairing Simulator (PSim that can be used to perform the analysis of devicepairing methods.

  8. Best-practice checklists for tephra collection, analysis and reporting - a draft consensus from the Tephra 2014 workshop

    Science.gov (United States)

    Wallace, K.; Bursik, M. I.; Kuehn, S. C.

    2015-12-01

    The Tephra 2014 Workshop (3-7 August 2014) discussed major developments, best practices, future directions, and critical needs in tephra studies from both volcanological and tephrochronological perspectives. In a consensus-seeking session held at the end of the workshop, the international group of over 70 tephra scientists focused on two complementary themes: (A) the need for common best practices in tephra data collection and reporting among different scientific disciplines, and (B) the need to establish common, accessible mechanisms for tephra data archiving and retrieval. Tephra is the focus of a wide range of research in volcanology, petrology, tephrochronology and tephrostratigraphy (with applications in studies of environmental/climate change, surface processes, paleolimnology, etc.), ash dispersion and fallout modeling, and archaeology, paleoanthropology, and human origins. Researchers in each field have specific objectives that may or may not overlap. The focus on best practices is a first step towards standardized protocols for the collection, analysis and reporting of tephra data across and within disciplines. Such uniformity will facilitate the development and population of useful tephra databases. Current initiatives include the development of best practice checklists as a starting point for ensuring uniformity and completeness. The goals of the checklists are to: 1) ensure consistency among tephra scientists, regardless of research focus, 2) provide basic, comprehensible, metadata requirements, especially those who collect tephra as a peripheral part of their research, 3) help train students, and 4) help journal editors to know which essential metadata should be included in published works. Consistency in tephra sample collection, analysis, and reporting attained by use of these checklists should ultimately aid in improving correlation of tephras across geographically large areas, and facilitate collaborative tephra research. Current and future

  9. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  10. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  11. Input Range Testing for the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  12. Nuclear Fuel Cycle Analysis and Simulation Tool (FAST)

    International Nuclear Information System (INIS)

    This paper describes the Nuclear Fuel Cycle Analysis and Simulation Tool (FAST) which has been developed by the Korea Atomic Energy Research Institute (KAERI). Categorizing various mix of nuclear reactors and fuel cycles into 11 scenario groups, the FAST calculates all the required quantities for each nuclear fuel cycle component, such as mining, conversion, enrichment and fuel fabrication for each scenario. A major advantage of the FAST is that the code employs a MS Excel spread sheet with the Visual Basic Application, allowing users to manipulate it with ease. The speed of the calculation is also quick enough to make comparisons among different options in a considerably short time. This user-friendly simulation code is expected to be beneficial to further studies on the nuclear fuel cycle to find best options for the future all proliferation risk, environmental impact and economic costs considered

  13. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  14. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    Directory of Open Access Journals (Sweden)

    Matthew V Caruana

    Full Text Available Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns.

  15. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis

    DEFF Research Database (Denmark)

    Callot, Laurent Abdelkader Francois; Paldam, Martin

    Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis...

  16. System-of-Systems Technology-Portfolio-Analysis Tool

    Science.gov (United States)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  17. Usage of a Responsible Gambling Tool: A Descriptive Analysis and Latent Class Analysis of User Behavior.

    Science.gov (United States)

    Forsström, David; Hesser, Hugo; Carlbring, Per

    2016-09-01

    Gambling is a common pastime around the world. Most gamblers can engage in gambling activities without negative consequences, but some run the risk of developing an excessive gambling pattern. Excessive gambling has severe negative economic and psychological consequences, which makes the development of responsible gambling strategies vital to protecting individuals from these risks. One such strategy is responsible gambling (RG) tools. These tools track an individual's gambling history and supplies personalized feedback and might be one way to decrease excessive gambling behavior. However, research is lacking in this area and little is known about the usage of these tools. The aim of this article is to describe user behavior and to investigate if there are different subclasses of users by conducting a latent class analysis. The user behaviour of 9528 online gamblers who voluntarily used a RG tool was analysed. Number of visits to the site, self-tests made, and advice used were the observed variables included in the latent class analysis. Descriptive statistics show that overall the functions of the tool had a high initial usage and a low repeated usage. Latent class analysis yielded five distinct classes of users: self-testers, multi-function users, advice users, site visitors, and non-users. Multinomial regression revealed that classes were associated with different risk levels of excessive gambling. The self-testers and multi-function users used the tool to a higher extent and were found to have a greater risk of excessive gambling than the other classes.

  18. SisRadiologia: a new software tool for analysis of radiological accidents and incidents in industrial radiography

    Energy Technology Data Exchange (ETDEWEB)

    Lima, Camila M. Araujo; Silva, Francisco C.A. da, E-mail: araujocamila@yahoo.com.br, E-mail: dasilva@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Araujo, Rilton A., E-mail: consultoria@maximindustrial.com.br [Maxim Industrial Assessoria TI, Rio de Janeiro, RJ (Brazil)

    2013-07-01

    According to the International Atomic Energy Agency (IAEA), many efforts have been made by Member states, aiming a better control of radioactive sources. Accidents mostly happened in practices named as high radiological risk and classified by IAEA in categories 1 and 2, being highlighted those related to radiotherapy, large irradiators and industrial radiography. Worldwide, more than 40 radiological accidents have been recorded in the industrial radiography. Worldwide, more than 40 radiological accidents have been recorded in the industrial radiography area, involving 37 workers, 110 members of the public and 12 fatalities. Records display 5 severe radiological accidents in industrial radiography activities in Brazil, in which 7 workers and 19 members of the public were involved. Such events led to hands and fingers radiodermatitis, but to no death occurrence. The purpose of this study is to present a computational program that allows the data acquisition and recording in the company, in such a way to ease a further detailed analysis of radiological event, besides providing the learning cornerstones aiming the avoidance of future occurrences. After one year of the 'Industrial SisRadiologia' computational program application - and mostly based upon the workshop about Analysis and Dose Calculation of Radiological Accidents in Industrial Radiography (Workshop sobre Analise e Calculo de dose de acidentes Radiologicos em Radiografia Industrial - IRD 2012), in which several Radiation Protection officers took part - it can be concluded that the computational program is a powerful tool to data acquisition, as well as, to accidents and incidents events recording and surveying in Industrial Radiography. The program proved to be efficient in the report elaboration to the Brazilian Regulatory Authority, and very useful in workers training to fix the lessons learned from radiological events.

  19. Generalized Analysis Tools for Multi-Spacecraft Missions

    Science.gov (United States)

    Chanteur, G. M.

    2011-12-01

    Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI

  20. Research on Tool Management Based Total Life for DNC Workshop%面向DNC车间的刀具全寿命周期管理技术研究

    Institute of Scientific and Technical Information of China (English)

    殷锐; 陈金亮

    2013-01-01

    Tool is indispensable resources of DNC workshop. Tool management based total life is affected by many large manufacturing enterprises attention. Tool marking and identification and life prediction are difficult and pivotal of tool management. According to the tool marking and identification, direct carving technology and automatic identification technology of tool were discussed, put forward strip light source as auxiliary illuminant method to improve the barcode recognition rate. According to the tool life prediction, a forecast method based on the BP neural network was proposed. The basic idea of the algorithm was introduced and experimental results were given.%刀具作为DNC车间不可缺少的资源,其全寿命周期管理受到许多大型制造企业的重视,而刀具信息的标识和寿命预测是实现刀具管理的难点和关键.针对刀具信息标识,主要研究了刀具信息的直接标刻技术和刀具信息的自动识别技术,提出了用条状光源作为辅助光源的方法来提高条码识别率的方法;针对刀具寿命预测,提出了基于BP神经网络的预测方法,介绍了该算法的基本思想,并给出了实验结果.

  1. Micropollutants in urban watersheds : substance flow analysis as management tool

    Science.gov (United States)

    Rossi, L.; Copin, P. J.; Barry, A. D.; Bader, H.-P.; Scheidegger, R.; Chèvre, N.

    2009-04-01

    Micropollutants released by cities into water are of increasing concern as they are suspected of inducing long-term effects on both aquatic organisms and humans (eg., hormonally active substances). Substances found in the urban water cycle have different sources in the urban area and different fates in this cycle. For example, the pollutants emitted from traffic, like copper or PAHs get to surface water during rain events often without any treatment. Pharmaceuticals resulting from human medical treatments get to surface water mainly through wastewater treatment plants, where they are only partly treated and eliminated. One other source of contamination in urban areas for these compounds are combined sewer overflows (CSOs). Once in the receiving waters (lakes, rivers, groundwater), these substances may re-enter the cycle through drinking water. It is therefore crucial to study the behaviour of micropollutants in the urban water cycle and to get flexible tools for urban water management. Substance flow analysis (SFA) has recently been proposed as instrument for water pollution management in urban water systems. This kind of analysis is an extension of material flow analysis (MFA) originally developed in the economic sector and later adapted to regional investigations. In this study, we propose to test the application of SFA for a large number of classes of micropollutants to evaluate its use for urban water management. We chose the city of Lausanne as case study since the receiving water of this city (Lake Geneva) is an important source of drinking water for the surrounding population. Moreover a profound system-knowledge and many data were available, both on the sewer system and the water quality. We focus our study on one heavy metal (copper) and four pharmaceuticals (diclofenac, ibuprofen, carbamazepine and naproxen). Results conducted on copper reveals that around 1500 kg of copper enter the aquatic compartment yearly. This amount contributes to sediment

  2. Thermal buckling comparative analysis using Different FE (Finite Element) tools

    Energy Technology Data Exchange (ETDEWEB)

    Banasiak, Waldemar; Labouriau, Pedro [INTECSEA do Brasil, Rio de Janeiro, RJ (Brazil); Burnett, Christopher [INTECSEA UK, Surrey (United Kingdom); Falepin, Hendrik [Fugro Engineers SA/NV, Brussels (Belgium)

    2009-12-19

    High operational temperature and pressure in offshore pipelines may lead to unexpected lateral movements, sometimes call lateral buckling, which can have serious consequences for the integrity of the pipeline. The phenomenon of lateral buckling in offshore pipelines needs to be analysed in the design phase using FEM. The analysis should take into account many parameters, including operational temperature and pressure, fluid characteristic, seabed profile, soil parameters, coatings of the pipe, free spans etc. The buckling initiation force is sensitive to small changes of any initial geometric out-of-straightness, thus the modeling of the as-laid state of the pipeline is an important part of the design process. Recently some dedicated finite elements programs have been created making modeling of the offshore environment more convenient that has been the case with the use of general purpose finite element software. The present paper aims to compare thermal buckling analysis of sub sea pipeline performed using different finite elements tools, i.e. general purpose programs (ANSYS, ABAQUS) and dedicated software (SAGE Profile 3D) for a single pipeline resting on an the seabed. The analyses considered the pipeline resting on a flat seabed with a small levels of out-of straightness initiating the lateral buckling. The results show the quite good agreement of results of buckling in elastic range and in the conclusions next comparative analyses with sensitivity cases are recommended. (author)

  3. Tool for Sizing Analysis of the Advanced Life Support System

    Science.gov (United States)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  4. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  5. Bioanalyzer: An Efficient Tool for Sequence Retrieval, Analysis and Manipulation

    Directory of Open Access Journals (Sweden)

    Hassan Tariq

    2010-12-01

    Full Text Available Bioanalyzer provides combination of tools that are never assembled together. Software has list of tools that can be important for different researchers. The aim to develop this kind of software is to provide unique set of tools at one platform in a more efficient and better way than the software or web tools available. It is stand-alone application so it can save time and effort to locate individual tools on net. Flexible design has made it easy to expand it in future. We will make it available publicly soon.

  6. Fatigue in cold-forging dies: Tool life analysis

    DEFF Research Database (Denmark)

    Skov-Hansen, P.; Bay, Niels; Grønbæk, J.;

    1999-01-01

    In the present investigation it is shown how the tool life of heavily loaded cold-forging dies can be predicted. Low-cycle fatigue and fatigue crack growth testing of the tool materials are used in combination with finite element modelling to obtain predictions of tool lives. In the models the...... number of forming cycles is calculated first to crack initiation and then during crack growth to fatal failure. An investigation of a critical die insert in an industrial cold-forging tool as regards the influence of notch radius, the amount and method of pre-stressing and the selected tool material is...

  7. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    Science.gov (United States)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  8. BUSINESS INTELLIGENCE TOOLS FOR DATA ANALYSIS AND DECISION MAKING

    Directory of Open Access Journals (Sweden)

    DEJAN ZDRAVESKI

    2011-04-01

    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  9. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Pakarinen Jyri

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  10. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Science.gov (United States)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  11. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  12. Suspended Cell Culture ANalysis (SCAN) Tool to Enhance ISS On-Orbit Capabilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences and partner, Draper Laboratory, propose to develop an on-orbit immuno-based label-free Suspension Cell Culture ANalysis tool, SCAN tool,...

  13. Highlights of the Workshop

    Science.gov (United States)

    Noor, Ahmed K.

    1997-01-01

    Economic stresses are forcing many industries to reduce cost and time-to-market, and to insert emerging technologies into their products. Engineers are asked to design faster, ever more complex systems. Hence, there is a need for novel design paradigms and effective design tools to reduce the design and development times. Several computational tools and facilities have been developed to support the design process. Some of these are described in subsequent presentations. The focus of the workshop is on the computational tools and facilities which have high potential for use in future design environment for aerospace systems. The outline for the introductory remarks is given. First, the characteristics and design drivers for future aerospace systems are outlined; second, simulation-based design environment, and some of its key modules are described; third, the vision for the next-generation design environment being planned by NASA, the UVA ACT Center and JPL is presented. The anticipated major benefits of the planned environment are listed; fourth, some of the government-supported programs related to simulation-based design are listed; and fifth, the objectives and format of the workshop are presented.

  14. CLARINET workshop 2001

    Energy Technology Data Exchange (ETDEWEB)

    Wensem, J. van [Soil Protection Technical Commitee, The Hague (Netherlands)

    2003-07-01

    In spring 2001, the CLARINET workshop (CLARINET, 2001) on ecological risk assessment agreed on an outline of an EU-framework on site specific ecological risk assessment (SS-ERA). The main final conclusion of this workshop was: 'On the one hand there agreement on the outline of an EU-framework on ERA. On the other hand much details are not filled in yet or have not been discussed yet. From these two facts it can be concluded that there is a good basis for filling in the ERA in future and ongoing discussion is recommended'. - As important common elements for an European framework for SS-ERA were identified: - Generic values in the first tier; - Bioassays; - Bioavailability; - Land use specific; - Negotiable with stakeholders. Although the workshop agreed on a tiered approach, no final conclusion was drawn about the elements of each tier. In this special session of ConSoil we will continue the discussion about use of SS-ERA and the possibilities for a European framework. We will start with introductionary presentations that will focus on four main topics for SS-ERA: - Implementation of site specific ecological risk assessment as a regulatory tool: what to take into consideration; - The feasibility of bio-assays in site specific ecological risk assessment; - Bio-availability; Higher tier field research in ecological risk assessment: a case study. (orig.)

  15. Analysis of the influence of tool dynamics in diamond turning

    Energy Technology Data Exchange (ETDEWEB)

    Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

    1988-12-01

    This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

  16. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    Science.gov (United States)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  17. Analysis of Facial Injuries Caused by Power Tools.

    Science.gov (United States)

    Kim, Jiye; Choi, Jin-Hee; Hyun Kim, Oh; Won Kim, Sug

    2016-06-01

    The number of injuries caused by power tools is steadily increasing as more domestic woodwork is undertaken and more power tools are used recreationally. The injuries caused by the different power tools as a consequence of accidents are an issue, because they can lead to substantial costs for patients and the national insurance system. The increase in hand surgery as a consequence of the use of power tools and its economic impact, and the characteristics of the hand injuries caused by power saws have been described. In recent years, the authors have noticed that, in addition to hand injuries, facial injuries caused by power tools commonly present to the emergency room. This study aimed to review the data in relation to facial injuries caused by power saws that were gathered from patients who visited the trauma center at our hospital over the last 4 years, and to analyze the incidence and epidemiology of the facial injuries caused by power saws. The authors found that facial injuries caused by power tools have risen continually. Facial injuries caused by power tools are accidental, and they cause permanent facial disfigurements and functional disabilities. Accidents are almost inevitable in particular workplaces; however, most facial injuries could be avoided by providing sufficient operator training and by tool operators wearing suitable protective devices. The evaluation of the epidemiology and patterns of facial injuries caused by power tools in this study should provide the information required to reduce the number of accidental injuries.

  18. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    Science.gov (United States)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  19. The Utility of a Geostationary Doppler radar applied to the hurricane analysis and prediction problem: A Report on the 1st Nexrad in Space Workshop

    Science.gov (United States)

    Tripoli, G. J.; Chandrasekar, V.; Chen, S. S.; Holland, G. J.; Im, E.; Kakar, R.; Lewis, W. E.; Marks, F. D.; Smith, E. A.; Tanelli, S.

    2007-12-01

    Last April the first Nexrad in Space (NIS) workshop was held in Miami, Florida to discuss the value and requirements for a possible satellite mission featuring a Doppler radar in geostationary orbit capable of measuring the internal structure of tropical cyclones over a circular scan area 50 degrees latitude in diameter. The proposed NIS technology, based on the PR2 radar design developed at JPL and an innovative deployable antenna design developed at UCLA would be capable of 3D volume sampling with 12 km horizontal and 300 m vertical resolution and 1 hour scan period. The workshop participants consisted of the JPL and UCLA design teams and cross section of tropical cyclone forecasters, researchers and modelers who could potentially benefit from this technology. The consensus of the workshop included: (a) the NIS technology would provide observations to benefit hurricane forecasters, real time weather prediction models and model researchers, (b) the most important feature of NIS was its high frequency coverage together with its 3D observation capability. These features were found to fill a data gap, now developing within cloud resolving analysis and prediction systems for which there is no other proposed solution, particularly over the oceans where TCs form. Closing this data gap is important to the improvement of TC intensity prediction. A complete description of the potential benefits and recommended goals for this technology concluded by the workshop participants will be given at the oral presentation.

  20. Creating Fantastic PI Workshops

    Energy Technology Data Exchange (ETDEWEB)

    Biedermann, Laura B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Clark, Blythe G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Colbert, Rachel S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dagel, Amber Lynn [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gupta, Vipin P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hibbs, Michael R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perkins, David Nikolaus [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); West, Roger Derek [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The goal of this SAND report is to provide guidance for other groups hosting workshops and peerto-peer learning events at Sandia. Thus this SAND report provides detail about our team structure, how we brainstormed workshop topics and developed the workshop structure. A Workshop “Nuts and Bolts” section provides our timeline and check-list for workshop activities. The survey section provides examples of the questions we asked and how we adapted the workshop in response to the feedback.

  1. SIMS applications workshop. Proceedings

    International Nuclear Information System (INIS)

    The first ANSTO/AINSE SIMS Workshop drew together a mixture of Surface Analysis experts and Surface Analysis users with the concept that SIMS analysis has to be enfolded within the spectrum of surface analysis techniques and that the user should select the technique most applicable to the problem. With this concept in mind the program was structured as sessions on SIMS Facilities; Applications to Mineral Surfaces; Applications to Biological Systems, Applications to Surfaces as Semi- conductors, Catalysts and Surface Coatings; and Applications to Ceramics

  2. Proceedings of the workshop on applications of synchrotron radiation to trace impurity analysis for advanced silicon processing

    Energy Technology Data Exchange (ETDEWEB)

    Laderman, S [Integrated Circuits Business Div., Hewlett Packard Co., Palo Alto, CA (United States); Pianetta, P [Stanford Linear Accelerator Center, Menlo Park, CA (United States)

    1993-03-01

    Wafer surface trace impurity analysis is essential for development of competitive Si circuit technologies. Today's grazing incidence x-ray fluorescence techniques with rotating anodes fall short of requirements for the future. Hewlett Packard/Toshiba experiments indicate that with second generation synchrotron sources such as SSRL, the techniques can be extended sufficiently to meet important needs of the leading edge Si circuit industry through nearly all of the 1990's. This workshop was held to identify people interested in use of synchrotron radiation-based methods and to document needs and concerns for further development. Viewgraphs are included for the following presentations: microcontamination needs in silicon technology (M. Liehr), analytical methods for wafer surface contamination (A. Schimazaki), trace impurity analysis of liquid drops using synchrotron radiation (D. Wherry), TRXRF using synchrotron sources (S. Laderman), potential role of synchrotron radiation TRXRF in Si process R D (M. Scott), potenital development of synchrotron radiation facilities (S. Brennan), and identification of goals, needs and concerns (M. Garner).

  3. Simplified Analysis Tool for Ship-Ship Collision

    DEFF Research Database (Denmark)

    Yamada, Yasuhira; Pedersen, Preben Terndrup

    2007-01-01

    to the collision scenario thatwhere a VLCC in ballast condition collides perpendicularly with the mid part of another D/H VLCC in fully loaded condition. The results obtained from the present tool are compared with those obtained by large scale FEA, and fairy good agreements are achieved. The applicability......, limitation and future enhancement of the present tool are discussed in detail....

  4. In Silico Analysis of Crop Science: Report on the First China-UK Workshop on Chips, Computers and Crops

    Institute of Scientific and Technical Information of China (English)

    Ming Chen; Andrew Harrison

    2008-01-01

    A workshop on "Chips, Computers and Crops" was held in Hangzhou, China during September 26-27, 2008. The main objective of the workshop was to bring together China and UK scientists from mathematics, bioinformatics and plant molecular biology communities to exchange ideas, enhance awareness of each others' fields,explore synergisms and make recommendations on fruitful future directions in crop science. Here we describe the contributions to the workshop, and examine some conceptual issues that lie at the foundations and future of crop systems biology.

  5. Collaborative authoring workshop

    NARCIS (Netherlands)

    Klemke, Roland; Schmitz, Birgit

    2009-01-01

    Klemke, R., & Schmitz, B. (2009). Collaborative authoring workshop. Workshop presentation at the Joint Technology Enhanced Learning Summerschool (JTELSS 2009). May, 30-June, 6, 2009, Terchova, Slovakia.

  6. WooW-II: Workshop on open workflows

    Directory of Open Access Journals (Sweden)

    Daniel Arribas-Bel

    2015-07-01

    Full Text Available This resource describes WooW-II, a two-day workshop on open workflows for quantitative social scientists. The workshop is broken down in five main parts, where each of them typically consists of an introductionary tutorial and a hands-on assignment. The specific tools discussed in this workshop are Markdown, Pandoc, Git, Github, R, and Rstudio, but the theoretical approach applies to a wider range of tools (e.g., LATEX and Python. By the end of the workshop, participants should be able to reproduce a paper of their own and make it available in an open form applying the concepts and tools introduced.

  7. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  8. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  9. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    Science.gov (United States)

    Adams, David; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Farrell, Steven; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-12-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  10. TPC Workshop

    International Nuclear Information System (INIS)

    The first workshop to focus on time projection chambers was held at TRIUMF (Canada) this summer. Some 75 participants came from groups in Europe and North America using TPCs in a variety of applications in experimental physics. Reports included several general descriptions of existing detectors as well as some proposals for new instruments. A time projection chamber (TPC) is the name given to a class of large volume drift chambers which operate generally with parallel electric and magnetic fields. Applications span energies from a few MeV in double beta decay searches, through intermediate energies in muon decay studies to large high energy arrays planned for LEP at CERN

  11. Workshop experience

    Directory of Open Access Journals (Sweden)

    Georgina Holt

    2007-04-01

    Full Text Available The setting for the workshop was a heady mix of history, multiculturalism and picturesque riverscapes. Within the group there was, as in many food studies, a preponderance of female scientists (or ethnographers, but the group interacted on lively, non-gendered terms - focusing instead on an appreciation of locals food and enthusiasm for research shared by all, and points of theoretical variance within that.The food provided by our hosts was of the very highest eating and local food qualities...

  12. A policy model to initiate environmental negotiations: Three hydropower workshops

    Science.gov (United States)

    Lamb, Berton Lee; Taylor, Jonathan G.; Burkardt, Nina; Ponds, Phadrea D.

    1998-01-01

    How do I get started in natural resource negotiations? Natural resource managers often face difficult negotiations when they implement laws and policies regulating such resources as water, wildlife, wetlands, endangered species, and recreation. As a result of these negotiations, managers must establish rules, grant permits, or create management plans. The Legal‐Institutional Analysis Model (LIAM) was designed to assist managers in systematically analyzing the parties in natural resource negotiations and using that analysis to prepare for bargaining. The LIAM relies on the theory that organizations consistently employ behavioral roles. The model uses those roles to predict likely negotiation behavior. One practical use of the LIAM is when all parties to a negotiation conduct a workshop as a way to open the bargaining on a note of trust and mutual understanding. The process and results of three LIAM workshops designed to guide hydroelectric power licensing negotiations are presented. Our experience with these workshops led us to conclude that the LIAM can be an effective tool to begin a negotiation and that trust built through the workshops can help create a successful result.

  13. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    Science.gov (United States)

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  14. General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft

    Science.gov (United States)

    Dove, Edwin; Hughes, Steve

    2007-01-01

    The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

  15. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  16. Online Analysis of Wind and Solar Part II: Transmission Tool

    Energy Technology Data Exchange (ETDEWEB)

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the tool has been developed and implemented in software.

  17. Online Analysis of Wind and Solar Part I: Ramping Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  18. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc

    2015-04-21

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  19. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  20. Brief of BES-Belle-CLEO-BaBar 2007 joint workshop on charm physics

    Institute of Scientific and Technical Information of China (English)

    WANG Yi-Fang; ZHANG Chang-Chun

    2008-01-01

    @@ The Institute of High Energy Physics of Chinese Academy of Sciences organized a workshop to establish closer contacts between experimentalists (theorists) involved in the studies of charm physics from both c and B communities. The workshop covers talks of physics analysis and its results from four electron-positron colliding experiments (BES, Belle, CLEO and BaBar). Presentations at the workshop are organized in the following sessions : (1) Hadron spectroscopy and new resonances; (2) D0-D0 mixing; (3) Charmonium decays; (4) Charm hadronic and (semi-)lcptonic decays; (5) QCD at low energy and τphysics; (6) Partial wave analysis and Dalitz analysis, MC generator and Tools; (7) Detector upgrade.

  1. Recent Workshops

    CERN Multimedia

    Wickens, F. J.

    Since the previous edition of ATLAS e-news, the NIKHEF Institute in Amsterdam has hosted not just one but two workshops related to ATLAS TDAQ activities. The first in October was dedicated to the Detector Control System (DCS). Just three institutes, CERN, NIKHEF and St Petersburg, provide the effort for the central DCS services, but each ATLAS sub-detector provides effort for their own controls. Some 30 people attended, including representatives for all of the ATLAS sub-detectors, representatives of the institutes working on the central services and the project leader of JCOP, which brings together common aspects of detector controls across the LHC experiments. During the three-day workshop the common components were discussed, and each sub-detector described their experiences and plans for their future systems. Whilst many of the components to be used are standard commercial components, a key custom item for ATLAS is the ELMB (Embedded Local Monitor Board). Prototypes for this have now been extensively test...

  2. Applied climate-change analysis: the climate wizard tool.

    Directory of Open Access Journals (Sweden)

    Evan H Girvetz

    Full Text Available BACKGROUND: Although the message of "global climate change" is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. METHODOLOGY/PRINCIPAL FINDINGS: To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951-2002 occurred in northern hemisphere countries (especially during January-April, but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50 degrees N during February-March to 10 degrees N during August-September. Precipitation decreases occurred most commonly in countries between 0-20 degrees N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs for 2070-2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. CONCLUSIONS/SIGNIFICANCE: The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally

  3. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    Science.gov (United States)

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest

  4. Analysis of Marketing Mix Tools in a Chosen Company

    OpenAIRE

    Havlíková, Žaneta

    2008-01-01

    The complex structure of the marketing mix of the Velteko s.r.o. company is analysed in my work. Characterising of the concern, basic analysing of the individual tools of the marketing mix and clients survey evaluating is the main object. Conclusion is focused on the rating of the level and the efficiency of the utilization of the used tools. Relevant recommendations are added.

  5. Advancing Risk Analysis for Nanoscale Materials: Report from an International Workshop on the Role of Alternative Testing Strategies for Advancement.

    Science.gov (United States)

    Shatkin, J A; Ong, Kimberly J; Beaudrie, Christian; Clippinger, Amy J; Hendren, Christine Ogilvie; Haber, Lynne T; Hill, Myriam; Holden, Patricia; Kennedy, Alan J; Kim, Baram; MacDonell, Margaret; Powers, Christina M; Sharma, Monita; Sheremeta, Lorraine; Stone, Vicki; Sultan, Yasir; Turley, Audrey; White, Ronald H

    2016-08-01

    The Society for Risk Analysis (SRA) has a history of bringing thought leadership to topics of emerging risk. In September 2014, the SRA Emerging Nanoscale Materials Specialty Group convened an international workshop to examine the use of alternative testing strategies (ATS) for manufactured nanomaterials (NM) from a risk analysis perspective. Experts in NM environmental health and safety, human health, ecotoxicology, regulatory compliance, risk analysis, and ATS evaluated and discussed the state of the science for in vitro and other alternatives to traditional toxicology testing for NM. Based on this review, experts recommended immediate and near-term actions that would advance ATS use in NM risk assessment. Three focal areas-human health, ecological health, and exposure considerations-shaped deliberations about information needs, priorities, and the next steps required to increase confidence in and use of ATS in NM risk assessment. The deliberations revealed that ATS are now being used for screening, and that, in the near term, ATS could be developed for use in read-across or categorization decision making within certain regulatory frameworks. Participants recognized that leadership is required from within the scientific community to address basic challenges, including standardizing materials, protocols, techniques and reporting, and designing experiments relevant to real-world conditions, as well as coordination and sharing of large-scale collaborations and data. Experts agreed that it will be critical to include experimental parameters that can support the development of adverse outcome pathways. Numerous other insightful ideas for investment in ATS emerged throughout the discussions and are further highlighted in this article.

  6. Advancing Risk Analysis for Nanoscale Materials: Report from an International Workshop on the Role of Alternative Testing Strategies for Advancement.

    Science.gov (United States)

    Shatkin, J A; Ong, Kimberly J; Beaudrie, Christian; Clippinger, Amy J; Hendren, Christine Ogilvie; Haber, Lynne T; Hill, Myriam; Holden, Patricia; Kennedy, Alan J; Kim, Baram; MacDonell, Margaret; Powers, Christina M; Sharma, Monita; Sheremeta, Lorraine; Stone, Vicki; Sultan, Yasir; Turley, Audrey; White, Ronald H

    2016-08-01

    The Society for Risk Analysis (SRA) has a history of bringing thought leadership to topics of emerging risk. In September 2014, the SRA Emerging Nanoscale Materials Specialty Group convened an international workshop to examine the use of alternative testing strategies (ATS) for manufactured nanomaterials (NM) from a risk analysis perspective. Experts in NM environmental health and safety, human health, ecotoxicology, regulatory compliance, risk analysis, and ATS evaluated and discussed the state of the science for in vitro and other alternatives to traditional toxicology testing for NM. Based on this review, experts recommended immediate and near-term actions that would advance ATS use in NM risk assessment. Three focal areas-human health, ecological health, and exposure considerations-shaped deliberations about information needs, priorities, and the next steps required to increase confidence in and use of ATS in NM risk assessment. The deliberations revealed that ATS are now being used for screening, and that, in the near term, ATS could be developed for use in read-across or categorization decision making within certain regulatory frameworks. Participants recognized that leadership is required from within the scientific community to address basic challenges, including standardizing materials, protocols, techniques and reporting, and designing experiments relevant to real-world conditions, as well as coordination and sharing of large-scale collaborations and data. Experts agreed that it will be critical to include experimental parameters that can support the development of adverse outcome pathways. Numerous other insightful ideas for investment in ATS emerged throughout the discussions and are further highlighted in this article. PMID:27510619

  7. 21st Century Kinematics : The 2012 NSF Workshop

    CERN Document Server

    2013-01-01

    21st Century Kinematics focuses on algebraic problems in the analysis and synthesis of mechanisms and robots, compliant mechanisms, cable-driven systems and protein kinematics. The specialist contributors provide the background for a series of presentations at the 2012 NSF Workshop. The text shows how the analysis and design of innovative mechanical systems yield increasingly complex systems of polynomials, characteristic of those systems. In doing so, takes advantage of increasingly sophisticated computational tools developed for numerical algebraic geometry and demonstrates the now routine derivation of polynomial systems dwarfing the landmark problems of even the recent past. The 21st Century Kinematics workshop echoes the NSF-supported 1963 Yale Mechanisms Teachers Conference that taught a generation of university educators the fundamental principles of kinematic theory. As such these proceedings will be provide admirable supporting theory for a graduate course in modern kinematics and should be of consid...

  8. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Directory of Open Access Journals (Sweden)

    Marilyn Wilhelmina Leonora Monster

    2015-12-01

    Full Text Available The multispecimen protocol (MSP is a method to estimate the Earth’s magnetic field’s past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA, that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected calculated following Dekkers and Böhnel (2006 and Fabian and Leonhardt (2010 and a number of other parameters proposed by Fabian and Leonhardt (2010, it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM and the partial thermoremanent magnetization (pTRM gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  9. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  10. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  11. Monte Carlo tools for Beyond the Standard Model Physics , April 14-16

    DEFF Research Database (Denmark)

    Badger...[], Simon; Christensen, Christian Holm; Dalsgaard, Hans Hjersing;

    2011-01-01

    This workshop aims to gather together theorists and experimentalists interested in developing and using Monte Carlo tools for Beyond the Standard Model Physics in an attempt to be prepared for the analysis of data focusing on the Large Hadron Collider. Since a large number of excellent tools alre...

  12. Computational proteomics pitfalls and challenges: HavanaBioinfo 2012 workshop report.

    Science.gov (United States)

    Perez-Riverol, Yasset; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart; Creasy, David; Cox, Jürgen; Leprevost, Felipe; Shan, Baozhen Paul; Pérez-Nueno, Violeta I; Blazejczyk, Michal; Punta, Marco; Vierlinger, Klemens; Valiente, Pedro A; Leon, Kalet; Chinea, Glay; Guirola, Osmany; Bringas, Ricardo; Cabrera, Gleysin; Guillen, Gerardo; Padron, Gabriel; Gonzalez, Luis Javier; Besada, Vladimir

    2013-07-11

    The workshop "Bioinformatics for Biotechnology Applications (HavanaBioinfo 2012)", held December 8-11, 2012 in Havana, aimed at exploring new bioinformatics tools and approaches for large-scale proteomics, genomics and chemoinformatics. Major conclusions of the workshop include the following: (i) development of new applications and bioinformatics tools for proteomic repository analysis is crucial; current proteomic repositories contain enough data (spectra/identifications) that can be used to increase the annotations in protein databases and to generate new tools for protein identification; (ii) spectral libraries, de novo sequencing and database search tools should be combined to increase the number of protein identifications; (iii) protein probabilities and FDR are not yet sufficiently mature; (iv) computational proteomics software needs to become more intuitive; and at the same time appropriate education and training should be provided to help in the efficient exchange of knowledge between mass spectrometrists and experimental biologists and bioinformaticians in order to increase their bioinformatics background, especially statistics knowledge. PMID:23376229

  13. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    Science.gov (United States)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  14. Fatigue analysis-based numerical design of stamping tools made of cast iron

    OpenAIRE

    Ben Slima, Khalil; Penazzi, Luc; Mabru, Catherine; Ronde-Oustau, François

    2013-01-01

    International audience This work concerns stress and fatigue analysis of stamping tools made of cast iron with an essentially pearlitic matrix and containing foundry defects. Our approach consists at first, in coupling the stamping numerical processing simulations and structure analysis in order to improve the tool stiffness geometry for minimizing the stress state and optimizing their fatigue lifetime. The method consists in simulating the stamping process by considering the tool as a per...

  15. Workshop "Risk and multicriteria Analysis. An application to natural resources management"

    OpenAIRE

    Bushenkov, Vladimir; Oliveira, Manuela

    2011-01-01

    Program Alexander Lotov (CCRAS, Russia) "Computer visualization of production possibility set in Data Envelopment Analysis" Roman Efremov (URJC, Spain) “Methodology for modeling processes of participatory decision-making in the environmental field with examples from the Water Debate in Catalonia” Maria João Batista (LNEG, Portugal) “Data analysis applied to mineral resources management: Exploration and environmental diagnosis” Susete Marques (ISA, Portugal) “Assessing wildfire ris...

  16. “DRYPACK” - a calculation and analysis tool

    DEFF Research Database (Denmark)

    Andreasen, M.B.; Toftegaard, R.; Schneider, P.;

    2013-01-01

    energy consumption reductions by using “DryPack” are calculated. With the “DryPack” calculation tool, it is possible to calculate four different unit operations with moist air (dehumidification of air, humidification of air, mixing of two air streams, and heating of air). In addition, a Mollier diagram......“DryPack” is a calculation tool that visualises the energy consumption of airbased and superheated steam drying processes. With “DryPack”, it is possible to add different components to a simple drying process, and thereby increase the flexibility, which makes it possible to analyse the most common...... drying processes. Moreover, it is possible to change the configuration of the dryer by including/changing energy saving components to illustrate the potential of the new configuration. The calculation tool is demonstrated in four different case studies, where the actual energy consumption and possible...

  17. Lagrangian analysis. Modern tool of the dynamics of solids

    Science.gov (United States)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  18. Interactive tool that empowers structural understanding and enables FEM analysis in a parametric design environment

    DEFF Research Database (Denmark)

    Christensen, Jesper Thøger; Parigi, Dario; Kirkegaard, Poul Henning

    2014-01-01

    This paper introduces an interactive tool developed to integrate structural analysis in the architectural design environment from the early conceptual design stage. The tool improves exchange of data between the design environment of Rhino Grasshopper and the FEM analysis of Autodesk Robot Struct...

  19. Enabling Collaborative Analysis: State Evaluation Groups, the Electronic State File, and Collaborative Analysis Tools

    International Nuclear Information System (INIS)

    The timely collection and analysis of all safeguards relevant information is the key to drawing and maintaining soundly-based safeguards conclusions. In this regard, the IAEA has made multidisciplinary State Evaluation Groups (SEGs) central to this process. To date, SEGs have been established for all States and tasked with developing State-level approaches (including the identification of technical objectives), drafting annual implementation plans specifying the field and headquarters activities necessary to meet technical objectives, updating the State evaluation on an ongoing basis to incorporate new information, preparing an annual evaluation summary, and recommending a safeguards conclusion to IAEA senior management. To accomplish these tasks, SEGs need to be staffed with relevant expertise and empowered with tools that allow for collaborative access to, and analysis of, disparate information sets. To ensure SEGs have the requisite expertise, members are drawn from across the Department of Safeguards based on their knowledge of relevant data sets (e.g., nuclear material accountancy, material balance evaluation, environmental sampling, satellite imagery, open source information, etc.) or their relevant technical (e.g., fuel cycle) expertise. SEG members also require access to all available safeguards relevant data on the State. To facilitate this, the IAEA is also developing a common, secure platform where all safeguards information can be electronically stored and made available for analysis (an electronic State file). The structure of this SharePoint-based system supports IAEA information collection processes, enables collaborative analysis by SEGs, and provides for management insight and review. In addition to this common platform, the Agency is developing, deploying, and/or testing sophisticated data analysis tools that can synthesize information from diverse information sources, analyze diverse datasets from multiple viewpoints (e.g., temporal, geospatial

  20. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    Science.gov (United States)

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper.

  1. Isogeometric analysis: a powerful numerical tool for the elastic analysis of historical masonry arches

    Science.gov (United States)

    Cazzani, Antonio; Malagù, Marcello; Turco, Emilio

    2016-03-01

    We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.

  2. Fractography analysis of tool samples used for cold forging

    DEFF Research Database (Denmark)

    Dahl, K.V.

    2002-01-01

    Three fractured tool dies used for industrial cold forging have been investigated using light optical microscopy and scanning electron microscopy. Two of the specimens were produced using the traditional Böhler P/M steel grade s790, while the lastspecimen was a third generation P/M steel produced...... resistance towards abrasive wear compared with the traditional P/M steel....

  3. Optical Network Testbeds Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Joe Mambretti

    2007-06-01

    This is the summary report of the third annual Optical Networking Testbed Workshop (ONT3), which brought together leading members of the international advanced research community to address major challenges in creating next generation communication services and technologies. Networking research and development (R&D) communities throughout the world continue to discover new methods and technologies that are enabling breakthroughs in advanced communications. These discoveries are keystones for building the foundation of the future economy, which requires the sophisticated management of extremely large qualities of digital information through high performance communications. This innovation is made possible by basic research and experiments within laboratories and on specialized testbeds. Initial network research and development initiatives are driven by diverse motives, including attempts to solve existing complex problems, the desire to create powerful new technologies that do not exist using traditional methods, and the need to create tools to address specific challenges, including those mandated by large scale science or government agency mission agendas. Many new discoveries related to communications technologies transition to wide-spread deployment through standards organizations and commercialization. These transition paths allow for new communications capabilities that drive many sectors of the digital economy. In the last few years, networking R&D has increasingly focused on advancing multiple new capabilities enabled by next generation optical networking. Both US Federal networking R&D and other national R&D initiatives, such as those organized by the National Institute of Information and Communications Technology (NICT) of Japan are creating optical networking technologies that allow for new, powerful communication services. Among the most promising services are those based on new types of multi-service or hybrid networks, which use new optical networking

  4. Tools for Developing a Quality Management Program: Proactive Tools (Process Mapping, Value Stream Mapping, Fault Tree Analysis, and Failure Mode and Effects Analysis)

    International Nuclear Information System (INIS)

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings

  5. Django girls workshop at IdeaSquare

    CERN Multimedia

    2016-01-01

    Short video highlights of the Django Girls coding workshop organized at IdeaSquare on Feb 26-27, 2016 by the Rosehipsters non profit organization, supported by the CERN diversity team and the IT department, attracting 39 women from 15 countries. The aim of the workshop was to introduce participants to the world of computer programming and technology by teaching them how to successfully create a blog application and deploy it to the internet. Most of the 16 volunteer mentors were female. Django Girls is a non-profit organization and a community that empowers and helps women to organize free, one-day programming workshops by providing tools, resources and support.

  6. An Equal Employment Opportunity Sensitivity Workshop

    Science.gov (United States)

    Patten, Thomas H., Jr.; Dorey, Lester E.

    1972-01-01

    The equal employment opportunity sensitivity workshop seems to be a useful training device for getting an organization started on developing black and white change agents. A report on the establishment of such a workshop at the U.S. Army Tank Automotive Command (TACOM). Includes charts of design, characteristics, analysis of results, program…

  7. Workshop on PSA applications, Sofia, Bulgaria, 7-11 October 1996. Lecturing materials

    International Nuclear Information System (INIS)

    The objective of this workshop was to present detailed, systematic and useful information about PSA-based tools and PSA applications. The first presentation of the workshop was titled ''The role of PSA in safety management''. This topic served to introduce the workshop and to highlight several concepts that were afterwards stressed during the week, i.e. the defence in depth principle and the use of deterministic and probabilistic approaches in a complementary way. This presentation provided a basis for the discussion of ''PSA applications''. As a complement to the theoretical lectures, there was a workshop during which three different exercises were run in parallel. For two of these, computer-based PSA tools were used. One of them was focused towards the analysis of design modifications and the other one towards demonstrating configuration control strategies. The objective of the third practice was to obtain Allowed Outage Times using different PSA-based approaches and to discuss the differences observed and the insights obtained. To conclude the workshop, stress was put on the importance of the quality of the PSA (the development of a high quality Living PSA should be the first step), the necessity to be cautious (before taking decisions both the qualitative and numerical results should be carefully analyzed), and the logical order for the implementation of PSA applications. Refs, figs, tabs

  8. PREFACE: Collapse Calderas Workshop

    Science.gov (United States)

    Gottsmann, Jo; Aguirre-Diaz, Gerardo

    2008-10-01

    Caldera-formation is one of the most awe-inspiring and powerful displays of nature's force. Resultant deposits may cover vast areas and significantly alter the immediate topography. Post-collapse activity may include resurgence, unrest, intra-caldera volcanism and potentially the start of a new magmatic cycle, perhaps eventually leading to renewed collapse. Since volcanoes and their eruptions are the surface manifestation of magmatic processes, calderas provide key insights into the generation and evolution of large-volume silicic magma bodies in the Earth's crust. Despite their potentially ferocious nature, calderas play a crucial role in modern society's life. Collapse calderas host essential economic deposits and supply power for many via the exploitation of geothermal reservoirs, and thus receive considerable scientific, economic and industrial attention. Calderas also attract millions of visitors world-wide with their spectacular scenic displays. To build on the outcomes of the 2005 calderas workshop in Tenerife (Spain) and to assess the most recent advances on caldera research, a follow-up meeting was proposed to be held in Mexico in 2008. This abstract volume presents contributions to the 2nd Calderas Workshop held at Hotel Misión La Muralla, Querétaro, Mexico, 19-25 October 2008. The title of the workshop `Reconstructing the evolution of collapse calderas: Magma storage, mobilisation and eruption' set the theme for five days of presentations and discussions, both at the venue as well as during visits to the surrounding calderas of Amealco, Amazcala and Huichapan. The multi-disciplinary workshop was attended by more than 40 scientist from North, Central and South America, Europe, Australia and Asia. Contributions covered five thematic topics: geology, geochemistry/petrology, structural analysis/modelling, geophysics, and hazards. The workshop was generously supported by the International Association of Volcanology and the Chemistry of The Earth's Interior

  9. Proceedings of the workshop on world oil supply-demand analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, K.C. (ed.)

    1977-01-01

    Twelve papers and four panel discussions are included. A separate abstract was prepared for each paper. The panel discussions were on: technical and physical policy elements affecting world oil supply and demand; financial, tax, and tariff issues in world oil supply and demand; the world economy as influenced by world oil prices and availability; the use of models and analysis in the policy process. (DLC)

  10. PIXE and μ-PIXE analysis of glazes from terracotta sculptures of the della Robbia workshop

    International Nuclear Information System (INIS)

    A series of PIXE analyses has been performed on glazes from terracotta sculptures of the Italian Renaissance and on reference standards. The problems related to the investigation of such heterogeneous materials are discussed and the experimental uncertainties are evaluated, for each element, from the PIXE analysis of standard glasses. Some examples from artefacts coming from Italian collections are given. This research has been conducted in the framework of the COST-G1 European action

  11. The Future Workshop: Democratic problem solving

    DEFF Research Database (Denmark)

    Vidal, Rene Victor Valqui

    2005-01-01

    The origins, principles and practice of a very popular method known as The Future Workshop are presented. The fundamental theory and principles of this method are presented in an introductory way. In addition, practical guidelines to carry out such a workshop are outlined and several types of app...... of applications are shortly described. The crucial importance of both the facilitation process and the use of creative tools in team work are enhanced....

  12. The Future Workshop: Democratic problem solving

    Directory of Open Access Journals (Sweden)

    Rene Victor Valqui Vidal

    2006-03-01

    Full Text Available The origins, principles and practice of a very popular method known as The Future Workshop are presented. The fundamental theory and principles of this method are presented in an introductory way. In addition, practical guidelines to carry out such a workshop are outlined and several types of applications are shortly described. The crucial importance of both the facilitation process and the use of creative tools in team work are enhanced.

  13. Graphical Models for Security : Second International Workshop

    OpenAIRE

    Kordy, Barbara; Mauw, Sjouke; Jajodia, Sushil

    2016-01-01

    International audience This volume constitutes the thoroughly refereed post-conference proceedings of the Second International Workshop on Graphical Models for Security, GraMSec 2015, held in Verona, Italy, in July 2015.The 5 revised full papers presented together with one short tool paper and one invited article were carefully reviewed and selected from 13 submissions. The GraMSec workshop contributes to the development of well-founded graphical security models, efficient algorithms for t...

  14. Analysis of online quizzes as a teaching and assessment tool

    Directory of Open Access Journals (Sweden)

    Lorenzo Salas-Morera

    2012-03-01

    Full Text Available This article deals with the integrated use of online quizzes as a teaching and assessment tool in the general program of the subject Proyectos in the third course of Ingeniero Técnico en Informática de Gestión over five consecutive years. The research undertaken aimed to test quizzes effectiveness on student performance when used, not only as an isolated assessment tool, but also when integrated into a combined strategy, which support the overall programming of the subject. The results obtained during the five years of experimentation using online quizzes shows that such quizzes have a proven positive influence on students' academic performance. Furthermore, surveys conducted at the end of each course revealed the high value students accord to use of online quizzes in course instruction.

  15. Stability analysis of machine tool spindle under uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Dou

    2016-05-01

    Full Text Available Chatter is a harmful machining vibration that occurs between the workpiece and the cutting tool, usually resulting in irregular flaw streaks on the finished surface and severe tool wear. Stability lobe diagrams could predict chatter by providing graphical representations of the stable combinations of the axial depth of the cut and spindle speed. In this article, the analytical model of a spindle system is constructed, including a Timoshenko beam rotating shaft model and double sets of angular contact ball bearings with 5 degrees of freedom. Then, the stability lobe diagram of the model is developed according to its dynamic properties. The Monte Carlo method is applied to analyse the bearing preload influence on the system stability with uncertainty taken into account.

  16. Storybuilder-A tool for the analysis of accident reports

    International Nuclear Information System (INIS)

    As part of an ongoing effort by the ministry of Social Affairs and Employment of The Netherlands a research project is being undertaken to construct a causal model for the most commonly occurring scenarios related to occupational risk. This model should provide quantitative insight in the causes and consequences of occupational accidents. The results should be used to help selecting optimal strategies to reduce these risks taking the costs of accidents and of measures into account. The research is undertaken by an international consortium under the name of Workgroup Occupational Risk Model. One of the components of the model is a tool to systematically classify and analyse past accidents. This tool: 'Storybuilder' and its place in the Occupational Risk Model (ORM) are described in the paper. The paper gives some illustrations of the application of the Storybuilder, drawn from the study of ladder accidents, which forms one of the biggest single accident categories in the Dutch data

  17. Forensic Analysis of Windows Hosts Using UNIX-based Tools

    Energy Technology Data Exchange (ETDEWEB)

    Cory Altheide

    2004-07-19

    Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linux operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.

  18. Proteomic tools for the analysis of transient interactions between metalloproteins.

    Science.gov (United States)

    Martínez-Fábregas, Jonathan; Rubio, Silvia; Díaz-Quintana, Antonio; Díaz-Moreno, Irene; De la Rosa, Miguel Á

    2011-05-01

    Metalloproteins play major roles in cell metabolism and signalling pathways. In many cases, they show moonlighting behaviour, acting in different processes, depending on the physiological state of the cell. To understand these multitasking proteins, we need to discover the partners with which they carry out such novel functions. Although many technological and methodological tools have recently been reported for the detection of protein interactions, specific approaches to studying the interactions involving metalloproteins are not yet well developed. The task is even more challenging for metalloproteins, because they often form short-lived complexes that are difficult to detect. In this review, we gather the different proteomic techniques and biointeractomic tools reported in the literature. All of them have shown their applicability to the study of transient and weak protein-protein interactions, and are therefore suitable for metalloprotein interactions.

  19. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  20. Monitoring SOA Applications with SOOM Tools: A Competitive Analysis

    OpenAIRE

    Ivan Zoraja; Goran Trlin; Marko Matijević

    2013-01-01

    Background: Monitoring systems decouple monitoring functionality from application and infrastructure layers and provide a set of tools that can invoke operations on the application to be monitored. Objectives: Our monitoring system is a powerful yet agile solution that is able to online observe and manipulate SOA (Service-oriented Architecture) applications. The basic monitoring functionality is implemented via lightweight components inserted into SOA frameworks thereby keeping the monitoring...

  1. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    OpenAIRE

    Tousif ur Rehman; Muhammad Naeem Ahmed Khan; Naveed Riaz

    2013-01-01

    Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The stu...

  2. CeTA - A Tool for Certified Termination Analysis

    OpenAIRE

    Sternagel, Christian; Thiemann, René; Winkler, Sarah; Zankl, Harald

    2012-01-01

    Since the first termination competition in 2004 it is of great interest, whether a proof that has been automatically generated by a termination tool, is indeed correct. The increasing number of termination proving techniques as well as the increasing complexity of generated proofs (e.g., combinations of several techniques, exhaustive labelings, tree automata, etc.), make certifying (i.e., checking the correctness of) such proofs more and more tedious for humans. Hence the interest in automate...

  3. Design tools for daylighting illumination and energy analysis

    Energy Technology Data Exchange (ETDEWEB)

    Selkowitz, S.

    1982-07-01

    The problems and potentials for using daylighting to provide illumination in building interiors are reviewed. It describes some of the design tools now or soon to be available for incorporating daylighting into the building design process. It also describes state-of-the-art methods for analyzing the impacts daylighting can have on selection of lighting controls, lighting energy consumption, heating and cooling loads, and peak power demand.

  4. Performance Analysis of the Capability Assessment Tool for Sustainable Manufacturing

    Directory of Open Access Journals (Sweden)

    Enda Crossin

    2013-08-01

    Full Text Available This paper explores the performance of a novel capability assessment tool, developed to identify capability gaps and associated training and development requirements across the supply chain for environmentally-sustainable manufacturing. The tool was developed to assess 170 capabilities that have been clustered with respect to key areas of concern such as managing energy, water, material resources, carbon emissions and waste as well as environmental management practices for sustainability. Two independent expert teams used the tool to assess a sample group of five first and second tier sports apparel and footwear suppliers within the supply chain of a global sporting goods manufacturer in Asia. The paper addresses the reliability and robustness of the developed assessment method by formulating the expected links between the assessment results. The management practices of the participating suppliers were shown to be closely connected to their performance in managing their resources and emissions. The companies’ initiatives in implementing energy efficiency measures were found to be generally related to their performance in carbon emissions management. The suppliers were also asked to undertake a self-assessment by using a short questionnaire. The large gap between the comprehensive assessment and these in-house self-assessments revealed the suppliers’ misconceptions about their capabilities.

  5. MACD - Analysis of weaknesses of the most powerful technical analysis tool

    Directory of Open Access Journals (Sweden)

    Sanel Halilbegovic

    2016-05-01

    Full Text Available Due to the huge popularization of the stock trading amongst youth, in the recent years more and more of trading and brokerage houses are trying to find a one ‘easy to understand’ tool for the novice traders.  Moving average convergence divergence seems to be the main pick and unfortunately inexperienced traders are relying on this one tool for analysis and trading of various securities.   In this paper, I will investigate the validity of MACD as the ‘magic wand’ when solely used in investment trading decision making.  The main limitation of this study is that it could be used more widely across industries and various sizes of companies, funds, and other trading instruments.

  6. Matlab symbolic circuit analysis and simulation tool ming PSpice netlist for circuits optimization

    OpenAIRE

    Ushie, OJ; Abbod, M; Ashigwuike, E

    2015-01-01

    This paper presents new Matlab symbolic circuit analysis and simulation (MSCAM) tool that make uses of netlist from PSpice to generate matrices. These matrices can be used to calculate circuit parameters or for optimization. The tool can handle active and passive components such as resistors, capacitors, inductors, operational amplifiers, and transistors. The transistors are converted into small signal analysis and operational amplifiers make use of the small signal analysis which can easily ...

  7. GammaWorkshops Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Ramebaeck, H. (ed.) (Swedish Defence Research Agency (Sweden)); Straalberg, E. (Institute for Energy Technology, Kjeller (Norway)); Klemola, S. (Radiation and Nuclear Safety Authority, STUK (Finland)); Nielsen, Sven P. (Technical Univ. of Denmark. Risoe National Lab. for Sustainable Energy, Roskilde (Denmark)); Palsson, S.E. (Icelandic Radiation Safety Authority (Iceland))

    2012-01-15

    Due to a sparse interaction during the last years between practioners in gamma ray spectrometry in the Nordic countries, a NKS activity was started in 2009. This GammaSem was focused on seminars relevant to gamma spectrometry. A follow up seminar was held in 2010. As an outcome of these activities it was suggested that the 2011 meeting should be focused on practical issues, e.g. different corrections needed in gamma spectrometric measurements. This three day's meeting, GammaWorkshops, was held in September at Risoe-DTU. Experts on different topics relevant for gamma spectrometric measurements were invited to the GammaWorkshops. The topics included efficiency transfer, true coincidence summing corrections, self-attenuation corrections, measurement of natural radionuclides (natural decay series), combined measurement uncertainty calculations, and detection limits. These topics covered both lectures and practical sessions. The practical sessions included demonstrations of tools for e.g. corrections and calculations of the above meantioned topics. (Author)

  8. Effectiveness of Workshop on Evaluation Methodology for Medical Teachers

    OpenAIRE

    Chinmay Shah; P. A. Gokhale; Mehta, H. B.

    2011-01-01

    A workshop on evaluation methodology was designed at Government Medical College, Bhavnagar. The workshop comprised of six modules namely: Mechanics of Paper setting, MCQ formulation & Item analysis, Mini CEX, OSPE, OSCE and Structured Viva. Study was carried out with aim to find out effectiveness of workshop in changing knowledge and attitude towards different evaluation methodology. Method: Instruction was provided during a one day workshop wit...

  9. An analysis of the ``accidental painting'' technique of D.A. Siqueiros: the Rayleigh Taylor instability as a tool to create explosive textures

    Science.gov (United States)

    Zetina, S.; Zenit, R.

    2012-11-01

    In the spring of 1936, the famous Mexican muralist David Alfaro Siqueiros organized an experimental painting workshop in New York: a group of artists focused in developing painting techniques through empirical experimentation of modern and industrial materials and tools. Among the young artists attending the workshop was Jackson Pollock. They tested different lacquers and a number of experimental techniques. One of the techniques, named by Siqueiros as a ``controlled accident,'' consisted in pouring layers of paint of different colors on top of each other. After a brief time, the paint from the lower layer emerged from bottom to top creating a relatively regular pattern of blobs. This technique led to the creation of explosion-inspired textures and catastrophic images. We conducted an analysis of this process. We experimentally reproduced the patterns ``discovered'' by Siquieros and analyzed the behavior of the flow. We found that the flow is driven by the well-known Rayleigh Taylor instability: different colors paints have different densities; a heavy layer on top of a light one is an unstable configuration. The blobs and plumes that result from the instability create the aesthetically pleasing patterns. We discuss the importance of fluid mechanics in artistic creation.

  10. The Gender Analysis Tools Applied in Natural Disasters Management: A Systematic Literature Review

    OpenAIRE

    Sohrabizadeh, Sanaz; Tourani, Sogand; Khankeh, Hamid Reza

    2014-01-01

    Background: Although natural disasters have caused considerable damages around the world, and gender analysis can improve community disaster preparedness or mitigation, there is little research about the gendered analytical tools and methods in communities exposed to natural disasters and hazards. These tools evaluate gender vulnerability and capacity in pre-disaster and post-disaster phases of the disaster management cycle. Objectives: Identifying the analytical gender tools and the strength...

  11. Practical Multi-Disciplinary Analysis Tools for Combustion Devices Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The use of multidisciplinary analysis (MDA) techniques for combustion device environment prediction, including complex fluid mixing phenomena, is now becoming...

  12. Practical Multi-Disciplinary Analysis Tools for Combustion Devices Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The use of multidisciplinary analysis (MDA) techniques for complex fluid/structure interaction phenomena is increasing as proven numerical and visualization...

  13. ROOT User Workshop 2013

    CERN Document Server

    2013-01-01

    Since almost two decades, ROOT has established itself as the framework for HENP data processing and analysis. The LHC upgrade program and the new experiments being designed at CERN and elsewhere will pose even more formidable challenges in terms of data complexity and size. The new parallel and heterogeneous computing architectures that are either announced or already available will call for a deep rethinking of the code and the data structures to be exploited efficiently. This workshop, following from a successful series of such events, will allow you to learn in detail about the new ROOT 6 and will help shape the future evolution of ROOT.

  14. Workshop in economics - the problem of climate change benefit-cost analysis

    International Nuclear Information System (INIS)

    Could benefit-cost analysis play a larger role in the discussion of policies to deal with the greenhouse effect? The paper also investigates the causes of this lack of influence. Selected forms of benefit-cost research are probed, particularly the critical discussions raised by this type of research, in an effort to suggest where the chances of greater acceptance lie. The paper begins by discussing the search for an appropriate policy: optimal, targeted, or incremental. It then describes the work being done in specifying and estimating climate change damage relationships. A consideration of the work being done in specifying and estimating abatement (both mitigation and adaptation) cost relationships follows. Finally, the paper ends with an examination of the search for the appropriate policy instrument. International and methodological concerns cut across these areas and are discussed in each section. This paper concludes that there seem to be a number of reasons that benefit-cost results play only a limited role in policy development. There is some evidence that the growing interest in market-based approaches to climate change policy and to other environmental control matters is a sign of increased acceptance. Suggestions about research directions are made throughout this paper

  15. An ontological knowledge based system for selection of process monitoring and analysis tools

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    monitoring and analysis tools for a wide range of operations has made their selection a difficult, time consuming and challenging task. Therefore, an efficient and systematic knowledge base coupled with an inference system is necessary to support the optimal selection of process monitoring and analysis tools......, satisfying the process and user constraints. A knowledge base consisting of the process knowledge as well as knowledge on measurement methods and tools has been developed. An ontology has been designed for knowledge representation and management. The developed knowledge base has a dual feature. On the one...... hand, it facilitates the selection of proper monitoring and analysis tools for a given application or process. On the other hand, it permits the identification of potential applications for a given monitoring technique or tool. An efficient inference system based on forward as well as reverse search...

  16. Pathway-based Analysis Tools for Complex Diseases: A Review

    Directory of Open Access Journals (Sweden)

    Lv Jin

    2014-10-01

    Full Text Available Genetic studies are traditionally based on single-gene analysis. The use of these analyses can pose tremendous challenges for elucidating complicated genetic interplays involved in complex human diseases. Modern pathway-based analysis provides a technique, which allows a comprehensive understanding of the molecular mechanisms underlying complex diseases. Extensive studies utilizing the methods and applications for pathway-based analysis have significantly advanced our capacity to explore large-scale omics data, which has rapidly accumulated in biomedical fields. This article is a comprehensive review of the pathway-based analysis methods—the powerful methods with the potential to uncover the biological depths of the complex diseases. The general concepts and procedures for the pathway-based analysis methods are introduced and then, a comprehensive review of the major approaches for this analysis is presented. In addition, a list of available pathway-based analysis software and databases is provided. Finally, future directions and challenges for the methodological development and applications of pathway-based analysis techniques are discussed. This review will provide a useful guide to dissect complex diseases.

  17. Cellular barcoding tool for clonal analysis in the hematopoietic system

    NARCIS (Netherlands)

    Gerrits, Alice; Dykstra, Brad; Kalmykowa, Olga J.; Klauke, Karin; Verovskaya, Evgenia; Broekhuis, Mathilde J. C.; de Haan, Gerald; Bystrykh, Leonid V.

    2010-01-01

    Clonal analysis is important for many areas of hematopoietic stem cell research, including in vitro cell expansion, gene therapy, and cancer progression and treatment. A common approach to measure clonality of retrovirally transduced cells is to perform integration site analysis using Southern blott

  18. Ecotoxicological mechanisms and models in an impact analysis tool for oil spills

    NARCIS (Netherlands)

    Laender, de F.; Olsen, G.H.; Frost, T.; Grosvik, B.E.; Klok, T.C.

    2011-01-01

    In an international collaborative effort, an impact analysis tool is being developed to predict the effect of accidental oil spills on recruitment and production of Atlantic cod (Gadus morhua) in the Barents Sea. The tool consisted of three coupled ecological models that describe (1) plankton biomas

  19. Limits, limits everywhere the tools of mathematical analysis

    CERN Document Server

    Applebaum, David

    2012-01-01

    A quantity can be made smaller and smaller without it ever vanishing. This fact has profound consequences for science, technology, and even the way we think about numbers. In this book, we will explore this idea by moving at an easy pace through an account of elementary real analysis and, in particular, will focus on numbers, sequences, and series.Almost all textbooks on introductory analysis assume some background in calculus. This book doesn't and, instead, the emphasis is on the application of analysis to number theory. The book is split into two parts. Part 1 follows a standard university

  20. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    Directory of Open Access Journals (Sweden)

    Tousif ur Rehman

    2013-02-01

    Full Text Available Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non-functional requirement, yet are naturally considered fundamental to secure software development.

  1. National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion

    Science.gov (United States)

    Follen, G.; Naiman, C.; Evans, A.

    1999-01-01

    Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.

  2. User Behavior Analysis from Web Log using Log Analyzer Tool

    Directory of Open Access Journals (Sweden)

    Brijesh Bakariya

    2013-11-01

    Full Text Available Now a day, internet plays a role of huge database in which many websites, information and search engines are available. But due to unstructured and semi-structured data in webpage, it has become a challenging task to extract relevant information. Its main reason is that traditional knowledge based technique are not correct to efficiently utilization the knowledge, because it consist of many discover pattern, contains a lots of noise and uncertainty. In this paper, analyzing of web usage mining has been made with the help if web log data for which web log analyzer tool, “Deep Log Analyzer” to find out abstract information from particular server and also tried to find out the user behavior and also developed an ontology which consist the relation among efficient web apart of web usage mining.

  3. Design and Analysis Tools for Deployable Solar Array Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Large, lightweight, deployable solar array structures have been identified as a key enabling technology for NASA with analysis and design of these structures being...

  4. IPHE Infrastructure Workshop Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    None

    2010-02-01

    This proceedings contains information from the IPHE Infrastructure Workshop, a two-day interactive workshop held on February 25-26, 2010, to explore the market implementation needs for hydrogen fueling station development.

  5. R Markdown: Integrating A Reproducible Analysis Tool into Introductory Statistics

    OpenAIRE

    Baumer, Ben; Cetinkaya-Rundel, Mine; Bray, Andrew; Loi, Linda; Horton, Nicholas J

    2014-01-01

    Nolan and Temple Lang argue that “the ability to express statistical computations is an es- sential skill.” A key related capacity is the ability to conduct and present data analysis in a way that another person can understand and replicate. The copy-and-paste workflow that is an artifact of antiquated user-interface design makes reproducibility of statistical analysis more difficult, especially as data become increasingly complex and statistical methods become increasingly sophisticated. R M...

  6. SOCIAL SENSOR: AN ANALYSIS TOOL FOR SOCIAL MEDIA

    OpenAIRE

    Chun-Hsiao Wu; Tsai-Yen Li

    2016-01-01

    In this research, we propose a new concept for social media analysis called Social Sensor, which is an innovative design attempting to transform the concept of a physical sensor in the real world into the world of social media with three design features: manageability, modularity, and reusability. The system is a case-centered design that allows analysts to select the type of social media (such as Twitter), the target data sets, and appropriate social sensors for analysis. By adopting paramet...

  7. Markov Chains as Tools for Jazz Improvisation Analysis

    OpenAIRE

    Franz, David Matthew

    1998-01-01

    This thesis describes an exploratory application of a statistical analysis and modeling technique (Markov chains) for the modeling of jazz improvisation with the intended subobjective of providing increased insight into an improviser's style and creativity through the postulation of quantitative measures of style and creativity based on the constructed Markovian analysis techniques. Using Visual Basic programming language, Markov chains of orders one to three are created using transcriptio...

  8. Applying observations of work activity in designing prototype data analysis tools

    Energy Technology Data Exchange (ETDEWEB)

    Springmeyer, R.R.

    1993-07-06

    Designers, implementers, and marketers of data analysis tools typically have different perspectives than users. Consequently, data analysis often find themselves using tools focused on graphics and programming concepts rather than concepts which reflect their own domain and the context of their work. Some user studies focus on usability tests late in development; others observe work activity, but fail to show how to apply that knowledge in design. This paper describes a methodology for applying observations of data analysis work activity in prototype tool design. The approach can be used both in designing improved data analysis tools, and customizing visualization environments to specific applications. We present an example of user-centered design for a prototype tool to cull large data sets. We revisit the typical graphical approach of animating a large data set from the point of view of an analysis who is culling data. Field evaluations using the prototype tool not only revealed valuable usability information, but initiated in-depth discussions about user`s work, tools, technology, and requirements.

  9. Grid Analysis and Display System (GrADS): A practical tool for earth science visualization

    Science.gov (United States)

    Kinter, James L., III; Doty, Brian E.

    1991-01-01

    Viewgraphs on grid analysis and display system (GrADS): a practical tool for earth science visualization are presented. Topics covered include: GrADS design goals; data sets; and temperature profiles.

  10. Integrative genomic analysis by interoperation of bioinformatics tools in GenomeSpace

    Science.gov (United States)

    Thorvaldsdottir, Helga; Liefeld, Ted; Ocana, Marco; Borges-Rivera, Diego; Pochet, Nathalie; Robinson, James T.; Demchak, Barry; Hull, Tim; Ben-Artzi, Gil; Blankenberg, Daniel; Barber, Galt P.; Lee, Brian T.; Kuhn, Robert M.; Nekrutenko, Anton; Segal, Eran; Ideker, Trey; Reich, Michael; Regev, Aviv; Chang, Howard Y.; Mesirov, Jill P.

    2015-01-01

    Integrative analysis of multiple data types to address complex biomedical questions requires the use of multiple software tools in concert and remains an enormous challenge for most of the biomedical research community. Here we introduce GenomeSpace (http://www.genomespace.org), a cloud-based, cooperative community resource. Seeded as a collaboration of six of the most popular genomics analysis tools, GenomeSpace now supports the streamlined interaction of 20 bioinformatics tools and data resources. To facilitate the ability of non-programming users’ to leverage GenomeSpace in integrative analysis, it offers a growing set of ‘recipes’, short workflows involving a few tools and steps to guide investigators through high utility analysis tasks. PMID:26780094

  11. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    Science.gov (United States)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  12. The Danish Scenario Workshop Report

    DEFF Research Database (Denmark)

    Brodersen, Søsser; Jørgensen, Michael Søgaard

    with informal drinks) and planned and carried out as recommended in Ahumada (2003). We have however not developed all the material recommended by Ahumada (2003) as informative material prior to the workshop, (e.g. a SWOT analysis) due to a wish only to produce material to the participants which we found useful...

  13. The Temporary City Workshop

    OpenAIRE

    Moore, Niamh; McCarthy, Linda

    2014-01-01

    The Temporary City Workshop was hosted by Dr Niamh Moore-Cherry on Tuesday 21 October in Nova UCD. The workshop is part of the Greening as Spatial Politics project funded by the IRC New Foundations scheme 2013 and is a collaboration between geographers at University College Dublin and the University of Wisconsin-Milwaukee. The goal of the workshop was to facilitate networking across a diversity of stakeholders and initiate discussion on temporary urban interventions in Dublin. The workshop wa...

  14. Uraninite chemistry as forensic tool for provenance analysis

    International Nuclear Information System (INIS)

    Highlights: • Uraninite chemistry can be used as fingerprint and provenance tool. • U/Th ratio and total REE contents are good indicators of crystallisation temperature. • REE fractionation is strongly dependent on uraninite genesis. • Application to uraninite from the Witwatesrand Basin highlights its detrital nature. • Witwatersrand uraninite is derived from a variety of magmatic sources. - Abstract: Electron microprobe and laser ablation-inductively coupled plasma mass spectrometric (LA-ICPMS) analyses were carried out on individual uraninite grains from several localities worldwide, representing a variety of different U-deposit types ranging in age from Mesoarchaean to the Mesozoic. For the first time, concentration data on a comprehensive set of minor/trace elements in uraninite are presented, i.e. LA-ICPMS concentration data for Th, Si, Al, Fe, Mn, Ca, Mg, P, Ti, V, Cr, Co, Ni, Pb, Zn, As, rare earth elements (REE), Y, Zr, Nb, Mo, Ag, Ta, W, Bi, and Au. Most of these elements could be detected in significant quantities in many of the studied examples. The results obtained in this study, supplemented by previously published data on major element and REE concentrations, reveal systematic differences in uraninite composition between genetically different deposit types and also, for a given genetic type, between different locations. Low-temperature hydrothermal uraninite is marked by U/Th >1000, whereas high-temperature metamorphic and magmatic (granitic, pegmatitic) uraninite has U/Th <100. Our new data also confirm previous observations that low-temperature, hydrothermal uraninite has low total REE contents (<1 wt%) whereas higher temperature uraninite can contain as much as several percent total REE. Genetically different uraninite types can be further identified by means of different REE fractionation patterns. Systematic differences between primary uraninite from different localities could be also noted with respect to the abundances of especially

  15. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  16. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.

    2014-12-01

    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  17. ICP-MS Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Carman, April J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Eiden, Gregory C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-11-01

    This is a short document that explains the materials that will be transmitted to LLNL and DNN HQ regarding the ICP-MS Workshop held at PNNL June 17-19th. The goal of the information is to pass on to LLNL information regarding the planning and preparations for the Workshop at PNNL in preparation of the SIMS workshop at LLNL.

  18. Extension of a System Level Tool for Component Level Analysis

    Science.gov (United States)

    Majumdar, Alok; Schallhorn, Paul

    2002-01-01

    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  19. MultiQC: summarize analysis results for multiple tools and samples in a single report

    Science.gov (United States)

    Ewels, Philip; Magnusson, Måns; Lundin, Sverker; Käller, Max

    2016-01-01

    Motivation: Fast and accurate quality control is essential for studies involving next-generation sequencing data. Whilst numerous tools exist to quantify QC metrics, there is no common approach to flexibly integrate these across tools and large sample sets. Assessing analysis results across an entire project can be time consuming and error prone; batch effects and outlier samples can easily be missed in the early stages of analysis. Results: We present MultiQC, a tool to create a single report visualising output from multiple tools across many samples, enabling global trends and biases to be quickly identified. MultiQC can plot data from many common bioinformatics tools and is built to allow easy extension and customization. Availability and implementation: MultiQC is available with an GNU GPLv3 license on GitHub, the Python Package Index and Bioconda. Documentation and example reports are available at http://multiqc.info Contact: phil.ewels@scilifelab.se PMID:27312411

  20. Risk Management Techniques and Practice Workshop Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, T; Zosel, M

    2008-12-02

    At the request of the Department of Energy (DOE) Office of Science (SC), Lawrence Livermore National Laboratory (LLNL) hosted a two-day Risk Management Techniques and Practice (RMTAP) workshop held September 18-19 at the Hotel Nikko in San Francisco. The purpose of the workshop, which was sponsored by the SC/Advanced Scientific Computing Research (ASCR) program and the National Nuclear Security Administration (NNSA)/Advanced Simulation and Computing (ASC) program, was to assess current and emerging techniques, practices, and lessons learned for effectively identifying, understanding, managing, and mitigating the risks associated with acquiring leading-edge computing systems at high-performance computing centers (HPCCs). Representatives from fifteen high-performance computing (HPC) organizations, four HPC vendor partners, and three government agencies attended the workshop. The overall workshop findings were: (1) Standard risk management techniques and tools are in the aggregate applicable to projects at HPCCs and are commonly employed by the HPC community; (2) HPC projects have characteristics that necessitate a tailoring of the standard risk management practices; (3) All HPCC acquisition projects can benefit by employing risk management, but the specific choice of risk management processes and tools is less important to the success of the project; (4) The special relationship between the HPCCs and HPC vendors must be reflected in the risk management strategy; (5) Best practices findings include developing a prioritized risk register with special attention to the top risks, establishing a practice of regular meetings and status updates with the platform partner, supporting regular and open reviews that engage the interests and expertise of a wide range of staff and stakeholders, and documenting and sharing the acquisition/build/deployment experience; and (6) Top risk categories include system scaling issues, request for proposal/contract and acceptance testing, and

  1. MetaboTools: A Comprehensive Toolbox for Analysis of Genome-Scale Metabolic Models.

    Science.gov (United States)

    Aurich, Maike K; Fleming, Ronan M T; Thiele, Ines

    2016-01-01

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community. PMID:27536246

  2. MetaboTools: A Comprehensive Toolbox for Analysis of Genome-Scale Metabolic Models

    Science.gov (United States)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-01-01

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.

  3. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, we built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of

  4. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Science.gov (United States)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  5. Measures of radioactivity: a tool for understanding statistical data analysis

    OpenAIRE

    Montalbano, Vera; Quattrini, Sonia

    2012-01-01

    A learning path on radioactivity in the last class of high school is presented. An introduction to radioactivity and nuclear phenomenology is followed by measurements of natural radioactivity. Background and weak sources are monitored for days or weeks. The data are analyzed in order to understand the importance of statistical analysis in modern physics.

  6. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  7. Sequential meta-analysis : an efficient decision-making tool

    NARCIS (Netherlands)

    van der Tweel, Ingeborg; Bollen, Casper

    2010-01-01

    Background A cumulative meta-analysis of successive randomized controlled trials (RCTs) can be used to decide whether enough evidence has been obtained comparing a control and an intervention treatment or whether a new RCT should be initiated. In general, no adjustment is made for repeatedly testing

  8. ProbFAST: Probabilistic Functional Analysis System Tool

    Directory of Open Access Journals (Sweden)

    Oliveira Thiago YK

    2010-03-01

    Full Text Available Abstract Background The post-genomic era has brought new challenges regarding the understanding of the organization and function of the human genome. Many of these challenges are centered on the meaning of differential gene regulation under distinct biological conditions and can be performed by analyzing the Multiple Differential Expression (MDE of genes associated with normal and abnormal biological processes. Currently MDE analyses are limited to usual methods of differential expression initially designed for paired analysis. Results We proposed a web platform named ProbFAST for MDE analysis which uses Bayesian inference to identify key genes that are intuitively prioritized by means of probabilities. A simulated study revealed that our method gives a better performance when compared to other approaches and when applied to public expression data, we demonstrated its flexibility to obtain relevant genes biologically associated with normal and abnormal biological processes. Conclusions ProbFAST is a free accessible web-based application that enables MDE analysis on a global scale. It offers an efficient methodological approach for MDE analysis of a set of genes that are turned on and off related to functional information during the evolution of a tumor or tissue differentiation. ProbFAST server can be accessed at http://gdm.fmrp.usp.br/probfast.

  9. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  10. The ATLAS Electromagnetic Calorimeter Calibration Workshop

    CERN Multimedia

    Hong Ma; Isabelle Wingerter

    The ATLAS Electromagnetic Calorimeter Calibration Workshop took place at LAPP-Annecy from the 1st to the 3rd of October; 45 people attended the workshop. A detailed program was setup before the workshop. The agenda was organised around very focused presentations where questions were raised to allow arguments to be exchanged and answers to be proposed. The main topics were: Electronics calibration Handling of problematic channels Cluster level corrections for electrons and photons Absolute energy scale Streams for calibration samples Calibration constants processing Learning from commissioning Forty-five people attended the workshop. The workshop was on the whole lively and fruitful. Based on years of experience with test beam analysis and Monte Carlo simulation, and the recent operation of the detector in the commissioning, the methods to calibrate the electromagnetic calorimeter are well known. Some of the procedures are being exercised in the commisssioning, which have demonstrated the c...

  11. The GRIP method for collaborative roadmapping workshops

    DEFF Research Database (Denmark)

    Piirainen, Kalle

    2015-01-01

    Technology roadmapping is a well-known tool for technology management, but practical advice for facilitating collaborative roadmapping workshops is relatively scarce. To cater for this need, we have designed a method for collaborative roadmapping, dubbed the GRIP method, for facilitating group work...... in TRM workshops. The design is based on establish best practices in facilitation and our experiences with the method suggest it is a feasible tool for technology managers. The benefits of the method are that it enables engaging a diverse group of individuals to the roadmapping process effectively...

  12. Proceedings 15th International Refinement Workshop

    CERN Document Server

    Derrick, John; Reeves, Steve; 10.4204/EPTCS.55

    2011-01-01

    Refinement is one of the cornerstones of a formal approach to software engineering: the process of developing a more detailed design or implementation from an abstract specification through a sequence of mathematically-based steps that maintain correctness with respect to the original specification. The aim of this BCS FACS Refinement Workshop, is to bring together people who are interested in the development of more concrete designs or executable programs from abstract specifications using formal notations, tool support for formal software development, and practical experience with formal refinement methodologies. The purpose of the workshop is to provide a forum for the exchange of ideas, and discussion of common ground and key differences. This 15th workshop continued a 20 year tradition in refinement workshops run under the auspices of the British Computer Society (BCS) FACS special interest group. After the first seven editions had been held in the UK, in 1998 it was combined with the Australasian Refine...

  13. Design and Analysis Tools for Concurrent Blackboard Systems

    Science.gov (United States)

    McManus, John W.

    1991-01-01

    A blackboard system consists of a set of knowledge sources, a blackboard data structure, and a control strategy used to activate the knowledge sources. The blackboard model of problem solving is best described by Dr. H. Penny Nii of the Stanford University AI Laboratory: "A Blackboard System can be viewed as a collection of intelligent agents who are gathered around a blackboard, looking at pieces of information written on it, thinking about the current state of the solution, and writing their conclusions on the blackboard as they generate them. " The blackboard is a centralized global data structure, often partitioned in a hierarchical manner, used to represent the problem domain. The blackboard is also used to allow inter-knowledge source communication and acts as a shared memory visible to all of the knowledge sources. A knowledge source is a highly specialized, highly independent process that takes inputs from the blackboard data structure, performs a computation, and places the results of the computation in the blackboard data structure. This design allows for an opportunistic control strategy. The opportunistic problem-solving technique allows a knowledge source to contribute towards the solution of the current problem without knowing which of the other knowledge sources will use the information. The use of opportunistic problem-solving allows the data transfers on the blackboard to determine which processes are active at a given time. Designing and developing blackboard systems is a difficult process. The designer is trying to balance several conflicting goals and achieve a high degree of concurrent knowledge source execution while maintaining both knowledge and semantic consistency on the blackboard. Blackboard systems have not attained their apparent potential because there are no established tools or methods to guide in their construction or analyze their performance.

  14. SYSTID - A flexible tool for the analysis of communication systems.

    Science.gov (United States)

    Dawson, C. T.; Tranter, W. H.

    1972-01-01

    Description of the System Time Domain Simulation (SYSTID) computer-aided analysis program which is specifically structured for communication systems analysis. The SYSTID program is user oriented so that very little knowledge of computer techniques and very little programming ability are required for proper application. The program is designed so that the user can go from a system block diagram to an accurate simulation by simply programming a single English language statement for each block in the system. The mathematical and functional models available in the SYSTID library are presented. An example problem is given which illustrates the ease of modeling communication systems. Examples of the outputs available are presented, and proposed improvements are summarized.

  15. Analysis of spreadable cheese by Raman spectroscopy and chemometric tools.

    Science.gov (United States)

    Oliveira, Kamila de Sá; Callegaro, Layce de Souza; Stephani, Rodrigo; Almeida, Mariana Ramos; de Oliveira, Luiz Fernando Cappa

    2016-03-01

    In this work, FT-Raman spectroscopy was explored to evaluate spreadable cheese samples. A partial least squares discriminant analysis was employed to identify the spreadable cheese samples containing starch. To build the models, two types of samples were used: commercial samples and samples manufactured in local industries. The method of supervised classification PLS-DA was employed to classify the samples as adulterated or without starch. Multivariate regression was performed using the partial least squares method to quantify the starch in the spreadable cheese. The limit of detection obtained for the model was 0.34% (w/w) and the limit of quantification was 1.14% (w/w). The reliability of the models was evaluated by determining the confidence interval, which was calculated using the bootstrap re-sampling technique. The results show that the classification models can be used to complement classical analysis and as screening methods.

  16. Application of Multivariate Analysis Tools to Industrial Scale Fermentation Data

    DEFF Research Database (Denmark)

    Mears, Lisa; Nørregård, Rasmus; Stocks, Stuart M.;

    to the different batch lengths, different data sampling intervals, noise in the measurements, and both online and offline data. The importance of the pre-processing stages are often underappreciated (Gurden et al. 2001). In this work, a 30 batch dataset from a production process operating at Novozymes A....../S is analysed by multivariate analysis with the aim of predicting the final product concentration, which is measured offline at the end of each batch. Many modelling iterations were required using different pre-processing methods, in order to extract the trends from the data set. The final model gave an average......The analysis of batch process data can provide insight into the process operation, and there is a vast amount of historical data available for data mining. Empirical modelling utilising this data is desirable where there is a lack of understanding regarding the underlying process (Formenti et al...

  17. Bayesian networks as a tool for epidemiological systems analysis

    OpenAIRE

    Lewis, F.I.

    2012-01-01

    Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter ...

  18. Soldier Station: A Tool for Dismounted Infantry Analysis

    OpenAIRE

    Pratt, Shirley; Ohman, David; Brown, Steve; Galloway, John; Pratt, David

    1997-01-01

    Soldier Station is a networked, human-in-the-loop, virtual dismounted infantryman (DI) simulator with underlying constructive model algorithms for movement, detection, engagement, and damage assessment. It is being developed by TRADOC Analysis Center - White Sands Missile Range, New Mexico, to analyze DI issues pertaining to situational awareness, command and control, and tactics techniques and procedures. It is unique in its design to integrate virtual and constructive simulation...

  19. Power Systems Life Cycle Analysis Tool (Power L-CAT).

    Energy Technology Data Exchange (ETDEWEB)

    Andruski, Joel; Drennen, Thomas E.

    2011-01-01

    The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation; and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).

  20. Geothermal systems materials: a workshop/symposium

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    Sixteen papers are included. A separate abstract was prepared for each. Summaries of workshops on the following topics are also included in the report: non-metallic materials, corrosion, materials selection, fluid chemistry, and failure analysis. (MHR)

  1. DYNAMICS ANALYSIS OF SPECIAL STRUCTURE OF MILLING-HEAD MACHINE TOOL

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The milling-bead machine tool is a sophisticated and high-quality machine tool of which the spindle system is made up of special multi-element structure. Two special mechanical configurations make the cutting performance of the machine tool decline. One is the milling head spindle supported on two sets of complex bearings. The mechanical dynamic rigidity of milling head structure is researched on designed digital prototype with finite element analysis(FEA) and modal synthesis analysis (MSA) for identifying the weak structures. The other is the ram structure hanging on milling head. The structure is researched to get dynamic performance on cutting at different ram extending positions. The analysis results on spindle and ram are used to improve the mechanical configurations and structure in design. The machine tool is built up with modified structure and gets better dynamic rigidity than it was before.

  2. Writing and Learning in the Business Classroom: The Workshop Approach

    Science.gov (United States)

    Fernsten, Linda; Fernsten, Jeffrey

    2008-01-01

    A writing workshop is a pedagogical tool that can create a more productive experience for teachers and students alike. Business students who have used this technique with experienced instructors agree that a well-planned writing workshop can be useful for dispelling writing fears, furthering understanding of business communication skills,…

  3. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    OpenAIRE

    ASLANTAŞ, Kubilay

    2003-01-01

    The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutt...

  4. Time-frequency tools of signal processing for EISCAT data analysis

    Directory of Open Access Journals (Sweden)

    J. Lilensten

    Full Text Available We demonstrate the usefulness of some signal-processing tools for the EISCAT data analysis. These tools are somewhat less classical than the familiar periodogram, squared modulus of the Fourier transform, and therefore not as commonly used in our community. The first is a stationary analysis, "Thomson's estimate'' of the power spectrum. The other two belong to time-frequency analysis: the short-time Fourier transform with the spectrogram, and the wavelet analysis via the scalogram. Because of the highly non-stationary character of our geophysical signals, the latter two tools are better suited for this analysis. Their results are compared with both a synthetic signal and EISCAT ion-velocity measurements. We show that they help to discriminate patterns such as gravity waves from noise.

  5. Advancing Risk Analysis for Nanoscale Materials: Report from an International Workshop on the Role of Alternative Testing Strategies for Advancement: Advancing Risk Analysis for Nanoscale Materials

    Energy Technology Data Exchange (ETDEWEB)

    Shatkin, J. A. [Vireo Advisors, Boston MA USA; Ong, Kimberly J. [Vireo Advisors, Boston MA USA; Beaudrie, Christian [Compass RM, Vancouver CA USA; Clippinger, Amy J. [PETA International Science Consortium Ltd, London UK; Hendren, Christine Ogilvie [Center for the Environmental Implications of NanoTechnology, Duke University, Durham NC USA; Haber, Lynne T. [TERA, Cincinnati OH USA; Hill, Myriam [Health Canada, Ottawa Canada; Holden, Patricia [UC Santa Barbara, Bren School of Environmental Science & Management, ERI, and UC CEIN, University of California, Santa Barbara CA USA; Kennedy, Alan J. [U.S. Army Engineer Research and Development Center, Environmental Laboratory, Vicksburg MS USA; Kim, Baram [Independent, Somerville MA USA; MacDonell, Margaret [Argonne National Laboratory, Environmental Science Division, Argonne IL USA; Powers, Christina M. [U.S. Environmental Protection Agency, Office of Air and Radiation, Office of Transportation and Air Quality, Ann Arbor MI USA; Sharma, Monita [PETA International Science Consortium Ltd, London UK; Sheremeta, Lorraine [Alberta Ingenuity Labs, Edmonton Alberta Canada; Stone, Vicki [John Muir Building Gait 1 Heriot-Watt University, Edinburgh Scotland UK; Sultan, Yasir [Environment Canada, Gatineau QC Canada; Turley, Audrey [ICF International, Durham NC USA; White, Ronald H. [RH White Consultants, Silver Spring MD USA

    2016-08-01

    The Society for Risk Analysis (SRA) has a history of bringing thought leadership to topics of emerging risk. In September 2014, the SRA Emerging Nanoscale Materials Specialty Group convened an international workshop to examine the use of alternative testing strategies (ATS) for manufactured nanomaterials (NM) from a risk analysis perspective. Experts in NM environmental health and safety, human health, ecotoxicology, regulatory compliance, risk analysis, and ATS evaluated and discussed the state of the science for in vitro and other alternatives to traditional toxicology testing for NM. Based on this review, experts recommended immediate and near-term actions that would advance ATS use in NM risk assessment. Three focal areas-human health, ecological health, and exposure considerations-shaped deliberations about information needs, priorities, and the next steps required to increase confidence in and use of ATS in NM risk assessment. The deliberations revealed that ATS are now being used for screening, and that, in the near term, ATS could be developed for use in read-across or categorization decision making within certain regulatory frameworks. Participants recognized that leadership is required from within the scientific community to address basic challenges, including standardizing materials, protocols, techniques and reporting, and designing experiments relevant to real-world conditions, as well as coordination and sharing of large-scale collaborations and data. Experts agreed that it will be critical to include experimental parameters that can support the development of adverse outcome pathways. Numerous other insightful ideas for investment in ATS emerged throughout the discussions and are further highlighted in this article.

  6. Stakeholder analysis: a useful tool for biobank planning.

    Science.gov (United States)

    Bjugn, Roger; Casati, Bettina

    2012-06-01

    Stakeholders are individuals, groups, or organizations that are affected by or can affect a particular action undertaken by others. Biobanks relate to a number of donors, researchers, research institutions, regulatory bodies, funders, and others. These stakeholders can potentially have a strong influence upon the organization and operation of a biobank. A sound strategy for stakeholder engagement is considered essential in project management and organization theory. In this article, we review relevant stakeholder theory and demonstrate how a stakeholder analysis was undertaken in the early stage of a planned research biobank at a public hospital in Norway. PMID:24835062

  7. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  8. Neutron activation analysis: a powerful tool in provenance investigations

    International Nuclear Information System (INIS)

    It is well known that neutron activation analysis (NAA), both instrumental and destructive, allows the simultaneous determination of a number of elements, mostly trace elements, with high levels of precision and accuracy. These peculiar properties of NAA are very useful when applied to provenance studies, i.e. to the identification of the origin of raw materials with which artifacts had been manufactured in ancient times. Data reduction by statistical procedures, especially multivariate analysis techniques, provides a statistical 'fingerprint' of investigated materials, both raw materials and archaeological artifacts, that, upon comparison, allows the identification of the provenance of prime matters used for artifact manufacturing. Thus information on quarries and flows exploitation in the antiquity, on technological raw materials processing, on trade routes and about the circulation of fakes, can be obtained. In the present paper two case studies are reported. The first one deals with the identification of the provenance of clay used to make ceramic materials, mostly bricks and tiles, recovered from the excavation of a Roman 'villa' in Lomello (Roman name Laumellum) and of Roman settlings in Casteggio (Roman name Clastidium). Both sites are located in the Province of Pavia in areas called Lomellina and Oltrepo respectively. The second one investigates the origin of the white marble used to build medieval arks, Carolingian age, located in the church of San Felice, now property of the University of Pavia. Experimental set-up, analytical results and data reduction procedures are presented and discussed. (author)

  9. Nested sampling as a tool for LISA data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gair, Jonathan R [Institute of Astronomy, Madingley Road, CB3 0HA, Cambridge (United Kingdom); Feroz, Farhan; Graff, Philip; Hobson, Michael P [Astrophysics Group, Cavendish Laboratory, JJ Thomson Avenue, Cambridge CB3 0HE (United Kingdom); Babak, Stanislav; Petiteau, Antoine [Max-Planck-Institut fuer Gravitationsphysik, Am Muehlenberg 1, 14476, Potsdam (Germany); Porter, Edward K, E-mail: jgair@ast.cam.ac.u [APC, UMR 7164, Universite Paris 7 Denis Diderot, 10, rue Alice Domon et Leonie Duquet, 75205 Paris Cedex 13 (France)

    2010-05-01

    Nested sampling is a technique for efficiently computing the probability of a data set under a particular hypothesis, also called the Bayesian Evidence or Marginal Likelihood, and for evaluating the posterior. MULTINEST is a multi-modal nested sampling algorithm which has been designed to efficiently explore and characterize posterior probability surfaces containing multiple secondary solutions. We have applied the MULTINEST algorithm to a number of problems in gravitational wave data analysis. In this article, we describe the algorithm and present results for several applications of the algorithm to analysis of mock LISA data. We summarise recently published results for a test case in which we searched for two non-spinning black hole binary merger signals in simulated LISA data. We also describe results obtained with MULTINEST in the most recent round of the Mock LISA Data Challenge (MLDC), in which the algorithm was used to search for and characterise both spinning supermassive black hole binary inspirals and bursts from cosmic string cusps. In all these applications, the algorithm found the correct number of signals and efficiently recovered the posterior probability distribution. Moreover, in most cases the waveform corresponding to the best a-posteriori parameters had an overlap in excess of 99% with the true signal.

  10. Tools for the analysis and characterization of therapeutic protein species

    Directory of Open Access Journals (Sweden)

    Fuh MM

    2016-05-01

    Full Text Available Marceline Manka Fuh, Pascal Steffen, Hartmut Schlüter Mass Spectrometric Proteomics, Institute for Clinical Chemistry and Laboratory Medicine, University Medical Center Hamburg-Eppendorf, Hamburg, Germany Abstract: A continuously increasing number of therapeutic proteins are being released into the market, including biosimilars. In contrast to small organic drugs, therapeutic proteins require an extensive analysis of their exact chemical composition because of their complexity and proof of the absence of contaminants, such as host cell proteins and nucleic acids. Especially challenging is the detection of low abundant species of therapeutic proteins because these species are usually very similar to the target therapeutic protein. However, the detection of these species is very important for the safety of patients because a very small change of the exact chemical composition may cause serious side effects. In this review, we give a brief overview about the most important analytical approaches for characterizing therapeutic protein species and their contaminants and focus on the progress in this field during the past 3 years. Top-down mass spectrometry of intact therapeutic proteins in the future may solve many of the current problems in their analysis. Keywords: therapeutic protein species, biosimilars, liquid chromatography, mass spectrometry, capillary electrophoresis

  11. MATING DESIGNS: HELPFUL TOOL FOR QUANTITATIVE PLANT BREEDING ANALYSIS

    Directory of Open Access Journals (Sweden)

    Athanase Nduwumuremyi

    2013-12-01

    Full Text Available Selection of parental materials and good mating designs in conventional plant breeding are the keys to the successful plant breeding programme. However, there are several factors affecting the choices of mating designs. Mating design refers to the procedure of producing the progenies, in plant breeding, plant breeders and geneticists, theoretically and practically, they use different form of mating designs and arrangements for targeted purpose. The choice of a mating design for estimating genetic variances should be dictated by the objectives of the study, time, space, cost and other biological limitations. In all mating designs, the individuals are taken randomly and crossed to produce progenies which are related to each other as half-sibs or full-sibs. A form of multivariate analysis or the analysis of variance can be adopted to estimate the components of variances. Therefore, this review aimed at highlighting the most used mating design in plant breeding and genetics studies. It provides easy and quick insight of the different form of mating designs and some statistical components for successful plant breeding.

  12. Nested sampling as a tool for LISA data analysis

    International Nuclear Information System (INIS)

    Nested sampling is a technique for efficiently computing the probability of a data set under a particular hypothesis, also called the Bayesian Evidence or Marginal Likelihood, and for evaluating the posterior. MULTINEST is a multi-modal nested sampling algorithm which has been designed to efficiently explore and characterize posterior probability surfaces containing multiple secondary solutions. We have applied the MULTINEST algorithm to a number of problems in gravitational wave data analysis. In this article, we describe the algorithm and present results for several applications of the algorithm to analysis of mock LISA data. We summarise recently published results for a test case in which we searched for two non-spinning black hole binary merger signals in simulated LISA data. We also describe results obtained with MULTINEST in the most recent round of the Mock LISA Data Challenge (MLDC), in which the algorithm was used to search for and characterise both spinning supermassive black hole binary inspirals and bursts from cosmic string cusps. In all these applications, the algorithm found the correct number of signals and efficiently recovered the posterior probability distribution. Moreover, in most cases the waveform corresponding to the best a-posteriori parameters had an overlap in excess of 99% with the true signal.

  13. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology

    Directory of Open Access Journals (Sweden)

    Peter J.A. Cock

    2013-09-01

    Full Text Available The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of “effector” proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen’s predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu.

  14. HTTR workshop (workshop on hydrogen production technology)

    International Nuclear Information System (INIS)

    Various research and development efforts have been performed to solve the global energy and environmental problems caused by large consumption of fossil fuels. Research activities on advanced hydrogen production technology by the use of nuclear heat from high temperature gas cooled reactors, for example, have been flourished in universities, research institutes and companies in many countries. The Department of HTTR Project and the Department of Advanced Nuclear Heat Technology of JAERI held the HTTR Workshop (Workshop on Hydrogen Production Technology) on July 5 and 6, 2004 to grasp the present status of R and D about the technology of HTGR and the nuclear hydrogen production in the world and to discuss about necessity of the nuclear hydrogen production and technical problems for the future development of the technology. More than 110 participants attended the Workshop including foreign participants from USA, France, Korea, Germany, Canada and United Kingdom. In the Workshop, the presentations were made on such topics as R and D programs for nuclear energy and hydrogen production technologies by thermo-chemical or other processes. Also, the possibility of the nuclear hydrogen production in the future society was discussed. The workshop showed that the R and D for the hydrogen production by the thermo-chemical process has been performed in many countries. The workshop affirmed that nuclear hydrogen production could be one of the competitive supplier of hydrogen in the future. The second HTTR Workshop will be held in the autumn next year. (author)

  15. FACTORIAL CORRESPONDENCES ANALYSIS – A TOOL IN TOURISM MOTIVATION RESEARCH

    Directory of Open Access Journals (Sweden)

    Ion Danut I. JUGANARU

    2016-05-01

    Full Text Available This study aims at analyzing the distribution of tourist flows in 2014, from 25 European countries, on three main categories of trip purposes, and assumes that there are differences or similarities between the tourists’ countries of residence and their trip purposes. "Purpose'' is a multidimensional concept used in marketing research, most often for understanding consumer behavior, and for identifying market segments or customer target groups, reunited in terms of similar characteristics. Being aware that the decision of choice/ purchase is based on purposes, their knowledge proves useful in designing strategies to increase the satisfaction level provided to the customer. The statistical method used in this paper is the factorial correspondences analysis. In our opinion, the identification, by this method, of the existence of differences or similarities between the tourists’ countries of residence and their trip purposes can represent a useful step in studying the tourism market and the choice/ reformulation of strategies.

  16. Betweenness as a Tool of Vulnerability Analysis of Power System

    Science.gov (United States)

    Rout, Gyanendra Kumar; Chowdhury, Tamalika; Chanda, Chandan Kumar

    2016-06-01

    Complex network theory finds its application in analysis of power grid as both share some common characteristics. By using this theory finding critical elements in power network can be achieved. As vulnerabilities of elements of the network decide the vulnerability of the total network, in this paper, vulnerability of each element is studied using two complex network models—betweenness centrality and extended betweenness. The betweenness centrality considers only topological structure of power system whereas extended betweenness is based on both topological and physical properties of the system. In the latter case, some of the electrical properties such as electrical distance, line flow limits, transmission capacities of lines and PTDF matrix are included. The standard IEEE 57 bus system has been studied based upon the above mentioned indices and following conclusions have been discussed.

  17. Tool for automated method design in activation analysis

    International Nuclear Information System (INIS)

    A computational approach to the optimization of the adjustable parameters of nuclear activation analysis has been developed for use in comprehensive method design calculations. An estimate of sample composition is used to predict the gamma-ray spectra to be expected for given sets of values of experimental parameters. These spectra are used to evaluate responses such as detection limits and measurement precision for application to optimization by the simplex method. This technique has been successfully implemented for the simultaneous determination of sample size and irradiation, decay and counting times by the optimization of either detection limit or precision. Both single-element and multielement determinations can be designed with the aid of these calculations. The combination of advance prediction and simplex optimization is both flexible and efficient and produces numerical results suitable for use in further computations

  18. "Boden macht Schule" - a soil awareness workshop for Austrian pupils

    Science.gov (United States)

    Foldal, Cecile B.; Aust, Günter; Baumgarten, Andreas; Berthold, Helene; Birli, Barbara; Englisch, Michael; Ferstl, Elsa; Leregger, Florian; Schwarz, Sigrid; Tulipan, Monika

    2014-05-01

    In order to raise awareness and understanding for the importance of soil, we developed a workshop for schoolchildren between the age of nine and thirteen. The workshop focuses on soil formation, soil functions and soil organisms. Guided by young soil scientist the children can actively explore different soil properties. Key elements are studies and identification of soil animals, small physical experiments and several games followed up with creative tasks. Our aim is to make the workshop an attractive tool for environmental education in public schools and by this to increase the interest in soil and soil protection. This poster gives a short overview of the contents of the workshop "Boden macht Schule"

  19. NCC: A Physics-Based Design and Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    2000-01-01

    The National Combustion Code (NCC) is an integrated system of computer codes for physics-based design and analysis of combustion systems. It uses unstructured meshes and runs on parallel computing platforms. The NCC is composed of a set of distinct yet closely related modules. They are: (1) a gaseous flow module solving 3-D Navier-Stokes equations; (2) a turbulence module containing the non-linear k-epsilon models; (3) a chemistry module using either the conventional reduced kinetics approach of solving species equations or the Intrinsic Low Dimensional Manifold (ILDM) kinetics approach of table looking up in conjunction with solving the equations of the progressive variables; (4) a turbulence-chemistry interaction module including the option of solving the joint probability density function (PDF) for species and enthalpy; and (5) a spray module for solving the liquid phase equations. In early 1995, an industry-government team was formed to develop the NCC. In July 1998, the baseline beta version was completed and presented in two NCC sessions at the 34th AIAA/ASME/SAE/ASEE Joint Propulsion Conference & Exhibit, July 1998. An overview of this baseline beta version was presented at the NASA HPCCP/CAS Workshop 98, August 1998. Since then, the effort has been focused on the streamlining, validation, and enhancement of the th baseline beta version. The progress is presented in two NCC sessions at the AIAA 38 Aerospace Sciences Meeting & Exhibit, January 2000. At this NASA HPCCP/CAS Workshop 2000, an overview of the NCC papers presented at the AIAA 38 th Aerospace Sciences Meeting & Exhibit is presented, with emphasis on the reduction of analysis time of simulating the (gaseous) reacting flows in full combustors. In addition, results of NCC simulation of a modern turbofan combustor will also be reported.

  20. The Arabidopsis co-expression tool (act): a WWW-based tool and database for microarray-based gene expression analysis

    DEFF Research Database (Denmark)

    Jen, C. H.; Manfield, I. W.; Michalopoulos, D. W.;

    2006-01-01

    We present a new WWW-based tool for plant gene analysis, the Arabidopsis Co-Expression Tool (act) , based on a large Arabidopsis thaliana microarray data set obtained from the Nottingham Arabidopsis Stock Centre. The co-expression analysis tool allows users to identify genes whose expression...... patterns are correlated across selected experiments or the complete data set. Results are accompanied by estimates of the statistical significance of the correlation relationships, expressed as probability (P) and expectation (E) values. Additionally, highly ranked genes on a correlation list can...... be examined using the novel clique finder tool to determine the sets of genes most likely to be regulated in a similar manner. In combination, these tools offer three levels of analysis: creation of correlation lists of co-expressed genes, refinement of these lists using two-dimensional scatter plots...

  1. Bayesian networks as a tool for epidemiological systems analysis

    Science.gov (United States)

    Lewis, F. I.

    2012-11-01

    Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter attempts not only to identify statistically associated variables, but to additionally, and empirically, separate these into those directly and indirectly dependent with one or more outcome variables. Such discrimination is vastly more ambitious but has the potential to reveal far more about key features of complex disease systems. Applying Bayesian network modeling to biological and medical data has considerable computational demands, combined with the need to ensure robust model selection given the vast model space of possible DAGs. These challenges require the use of approximation techniques, such as the Laplace approximation, Markov chain Monte Carlo simulation and parametric bootstrapping, along with computational parallelization. A case study in structure discovery - identification of an optimal DAG for given data - is presented which uses additive Bayesian networks to explore veterinary disease data of industrial and medical relevance.

  2. [Participation and creativity as tools of analysis of public policies].

    Science.gov (United States)

    Cordeiro, Joselma Cavalcanti; Villasante, Tomás Rodriguez Pietro; de Araújo, José Luiz do Amaral Correa

    2010-07-01

    In the context of current globalization, important modifications of the international relations and of the ideological, technical, and cultural components in the administration of the States are expressed by non-legitimate public action principles which account for social iniquity and the weakening of the role of the State. Regardless of its political origin or ideological orientation, the economic development plans and programs exhibit a prevailing uniformity. The challenge today implies mobilizing in local capacities with the objective of changing the quality of public action through the adoption of new development strategies able to integrate new social dimensions with other mechanisms of action. One of them, the intersectoral action, demands the structural revision of the administrative and cultural frontiers of the public and private social agents as a means of making a new tentative sociopolitical arrangement. The complexity of politics, projects and programs is taken as a methodological landmark based on the following theoretical presuppositions: integrality, social networks, and sociopraxis, constructing a participative process of knowledge to a political analysis in search of a change in the approach of the sociopolitical processes, starting from local social networks. PMID:20694334

  3. The geomatic like a tool for biodiversity analysis in Colombia

    International Nuclear Information System (INIS)

    Current biodiversity research recognizes geographic information and its variability in space as an essential characteristic that helps understand the relationships between the components of biological communities and their environment. The description and quantification of their spatial and temporal attributes adds important elements for their adequate management. The biological diversity convention (biological diversity convention, law 165 of 1994) reassured the importance of biodiversity and the necessity of its conservation and sustainable use and emphasized that its components should be characterized and monitored, and the data and information related with them should be maintained and organized. The biological research institute Alexander von Humboldt is the Colombian entity in charge of promoting, coordinating and undertaking research that helps in the conservation and sustainable use of biodiversity, this institution has defined the inventory of all the fauna and flora resources in the country as one of its priority research lines. Using geomatic techniques, Humboldt institute has implemented and developed technologies to capture, debug, geocode and analyze geographic data related with biodiversity (Armenteras, 2001) among others, this has helped in the development, structure and management of projects such as the ecosystems mapping of the Colombian amazonic, Andean and Orinoco ecosystems (GIS -RS), finding conservation opportunities in rural landscapes (GIS-RS) biological localities Gazetteer (GIS, databases, programming), development of models that predict and explain species distribution (GIS, database management, modeling techniques), conservation weakness (GIS-RS) and environmental indicators (GIS, geostatistical analysis)

  4. IMPORTANT - PERFORMANCE ANALYSIS AS A TOOL IN DESTINATION MARKETING

    Directory of Open Access Journals (Sweden)

    Eleina QIRICI

    2011-06-01

    Full Text Available The Korça Region is located in the Southeast of Albania and borders Greece and Macedonia to the South and the East. It is a mountainous region with two major lakes, Lake Ohrid, the oldest lake in Europe, which is shared with Macedonia and Lake Prespa which is shared with Greece and Macedonia (100km2 in Albania.If we consider the last years, there is an increasing tendency to improve the tourist facilities and to attract the tourist market which is interested for activities in open nature and relax in fresh and pure air. These demands could be met very well in Korca destination which is characterized by suitable climatic conditions and tourist services. Eventually a combination of development of town tourism and tourist villages helped the sustainability of the development of Korca as tourist destination in general.The main purpose of this paper is to present the using of important - performance analysis in marketing destination for the development of tourism.Highlights: (1 the paper considers multifarious goals of the destination management; (2 a computer booking system is used by hotels and guest houses in the region; (3 the relationship between what a tourists wants to find in a destination and that he finds in fact.

  5. Microplasmas for chemical analysis: analytical tools or research toys?

    International Nuclear Information System (INIS)

    An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between 'liquid' electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided

  6. Computational tool for morphological analysis of cultured neonatal rat cardiomyocytes.

    Science.gov (United States)

    Leite, Maria Ruth C R; Cestari, Idágene A; Cestari, Ismar N

    2015-08-01

    This study describes the development and evaluation of a semiautomatic myocyte edge-detector using digital image processing. The algorithm was developed in Matlab 6.0 using the SDC Morphology Toolbox. Its conceptual basis is the mathematical morphology theory together with the watershed and Euclidean distance transformations. The algorithm enables the user to select cells within an image for automatic detection of their borders and calculation of their surface areas; these areas are determined by adding the pixels within each myocyte's boundaries. The algorithm was applied to images of cultured ventricular myocytes from neonatal rats. The edge-detector allowed the identification and quantification of morphometric alterations in cultured isolated myocytes induced by 72 hours of exposure to a hypertrophic agent (50 μM phenylephrine). There was a significant increase in the mean surface area of the phenylephrine-treated cells compared with the control cells (p<;0.05), corresponding to cellular hypertrophy of approximately 50%. In conclusion, this edge-detector provides a rapid, repeatable and accurate measurement of cell surface areas in a standardized manner. Other possible applications include morphologic measurement of other types of cultured cells and analysis of time-related morphometric changes in adult cardiac myocytes.

  7. An integrated data analysis tool for improving measurements on the MST RFP

    Energy Technology Data Exchange (ETDEWEB)

    Reusch, L. M., E-mail: lmmcguire@wisc.edu; Galante, M. E.; Johnson, J. R.; McGarry, M. B.; Den Hartog, D. J. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Franz, P. [Consorzio RFX, EURATOM-ENEA Association, Padova (Italy); Stephens, H. D. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Pierce College Fort Steilacoom, Lakewood, Washington 98498 (United States)

    2014-11-15

    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method.

  8. An integrated data analysis tool for improving measurements on the MST RFP

    International Nuclear Information System (INIS)

    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method

  9. Matlab Symbolic Circuit Analysis and Simulation Tool Using PSpice Netlist for Circuits Optimization

    Directory of Open Access Journals (Sweden)

    Ogri J. Ushie

    2015-04-01

    Full Text Available This paper presents new Matlab symbolic circuit analysis and simulation (MSCAM tool that make uses of netlist from PSpice to generate matrices. These matrices can be used to calculate circuit parameters or for optimization. The tool can handle active and passive components such as resistors, capacitors, inductors, operational amplifiers, and transistors. The transistors are converted into small signal analysis and operational amplifiers make use of the small signal analysis which can easily be implemented in a program as explained in the main work. Five examples are used to illustrate the potential of the approach. Results presented are similar when compared to PSpice simulation. This approach can handle larger matrix dimension compared to symbolic circuit analysis tool (SCAM.

  10. Analytical tools for the analysis of fire debris. A review: 2008-2015.

    Science.gov (United States)

    Martín-Alberca, Carlos; Ortega-Ojeda, Fernando Ernesto; García-Ruiz, Carmen

    2016-07-20

    The analysis of fire debris evidence might offer crucial information to a forensic investigation, when for instance, there is suspicion of the intentional use of ignitable liquids to initiate a fire. Although the evidence analysis in the laboratory is mainly conducted by a handful of well-established methodologies, during the last eight years several authors proposed noteworthy improvements on these methodologies, suggesting new interesting approaches. This review critically outlines the most up-to-date and suitable tools for the analysis and interpretation of fire debris evidence. The survey about analytical tools covers works published in the 2008-2015 period. It includes sources of consensus-classified reference samples, current standard procedures, new proposals for sample extraction and analysis, and the most novel statistical tools. In addition, this review provides relevant knowledge on the distortion effects of the ignitable liquid chemical fingerprints, which have to be considered during interpretation of results. PMID:27251852

  11. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example.......A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...

  12. CoryneBase: Corynebacterium genomic resources and analysis tools at your fingertips.

    Directory of Open Access Journals (Sweden)

    Hamed Heydari

    Full Text Available Corynebacteria are used for a wide variety of industrial purposes but some species are associated with human diseases. With increasing number of corynebacterial genomes having been sequenced, comparative analysis of these strains may provide better understanding of their biology, phylogeny, virulence and taxonomy that may lead to the discoveries of beneficial industrial strains or contribute to better management of diseases. To facilitate the ongoing research of corynebacteria, a specialized central repository and analysis platform for the corynebacterial research community is needed to host the fast-growing amount of genomic data and facilitate the analysis of these data. Here we present CoryneBase, a genomic database for Corynebacterium with diverse functionality for the analysis of genomes aimed to provide: (1 annotated genome sequences of Corynebacterium where 165,918 coding sequences and 4,180 RNAs can be found in 27 species; (2 access to comprehensive Corynebacterium data through the use of advanced web technologies for interactive web interfaces; and (3 advanced bioinformatic analysis tools consisting of standard BLAST for homology search, VFDB BLAST for sequence homology search against the Virulence Factor Database (VFDB, Pairwise Genome Comparison (PGC tool for comparative genomic analysis, and a newly designed Pathogenomics Profiling Tool (PathoProT for comparative pathogenomic analysis. CoryneBase offers the access of a range of Corynebacterium genomic resources as well as analysis tools for comparative genomics and pathogenomics. It is publicly available at http://corynebacterium.um.edu.my/.

  13. An Intelligent Tool to support Requirements Analysis and Conceptual Design of Database Design

    Institute of Scientific and Technical Information of China (English)

    王能斌; 刘海青

    1991-01-01

    As an application of artificial intelligence and expert system technology to database design,this paper presents an intelligent design tool NITDT,which comprises a requirements specification language NITSL,a knowledge representation language NITKL,and an inference engine with uncertainty reasoning capability.NITDT now covers the requirements analysis and conceptual design of database design.However,it is possible to be integrated with another database design tool,NITDBA,developed also at NIT to become an integrated design tool supporting the whole process of database design.

  14. Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) Users' Guide

    Science.gov (United States)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The tool for turbine engine closed-loop transient analysis (TTECTrA) is a semi-automated control design tool for subsonic aircraft engine simulations. At a specific flight condition, TTECTrA produces a basic controller designed to meet user-defined goals and containing only the fundamental limiters that affect the transient performance of the engine. The purpose of this tool is to provide the user a preliminary estimate of the transient performance of an engine model without the need to design a full nonlinear controller.

  15. The Fuzzy Cluster Analysis in Identification of Key Temperatures in Machine Tool

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The thermal-induced error is a very important sour ce of machining errors of machine tools. To compensate the thermal-induced machin ing errors, a relationship model between the thermal field and deformations was needed. The relationship can be deduced by virtual of FEM (Finite Element Method ), ANN (Artificial Neural Network) or MRA (Multiple Regression Analysis). MR A is on the basis of a total understanding of the temperature distribution of th e machine tool. Although the more the temperatures measu...

  16. Configuration Analysis Tool (CAT). System Description and users guide (revision 1)

    Science.gov (United States)

    Decker, W.; Taylor, W.; Mcgarry, F. E.; Merwarth, P.

    1982-01-01

    A system description of, and user's guide for, the Configuration Analysis Tool (CAT) are presented. As a configuration management tool, CAT enhances the control of large software systems by providing a repository for information describing the current status of a project. CAT provides an editing capability to update the information and a reporting capability to present the information. CAT is an interactive program available in versions for the PDP-11/70 and VAX-11/780 computers.

  17. MGAviewer: a desktop visualization tool for analysis of metagenomics alignment data

    OpenAIRE

    Zhu, Zhengwei; Niu, Beifang; Chen, Jing; Wu, Sitao; Sun, Shulei; Li, Weizhong

    2012-01-01

    Summary: Numerous metagenomics projects have produced tremendous amounts of sequencing data. Aligning these sequences to reference genomes is an essential analysis in metagenomics studies. Large-scale alignment data call for intuitive and efficient visualization tool. However, current tools such as various genome browsers are highly specialized to handle intraspecies mapping results. They are not suitable for alignment data in metagenomics, which are often interspecies alignments. We have dev...

  18. Evaluation of the accuracy of analysis tools for atmospheric new particle formation

    OpenAIRE

    Korhonen, H.; S.-L. Sihto; V.-M. Kerminen; Lehtinen, K. E. J.

    2010-01-01

    Several mathematical tools have been developed in recent years to analyze new particle formation rates and to estimate nucleation rates and mechanisms at sub-3nm sizes from atmospheric aerosol data. Here we evaluate these analysis tools using 1239 numerical nucleation events for which the nucleation mechanism and formation rates were known exactly. The accuracy of the estimates of particle formation rate at 3 nm (J3) showed significant sensitivity to ...

  19. Evaluation of the accuracy of analysis tools for atmospheric new particle formation

    OpenAIRE

    Korhonen, H.; Sihto, S.-L.; Kerminen, V.-M.; Lehtinen, K. E. J.

    2011-01-01

    Several mathematical tools have been developed in recent years to analyze new particle formation rates and to estimate nucleation rates and mechanisms at sub-3 nm sizes from atmospheric aerosol data. Here we evaluate these analysis tools using 1239 numerical nucleation events for which the nucleation mechanism and formation rates were known exactly. The accuracy of the estimates of particle formation rate at 3 nm (J3) showed significant sensitivity to...

  20. Summary and abstracts of the Planetary Data Workshop, June 2012

    Science.gov (United States)

    Gaddis, Lisa R.; Hare, Trent; Beyer, Ross

    2014-01-01

    The recent boom in the volume of digital data returned by international planetary science missions continues to both delight and confound users of those data. In just the past decade, the Planetary Data System (PDS), NASA’s official archive of scientific results from U.S. planetary missions, has seen a nearly 50-fold increase in the amount of data and now serves nearly half a petabyte. In only a handful of years, this volume is expected to approach 1 petabyte (1,000 terabytes or 1 quadrillion bytes). Although data providers, archivists, users, and developers have done a creditable job of providing search functions, download capabilities, and analysis and visualization tools, the new wealth of data necessitates more frequent and extensive discussion among users and developers about their current capabilities and their needs for improved and new tools. A workshop to address these and other topics, “Planetary Data: A Workshop for Users and Planetary Software Developers,” was held June 25–29, 2012, at Northern Arizona University (NAU) in Flagstaff, Arizona. A goal of the workshop was to present a summary of currently available tools, along with hands-on training and how-to guides, for acquiring, processing and working with a variety of digital planetary data. The meeting emphasized presentations by data users and mission providers during days 1 and 2, and developers had the floor on days 4 and 5 using an “unconference” format for day 5. Day 3 featured keynote talks by Laurence Soderblom (U.S. Geological Survey, USGS) and Dan Crichton (Jet Propulsion Laboratory, JPL) followed by a panel discussion, and then research and technical discussions about tools and capabilities under recent or current development. Software and tool demonstrations were held in break-out sessions in parallel with the oral session. Nearly 150 data users and developers from across the globe attended, and 22 National Aeronautics and space Administration (NASA) and non-NASA data providers

  1. 7th International Workshop on Statistical Simulation

    CERN Document Server

    Mignani, Stefania; Monari, Paola; Salmaso, Luigi

    2014-01-01

    The Department of Statistical Sciences of the University of Bologna in collaboration with the Department of Management and Engineering of the University of Padova, the Department of Statistical Modelling of Saint Petersburg State University, and INFORMS Simulation Society sponsored the Seventh Workshop on Simulation. This international conference was devoted to statistical techniques in stochastic simulation, data collection, analysis of scientific experiments, and studies representing broad areas of interest. The previous workshops took place in St. Petersburg, Russia in 1994, 1996, 1998, 2001, 2005, and 2009. The Seventh Workshop took place in the Rimini Campus of the University of Bologna, which is in Rimini’s historical center.

  2. Technology transfer - insider protection workshop (Safeguards Evaluation Method - Insider Threat)

    International Nuclear Information System (INIS)

    The Safeguards Evaluation Method - Insider Threat, developed by Lawrence Livermore National Laboratory, is a field-applicable tool to evaluate facility safeguards against theft or diversion of special nuclear material (SNM) by nonviolent insiders. To ensure successful transfer of this technology from the laboratory to DOE field offices and contractors, LLNL developed a three-part package. The package includes a workbook, user-friendly microcomputer software, and a three-day training program. The workbook guides an evaluation team through the Safeguards Evaluation Method and provides forms for gathering data. The microcomputer software assists in the evaluation of safeguards effectiveness. The software is designed for safeguards analysts with no previous computer experience. It runs on an IBM Personal Computer or any compatible machine. The three-day training program is called the Insider Protection Workshop. The workshop students learn how to use the workbook and the computer software to assess insider vulnerabilities and to evaluate the benefits and costs of potential improvements. These activities increase the students' appreciation of the insider threat. The workshop format is informal and interactive, employing four different instruction modes: classroom presentations, small-group sessions, a practical exercise, and ''hands-on'' analysis using microcomputers. This approach to technology transfer has been successful: over 100 safeguards planners and analysts have been trained in the method, and it is being used at facilities through the DOE complex

  3. 2nd Ralf Yorque Workshop

    CERN Document Server

    1985-01-01

    These are the proceedings of the Second R. Yorque Workshop on Resource Management which took place in Ashland, Oregon on July 23-25, 1984. The purpose of the workshop is to provide an informal atmosphere for the discussion of resource assessment and management problems. Each participant presented a one hour morning talk; afternoons were reserved for informal chatting. The workshop was successful in stimulating ideas and interaction. The papers by R. Deriso, R. Hilborn and C. Walters all address the same basic issue, so they are lumped together. Other than that, the order to the papers in this volume was determined in the same fashion as the order of speakers during the workshop -- by random draw. Marc Mangel Department of Mathematics University of California Davis, California June 1985 TABLE OF CONTENTS A General Theory for Fishery Modeling Jon Schnute Data Transformations in Regression Analysis with Applications to Stock-Recruitment Relationships David Ruppert and Raymond J. Carroll ••••••. •�...

  4. A Bluetooth low-energy capture and analysis tool using software-defined radio

    OpenAIRE

    Kilgour, Christopher David

    2013-01-01

    Wireless protocol analysis is a useful tool for researchers, engineers, and network security professionals. Exhaustive BTLE sniffing – the full capture and analysis of Bluetooth Low-Energy radio transmissions – has been out of reach for individuals to apply to research, engineering, and security analysis tasks. Discovering and following an arbitrary Bluetooth frequency-hopping pattern with a cheap narrow-band receiver is a complex undertaking with little chance of success. Further, the high-e...

  5. Proceedings of the workshop: HERA and the LHC workshop series on the implications of HERA for LHC physics

    CERN Document Server

    Jung, H; Ajaltouni, Z J; Albino, S; Altarelli, G; Ambroglini, F; Anderson, J; Antchev, G; Arneodo, M; Aspell, P; Avati, V; Bahr, M; Bacchetta, A; Bagliesi, M G; Ball, R D; Banfi, A; Baranov, S; Bartalini, P; Bartels, J; Bechtel, F; Berardi, V; Berretti, M; Beuf, G; Biasini, M; Bierenbaum, I; Blumlein, J; Blair, R E; Bombonati, C; Boonekamp, M; Bottigli, U; Boutle, S; Bozzo, M; Brucken, E; Bracinik, J; Bruni, A; Bruno, G E; Buckley, A; Bunyatyan, A; Burkhardt, H; Bussey, P; Buzzo, A; Cacciari, M; Cafagna, F; Calicchio, M; Caola, F; Catanesi, M G; Catastini, P L; Cecchi, R; Ceccopieri, F A; Cerci, S; Chekanov, S; Chierici, R; Ciafaloni, M; Ciocci, M A; Coco, V; Colferai, D; Cooper-Sarkar, A; Corcella, G; Czakon, M; Dainese, A; Dasgupta, M; Deak, M; Deile, M; Delsart, P A; Del Debbio, L; de Roeck, A; Diaconu, C; Diehl, M; Dimovasili, E; Dittmar, M; Dremin, I M; Eggert, K; Engel, R; Eremin, V; Erhan, S; Ewerz, C; Fano, L; Feltesse, J; Ferrera, G; Ferro, F; Field, R; Forte, S; Garcia, F; Geiser, A; Gelis, F; Giani, S; Gieseke, S; Gigg, M A; Glazov, A; Golec-Biernat, K; Goulianos, K; Grebenyuk, J; Greco, V; Grellscheid, D; Grindhammer, G; Grothe, M; Guffanti, A; Gwenlan, C; Halyo, V; Hamilton, K; Hautmann, F; Heino, J; Heinrich, G; Hilden, T; Hiller, K; Hollar, J; Janssen, X; Joseph, S; Jung, A W; Jung, H; Juranek, V; Kaspar, J; Kepka, O; Khoze, V A; Kiesling, Ch; Klasen, M; Klein, S; Kniehl, B A; Knutsson, A; Kopal, J; Kramer, G; Krauss, F; Kundrat, V; Kurvinen, K; Kutak, K; Lonnblad, L; Lami, S; Latino, G; Latorre, J I; Latunde-Dada, O; Lauhakangas, R; Lendermann, V; Lenzi, P; Li, G; Likhoded, A; Lipatov, A; Lippmaa, E; Lokajicek, M; Vetere, M Lo; Rodriguez, F Lucas; Luisoni, G; Lytken, E; Muller, K; Macri, M; Magazzu, G; Majhi, A; Majhi, S; Marage, P; Marti, L; Martin, A D; Meucci, M; Milstead, D A; Minutoli, S; Nischke, A; Moares, A; Moch, S; Motyka, L; Namsoo, T; Newman, P; Niewiadomski, H; Nockles, C; Noschis, E; Notarnicola, G; Nystrand, J; Oliveri, E; Oljemark, F; Osterberg, K; Orava, R; Oriunno, M; Osman, S; Ostapchenko, S; Palazzi, P; Pedreschi, E; Pereira, A V; Perrey, H; Petajajarvi, J; Petersen, T; Piccione, A; Pierog, T; Pinfold, J L; Piskounova, O I; Platzer, S; Quinto, M; Rurikova, Z; Radermacher, E; Radescu, V; Radicioni, E; Ravotti, F; Rella, G; Richardson, P; Robutti, E; Rodrigo, G; Rodrigues, E; Rogal, M; Rogers, T C; Rojo, J; Roloff, P; Ropelewski, L; Rosemann, C; Royon, Ch; Ruggiero, G; Rummel, A; Ruspa, M; Ryskin, M G; Salek, D; Slominski, W; Saarikko, H; Vera, A Sabio; Sako, T; Salam, G P; Saleev, V A; Sander, C; Sanguinetti, G; Santroni, A; Schorner-Sadenius, Th; Schicker, R; Schienbein, I; Schmidke, W B; Schwennsen, F; Scribano, A; Sette, G; Seymour, M H; Sherstnev, A; Sjostrand, T; Snoeys, W; Somogyi, G; Sonnenschein, L; Soyez, G; Spiesberger, H; Spinella, F; Squillacioti, P; Stasto, A M; Starodumov, A; Stenzel, H; Stephens, Ph; Ster, A; Stocco, D; Strikman, M; Taylor, C; Teubner, T; Thorne, R S; Trocsanyi, Z; Treccani, M; Treleani, D; Trentadue, L; Trummal, A; Tully, J; Tung, W K; Turcato, M; Turini, N; Ubiali, M; Valkarova, A; van Hameren, A; Van Mechelen, P; Vermaseren, J A M; Vogt, A; Ward, B F L; Watt, G; Webber, B R; Weiss, Ch; White, Ch; Whitmore, J; Wolf, R; Wu, J; Yagues-Molina, A; Yost, S A; Zanderighi, G; Zotov, N; Nedden, M zur

    2009-01-01

    2nd workshop on the implications of HERA for LHC physics. Working groups: Parton Density Functions Multi-jet final states and energy flows Heavy quarks (charm and beauty) Diffraction Cosmic Rays Monte Carlos and Tools

  6. MEL-IRIS: An Online Tool for Audio Analysis and Music Indexing

    Directory of Open Access Journals (Sweden)

    Dimitrios Margounakis

    2009-01-01

    Full Text Available Chroma is an important attribute of music and sound, although it has not yet been adequately defined in literature. As such, it can be used for further analysis of sound, resulting in interesting colorful representations that can be used in many tasks: indexing, classification, and retrieval. Especially in Music Information Retrieval (MIR, the visualization of the chromatic analysis can be used for comparison, pattern recognition, melodic sequence prediction, and color-based searching. MEL-IRIS is the tool which has been developed in order to analyze audio files and characterize music based on chroma. The tool implements specially designed algorithms and a unique way of visualization of the results. The tool is network-oriented and can be installed in audio servers, in order to manipulate large music collections. Several samples from world music have been tested and processed, in order to demonstrate the possible uses of such an analysis.

  7. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    Science.gov (United States)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  8. New university workshops feature emerging technologies

    OpenAIRE

    Brunais, Andrea

    2010-01-01

    Three workshops in October and November aim to help presenters increase their knowledge of emerging technologies. Virginia Tech's Continuing and Professional Education Department is offering the courses in Roanoke: Ten Technology Tools to Use Today, Emerging Technologies for Business, and Second Life in Business.

  9. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang;

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...

  10. 2014 MICCAI Workshop

    CERN Document Server

    Nedjati-Gilani, Gemma; Rathi, Yogesh; Reisert, Marco; Schneider, Torben

    2014-01-01

    This book contains papers presented at the 2014 MICCAI Workshop on Computational Diffusion MRI, CDMRI’14. Detailing new computational methods applied to diffusion magnetic resonance imaging data, it offers readers a snapshot of the current state of the art and covers a wide range of topics from fundamental theoretical work on mathematical modeling to the development and evaluation of robust algorithms and applications in neuroscientific studies and clinical practice.   Inside, readers will find information on brain network analysis, mathematical modeling for clinical applications, tissue microstructure imaging, super-resolution methods, signal reconstruction, visualization, and more. Contributions include both careful mathematical derivations and a large number of rich full-color visualizations.   Computational techniques are key to the continued success and development of diffusion MRI and to its widespread transfer into the clinic. This volume will offer a valuable starting point for anyone interested i...

  11. Nuclear physics workshop

    International Nuclear Information System (INIS)

    This Workshop in Nuclear Physics related to the TANDAR, took place in Buenos Aires in April from 23 to 26, 1987, with attendance of foreign scientists. There were presented four seminars and a lot of studies which deal with the following fields: Nuclear Physics at medium energies, Nuclear Structure, Nuclear Reactions, Nuclear Matter, Instrumentation and Methodology for Nuclear Spectroscopy, Classical Physics, Quantum Mechanics and Field Theory. It must be emphasized that the Electrostatic Accelerator TANDAR allows to work with heavy ions of high energy, that opens a new field of work in PIXE (particle induced X-ray emission). This powerful analytic technique makes it possiblethe analysis of nearly all the elements of the periodic table with the same accuracy. (M.E.L.)

  12. AAAI 2008 Workshop Reports

    OpenAIRE

    Bunescu, Razvan C; Ohio University; Carvalho, Vitor R.; Microsoft Live Labs; Chomicki, Jan; University of Buffalo; Conitzer, Vincent; Duke University; Cox, Michael T.; BBN Technologies; Dignum, Virginia; Utrecht University; Dodds, Zachary; Harvey Mudd College; Dredze, Mark; University of Pennsylvania; Furcy, David; University of Wisconsin Oshkosh; Gabrilovich, Evgeniy; Yahoo! Research; Göker, Mehmet H.; PricewaterhouseCoopers; Guesgen, Hans Werner; Massey University; Hirsh, Haym; Rutgers University; Jannach, Dietmar; Dortmund University of Technology; Junker, Ulrich; ILOG

    2009-01-01

    AAAI was pleased to present the AAAI-08 Workshop Program, to be held Sunday and Monday, July 13–14, in Chicago, Illinois, USA. The program included the following fifteen workshops: Advancements in POMDP Solvers, AI Education Workshop, Coordination, Organization, Institutions and Norms in Agent Systems, Enhanced Messaging, Human Implications of Human-Robot Interaction, Intelligent Techniques for Web Personalization and Recommender Systems, Metareasoning: Thinking about Thinking, Multidisciplin...

  13. Innovative confinement concepts workshop

    Energy Technology Data Exchange (ETDEWEB)

    Kirkpatrick, R.C.

    1998-06-01

    The Innovative Confinement Concepts Workshop occurred in California during the week preceding the Second Symposium on Current Trends in International Fusion Research. An informal report was made to the Second Symposium. A summary of the Workshop concluded that some very promising ideas were presented, that innovative concept development is a central element of the restructured US DOE. Fusion Energy Sciences program, and that the Workshop should promote real scientific progress in fusion.

  14. MathWeb: a concurrent image analysis tool suite for multispectral data fusion

    Science.gov (United States)

    Achalakul, Tiranee; Haaland, Peter D.; Taylor, Stephen

    1999-03-01

    This paper describes a preliminary approach to the fusion of multi-spectral image data for the analysis of cervical cancer. The long-term goal of this research is to define spectral signatures and automatically detect cancer cell structures. The approach combines a multi-spectral microscope with an image analysis tool suite, MathWeb. The tool suite incorporates a concurrent Principal Component Transform (PCT) that is used to fuse the multi-spectral data. This paper describes the general approach and the concurrent PCT algorithm. The algorithm is evaluated from both the perspective of image quality and performance scalability.

  15. The community development workshop, appendix B.

    Science.gov (United States)

    Brill, R.; Gastro, E.; Pennington, A. J.

    1973-01-01

    The Community Development Workshop is the name given to a collection of techniques designed to implement participation in the planning process. It is an electric approach, making use of current work in the psychology of groups, mathematical modeling and systems analysis, simulation gaming, and other techniques. An outline is presented for a session of the workshop which indicates some of the psychological techniques employed, i.e. confrontation, synectics, and encounter micro-labs.

  16. Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0

    Science.gov (United States)

    Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan

    2010-01-01

    The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.

  17. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  18. Development of software tools for 4-D visualization and quantitative analysis of PHITS simulation results

    International Nuclear Information System (INIS)

    A suite of software tools has been developed to facilitate the development of apparatus using a radiation transport simulation code PHITS by enabling 4D visualization (3D space and time) and quantitative analysis of so-called dieaway plots. To deliver useable tools as soon as possible, the existing software was utilized as much as possible; ParaView will be used for the 4D visualization of the results, whereas the analyses of dieaway plots will be done with ROOT toolkit with a tool named “diana”. To enable 4D visualization using ParaView, a group of tools (angel2vtk, DispDCAS1, CamPos) has been developed for the conversion of the data format to the one which can be read from ParaView and to ease the visualization. (author)

  19. MetLab: An In Silico Experimental Design, Simulation and Analysis Tool for Viral Metagenomics Studies

    Science.gov (United States)

    Gourlé, Hadrien; Bongcam-Rudloff, Erik; Hayer, Juliette

    2016-01-01

    Metagenomics, the sequence characterization of all genomes within a sample, is widely used as a virus discovery tool as well as a tool to study viral diversity of animals. Metagenomics can be considered to have three main steps; sample collection and preparation, sequencing and finally bioinformatics. Bioinformatic analysis of metagenomic datasets is in itself a complex process, involving few standardized methodologies, thereby hampering comparison of metagenomics studies between research groups. In this publication the new bioinformatics framework MetLab is presented, aimed at providing scientists with an integrated tool for experimental design and analysis of viral metagenomes. MetLab provides support in designing the metagenomics experiment by estimating the sequencing depth needed for the complete coverage of a species. This is achieved by applying a methodology to calculate the probability of coverage using an adaptation of Stevens’ theorem. It also provides scientists with several pipelines aimed at simplifying the analysis of viral metagenomes, including; quality control, assembly and taxonomic binning. We also implement a tool for simulating metagenomics datasets from several sequencing platforms. The overall aim is to provide virologists with an easy to use tool for designing, simulating and analyzing viral metagenomes. The results presented here include a benchmark towards other existing software, with emphasis on detection of viruses as well as speed of applications. This is packaged, as comprehensive software, readily available for Linux and OSX users at https://github.com/norling/metlab. PMID:27479078

  20. MetLab: An In Silico Experimental Design, Simulation and Analysis Tool for Viral Metagenomics Studies.

    Science.gov (United States)

    Norling, Martin; Karlsson-Lindsjö, Oskar E; Gourlé, Hadrien; Bongcam-Rudloff, Erik; Hayer, Juliette

    2016-01-01

    Metagenomics, the sequence characterization of all genomes within a sample, is widely used as a virus discovery tool as well as a tool to study viral diversity of animals. Metagenomics can be considered to have three main steps; sample collection and preparation, sequencing and finally bioinformatics. Bioinformatic analysis of metagenomic datasets is in itself a complex process, involving few standardized methodologies, thereby hampering comparison of metagenomics studies between research groups. In this publication the new bioinformatics framework MetLab is presented, aimed at providing scientists with an integrated tool for experimental design and analysis of viral metagenomes. MetLab provides support in designing the metagenomics experiment by estimating the sequencing depth needed for the complete coverage of a species. This is achieved by applying a methodology to calculate the probability of coverage using an adaptation of Stevens' theorem. It also provides scientists with several pipelines aimed at simplifying the analysis of viral metagenomes, including; quality control, assembly and taxonomic binning. We also implement a tool for simulating metagenomics datasets from several sequencing platforms. The overall aim is to provide virologists with an easy to use tool for designing, simulating and analyzing viral metagenomes. The results presented here include a benchmark towards other existing software, with emphasis on detection of viruses as well as speed of applications. This is packaged, as comprehensive software, readily available for Linux and OSX users at https://github.com/norling/metlab. PMID:27479078