WorldWideScience

Sample records for analysis tools workshop

  1. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  2. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  3. Video Analysis and Modeling Tool for Physics Education: A workshop for Redesigning Pedagogy

    CERN Document Server

    Wee, Loo Kang

    2012-01-01

    This workshop aims to demonstrate how the Tracker Video Analysis and Modeling Tool engages, enables and empowers teachers to be learners so that we can be leaders in our teaching practice. Through this workshop, the kinematics of a falling ball and a projectile motion are explored using video analysis and in the later video modeling. We hope to lead and inspire other teachers by facilitating their experiences with this ICT-enabled video modeling pedagogy (Brown, 2008) and free tool for facilitating students-centered active learning, thus motivate students to be more self-directed.

  4. 6th International Parallel Tools Workshop

    CERN Document Server

    Brinkmann, Steffen; Gracia, José; Resch, Michael; Nagel, Wolfgang

    2013-01-01

    The latest advances in the High Performance Computing hardware have significantly raised the level of available compute performance. At the same time, the growing hardware capabilities of modern supercomputing architectures have caused an increasing complexity of the parallel application development. Despite numerous efforts to improve and simplify parallel programming, there is still a lot of manual debugging and  tuning work required. This process  is supported by special software tools, facilitating debugging, performance analysis, and optimization and thus  making a major contribution to the development of  robust and efficient parallel software. This book introduces a selection of the tools, which were presented and discussed at the 6th International Parallel Tools Workshop, held in Stuttgart, Germany, 25-26 September 2012.

  5. UVI Cyber-security Workshop Workshop Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Kuykendall, Tommie G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allsop, Jacob Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Benjamin Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Boumedine, Marc [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Carter, Cedric [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Galvin, Seanmichael Yurko [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Oscar [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lee, Wellington K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lin, Han Wei [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Morris, Tyler Jake [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nauer, Kevin S.; Potts, Beth A.; Ta, Kim Thanh; Trasti, Jennifer; White, David R.

    2015-07-08

    The cybersecurity consortium, which was established by DOE/NNSA’s Minority Serving Institutions Partnerships Program (MSIPP), allows students from any of the partner schools (13 HBCUs, two national laboratories, and a public school district) to have all consortia options available to them, to create career paths and to open doors to DOE sites and facilities to student members of the consortium. As a part of this year consortium activities, Sandia National Laboratories and the University of Virgin Islands conducted a week long cyber workshop that consisted of three courses; Digital Forensics and Malware Analysis, Python Programming, and ThunderBird Cup. These courses are designed to enhance cyber defense skills and promote learning within STEM related fields.

  6. Realtime simulation tools in the CHORALE workshop

    Science.gov (United States)

    Cathala, Thierry; Le Goff, Alain; Gozard, Patrick; Latger, Jean

    2006-05-01

    CHORALE (simulated Optronic Acoustic Radar battlefield) is used by the French DGA/DET (Directorate for Evaluation of the French Ministry of Defense) to perform multi-sensors simulations. CHORALE enables the user to create virtual and realistic multi spectral 3D scenes, and generate the physical signal received by a sensor, typically an IR sensor. To evaluate their efficiency in visible and infrared wavelength, simulation tools, that give a good representation of physical phenomena, are used. This article describes the elements used to prepare data (3D database, materials, scenario, ...) for the simulation, and the set of tools (SE-FAST-IR), used in CHORALE for the Real Time simulation in the infrared spectrum. SE-FAST-IR package allows the compilation and visualization of 3D databases for infrared simulations. It enables one to visualize complex and large databases for a wide set of real and pseudo-real time applications. SE-FAST-IR is based on the physical model of the Non Real Time tool of CHORALE workshop. It automatically computes radiance textures, Open GL light source and fog-law parameters for predefined thermal and atmospheric conditions, specified by the user.

  7. Collaboration tools for the global accelerator network: Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Olson, Gary [Univ. of Michigan, Ann Arbor, MI (United States); Olson, Judy [Univ. of Michigan, Ann Arbor, MI (United States)

    2002-09-15

    The concept of a ''Global Accelerator Network'' (GAN) has been put forward as a means for inter-regional collaboration in the operation of internationally constructed and operated frontier accelerator facilities. A workshop was held to allow representatives of the accelerator community and of the collaboratory development community to meet and discuss collaboration tools for the GAN environment. This workshop, called the Collaboration Tools for the Global Accelerator Network (GAN) Workshop, was held on August 26, 2002 at Lawrence Berkeley National Laboratory. The goal was to provide input about collaboration tools in general and to provide a strawman for the GAN collaborative tools environment. The participants at the workshop represented accelerator physicists, high-energy physicists, operations, technology tool developers, and social scientists that study scientific collaboration.

  8. Workshop One : Risk Analysis

    NARCIS (Netherlands)

    Carlson, T.J.; Jong, C.A.F. de; Dekeling, R.P.A.

    2012-01-01

    The workshop looked at the assessment of risk to aquatic animals exposed to anthropogenic sound. The discussion focused on marine mammals given the worldwide attention being paid to them at the present time, particularly in relationship to oil and gas exploration, ocean power, and increases in ship

  9. Workshop Physics and Related Curricula: "A 25-Year History of Collaborative Learning Enhanced by Computer Tools for Observation and Analysis"

    Science.gov (United States)

    Laws, Priscilla W.; Willis, Maxine C.; Sokoloff, David R.

    2015-01-01

    This article describes the 25-year history of development of the activity-based Workshop Physics (WP) at Dickinson College, its adaptation for use at Gettysburg Area High School, and its synergistic influence on curricular materials developed at the University of Oregon and Tufts University and vice versa. WP and these related curricula: 1) are…

  10. The Tenth Thermal and Fluids Analysis Workshop

    Science.gov (United States)

    Majumdar, Alok (Compiler); McConnaughey, Paul (Technical Monitor)

    2001-01-01

    The Tenth Thermal arid Fluids Analysis Workshop (TFAWS 99) was held at the Bevill Center, University of Alabama in Huntsville, Huntsville, Alabama, September 13-17, 1999. The theme for the hands-on training workshop and conference was "Tools and Techniques Contributing to Engineering Excellence". Forty-seven technical papers were presented in four sessions. The sessions were: (1) Thermal Spacecraft/Payloads, (2) Thermal Propulsion/Vehicles, (3) Interdisciplinary Paper, and (4) Fluids Paper. Forty papers were published in these proceedings. The remaining seven papers were not available in electronic format at the time of publication. In addition to the technical papers, there were (a) nine hands-on classes on thermal and flow analyses software, (b) twelve short courses, (c) thirteen product overview lectures, and (d) three keynote lectures. The workshop resulted in participation of 171 persons representing NASA Centers, Government agencies, aerospace industries, academia, software providers, and private corporations.

  11. The EADGENE Microarray Data Analysis Workshop

    DEFF Research Database (Denmark)

    de Koning, Dirk-Jan; Jaffrézic, Florence; Lund, Mogens Sandø;

    2007-01-01

    Microarray analyses have become an important tool in animal genomics. While their use is becoming widespread, there is still a lot of ongoing research regarding the analysis of microarray data. In the context of a European Network of Excellence, 31 researchers representing 14 research groups from...... 10 countries performed and discussed the statistical analyses of real and simulated 2-colour microarray data that were distributed among participants. The real data consisted of 48 microarrays from a disease challenge experiment in dairy cattle, while the simulated data consisted of 10 microarrays...... statistical weights, to omitting a large number of spots or omitting entire slides. Surprisingly, these very different approaches gave quite similar results when applied to the simulated data, although not all participating groups analysed both real and simulated data. The workshop was very successful...

  12. The digital workshop: exploring the use of interactive and immersive visualisation tools in participatory planning.

    Science.gov (United States)

    Salter, Jonathan D; Campbell, Cam; Journeay, Murray; Sheppard, Stephen R J

    2009-05-01

    This paper examines the emerging role of digital tools in a collaborative planning process for British Columbia's Bowen Island. The goal of this research was to evaluate the effectiveness of a 'digital workshop', combining the interactive CommunityViz tool with the immersive lab facilities at the Collaborative for Advanced Landscape Planning (CALP). In support of the larger community planning process, two 3-h workshops were held at CALP's Landscape Immersion Lab. To facilitate collaborative exploration, the interactive landscape visualisation and real-time data analysis capabilities of CommunityViz were employed to illustrate the possible outcomes of residential density policies for Bowen Island's Snug Cove community. The community planning workshops were structured to provide the 14 semi-expert participants with the opportunity to explore and discuss the contentious residential density components of the draft Snug Cove Village Plan. The abilities to dynamically explore the visualisations of the planning proposals, and to see real-time changes in indicator metrics were considered particularly informative, and appeared to increase participants' understanding of the plan. Written and verbal responses indicated, however, that there was insufficient time to examine and interact with the information during the workshop, suggesting a need to examine in greater depth how and when these tools might best be employed in collaborative settings. Current and future research relating to this project is discussed.

  13. Proceedings of pollution prevention and waste minimization tools workshop

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-11-01

    Pollution Prevention (P2) has evolved into one of DOE`s sprime strategies to meet environmental, fiscal, and worker safety obligations. P2 program planning, opportunity identification, and implementation tools were developed under the direction of the Waste Minimization Division (EM-334). Forty experts from EM, DP, ER and DOE subcontractors attended this 2-day workshop to formulate the incentives to drive utilization of these tools. Plenary and small working group sessions were held both days. Working Group 1 identified incentives to overcoming barriers in the area of P2 program planning and resource allocation. Working Group 2 identified mechanisms to drive the completion of P2 assessments and generation of opportunities. Working Group 3 compiled and documented a broad range of potential P2 incentives that address fundamental barriers to implementation of cost effective opportunities.

  14. 9th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Hilbrich, Tobias; Niethammer, Christoph; Gracia, José; Nagel, Wolfgang; Resch, Michael

    2016-01-01

    High Performance Computing (HPC) remains a driver that offers huge potentials and benefits for science and society. However, a profound understanding of the computational matters and specialized software is needed to arrive at effective and efficient simulations. Dedicated software tools are important parts of the HPC software landscape, and support application developers. Even though a tool is by definition not a part of an application, but rather a supplemental piece of software, it can make a fundamental difference during the development of an application. Such tools aid application developers in the context of debugging, performance analysis, and code optimization, and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 9th International Parallel Tools Workshop held in Dresden, Germany, September 2-3, 2015, which offered an established forum for discussing the latest advances in paral...

  15. Seventh Workshop and Tutorial on Practical Use of Coloured Petri Nets and the CPN Tools

    DEFF Research Database (Denmark)

    This booklet contains the proceedings of the Seventh Workshop on Practical Use of Coloured Petri Nets and the CPN Tools, October 24-26, 2006.......This booklet contains the proceedings of the Seventh Workshop on Practical Use of Coloured Petri Nets and the CPN Tools, October 24-26, 2006....

  16. Twelfth Thermal and Fluids Analysis Workshop

    Science.gov (United States)

    Majumdar, Alok (Compiler)

    2002-01-01

    The Twelfth Thermal and Fluids Analysis Workshop (TFAWS 01) was held at the Bevill Center, The University of Alabama in Huntsville, Huntsville, Alabama, September 10-14, 2001. The theme for the hands-on training workshop and conference was "Engineering Excellence and Advances in the New Millenium." Forty-five technical papers were presented in four sessions: (1) Thermal Spacecraft/Payloads, (2) Thermal Propulsion/Vehicles, (3) Interdisciplinary Papers, and (4) Fluids Papers. Thirty-nine papers were published in these proceedings. The remaining six papers were not available in electronic format at the time of publication. In addition to the technical papers, there were (a) nine hands-on classes on thermal and flow analyses software, (b) thirteen short courses and product overview lectures, (c) five keynote lectures and, (d) panel discussions consisting of eight presentations. The workshop resulted in participation of 195 persons representing NASA Centers, Government agencies, aerospace industries, academia, software providers, and private corporations.

  17. 7th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Nagel, Wolfgang; Resch, Michael

    2014-01-01

    Current advances in High Performance Computing (HPC) increasingly impact efficient software development workflows. Programmers for HPC applications need to consider trends such as increased core counts, multiple levels of parallelism, reduced memory per core, and I/O system challenges in order to derive well performing and highly scalable codes. At the same time, the increasing complexity adds further sources of program defects. While novel programming paradigms and advanced system libraries provide solutions for some of these challenges, appropriate supporting tools are indispensable. Such tools aid application developers in debugging, performance analysis, or code optimization and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 7th International Parallel Tools Workshop, held in Dresden, Germany, September 3-4, 2013.  

  18. Asbestos Workshop: Sampling, Analysis, and Risk Assessment

    Science.gov (United States)

    2012-03-01

    coatings Vinyl/asbestos floor tile Automatic transmission components Clutch facings Disc brake pads Drum brake linings Brake blocks Commercial and...1EMDQ March 2012 ASBESTOS WORKSHOP: SAMPLING, ANALYSIS , AND RISK ASSESSMENT Paul Black, PhD, Neptune and Company Ralph Perona, DABT, Neptune and...Sampling, Analysis , and Risk Assessment 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  19. 2nd International Workshop on Isogeometric Analysis and Applications

    CERN Document Server

    Simeon, Bernd

    2015-01-01

    Isogeometric Analysis is a groundbreaking computational approach that promises the possibility of integrating the finite element  method into conventional spline-based CAD design tools. It thus bridges the gap between numerical analysis and geometry, and moreover it allows to tackle new cutting edge applications at the frontiers of research in science and engineering. This proceedings volume contains a selection of outstanding research papers presented at the second International Workshop on Isogeometric Analysis and Applications, held at Annweiler, Germany, in April 2014.

  20. BENCHMARKING WORKSHOPS AS A TOOL TO RAISE BUSINESS EXCELLENCE

    Directory of Open Access Journals (Sweden)

    Milos Jelic

    2011-03-01

    Full Text Available Annual competition for national award for business excellence appears to be a good opportunity for participating organizations to demonstrate their practices particularly those ones which enable them to excel. National quality award competition in Serbia (and Montenegro, namely "OSKAR KVALITETA" started in 1995 but was limited to competition cycle only. However, upon establishing Fund for Quality Culture and Excellence - FQCE in 2002, which took over OSKAR KVALITETA model, several changes took place. OSKAR KVALITETA turned to be annual competition in business excellence, but at the same time FQCE started to offer much wider portfolio of its services including levels of excellence programs, assessment and self-assessment training courses and benchmarking workshops. These benchmarking events have hosted by Award winners or other laureates in OSKAR KVALITETA competition who demonstrated excellence in regard of some particular criteria thus being in position to share their practice with other organizations. For six years experience in organizing benchmarking workshops FQCE scored 31 workshops covering major part of model issues. Increasing level of participation on the workshops and distinct positive trends of participants expressed satisfaction may serve as a reliable indicator that the workshops have been effective in actuating people to think and move in business excellence direction.

  1. 100-KE REACTOR CORE REMOVAL PROJECT ALTERNATIVE ANALYSIS WORKSHOP REPORT

    Energy Technology Data Exchange (ETDEWEB)

    HARRINGTON RA

    2010-01-15

    . In brief, the Path Forward was developed to reconsider potential open air demolition areas; characterize to determine if any zircaloy exists, evaluate existing concrete data to determine additional characterization needs, size the new building to accommodate human machine interface and tooling, consider bucket thumb and use ofshape-charges in design, and finally to utilize complex-wide and industry explosive demolition lessons learned in the design approach. Appendix B documents these results from the team's use ofValue Engineering process tools entitled Weighted Analysis Alternative Matrix, Matrix Conclusions, Evaluation Criteria, and Alternative Advantages and Disadvantages. These results were further supported with the team's validation of parking-lot information sheets: memories (potential ideas to consider), issues/concerns, and assumptions, contained in Appendix C. Appendix C also includes the recorded workshop flipchart notes taken from the SAR Alternatives and Project Overview presentations. The SAR workshop presentations, including a 3-D graphic illustration demonstration video have been retained in the CHPRC project file, and were not included in this report due to size limitations. The workshop concluded with a round robin close-out where each member was engaged for any last minute items and meeting utility. In summary, the team felt the session was value added and looked forward to proceeding with the recommended actions and conceptual design.

  2. [A fragrance workshop, a mediation tool for teenagers].

    Science.gov (United States)

    Saada, Valérie; Harf, Aurélie; Le Camus, Sabine; Moro, Marie Rose

    2013-01-01

    The fragrance workshop is one of the therapies used with young people in the day hospital of the Adolescent Centre of Cochin hospital in Paris. This unique form of mediation offers, through the use of a sense which is often neglected, access to the imaginary world of teenagers, allowing regression and the evocation of memories in a contained framework.

  3. Fourth Workshop and Tutorial on Practical Use of Coloured Petri Nets and the CPN Tools

    DEFF Research Database (Denmark)

    Coloured Petri Nets and the CPN tools are now used by more than 750 organisations in 50 different countries all over the world (including 150 commercial companies). The purpose of this event is to bring together some of the users and in this way provide a forum for those who are interested...... in the practical use of Coloured Petri Nets and the CPN tools. This booklet contains the proceedings of the Fourth Workshop on Practical Use of Coloured Petri Nets and the CPN Tools, August 28-30, 2002. The workshop is organised by the CPN group at the Department of Computer Science, University of Aarhus, Denmark....

  4. Summary of the Third International Planetary Dunes Workshop: remote sensing and image analysis of planetary dunes

    Science.gov (United States)

    Fenton, Lori K.; Hayward, Rosalyn K.; Horgan, Briony H.N.; Rubin, David M.; Titus, Timothy N.; Bishop, Mark A.; Burr, Devon M.; Chojnacki, Matthew; Dinwiddie, Cynthia L.; Kerber, Laura; Gall, Alice Le; Michaels, Timothy I.; Neakrase, Lynn D.V.; Newman, Claire E.; Tirsch, Daniela; Yizhaq, Hezi; Zimbelman, James R.

    2013-01-01

    The Third International Planetary Dunes Workshop took place in Flagstaff, AZ, USA during June 12–15, 2012. This meeting brought together a diverse group of researchers to discuss recent advances in terrestrial and planetary research on aeolian bedforms. The workshop included two and a half days of oral and poster presentations, as well as one formal (and one informal) full-day field trip. Similar to its predecessors, the presented work provided new insight on the morphology, dynamics, composition, and origin of aeolian bedforms on Venus, Earth, Mars, and Titan, with some intriguing speculation about potential aeolian processes on Triton (a satellite of Neptune) and Pluto. Major advancements since the previous International Planetary Dunes Workshop include the introduction of several new data analysis and numerical tools and utilization of low-cost field instruments (most notably the time-lapse camera). Most presentations represented advancement towards research priorities identified in both of the prior two workshops, although some previously recommended research approaches were not discussed. In addition, this workshop provided a forum for participants to discuss the uncertain future of the Planetary Aeolian Laboratory; subsequent actions taken as a result of the decisions made during the workshop may lead to an expansion of funding opportunities to use the facilities, as well as other improvements. The interactions during this workshop contributed to the success of the Third International Planetary Dunes Workshop, further developing our understanding of aeolian processes on the aeolian worlds of the Solar System.

  5. Pollution prevention and waste minimization tools workshops: Proceedings. Part 2

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-31

    The purpose of the second workshop was to bring together representatives of DOE and DOE contractor organizations to discuss four topics: process waste assessments (PWAs), a continuation of one of the sessions held at the first workshop in Clearwater; waste minimization reporting requirements; procurement systems for waste minimization; and heating, ventilating, and air conditioning (HVAC) and replacements for chlorofluorocarbons (CFCs). The topics were discussed in four concurrent group sessions. Participants in each group were encouraged to work toward achieving two main objectives: establish a ``clear vision`` of the overall target for their session`s program, focusing not just on where the program is now but on where it should go in the long term; and determine steps to be followed to carry out the target program.

  6. Proceedings of the of the Tenth Workshop on Language Descriptions, Tools and Applications (LDTA 2010)

    DEFF Research Database (Denmark)

    Brabrand, Claus

    2010-01-01

    This volume contains the proceedings of the Tenth Workshop on Language Descriptions, Tools and Applications (LDTA 2010), held in Paphos, Cyprus on March 28--29, 2010. LDTA is a two-day satellite event of ETAPS (European Joint Conferences on Theory and Practice of Software) organized in cooperatio...

  7. 77 FR 14814 - Tobacco Product Analysis; Scientific Workshop; Request for Comments

    Science.gov (United States)

    2012-03-13

    ... to Attend the Workshop and Requests for Oral Presentations: If you wish to attend the workshop or make an oral presentation at the workshop, please email your registration to workshop.CTPOS@fda.hhs.gov... workshop to solicit feedback on analysis of tobacco products. The analyses of tobacco products...

  8. Tools for Project Management, Workshops and Consulting A Must-Have Compendium of Essential Tools and Techniques

    CERN Document Server

    Andler, Nicolai

    2012-01-01

    Typically today's tasks in management and consulting include project management, running workshops and strategic work - all complex activities, which require a multitude of skills and competencies. This standard work, which is also well accepted amongst consultants, gives you a reference or cookbook-style access to the most important tools, including a rating of each tool in terms of applicability, ease of use and effectiveness.In his book, Nicolai Andler presents about 120 of such tools, grouped into task-specific categories entitled Define Situation, Gather Information, Information Consolida

  9. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  10. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  11. The EADGENE Microarray Data Analysis Workshop

    NARCIS (Netherlands)

    Koning, de D.J.; Jaffrezic, F.; Lund, M.S.; Watson, M.; Channing, C.; Hulsegge, B.; Pool, M.H.; Buitenhuis, B.; Hedegaard, J.; Hornshoj, H.; Sorensen, P.; Marot, G.; Delmas, C.; Lê Cao, K.A.; San Cristobal, M.; Baron, M.D.; Malinverni, R.; Stella, A.; Brunner, R.M.; Seyfert, H.M.; Jensen, K.; Mouzaki, D.; Waddington, D.; Jiménez-Marín, A.; Perez-Alegre, M.; Perez-Reinado, E.; Closset, R.; Detilleux, J.C.; Dovc, P.; Lavric, M.; Nie, H.; Janss, L.

    2007-01-01

    Microarray analyses have become an important tool in animal genomics. While their use is becoming widespread, there is still a lot of ongoing research regarding the analysis of microarray data. In the context of a European Network of Excellence, 31 researchers representing 14 research groups from 10

  12. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    Science.gov (United States)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  13. PREFACE: EMAS 2013 Workshop: 13th European Workshop on Modern Developments and Applications in Microbeam Analysis

    Science.gov (United States)

    Llovet, Xavier, Dr; Matthews, Mr Michael B.; Brisset, François, Dr; Guimarães, Fernanda, Dr; Vieira, Professor Joaquim M., Dr

    2014-03-01

    This volume of the IOP Conference Series: Materials Science and Engineering contains papers from the 13th Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis which took place from the 12th to the 16th of May 2013 in the Centro de Congressos do Alfândega, Porto, Portugal. The primary aim of this series of workshops is to assess the state-of-the-art and reliability of microbeam analysis techniques. The workshops also provide a forum where students and young scientists starting out on a career in microbeam analysis can meet and discuss with the established experts. The workshops have a very specific format comprising invited plenary lectures by internationally recognized experts, poster presentations by the participants and round table discussions on the key topics led by specialists in the field. This workshop was organized in collaboration with LNEG - Laboratório Nacional de Energia e Geologia and SPMICROS - Sociedade Portuguesa de Microscopia. The technical programme included the following topics: electron probe microanalysis, future technologies, electron backscatter diffraction (EBSD), particle analysis, and applications. As at previous workshops there was also a special oral session for young scientists. The best presentation by a young scientist was awarded with an invitation to attend the 2014 Microscopy and Microanalysis meeting at Hartford, Connecticut. The prize went to Shirin Kaboli, of the Department of Metals and Materials Engineering of McGill University (Montréal, Canada), for her talk entitled ''Plastic deformation studies with electron channelling contrast imaging and electron backscattered diffraction''. The continuing relevance of the EMAS workshops and the high regard in which they are held internationally can be seen from the fact that 74 posters from 21 countries were on display at the meeting and that the participants came from as far away as Japan, Canada and the USA. A

  14. Proceedings of the Workshop on software tools for distributed intelligent control systems

    Energy Technology Data Exchange (ETDEWEB)

    Herget, C.J. (ed.)

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  15. Summary of Training Workshop on the Use of NASA tools for Coastal Resource Management in the Gulf of Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Judd, Chaeli; Judd, Kathleen S.; Gulbransen, Thomas C.; Thom, Ronald M.

    2009-03-01

    A two-day training workshop was held in Xalapa, Mexico from March 10-11 2009 with the goal of training end users from the southern Gulf of Mexico states of Campeche and Veracruz in the use of tools to support coastal resource management decision-making. The workshop was held at the computer laboratory of the Institute de Ecologia, A.C. (INECOL). This report summarizes the results of that workshop and is a deliverable to our NASA client.

  16. Proceedings of the of the Eleventh Workshop on Language Descriptions, Tools and Applications (LDTA 2011)

    DEFF Research Database (Denmark)

    . A primary focus of LDTA is grammarware that is generated from high-level grammar-centric specifications and thus submissions on parser generation, attribute grammar systems, term/graph rewriting systems, and other grammar-related meta-programming tools, techniques, and formalisms were encouraged. For 2011......, as well as techniques and tools, to the test in a new way in the form of the LDTA Tool Challenge. Tool developers were invited to participate in the Challenge by developing solutions to a range of language processing tasks over a simple but evolving set of imperative programming languages. Tool challenge......This volume contains the proceedings of the Eleventh Workshop on Language Descriptions, Tools and Applications (LDTA 2011), held in Saarbrücken, Germany on March 26 & 27, 2011. LDTA is a two-day satellite event of ETAPS (European Joint Conferences on Theory and Practice of Software) and organized...

  17. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  18. Ninth Workshop and Tutorial on Practical Use of Coloured Petri Nets and the CPN Tools

    DEFF Research Database (Denmark)

    to more than 7,200 users in 138 countries. The aim of the workshop is to bring together some of the users and in this way provide a forum for those who are interested in the practical use of Coloured Petri nets and their tools. The submitted papers were evaluated by a programme committee...... have been published in four special sections in the International Journal on Software Tools for Technology Transfer (STTT). For more information see: www.sttt.cs.uni-dortmund.de/. After an additional round of reviewing and revision, some of the papers from this years workshop will be published...... in Transactions of Petri Nets and Other Models of Concurrency (ToPNoC) which is new journal subline of Lecture Notes in Computer Science. For more information see: www.springer.com/lncs/topnoc. Kurt Jensen PC and OC chair...

  19. 9th Workshop on Stochastic Analysis and Related Topics

    CERN Document Server

    Decreusefond, Laurent; Stochastic Analysis and Related Topics

    2012-01-01

    Since the early eighties, Ali Suleyman Ustunel has been one of the main contributors to the field of Malliavin calculus. In a workshop held in Paris, June 2010 several prominent researchers gave exciting talks in honor of his 60th birthday. The present volume includes scientific contributions from this workshop. Probability theory is first and foremost aimed at solving real-life problems containing randomness. Markov processes are one of the key tools for modeling that plays a vital part concerning such problems. Contributions on inventory control, mutation-selection in genetics and public-pri

  20. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  1. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  2. 8th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2015-01-01

    Numerical simulation and modelling using High Performance Computing has evolved into an established technique in academic and industrial research. At the same time, the High Performance Computing infrastructure is becoming ever more complex. For instance, most of the current top systems around the world use thousands of nodes in which classical CPUs are combined with accelerator cards in order to enhance their compute power and energy efficiency. This complexity can only be mastered with adequate development and optimization tools. Key topics addressed by these tools include parallelization on heterogeneous systems, performance optimization for CPUs and accelerators, debugging of increasingly complex scientific applications, and optimization of energy usage in the spirit of green IT. This book represents the proceedings of the 8th International Parallel Tools Workshop, held October 1-2, 2014 in Stuttgart, Germany – which is a forum to discuss the latest advancements in the parallel tools.

  3. IFPA Meeting 2013 Workshop Report II: use of 'omics' in understanding placental development, bioinformatics tools for gene expression analysis, planning and coordination of a placenta research network, placental imaging, evolutionary approaches to understanding pre-eclampsia.

    Science.gov (United States)

    Ackerman, W E; Adamson, L; Carter, A M; Collins, S; Cox, B; Elliot, M G; Ermini, L; Gruslin, A; Hoodless, P A; Huang, J; Kniss, D A; McGowen, M R; Post, M; Rice, G; Robinson, W; Sadovsky, Y; Salafia, C; Salomon, C; Sled, J G; Todros, T; Wildman, D E; Zamudio, S; Lash, G E

    2014-02-01

    Workshops are an important part of the IFPA annual meeting as they allow for discussion of specialized topics. At the IFPA meeting 2013 twelve themed workshops were presented, five of which are summarized in this report. These workshops related to various aspects of placental biology but collectively covered areas of new technologies for placenta research: 1) use of 'omics' in understanding placental development and pathologies; 2) bioinformatics and use of omics technologies; 3) planning and coordination of a placenta research network; 4) clinical imaging and pathological outcomes; 5) placental evolution.

  4. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  5. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    CERN Document Server

    Battaglieri, M; Celentano, A; Chung, S -U; D'Angelo, A; De Vita, R; Döring, M; Dudek, J; Eidelman, S; Fegan, S; Ferretti, J; Fox, G; Galata, G; Garcia-Tecocoatzi, H; Glazier, D I; Grube, B; Hanhart, C; Hoferichter, M; Hughes, S M; Ireland, D G; Ketzer, B; Klein, F J; Kubis, B; Liu, B; Masjuan, P; Mathieu, V; McKinnon, B; Mitchell, R; Nerling, F; Paul, S; Pelaez, J R; Rademacker, J; Rizzo, A; Salgado, C; Santopinto, E; Sarantsev, A V; Sato, T; Schlüter, T; da Silva, M L L; Stankovic, I; Strakovsky, I; Szczepaniak, A; Vassallo, A; Walford, N K; Watts, D P; Zana, L

    2014-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near...

  6. The EADGENE Microarray Data Analysis Workshop (Open Access publication

    Directory of Open Access Journals (Sweden)

    Jiménez-Marín Ángeles

    2007-11-01

    Full Text Available Abstract Microarray analyses have become an important tool in animal genomics. While their use is becoming widespread, there is still a lot of ongoing research regarding the analysis of microarray data. In the context of a European Network of Excellence, 31 researchers representing 14 research groups from 10 countries performed and discussed the statistical analyses of real and simulated 2-colour microarray data that were distributed among participants. The real data consisted of 48 microarrays from a disease challenge experiment in dairy cattle, while the simulated data consisted of 10 microarrays from a direct comparison of two treatments (dye-balanced. While there was broader agreement with regards to methods of microarray normalisation and significance testing, there were major differences with regards to quality control. The quality control approaches varied from none, through using statistical weights, to omitting a large number of spots or omitting entire slides. Surprisingly, these very different approaches gave quite similar results when applied to the simulated data, although not all participating groups analysed both real and simulated data. The workshop was very successful in facilitating interaction between scientists with a diverse background but a common interest in microarray analyses.

  7. Assessing the interactivity and prescriptiveness of faculty professional development workshops: The real-time professional development observation tool

    Science.gov (United States)

    Olmstead, Alice; Turpen, Chandra

    2016-12-01

    Professional development workshops are one of the primary mechanisms used to help faculty improve their teaching, and draw in many STEM instructors every year. Although workshops serve a critical role in changing instructional practices within our community, we rarely assess workshops through careful consideration of how they engage faculty. Initial evidence suggests that workshop leaders often overlook central tenets of education research that are well established in classroom contexts, such as the role of interactivity in enabling student learning [S. Freeman et al., Proc. Natl. Acad. Sci. U.S.A. 111, 8410 (2014)]. As such, there is a need to develop more robust, evidence-based models of how best to support faculty learning in professional development contexts, and to actively support workshop leaders in relating their design decisions to familiar ideas from other educational contexts. In response to these needs, we have developed an observation tool, the real-time professional development observation tool (R-PDOT), to document the form and focus of faculty engagement during workshops. In this paper, we describe the motivation and methodological considerations behind the development of the R-PDOT, justify our decisions to highlight particular aspects of workshop sessions, and demonstrate how the R-PDOT can be used to analyze three sessions from the Physics and Astronomy New Faculty Workshop. We also justify the accessibility and potential utility of the R-PDOT output as a reflective tool using preliminary data from interviews with workshop leaders, and consider the roles the R-PDOT could play in supporting future research on faculty professional development.

  8. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  9. Second Workshop on Stochastic Analysis and Related Topics

    CERN Document Server

    Ustunel, Ali

    1990-01-01

    The Second Silivri Workshop functioned as a short summer school and a working conference, producing lecture notes and research papers on recent developments of Stochastic Analysis on Wiener space. The topics of the lectures concern short time asymptotic problems and anticipative stochastic differential equations. Research papers are mostly extensions and applications of the techniques of anticipative stochastic calculus.

  10. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  11. Proceedings of the Airborne Imaging Spectrometer Data Analysis Workshop

    Science.gov (United States)

    Vane, G. (Editor); Goetz, A. F. H. (Editor)

    1985-01-01

    The Airborne Imaging Spectrometer (AIS) Data Analysis Workshop was held at the Jet Propulsion Laboratory on April 8 to 10, 1985. It was attended by 92 people who heard reports on 30 investigations currently under way using AIS data that have been collected over the past two years. Written summaries of 27 of the presentations are in these Proceedings. Many of the results presented at the Workshop are preliminary because most investigators have been working with this fundamentally new type of data for only a relatively short time. Nevertheless, several conclusions can be drawn from the Workshop presentations concerning the value of imaging spectrometry to Earth remote sensing. First, work with AIS has shown that direct identification of minerals through high spectral resolution imaging is a reality for a wide range of materials and geological settings. Second, there are strong indications that high spectral resolution remote sensing will enhance the ability to map vegetation species. There are also good indications that imaging spectrometry will be useful for biochemical studies of vegetation. Finally, there are a number of new data analysis techniques under development which should lead to more efficient and complete information extraction from imaging spectrometer data. The results of the Workshop indicate that as experience is gained with this new class of data, and as new analysis methodologies are developed and applied, the value of imaging spectrometry should increase.

  12. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  13. The Astronomy Workshop: Web Tools for Astronomy Students at All Levels

    Science.gov (United States)

    Hayes-Gehrke, Melissa N.; Hamilton, D.; Deming, G.

    2010-01-01

    The Astronomy Workshop (http://janus.astro.umd.edu/) is a collection of over 20 interactive web tools that were developed under the direction of Doug Hamilton for use in undergraduate classes and by the general public. The goal of the website is to encourage students to learn about astronomy by exploiting their fascination with the internet. Two of the tools, Scientific Notation and Solar System Collisions, have instructor materials available to facilitate their use in undergraduate, high school, and junior high classes. The Scientific Notation web tool allows students to practice conversion, addition/subtraction, and multiplication/division with scientific notation, while the Solar System Collisions web tool explores the effects of impacts on the Earth and other solar system bodies. Some web tools allow students to explore our own solar system (Solar System Visualizer) and the Sun's past and future history (The Life of the Sun), Others allow students to experiment with changes in the solar system, such as to the tilt of the Earth (Earth's Seasons) and changing the properties of the planets in the solar system (Build Your Own Solar System).

  14. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  15. Udder Hygiene Analysis tool

    OpenAIRE

    2013-01-01

    In this report, the pilot of UHC is described. The main objective of the pilot is to make farmers more aware of how to increase udder health in dairy herds. This goes through changing management aspects related to hygiene. This report firstly provides general information about antibiotics and the processes that influence udder health. Secondly, six subjects are described related to udder health. Thirdly, the tools (checklists and roadmap) are shown and fourthly, advises that are written by UH...

  16. Social Data Analysis Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi; Hardt, Daniel;

    2014-01-01

    As governments, citizens and organizations have moved online there is an increasing need for academic enquiry to adapt to this new context for communication and political action. This adaptation is crucially dependent on researchers being equipped with the necessary methodological tools to extrac...... and analyze web data in the process of investigating substantive questions......., analyze and visualize patterns of web activity. This volume profiles the latest techniques being employed by social scientists to collect and interpret data from some of the most popular social media applications, the political parties' own online activist spaces, and the wider system of hyperlinks...

  17. Development of Workshops on Biodiversity and Evaluation of the Educational Effect by Text Mining Analysis

    Science.gov (United States)

    Baba, R.; Iijima, A.

    2014-12-01

    Conservation of biodiversity is one of the key issues in the environmental studies. As means to solve this issue, education is becoming increasingly important. In the previous work, we have developed a course of workshops on the conservation of biodiversity. To disseminate the course as a tool for environmental education, determination of the educational effect is essential. A text mining enables analyses of frequency and co-occurrence of words in the freely described texts. This study is intended to evaluate the effect of workshop by using text mining technique. We hosted the originally developed workshop on the conservation of biodiversity for 22 college students. The aim of the workshop was to inform the definition of biodiversity. Generally, biodiversity refers to the diversity of ecosystem, diversity between species, and diversity within species. To facilitate discussion, supplementary materials were used. For instance, field guides of wildlife species were used to discuss about the diversity of ecosystem. Moreover, a hierarchical framework in an ecological pyramid was shown for understanding the role of diversity between species. Besides, we offered a document material on the historical affair of Potato Famine in Ireland to discuss about the diversity within species from the genetic viewpoint. Before and after the workshop, we asked students for free description on the definition of biodiversity, and analyzed by using Tiny Text Miner. This technique enables Japanese language morphological analysis. Frequently-used words were sorted into some categories. Moreover, a principle component analysis was carried out. After the workshop, frequency of the words tagged to diversity between species and diversity within species has significantly increased. From a principle component analysis, the 1st component consists of the words such as producer, consumer, decomposer, and food chain. This indicates that the students have comprehended the close relationship between

  18. Workshop tools and methodologies for evaluation of energy chains and for technology perspective

    Energy Technology Data Exchange (ETDEWEB)

    Appert, O. [Institut Francais du Petrole (IFP), 92 - Rueil-Malmaison (France); Maillard, D. [Energy and Raw Materials, 75 - Paris (France); Pumphrey, D. [Energy Cooperation, US Dept. of Energy (United States); Sverdrup, G.; Valdez, B. [National Renewable Energy Laboratory, Golden, CO (United States); Schindler, J. [LB-Systemtechnik (LBST), GmbH, Ottobrunn (Germany); His, St.; Rozakis, St. [Centre International de Recherche sur Environnement Developpement (CIRED), 94 - Nogent sur Marne (France); Sagisaka, M. [LCA Research Centre (Japan); Bjornstad, D. [Oak Ridge National Laboratory, Oak Ridge, Tennessee (United States); Madre, J.L. [Institut National de Recherche sur les Transports et leur Securite, 94 - Arcueil (France); Hourcade, J.Ch. [Centre International de Recherche sur l' Environnement le Developpement (CIRED), 94 - Nogent sur Marne (France); Ricci, A.; Criqui, P.; Chateau, B.; Bunger, U.; Jeeninga, H. [EU/DG-R (Italy); Chan, A. [National Research Council (Canada); Gielen, D. [IEA-International Energy Associates Ltd., Fairfax, VA (United States); Tosato, G.C. [Energy Technology Systems Analysis Programme (ETSAP), 75 - Paris (France); Akai, M. [Agency of Industrial Science and technology (Japan); Ziesing, H.J. [Deutsches Institut fur Wirtschaftsforschung, DIW Berlin (Germany); Leban, R. [Conservatoire National des Arts et Metiers (CNAM), 75 - Paris (France)

    2005-07-01

    The aims of this workshop is to better characterize the future in integrating all the dynamic interaction between the economy, the environment and the society. It offers presentations on the Hydrogen chains evaluation, the micro-economic modelling for evaluation of bio-fuel options, life cycle assessment evolution and potentialities, the consumer valuation of energy technologies attributes, the perspectives for evaluation of changing behavior, the incentive systems and barriers to social acceptability, the internalization of external costs, the endogenous technical change in long-tem energy models, ETSAP/technology dynamics in partial equilibrium energy models, very long-term energy environment modelling, ultra long-term energy technology perspectives, the socio-economic toolbox of the EU hydrogen road-map, the combined approach using technology oriented optimization and evaluation of impacts of individual policy measures and the application of a suite of basic research portfolio management tools. (A.L.B.)

  19. "I'm Not Here to Learn How to Mark Someone Else's Stuff": An Investigation of an Online Peer-to-Peer Review Workshop Tool

    Science.gov (United States)

    Wilson, Michael John; Diao, Ming Ming; Huang, Leon

    2015-01-01

    In this article, we explore the intersecting concepts of fairness, trust and temporality in relation to the implementation of an online peer-to-peer review Moodle Workshop tool at a Sydney metropolitan university. Drawing on qualitative interviews with unit convenors and online surveys of students using the Workshop tool, we seek to highlight a…

  20. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  1. Proceedings of the CEC/USDOE workshop on uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Elderkin, C.E. (Pacific Northwest Lab., Richland, WA (USA)); Kelly, G.N. (eds.)(Commission of the European Communities, Brussels (Belgium))

    1990-09-01

    In recent years it has become increasingly important to specify the uncertainty inherent in consequence assessments and in the models that trace radionuclides from their source, through the environment, to their impacts on human health. European and US scientists have, been independently developing and applying methods for analyzing uncertainty. It recently became apparent that a scientific exchange on this subject would be beneficial as improvements are sought and as uncertainty methods find broader application. The Commission of the European Communities (CEC) and the Office of Health and Environmental Research of the US Department of Energy (OHER/DOE), through their continuing agreement for cooperation, decided to co-sponsor the CEC/USDOE Workshop on Uncertainty Analysis. CEC's Radiation Protection Research Programme and OHER's Atmospheric Studies in Complex Terrain Program collaborated in planning and organizing the workshop, which was held in Santa Fe, New Mexico, on November 13 through 16, 1989. As the workshop progressed, the perspectives of individual participants, each with their particular background and interests in some segment of consequence assessment and its uncertainties, contributed to a broader view of how uncertainties are introduced and handled. This proceedings contains, first, the editors' introduction to the problem of uncertainty analysis and their general summary and conclusions. These are then followed by the results of the working groups, and the abstracts of individual presentations.

  2. Workshop on the applications of new computer tools to thermal engineering; Applications a la thermique des nouveaux outils informatiques

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    This workshop on the applications of new computer tools to thermal engineering has been organized by the French society of thermal engineers. Seven papers have been presented, from which two papers dealing with thermal diffusivity measurements in materials and with the optimization of dryers have been selected for ETDE. (J.S.)

  3. Proceedings of the Workshop on Methods & Tools for Computer Supported Collaborative Creativity Process: Linking creativity & informal learning

    NARCIS (Netherlands)

    Retalis, Symeon; Sloep, Peter

    2009-01-01

    Retalis, S., & Sloep, P. B. (Eds.) (2009). Collection of 4 symposium papers at EC-TEL 2009. Proceedings of the Workshop on Methods & Tools for Computer Supported Collaborative Creativity Process: Linking creativity & informal learning. September, 30, 2009, Nice, France. http://sunsite.informatik.rwt

  4. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  5. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    Science.gov (United States)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  6. Sandia PUF Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics such as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.

  7. Performance Analysis using CPN Tools

    DEFF Research Database (Denmark)

    Wells, Lisa Marie

    2006-01-01

    This paper provides an overview of new facilities for performance analysis using Coloured Petri Nets and the tool CPN Tools. Coloured Petri Nets is a formal modeling language that is well suited for modeling and analyzing large and complex systems. The new facilities include support for collecting...... data during simulations, for generating different kinds of performance-related output, and for running multiple simulation replications. A simple example of a network protocol is used to illustrate the flexibility of the new facilities....

  8. Summary of the Third International Planetary Dunes Workshop: Remote Sensing and Image Analysis of Planetary Dunes, Flagstaff, Arizona, USA, June 12-15, 2012

    Science.gov (United States)

    Fenton, Lori K.; Hayward, Rosalyn K.; Horgan, Briony H. N.; Rubin, David M.; Titus, Timothy N.; Bishop, Mark A.; Burr, Devon M.; Chojnacki, Matthew; Dinwiddie, Cynthia L.; Kerber, Laura; Le Gall, Alice; Michaels, Timothy I.; Neakrase, Lynn D. V.; Newman, Claire E.; Tirsch, Daniela; Yizhaq, Hezi; Zimbelman, James R.

    2013-03-01

    The Third International Planetary Dunes Workshop took place in Flagstaff, AZ, USA during June 12-15, 2012. This meeting brought together a diverse group of researchers to discuss recent advances in terrestrial and planetary research on aeolian bedforms. The workshop included two and a half days of oral and poster presentations, as well as one formal (and one informal) full-day field trip. Similar to its predecessors, the presented work provided new insight on the morphology, dynamics, composition, and origin of aeolian bedforms on Venus, Earth, Mars, and Titan, with some intriguing speculation about potential aeolian processes on Triton (a satellite of Neptune) and Pluto. Major advancements since the previous International Planetary Dunes Workshop include the introduction of several new data analysis and numerical tools and utilization of low-cost field instruments (most notably the time-lapse camera). Most presentations represented advancement towards research priorities identified in both of the prior two workshops, although some previously recommended research approaches were not discussed. In addition, this workshop provided a forum for participants to discuss the uncertain future of the Planetary Aeolian Laboratory; subsequent actions taken as a result of the decisions made during the workshop may lead to an expansion of funding opportunities to use the facilities, as well as other improvements. The interactions during this workshop contributed to the success of the Third International Planetary Dunes Workshop, further developing our understanding of aeolian processes on the aeolian worlds of the Solar System.

  9. Statistical Analysis of CFD Solutions from the 6th AIAA CFD Drag Prediction Workshop

    Science.gov (United States)

    Derlaga, Joseph M.; Morrison, Joseph H.

    2017-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N- version test of a collection of Reynolds-averaged Navier-Stokes computational uid dynam- ics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using both common and custom grid sequencees as well as multiple turbulence models for the June 2016 6th AIAA CFD Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic con guration for this workshop was the Common Research Model subsonic transport wing- body previously used for both the 4th and 5th Drag Prediction Workshops. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.

  10. Statistical Analysis of CFD Solutions from the Fourth AIAA Drag Prediction Workshop

    Science.gov (United States)

    Morrison, Joseph H.

    2010-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from the U.S., Europe, Asia, and Russia using a variety of grid systems and turbulence models for the June 2009 4th Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was a new subsonic transport model, the Common Research Model, designed using a modern approach for the wing and included a horizontal tail. The fourth workshop focused on the prediction of both absolute and incremental drag levels for wing-body and wing-body-horizontal tail configurations. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with earlier workshops using the statistical framework.

  11. Statistical Analysis of CFD Solutions From the Fifth AIAA Drag Prediction Workshop

    Science.gov (United States)

    Morrison, Joseph H.

    2013-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using a common grid sequence and multiple turbulence models for the June 2012 fifth Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was the Common Research Model subsonic transport wing-body previously used for the 4th Drag Prediction Workshop. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.

  12. 6th International Workshop on Compositional Data Analysis

    CERN Document Server

    Thió-Henestrosa, Santiago

    2016-01-01

    The authoritative contributions gathered in this volume reflect the state of the art in compositional data analysis (CoDa). The respective chapters cover all aspects of CoDa, ranging from mathematical theory, statistical methods and techniques to its broad range of applications in geochemistry, the life sciences and other disciplines. The selected and peer-reviewed papers were originally presented at the 6th International Workshop on Compositional Data Analysis, CoDaWork 2015, held in L’Escala (Girona), Spain. Compositional data is defined as vectors of positive components and constant sum, and, more generally, all those vectors representing parts of a whole which only carry relative information. Examples of compositional data can be found in many different fields such as geology, chemistry, economics, medicine, ecology and sociology. As most of the classical statistical techniques are incoherent on compositions, in the 1980s John Aitchison proposed the log-ratio approach to CoDa. This became the foundation...

  13. Assessing the interactivity and prescriptiveness of faculty professional development workshops: The Real-Time Professional Development Observation Tool (R-PDOT)

    CERN Document Server

    Olmstead, Alice

    2016-01-01

    Professional development workshops are one of the primary mechanisms used to help faculty improve their teaching, and draw in many STEM instructors every year. Although workshops serve a critical role in changing instructional practices within our community, we rarely assess workshops through careful consideration of how they engage faculty. Initial evidence suggests that workshop leaders often overlook central tenets of education research that are well-established in classroom contexts, such as the role of interactivity in enabling student learning. As such, there is a need to develop more robust, evidence-based models of how best to support faculty learning in professional development contexts, and to activity support workshop leaders in relating their design decisions to familiar ideas form other educational contexts. In response to these needs, we have developed an observation tool, the Real-Time Professional Development Observation Tool (R-PDOT), to document the form and focus of faculty's engagement dur...

  14. A Decision-Analytic Feasibility Study of Upgrading Machinery at a Tools Workshop

    Directory of Open Access Journals (Sweden)

    M. L. Chew Hernandez

    2012-04-01

    Full Text Available This paper presents the evaluation, from a Decision Analysis point of view, of the feasibility of upgrading machinery at an existing metal-forming workshop. The Integral Decision Analysis (IDA methodology is applied to clarify the decision and develop a decision model. One of the key advantages of the IDA is its careful selection of the problem frame, allowing a correct problem definition. While following most of the original IDA methodology, an addition to this methodology is proposed in this work, that of using the strategic Means-Ends Objective Network as a backbone for the development of the decision model. The constructed decision model uses influence diagrams to include factual operator and vendor expertise, simulation to evaluate the alternatives and a utility function to take into account the risk attitude of the decision maker. Three alternatives are considered: Base (no modification, CNC (installing an automatic lathe and CF (installation of an automatic milling machine. The results are presented as a graph showing zones in which a particular alternative should be selected. The results show the potential of IDA to tackle technical decisions that are otherwise approached without the due care.

  15. Dynamic Hurricane Data Analysis Tool

    Science.gov (United States)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  16. Using data visualization in creativity workshops: a new tool in the designer's kit

    OpenAIRE

    Dove, G; Jones, S.; Dykes, J; Brown, A. E.; Duffy, A.

    2013-01-01

    Creativity workshops have proved effective in drawing out unexpected requirements and giving form to participants' novel ideas. Here, we introduce a new addition to the workshop designer's toolkit: interactive data visualization, used as stimuli to prompt insight and inspire creativity. We first describe a pilot study in which we compare the effectiveness of two different styles of data visualization. Here we found that a less ambiguous style was more effective in supporting idea generation. ...

  17. Haplotype sharing analysis with SNPs in candidate genes : The genetic analysis workshop 12 example

    NARCIS (Netherlands)

    Fischer, C; Beckmann, L; Majoram, P; Meerman, GT; Chang-Claude, J

    2003-01-01

    Haplotype sharing analysis was used to investigate the association of affection status with single nucleotide polymorphism (SNP) haplotypes within candidate gene 1 in one sample each from the isolated and the general population of Genetic Analysis Workshop (GAW) 12 simulated data. Gene 1 has direct

  18. Shot Planning and Analysis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  19. Proceedings: Workshop on Advanced Mathematics and Computer Science for Power Systems Analysis

    Energy Technology Data Exchange (ETDEWEB)

    None

    1991-08-01

    EPRI's Office of Exploratory Research sponsors a series of workshops that explore how to apply recent advances in mathematics and computer science to the problems of the electric utility industry. In this workshop, participants identified research objectives that may significantly improve the mathematical methods and computer architecture currently used for power system analysis.

  20. Graphical Multiprocessing Analysis Tool (GMAT)

    Energy Technology Data Exchange (ETDEWEB)

    Seager, M.K.; Campbell, S.; Sikora, S.; Strout, R.; Zosel, M.

    1988-03-01

    The design and debugging of parallel programs is a difficult task due to the complex synchronization and data scoping issues involed. to aid the programmer in paralle code dvelopment we have developed two methodologies for the graphical display of execution of parallel codes. The Graphical Multiprocessing Analysis Tools (GMAT) consist of stategraph, which represents an inheritance tree of task states, and timeline, which represens task as flowing sequence of events. Information about the code can be displayed as the application runs (dynamic mode) or played back with time under user control (static mode). This document discusses the design and user interface issues involved in developing the parallel application display GMAT family. Also, we present an introductory user's guide for both tools. 4 figs.

  1. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  2. Chronic wasting disease risk analysis workshop: An integrative approach

    Science.gov (United States)

    Gillette, Shana; Dein, Joshua; Salman, Mo; Richards, Bryan; Duarte, Paulo

    2004-01-01

    Risk analysis tools have been successfully used to determine the potential hazard associated with disease introductions and have facilitated management decisions designed to limit the potential for disease introduction. Chronic Wasting Disease (CWD) poses significant challenges for resource managers due to an incomplete understanding of disease etiology and epidemiology and the complexity of management and political jurisdictions. Tools designed specifically to assess the risk of CWD introduction would be of great value to policy makers in areas where CWD has not been detected.

  3. General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  4. Abstract Interfaces for Data Analysis Component Architecture for Data Analysis Tools

    CERN Document Server

    Barrand, G; Dönszelmann, M; Johnson, A; Pfeiffer, A

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and i...

  5. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  6. Design and analysis tool validation

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.

    1981-07-01

    The Solar Energy Research Institute (SERI) is developing a procedure for the validation of Building Energy Analysis Simulation Codes (BEAS). These codes are being used increasingly in the building design process, both directly and as the basis for simplified design tools and guidelines. The importance of the validity of the BEAS in predicting building energy performance is obvious when one considers the money and energy that could be wasted by energy-inefficient designs. However, to date, little or no systematic effort has been made to ensure the validity of the various BEAS. The validation work at SERI consists of three distinct parts: Comparative Study, Analytical Verification, and Empirical Validation. The procedures have been developed for the first two parts and have been implemented on a sampling of the major BEAS; results have shown major problems in one of the BEAS tested. Furthermore, when one building design was run using several of the BEAS, large differences were found in the predicted annual cooling and heating loads. The empirical validation procedure has been developed, and five two-zone test cells have been constructed for validation; a summer validation run will take place as soon as the data acquisition system is completed. Additionally, a test validation exercise is now in progress using the low-cal house to fine-tune the empirical validation procedure and better define monitoring data requirements.

  7. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity

  8. Virtual Workshop

    DEFF Research Database (Denmark)

    Buus, Lillian; Bygholm, Ann

    In relation to the Tutor course in the Mediterranean Virtual University (MVU) project, a virtual workshop “Getting experiences with different synchronous communication media, collaboration, and group work” was held with all partner institutions in January 2006. More than 25 key-tutors within MVU...... participated from different institutions in the workshop. The result of the workshop was experiences with different communication tools and media. Facing the difficulties and possibilities in collaborateting virtually concerned around group work and development of a shared presentation. All based on getting...... experiences for the learning design of MVU courses. The workshop intented to give the participants the possibility to draw their own experiences with issues on computer supported collaboration, group work in a virtual environment, synchronous and asynchronous communication media, and different perspectives...

  9. High-Speed Research: 1994 Sonic Boom Workshop. Configuration, Design, Analysis and Testing

    Science.gov (United States)

    McCurdy, David A. (Editor)

    1999-01-01

    The third High-Speed Research Sonic Boom Workshop was held at NASA Langley Research Center on June 1-3, 1994. The purpose of this workshop was to provide a forum for Government, industry, and university participants to present and discuss progress in their research. The workshop was organized into sessions dealing with atmospheric propagation; acceptability studies; and configuration design, and testing. Attendance at the workshop was by invitation only. The workshop proceedings include papers on design, analysis, and testing of low-boom high-speed civil transport configurations and experimental techniques for measuring sonic booms. Significant progress is noted in these areas in the time since the previous workshop a year earlier. The papers include preliminary results of sonic boom wind tunnel tests conducted during 1993 and 1994 on several low-boom designs. Results of a mission performance analysis of all low-boom designs are also included. Two experimental methods for measuring near-field signatures of airplanes in flight are reported.

  10. General Mission Analysis Tool (GMAT) Mathematical Specifications

    Science.gov (United States)

    Hughes, Steve

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

  11. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  12. Assessing the Interactivity and Prescriptiveness of Faculty Professional Development Workshops: The Real-Time Professional Development Observation Tool

    Science.gov (United States)

    Olmstead, Alice; Turpen, Chandra

    2016-01-01

    Professional development workshops are one of the primary mechanisms used to help faculty improve their teaching, and draw in many STEM instructors every year. Although workshops serve a critical role in changing instructional practices within our community, we rarely assess workshops through careful consideration of how they engage faculty.…

  13. Proceedings: Workshop on advanced mathematics and computer science for power systems analysis

    Energy Technology Data Exchange (ETDEWEB)

    Esselman, W.H.; Iveson, R.H. (Electric Power Research Inst., Palo Alto, CA (United States))

    1991-08-01

    The Mathematics and Computer Workshop on Power System Analysis was held February 21--22, 1989, in Palo Alto, California. The workshop was the first in a series sponsored by EPRI's Office of Exploratory Research as part of its effort to develop ways in which recent advances in mathematics and computer science can be applied to the problems of the electric utility industry. The purpose of this workshop was to identify research objectives in the field of advanced computational algorithms needed for the application of advanced parallel processing architecture to problems of power system control and operation. Approximately 35 participants heard six presentations on power flow problems, transient stability, power system control, electromagnetic transients, user-machine interfaces, and database management. In the discussions that followed, participants identified five areas warranting further investigation: system load flow analysis, transient power and voltage analysis, structural instability and bifurcation, control systems design, and proximity to instability. 63 refs.

  14. PREFACE: Seventh International Workshop: Group Analysis of Differential Equations and Integrable Systems (GADEISVII)

    Science.gov (United States)

    Vaneeva, Olena; Sophocleous, Christodoulos; Popovych, Roman; Boyko, Vyacheslav; Damianou, Pantelis

    2015-06-01

    The Seventh International Workshop "Group Analysis of Differential Equations and Integrable Systems" (GADEIS-VII) took place at Flamingo Beach Hotel, Larnaca, Cyprus during the period June 15-19, 2014. Fifty nine scientists from nineteen countries participated in the Workshop, and forty one lectures were presented. The Workshop topics ranged from theoretical developments of group analysis of differential equations, hypersymplectic structures, theory of Lie algebras, integrability and superintegrability to their applications in various fields. The Series of Workshops is a joint initiative by the Department of Mathematics and Statistics, University of Cyprus, and the Department of Applied Research of the Institute of Mathematics, National Academy of Sciences, Ukraine. The Workshops evolved from close collaboration among Cypriot and Ukrainian scientists. The first three meetings were held at the Athalassa campus of the University of Cyprus (October 27, 2005, September 25-28, 2006, and October 4-5, 2007). The fourth (October 26-30, 2008), the fifth (June 6-10, 2010) and the sixth (June 17-21, 2012) meetings were held at the coastal resort of Protaras. We would like to thank all the authors who have published papers in the Proceedings. All of the papers have been reviewed by at least two independent referees. We express our appreciation of the care taken by the referees. Their constructive suggestions have improved most of the papers. The importance of peer review in the maintenance of high standards of scientific research can never be overstated. Olena Vaneeva, Christodoulos Sophocleous, Roman Popovych, Vyacheslav Boyko, Pantelis Damianou

  15. Model Analysis ToolKit

    Energy Technology Data Exchange (ETDEWEB)

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  16. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  17. 2010 Solar Market Transformation Analysis and Tools

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2010-04-01

    This document describes the DOE-funded solar market transformation analysis and tools under development in Fiscal Year 2010 so that stakeholders can access available resources and get engaged where interested.

  18. Quick Spacecraft Thermal Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  19. SHARAD Radargram Analysis Tool Development in JMARS

    Science.gov (United States)

    Adler, J. B.; Anwar, S.; Dickenshied, S.; Carter, S.

    2016-09-01

    New tools are being developed in JMARS, a free GIS software, for SHARAD radargram viewing and analysis. These capabilities are useful for the polar science community, and for constraining the viability of ice resource deposits for human exploration.

  20. Abstract Interfaces for Data Analysis —Component Architecture for Data Analysis Tools

    Institute of Scientific and Technical Information of China (English)

    G.Barrand; P.Binko; 等

    2001-01-01

    The fast turnover of software technologies,in particular in the domain of in teractivity(covering user interface and visualisation)makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete.At the HepVis '99 workshop,a working group has been formed to improve the rpoduction of software tools for data analysis in HENP.Beside promoting a distributed development organisation,one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques.An initial domain analysis has come up with several categories(componets)found in typical data analysis tools:historams,Ntuples,Functions,Vectors,Fitter,Plotter,Analyzer and Controller,Special Emphasis was put on reducing the couplings between the categories to a minimum,thus optimising re-use and maintainability of any component individually.The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++(Anaphe/Lizard,Openscientist)and Java(Java Analysis Studio),A special implementation aims at accessing the Java Liraries(through their Abstract Interfaces)from C++.This paper giver an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

  1. Tools for Basic Statistical Analysis

    Science.gov (United States)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  2. An Automatic Hierarchical Delay Analysis Tool

    Institute of Scientific and Technical Information of China (English)

    FaridMheir-El-Saadi; BozenaKaminska

    1994-01-01

    The performance analysis of VLSI integrated circuits(ICs) with flat tools is slow and even sometimes impossible to complete.Some hierarchical tools have been developed to speed up the analysis of these large ICs.However,these hierarchical tools suffer from a poor interaction with the CAD database and poorly automatized operations.We introduce a general hierarchical framework for performance analysis to solve these problems.The circuit analysis is automatic under the proposed framework.Information that has been automatically abstracted in the hierarchy is kept in database properties along with the topological information.A limited software implementation of the framework,PREDICT,has also been developed to analyze the delay performance.Experimental results show that hierarchical analysis CPU time and memory requirements are low if heuristics are used during the abstraction process.

  3. Energy demand analysis in the workshop on alternative energy strategies

    Energy Technology Data Exchange (ETDEWEB)

    Carhart, S C

    1978-04-01

    The Workshop on Alternative Energy Strategies, conducted from 1974 through 1977, was an international study group formed to develop consistent national energy alternatives within a common analytical framework and global assumptions. A major component of this activity was the demand program, which involved preparation of highly disaggregated demand estimates based upon estimates of energy-consuming activities and energy requirements per unit of activity reported on a consistent basis for North America, Europe, and Japan. Comparison of the results of these studies reveals that North America requires more energy per unit of activity in many consumption categories, that major improvements in efficiency will move North America close to current European and Japanese efficiencies, and that further improvements in European and Japanese efficiencies may be anticipated as well. When contrasted with expected availabilities of fuels, major shortfalls of oil relative to projected demands emerge in the eighties and nineties. Some approaches to investment in efficiency improvements which will offset these difficulties are discussed.

  4. Presentations and recorded keynotes of the First European Workshop on Latent Semantic Analysis in Technology Enhanced Learning

    NARCIS (Netherlands)

    Several

    2007-01-01

    Presentations and recorded keynotes at the 1st European Workshop on Latent Semantic Analysis in Technology-Enhanced Learning, March, 29-30, 2007. Heerlen, The Netherlands: The Open University of the Netherlands. Please see the conference website for more information: http://homer.ou.nl/lsa-workshop0

  5. Surface analysis of stone and bone tools

    Science.gov (United States)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  6. Stochastic Simulation Tool for Aerospace Structural Analysis

    Science.gov (United States)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  7. Built Environment Energy Analysis Tool Overview (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  8. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  9. Performance analysis of GYRO: a tool evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Worley, P [Oak Ridge National Laboratory, PO Box 2008, Oak Ridge, TN 37831-6016 (United States); Candy, J [General Atomics, PO Box 85608, San Diego, CA 92186-5608 (United States); Carrington, L [San Diego Supercomputer Center, University of California, San Diego, 9500 Gilman Drive, La Jolla, California 92093-0505 (United States); Huck, K [Computer and Information Science Department, 1202 University of Oregon, Eugene, OR 97403-1202 (United States); Kaiser, T [San Diego Supercomputer Center, University of California, San Diego, 9500 Gilman Drive, La Jolla, California 92093-0505 (United States); Mahinthakumar, G [Department of Civil Engineering, North Carolina State University, Raleigh, NC 27695-7908 (United States); Malony, A [Computer and Information Science Department, 1202 University of Oregon, Eugene, OR 97403-1202 (United States); Moore, S [Innovative Computing Laboratory, University of Tennessee, 1122 Volunteer Blvd., Suite 413, Knoxville, TN 37996-3450 (United States); Reed, D [Renaissance Computing Institute, University of North Carolina at Chapel Hill, CB 7583, Carr Building, Chapel Hill, NC 27599-7583 (United States); Roth, P [Oak Ridge National Laboratory, PO Box 2008, Oak Ridge, TN 37831-6016 (United States); Shan, H [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Shende, S [Computer and Information Science Department, 1202 University of Oregon, Eugene, OR 97403-1202 (United States); Snavely, A [San Diego Supercomputer Center, Univ. of California, San Diego, 9500 Gilman Drive, La Jolla, California 92093-0505 (United States); Sreepathi, S [Dept. of Computer Science, North Carolina State Univ., Raleigh, NC 27695-7908 (United States); Wolf, F [Innovative Computing Lab., Univ. of Tennessee, 1122 Volunteer Blvd., Suite 413, Knoxville, TN 37996-3450 (United States); Zhang, Y [Renaissance Computing Inst., Univ. of North Carolina at Chapel Hill, CB 7583, Carr Building, Chapel Hill, NC 27599-7583 (United States)

    2005-01-01

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wallclock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  10. Performance Analysis of GYRO: A Tool Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  11. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing (Dagstuhl Perspectives Workshop 14022)

    OpenAIRE

    Bremer, Peer-Timo; Mohr, Bernd; Pascucci, Valerio; Schulz, Martin

    2014-01-01

    In the first week of January 2014 Dagstuhl hosted a Perspectives Workshop on "Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing". The event brought together two previously separate communities - from Visualization and HPC Performance Analysis - to discuss a long term joined research agenda. The goal was to identify and address the challenges in using visual representations to understand and optimize the performance of extreme-scale applications running...

  12. Requirements Engineering and Analysis Workshop Proceedings Held in Pittsburgh, Pennsylvania on 12-14 March 1991

    Science.gov (United States)

    1991-12-01

    urally. 6.5 Summary of Current or Potential Approaches Many approaches to context analysis were discussed by the group, including: * Causal Trees * SWOT ...This could come in the form of a purchase at a shopping mall , buying lunch, or delivering a pizza. The estimated economic loss due to congestion has...Requirements Engineering and Analysis Work- shop in Pittsburgh, Pennsylvania, on March 12-14, 1991. The intention of the workshop was to focus (please turn

  13. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...

  14. General Analysis Tool Box for Controlled Perturbation

    CERN Document Server

    Osbild, Ralf

    2012-01-01

    The implementation of reliable and efficient geometric algorithms is a challenging task. The reason is the following conflict: On the one hand, computing with rounded arithmetic may question the reliability of programs while, on the other hand, computing with exact arithmetic may be too expensive and hence inefficient. One solution is the implementation of controlled perturbation algorithms which combine the speed of floating-point arithmetic with a protection mechanism that guarantees reliability, nonetheless. This paper is concerned with the performance analysis of controlled perturbation algorithms in theory. We answer this question with the presentation of a general analysis tool box. This tool box is separated into independent components which are presented individually with their interfaces. This way, the tool box supports alternative approaches for the derivation of the most crucial bounds. We present three approaches for this task. Furthermore, we have thoroughly reworked the concept of controlled per...

  15. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  16. Statistical Tools for Forensic Analysis of Toolmarks

    Energy Technology Data Exchange (ETDEWEB)

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  17. Decision Analysis Tools for Volcano Observatories

    Science.gov (United States)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  18. Workshop on Scientific Analysis and Policy in Network Security

    Science.gov (United States)

    2010-09-10

    business process. The input elements which we need comprise. (1) a process model given in a notation such as EPC, BPEL, YAWL or BPMN that contains...with tool support, an appropriate formal representation has to be chosen because semi-formal lan- guages such as BPMN allow to create models with...formal semantics here such as the approaches in [2] for BPMN or [1] for EPC that allow for computation of possible system’s behaviour. Reachability

  19. 11th International Workshop in Model-Oriented Design and Analysis

    CERN Document Server

    Müller, Christine; Atkinson, Anthony

    2016-01-01

    This volume contains pioneering contributions to both the theory and practice of optimal experimental design. Topics include the optimality of designs in linear and nonlinear models, as well as designs for correlated observations and for sequential experimentation. There is an emphasis on applications to medicine, in particular, to the design of clinical trials. Scientists from Europe, the US, Asia, Australia and Africa contributed to this volume of papers from the 11th Workshop on Model Oriented Design and Analysis.

  20. Space Debris Reentry Analysis Methods and Tools

    Institute of Scientific and Technical Information of China (English)

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe

    2011-01-01

    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  1. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan

    2012-11-01

    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs130170

  2. Summary and Statistical Analysis of the First AIAA Sonic Boom Prediction Workshop

    Science.gov (United States)

    Park, Michael A.; Morgenstern, John M.

    2014-01-01

    A summary is provided for the First AIAA Sonic Boom Workshop held 11 January 2014 in conjunction with AIAA SciTech 2014. Near-field pressure signatures extracted from computational fluid dynamics solutions are gathered from nineteen participants representing three countries for the two required cases, an axisymmetric body and simple delta wing body. Structured multiblock, unstructured mixed-element, unstructured tetrahedral, overset, and Cartesian cut-cell methods are used by the participants. Participants provided signatures computed on participant generated and solution adapted grids. Signatures are also provided for a series of uniformly refined workshop provided grids. These submissions are propagated to the ground and loudness measures are computed. This allows the grid convergence of a loudness measure and a validation metric (dfference norm between computed and wind tunnel measured near-field signatures) to be studied for the first time. Statistical analysis is also presented for these measures. An optional configuration includes fuselage, wing, tail, flow-through nacelles, and blade sting. This full configuration exhibits more variation in eleven submissions than the sixty submissions provided for each required case. Recommendations are provided for potential improvements to the analysis methods and a possible subsequent workshop.

  3. RSAT 2015: Regulatory Sequence Analysis Tools.

    Science.gov (United States)

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-07-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/.

  4. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  5. Scientific Ballooning Technologies Workshop STO-2 Thermal Design and Analysis

    Science.gov (United States)

    Ferguson, Doug

    2016-01-01

    The heritage thermal model for the full STO-2 (Stratospheric Terahertz Observatory II), vehicle has been updated to model the CSBF (Columbia Scientific Balloon Facility) SIP-14 (Scientific Instrument Package) in detail. Analysis of this model has been performed for the Antarctica FY2017 launch season. Model temperature predictions are compared to previous results from STO-2 review documents.

  6. Microgenetic Learning Analytics Methods: Workshop Report

    Science.gov (United States)

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  7. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  8. Cretaceous oceanic red bed deposition, a tool for paleoenvironmental changes--Workshop of IGCP 463 & 494

    Institute of Scientific and Technical Information of China (English)

    MihaelaCarmenMelinte; RobertScott; ChengshanWANG; XiumianHU

    2005-01-01

    Members of IGCP 463, Cretaceous Oceanic Red Beds (CORBs), held the third workshop in Romania. In addition to scientific sessions,discussions of results and future plans, the participants examined exposures of Upper Cretaceous Red Beds of the Romanian Carpathians characterized both by pelagic/hemipelagic and turbiditic facies.

  9. A workshop report on the development of the Cow's Milk-related Symptom Score awareness tool for young children

    DEFF Research Database (Denmark)

    Vandenplas, Yvan; Dupont, Christophe; Eigenmann, Philippe;

    2015-01-01

    Clinicians with expertise in managing children with gastrointestinal problems and, or, atopic diseases attended a workshop in Brussels in September 2014 to review the literature and determine whether a clinical score derived from symptoms associated with the ingestion of cow's milk proteins could...

  10. 1995 NASA High-Speed Research Program Sonic Boom Workshop. Volume 2; Configuration Design, Analysis, and Testing

    Science.gov (United States)

    Baize, Daniel G. (Editor)

    1999-01-01

    The High-Speed Research Program and NASA Langley Research Center sponsored the NASA High-Speed Research Program Sonic Boom Workshop on September 12-13, 1995. The workshop was designed to bring together NASAs scientists and engineers and their counterparts in industry, other Government agencies, and academia working together in the sonic boom element of NASAs High-Speed Research Program. Specific objectives of this workshop were to: (1) report the progress and status of research in sonic boom propagation, acceptability, and design; (2) promote and disseminate this technology within the appropriate technical communities; (3) help promote synergy among the scientists working in the Program; and (4) identify technology pacing, the development C, of viable reduced-boom High-Speed Civil Transport concepts. The Workshop was organized in four sessions: Sessions 1 Sonic Boom Propagation (Theoretical); Session 2 Sonic Boom Propagation (Experimental); Session 3 Acceptability Studies-Human and Animal; and Session 4 - Configuration Design, Analysis, and Testing.

  11. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  12. Web-based pre-Analysis Tools

    CERN Document Server

    Moskalets, Tetiana

    2014-01-01

    The project consists in the initial development of a web based and cloud computing services to allow students and researches to perform fast and very useful cut-based pre-analysis on a browser, using real data and official Monte-Carlo simulations (MC). Several tools are considered: ROOT files filter, JavaScript Multivariable Cross-Filter, JavaScript ROOT browser and JavaScript Scatter-Matrix Libraries. Preliminary but satisfactory results have been deployed online for test and future upgrades.

  13. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur....

  14. PROMOTION OF PRODUCTS AND ANALYSIS OF MARKET OF POWER TOOLS

    Directory of Open Access Journals (Sweden)

    Sergey S. Rakhmanov

    2014-01-01

    Full Text Available The article describes the general situation of power tools on the market, both in Russia and in the world. A comparative analysis of competitors, market structure analysis of power tools, as well as assessment of competitiveness of some major product lines. Also the analysis methods of promotion used by companies selling tools, competitive analysis range Bosch, the leader in its segment, power tools available on the market in Russia.

  15. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  16. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  17. Using the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Conway, Darrel J.; Parker, Joel

    2017-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT). These slides will be used to accompany the demonstration. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. This talk is a combination of existing presentations and material; system user guide and technical documentation; a GMAT basics and overview, and technical presentations from the TESS projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project. Slides for navigation and optimal control are borrowed from system documentation and training material.

  18. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  19. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  20. Ultrasonic vibrating system design and tool analysis

    Institute of Scientific and Technical Information of China (English)

    Kei-Lin KUO

    2009-01-01

    The applications of ultrasonic vibrations for material removal processes exist predominantly in the area of vertical processing of hard and brittle materials. This is because the power generated by vertical vibrating oscillators generates the greatest direct penetration, in order to conduct material removal on workpieces by grains. However, for milling processes, vertical vibrating power has to be transformed into lateral (horizontal) vibration to produce the required horizontal cutting force. The objective of this study is to make use of ultrasonic lateral transformation theory to optimize processing efficiency, through the use of the finite element method for design and analysis of the milling tool. In addition, changes can be made to the existing vibrating system to generate best performance under consistent conditions, namely, using the same piezoelectric ceramics.

  1. Science for Managing Riverine Ecosystems: Actions for the USGS Identified in the Workshop "Analysis of Flow and Habitat for Instream Aquatic Communities"

    Science.gov (United States)

    Bencala, Kenneth E.; Hamilton, David B.; Petersen, James H.

    2006-01-01

    Federal and state agencies need improved scientific analysis to support riverine ecosystem management. The ability of the USGS to integrate geologic, hydrologic, chemical, geographic, and biological data into new tools and models provides unparalleled opportunities to translate the best riverine science into useful approaches and usable information to address issues faced by river managers. In addition to this capability to provide integrated science, the USGS has a long history of providing long-term and nationwide information about natural resources. The USGS is now in a position to advance its ability to provide the scientific support for the management of riverine ecosystems. To address this need, the USGS held a listening session in Fort Collins, Colorado in April 2006. Goals of the workshop were to: 1) learn about the key resource issues facing DOI, other Federal, and state resource management agencies; 2) discuss new approaches and information needs for addressing these issues; and 3) outline a strategy for the USGS role in supporting riverine ecosystem management. Workshop discussions focused on key components of a USGS strategy: Communications, Synthesis, and Research. The workshop identified 3 priority actions the USGS can initiate now to advance its capabilities to support integrated science for resource managers in partner government agencies and non-governmental organizations: 1) Synthesize the existing science of riverine ecosystem processes to produce broadly applicable conceptual models, 2) Enhance selected ongoing instream flow projects with complementary interdisciplinary studies, and 3) Design a long-term, watershed-scale research program that will substantively reinvent riverine ecosystem science. In addition, topical discussion groups on hydrology, geomorphology, aquatic habitat and populations, and socio-economic analysis and negotiation identified eleven important complementary actions required to advance the state of the science and to

  2. Diving into the analysis of time-depth recorder and behavioural data records: A workshop summary

    Science.gov (United States)

    Womble, Jamie N.; Horning, Markus; Lea, Mary-Anne; Rehberg, Michael J.

    2013-04-01

    Directly observing the foraging behavior of animals in the marine environment can be extremely challenging, if not impossible, as such behavior often takes place beneath the surface of the ocean and in extremely remote areas. In lieu of directly observing foraging behavior, data from time-depth recorders and other types of behavioral data recording devices are commonly used to describe and quantify the behavior of fish, squid, seabirds, sea turtles, pinnipeds, and cetaceans. Often the definitions of actual behavioral units and analytical approaches may vary substantially which may influence results and limit our ability to compare behaviors of interest across taxonomic groups and geographic regions. A workshop was convened in association with the Fourth International Symposium on Bio-logging in Hobart, Tasmania on 8 March 2011, with the goal of providing a forum for the presentation, review, and discussion of various methods and approaches that are used to describe and analyze time-depth recorder and associated behavioral data records. The international meeting brought together 36 participants from 14 countries from a diversity of backgrounds including scientists from academia and government, graduate students, post-doctoral fellows, and developers of electronic tagging technology and analysis software. The specific objectives of the workshop were to host a series of invited presentations followed by discussion sessions focused on (1) identifying behavioral units and metrics that are suitable for empirical studies, (2) reviewing analytical approaches and techniques that can be used to objectively classify behavior, and (3) identifying cases when temporal autocorrelation structure is useful for identifying behaviors of interest. Outcomes of the workshop included highlighting the need to better define behavioral units and to devise more standardized processing and analytical techniques in order to ensure that results are comparable across studies and taxonomic groups.

  3. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  4. Timeline analysis tools for law enforcement

    Science.gov (United States)

    Mucks, John

    1997-02-01

    The timeline analysis system (TAS) was developed by Rome Laboratory to assist intelligence analysts with the comprehension of large amounts of information. Under the TAS program data visualization, manipulation and reasoning tools were developed in close coordination with end users. The initial TAS prototype was developed for foreign command and control analysts at Space Command in Colorado Springs and was fielded there in 1989. The TAS prototype replaced manual paper timeline maintenance and analysis techniques and has become an integral part of Space Command's information infrastructure. TAS was designed to be domain independent and has been tailored and proliferated to a number of other users. The TAS program continues to evolve because of strong user support. User funded enhancements and Rome Lab funded technology upgrades have significantly enhanced TAS over the years and will continue to do so for the foreseeable future. TAS was recently provided to the New York State Police (NYSP) for evaluation using actual case data. Timeline analysis it turns out is a popular methodology used in law enforcement. The evaluation has led to a more comprehensive application and evaluation project sponsored by the National Institute of Justice (NIJ). This paper describes the capabilities of TAS, results of the initial NYSP evaluation and the plan for a more comprehensive NYSP evaluation.

  5. Report from the 2nd Workshop on Extremely Large Databases

    Directory of Open Access Journals (Sweden)

    Jacek Becla

    2009-03-01

    Full Text Available The complexity and sophistication of large scale analytics in science and industry have advanced dramatically in recent years. Analysts are struggling to use complex techniques such as time series analysis and classification algorithms because their familiar, powerful tools are not scalable and cannot effectively use scalable database systems. The 2nd Extremely Large Databases (XLDB workshop was organized to understand these issues, examine their implications, and brainstorm possible solutions. The design of a new open source science database, SciDB that emerged from the first workshop in this series was also debated. This paper is the final report of the discussions and activities at this workshop.

  6. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a comple...

  7. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  8. Team-Based Peer Review as a Form of Formative Assessment--The Case of a Systems Analysis and Design Workshop

    Science.gov (United States)

    Lavy, Ilana; Yadin, Aharon

    2010-01-01

    The present study was carried out within a systems analysis and design workshop. In addition to the standard analysis and design tasks, this workshop included practices designed to enhance student capabilities related to non-technical knowledge areas, such as critical thinking, interpersonal and team skills, and business understanding. Each task…

  9. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  10. Parallel Enhancements of the General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  11. Development of Integrated Protein Analysis Tool

    Directory of Open Access Journals (Sweden)

    Poorna Satyanarayana Boyidi,

    2010-05-01

    Full Text Available We present an “Integrated Protein Analysis Tool(IPAT” that is able to perform the following tasks in segregating and annotating genomic data: Protein Editor enables the entry of nucleotide/ aminoacid sequences Utilities :IPAT enables to conversion of given nucleotide sequence to equivalent amino acid sequence: Secondary Structure Prediction is possible using three algorithms (GOR-I Gibrat Method and DPM (Double Prediction Method with graphical display. Profiles and properties: allow calculating eight physico-chemical profiles and properties, viz Hydrophobicity, Hydrophilicity, Antigenicity, Transmembranous regions , Solvent Accessibility, Molecular Weight, Absorption factor and Amino Acid Content. IPAT has a provision for viewing Helical-Wheel Projection of a selected region of a given protein sequence and 2D representation of alphacarbon IPAT was developed using the UML (Unified Modeling Language for modeling the project elements, coded in Java, and subjected to unit testing, path testing, and integration testing.This project mainly concentrates on Butyrylcholinesterase to predict secondary structure and its physicochemical profiles, properties.

  12. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  13. Development of Multiscale Biological Image Data Analysis: Review of 2006 International Workshop on Multiscale Biological Imaging, Data Mining and Informatics, Santa Barbara, USA (BII06)

    OpenAIRE

    Auer, Manfred; Peng, Hanchuan; Singh, Ambuj

    2007-01-01

    The 2006 International Workshop on Multiscale Biological Imaging, Data Mining and Informatics was held at Santa Barbara, on Sept 7–8, 2006. Based on the presentations at the workshop, we selected and compiled this collection of research articles related to novel algorithms and enabling techniques for bio- and biomedical image analysis, mining, visualization, and biology applications.

  14. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    Energy Technology Data Exchange (ETDEWEB)

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

    2009-09-01

    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  15. Tools for Knowledge Analysis, Synthesis, and Sharing

    Science.gov (United States)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  16. The Use of Song to Open an Educational Development Workshop: Exploratory Analysis and Reflections

    Science.gov (United States)

    Lesser, Lawrence; An, Song; Tillman, Daniel

    2016-01-01

    Song has been used by faculty of many disciplines in their classrooms and, to a lesser extent, by educational developers in workshops. This paper shares and discusses a new song (about an instructor's evolving openness to alternatives to lecture-only teaching) and its novel use to open an educational development workshop. Self-reported participant…

  17. Forensic analysis of video steganography tools

    Directory of Open Access Journals (Sweden)

    Thomas Sloan

    2015-05-01

    Full Text Available Steganography is the art and science of concealing information in such a way that only the sender and intended recipient of a message should be aware of its presence. Digital steganography has been used in the past on a variety of media including executable files, audio, text, games and, notably, images. Additionally, there is increasing research interest towards the use of video as a media for steganography, due to its pervasive nature and diverse embedding capabilities. In this work, we examine the embedding algorithms and other security characteristics of several video steganography tools. We show how all feature basic and severe security weaknesses. This is potentially a very serious threat to the security, privacy and anonymity of their users. It is important to highlight that most steganography users have perfectly legal and ethical reasons to employ it. Some common scenarios would include citizens in oppressive regimes whose freedom of speech is compromised, people trying to avoid massive surveillance or censorship, political activists, whistle blowers, journalists, etc. As a result of our findings, we strongly recommend ceasing any use of these tools, and to remove any contents that may have been hidden, and any carriers stored, exchanged and/or uploaded online. For many of these tools, carrier files will be trivial to detect, potentially compromising any hidden data and the parties involved in the communication. We finish this work by presenting our steganalytic results, that highlight a very poor current state of the art in practical video steganography tools. There is unfortunately a complete lack of secure and publicly available tools, and even commercial tools offer very poor security. We therefore encourage the steganography community to work towards the development of more secure and accessible video steganography tools, and make them available for the general public. The results presented in this work can also be seen as a useful

  18. Generalized Geophysical Retrieval and Analysis Tool for Planetary Atmospheres Project

    Data.gov (United States)

    National Aeronautics and Space Administration — CPI proposes to develop an innovative, generalized retrieval algorithm and analysis tool (GRANT) that will facilitate analysis of remote sensing data from both...

  19. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  20. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  1. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  2. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  3. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri

    2011-01-01

    e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order......We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... to ensure the finiteness of the protocol state-spaces while still being able to verify interesting protocol properties. The translations for different kinds of communication media have been implemented and successfully tested, among others, on agreement protocols from WS-Business Activity....

  4. Workshops as a Research Methodology

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Levinsen, Karin Tweddell

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice......, and workshops as a research methodology. Focusing primarily on the latter, this paper presents five studies on upper secondary and higher education teachers’ professional development and on teaching and learning through video conferencing. Through analysis and discussion of these studies’ findings, we argue...... that workshops provide a platform that can aid researchers in identifying and exploring relevant factors in a given domain by providing means for understanding complex work and knowledge processes that are supported by technology (for example, e-learning). The approach supports identifying factors...

  5. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Science.gov (United States)

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development.

  6. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  7. MICCAI Workshops

    CERN Document Server

    Nedjati-Gilani, Gemma; Venkataraman, Archana; O'Donnell, Lauren; Panagiotaki, Eleftheria

    2014-01-01

    This volume contains the proceedings from two closely related workshops: Computational Diffusion MRI (CDMRI’13) and Mathematical Methods from Brain Connectivity (MMBC’13), held under the auspices of the 16th International Conference on Medical Image Computing and Computer Assisted Intervention, which took place in Nagoya, Japan, September 2013. Inside, readers will find contributions ranging from mathematical foundations and novel methods for the validation of inferring large-scale connectivity from neuroimaging data to the statistical analysis of the data, accelerated methods for data acquisition, and the most recent developments on mathematical diffusion modeling. This volume offers a valuable starting point for anyone interested in learning computational diffusion MRI and mathematical methods for brain connectivity as well as offers new perspectives and insights on current research challenges for those currently in the field. It will be of interest to researchers and practitioners in computer science, ...

  8. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    . An example of a prototype for a digital conceptual design tool with integrated real time structural analysis is presented and compared with a more common Building Information Modelling (BIM) approach. It is concluded that a digital conceptual design tool with embedded real time structural analysis could......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered...

  9. A Lexical Analysis Tool with Ambiguity Support

    CERN Document Server

    Quesada, Luis; Cortijo, Francisco J

    2012-01-01

    Lexical ambiguities naturally arise in languages. We present Lamb, a lexical analyzer that produces a lexical analysis graph describing all the possible sequences of tokens that can be found within the input string. Parsers can process such lexical analysis graphs and discard any sequence of tokens that does not produce a valid syntactic sentence, therefore performing, together with Lamb, a context-sensitive lexical analysis in lexically-ambiguous language specifications.

  10. Fully Parallel MHD Stability Analysis Tool

    Science.gov (United States)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2015-11-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Results of MARS parallelization and of the development of a new fix boundary equilibrium code adapted for MARS input will be reported. Work is supported by the U.S. DOE SBIR program.

  11. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    Science.gov (United States)

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  12. Evaluation of a LASSO regression approach on the unrelated samples of Genetic Analysis Workshop 17.

    Science.gov (United States)

    Guo, Wei; Elston, Robert C; Zhu, Xiaofeng

    2011-11-29

    The Genetic Analysis Workshop 17 data we used comprise 697 unrelated individuals genotyped at 24,487 single-nucleotide polymorphisms (SNPs) from a mini-exome scan, using real sequence data for 3,205 genes annotated by the 1000 Genomes Project and simulated phenotypes. We studied 200 sets of simulated phenotypes of trait Q2. An important feature of this data set is that most SNPs are rare, with 87% of the SNPs having a minor allele frequency less than 0.05. For rare SNP detection, in this study we performed a least absolute shrinkage and selection operator (LASSO) regression and F tests at the gene level and calculated the generalized degrees of freedom to avoid any selection bias. For comparison, we also carried out linear regression and the collapsing method, which sums the rare SNPs, modified for a quantitative trait and with two different allele frequency thresholds. The aim of this paper is to evaluate these four approaches in this mini-exome data and compare their performance in terms of power and false positive rates. In most situations the LASSO approach is more powerful than linear regression and collapsing methods. We also note the difficulty in determining the optimal threshold for the collapsing method and the significant role that linkage disequilibrium plays in detecting rare causal SNPs. If a rare causal SNP is in strong linkage disequilibrium with a common marker in the same gene, power will be much improved.

  13. PREFACE: European Microbeam Analysis Society's 14th European Workshop on Modern Developments and Applications in Microbeam Analysis (EMAS 2015), Portorož, Slovenia, 3-7 May 2015

    Science.gov (United States)

    Llovet, Xavier; Matthews, Michael B.; Čeh, Miran; Langer, Enrico; Žagar, Kristina

    2016-02-01

    This volume of the IOP Conference Series: Materials Science and Engineering contains papers from the 14th Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis which took place from the 3rd to the 7th of May 2015 in the Grand Hotel Bernardin, Portorož, Slovenia. The primary aim of this series of workshops is to assess the state-of-the-art and reliability of microbeam analysis techniques. The workshops also provide a forum where students and young scientists starting out on a career in microbeam analysis can meet and discuss with the established experts. The workshops have a unique format comprising invited plenary lectures by internationally recognized experts, poster presentations by the participants and round table discussions on the key topics led by specialists in the field.This workshop was organized in collaboration with the Jožef Stefan Institute and SDM - Slovene Society for Microscopy. The technical programme included the following topics: electron probe microanalysis, STEM and EELS, materials applications, cathodoluminescence and electron backscatter diffraction (EBSD), and their applications. As at previous workshops there was also a special oral session for young scientists. The best presentation by a young scientist was awarded with an invitation to attend the 2016 Microscopy and Microanalysis meeting at Columbus, Ohio. The prize went to Shirin Kaboli, of the Department of Metals and Materials Engineering of McGill University (Montréal, Canada), for her talk entitled "Electron channelling contrast reconstruction with electron backscattered diffraction". The continuing relevance of the EMAS workshops and the high regard in which they are held internationally can be seen from the fact that 71 posters from 16 countries were on display at the meeting and that the participants came from as far away as Japan, Canada, USA, and Australia. A selection of participants with posters was invited

  14. Healthcare BI: a tool for meaningful analysis.

    Science.gov (United States)

    Rohloff, Rose

    2011-05-01

    Implementing an effective business intelligence (BI) system requires organizationwide preparation and education to allow for meaningful analysis of information. Hospital executives should take steps to ensure that: Staff entering data are proficient in how the data are to be used for decision making, and integration is based on clean data from primary sources of entry. Managers have the business acumen required for effective data analysis. Decision makers understand how multidimensional BI offers new ways of analysis that represent significant improvements over historical approaches using static reporting.

  15. Surface Operations Data Analysis and Adaptation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  16. SBOAT: A Stochastic BPMN Analysis and Optimisation Tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching and parame......In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching...

  17. INDICO Workshop

    CERN Document Server

    CERN. Geneva; Fabbrichesi, Marco

    2004-01-01

    The INtegrated DIgital COnferencing EU project has finished building a complete software solution to facilitate the MANAGEMENT OF CONFERENCES, workshops, schools or simple meetings from their announcement to their archival. Everybody involved in the organization of events is welcome to join this workshop, in order to understand the scope of the project and to see demonstrations of the various features.

  18. Bayesian data analysis tools for atomic physics

    CERN Document Server

    Trassinelli, Martino

    2016-01-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested fit to calculate the different probability distrib...

  19. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  20. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  1. Workshop on molecular animation.

    Science.gov (United States)

    Bromberg, Sarina; Chiu, Wah; Ferrin, Thomas E

    2010-10-13

    From February 25 to 26, 2010, in San Francisco, the Resource for Biocomputing, Visualization, and Informatics (RBVI) and the National Center for Macromolecular Imaging (NCMI) hosted a molecular animation workshop for 21 structural biologists, molecular animators, and creators of molecular visualization software. Molecular animation aims to visualize scientific understanding of biomolecular processes and structures. The primary goal of the workshop was to identify the necessary tools for producing high-quality molecular animations, understanding complex molecular and cellular structures, creating publication supplementary materials and conference presentations, and teaching science to students and the public. Another use of molecular animation emerged in the workshop: helping to focus scientific inquiry about the motions of molecules and enhancing informal communication within and between laboratories.

  2. WALLTURB International Workshop

    CERN Document Server

    Jimenez, Javier; Marusic, Ivan

    2011-01-01

    This book brings together selected contributions from the WALLTURB workshop on ”Understanding and modelling of wall turbulence” held in Lille, France, on April 21st to 23rd 2009. This workshop was organized by the WALLTURB consortium, in order to present to the relevant scientific community the main results of the project and to stimulate scientific discussions around the subject of wall turbulence. The workshop reviewed the recent progress in theoretical, experimental and numerical approaches to wall turbulence. The problems of zero pressure gradient, adverse pressure gradient and separating turbulent boundary layers were addressed in detail with the three approaches, using the most advanced tools. This book is a milestone in the research field, thanks to the high level of the invited speakers and the involvement of the contributors and a testimony of the achievement of the WALLTURB project.

  3. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  4. 3rd International Workshop on Intelligent Data Analysis and Management (IDAM)

    CERN Document Server

    Wang, Leon; Hong, Tzung-Pei; Yang, Hsin-Chang; Ting, I-Hsien

    2013-01-01

    These papers on Intelligent Data Analysis and Management (IDAM) examine issues related to the research and applications of Artificial Intelligence techniques in data analysis and management across a variety of disciplines. The papers derive from the 2013 IDAM conference in Kaohsiung ,Taiwan. It is an interdisciplinary research field involving academic researchers in information technologies, computer science, public policy, bioinformatics, medical informatics, and social and behavior studies, etc. The techniques studied include (but are not limited to): data visualization, data pre-processing, data engineering, database mining techniques, tools and applications, evolutionary algorithms, machine learning, neural nets, fuzzy logic, statistical pattern recognition, knowledge filtering, and post-processing, etc.

  5. Match Analysis an undervalued coaching tool

    CERN Document Server

    Sacripanti, Attilio

    2010-01-01

    From a Biomechanical point of view, Judo competition is an intriguing complex nonlinear system, with many chaotic and fractals aspects, It is also the test bed in which all coaching capabilities and athlete's performances are evaluated and put to the test. Competition is the moment of truth of all conditioning time, preparation and technical work, before developed, and it is also the climax of the teaching point of view. Furthermore, it is the most important source of technical assessment. Studying it is essential to the coaches because they can obtain useful information for their coaching. Match Analysis could be seen as the master key in all situation sports (dual or team) like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a short summary of the most important methodological achievement in judo match analysis. It is also presented, at light of the last technological improvement, the first systematization toward new fiel...

  6. NMR spectroscopy: a tool for conformational analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto, E-mail: rittner@iqm.unicamp.br [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica. Lab. de Fisico-Quimica Organica; Freitas, Matheus P. [Universidade Federal de Lavras (UFLA), MG (Brazil). Dept. de Qumica

    2011-07-01

    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  7. Serial concept maps: tools for concept analysis.

    Science.gov (United States)

    All, Anita C; Huycke, LaRae I

    2007-05-01

    Nursing theory challenges students to think abstractly and is often a difficult introduction to graduate study. Traditionally, concept analysis is useful in facilitating this abstract thinking. Concept maps are a way to visualize an individual's knowledge about a specific topic. Serial concept maps express the sequential evolution of a student's perceptions of a selected concept. Maps reveal individual differences in learning and perceptions, as well as progress in understanding the concept. Relationships are assessed and suggestions are made during serial mapping, which actively engages the students and faculty in dialogue that leads to increased understanding of the link between nursing theory and practice. Serial concept mapping lends itself well to both online and traditional classroom environments.

  8. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    Science.gov (United States)

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  9. Tools for analysis of Dirac structures on banach spaces

    NARCIS (Netherlands)

    Iftime, Orest V.; Sandovici, Adrian; Golo, Goran

    2005-01-01

    Power-conserving and Dirac structures are known as an approach to mathematical modeling of physical engineering systems. In this paper connections between Dirac structures and well known tools from standard functional analysis are presented. The analysis can be seen as a possible starting framework

  10. Tool Failure Analysis in High Speed Milling of Titanium Alloys

    Institute of Scientific and Technical Information of China (English)

    ZHAO Xiuxu; MEYER Kevin; HE Rui; YU Cindy; NI Jun

    2006-01-01

    In high speed milling of titanium alloys the high rate of tool failure is the main reason for its high manufacturing cost. In this study, fractured tools which were used in a titanium alloys 5-axis milling process have been observed both in the macro scale using a PG-1000 light microscope and in the micro scale using a Scanning Electron Microscope (SEM) respectively. These observations indicate that most of these tool fractures are the result of tool chipping. Further analysis of each chipping event has shown that beachmarks emanate from points on the cutting edge. This visual evidence indicates that the cutting edge is failing in fatigue due to cyclical mechanical and/or thermal stresses. Initial analyses explaining some of the outlying conditions for this phenomenon are discussed. Future analysis regarding determining the underlying causes of the fatigue phenomenon is then outlined.

  11. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  12. PREFACE: 3rd International Workshop on Materials Analysis and Processing in Magnetic Fields (MAP3)

    Science.gov (United States)

    Sakka, Yoshio; Hirota, Noriyuki; Horii, Shigeru; Ando, Tsutomu

    2009-07-01

    The 3rd International Workshop on Materials Analysis and Processing in Materials Fields (MAP3) was held on 14-16 May 2008 at the University of Tokyo, Japan. The first was held in March 2004 at the National High Magnetic Field Laboratory in Tallahassee, USA. Two years later the second took place in Grenoble, France. MAP3 was held at The University of Tokyo International Symposium, and jointly with MANA Workshop on Materials Processing by External Stimulation, and JSPS CORE Program of Construction of the World Center on Electromagnetic Processing of Materials. At the end of MAP3 it was decided that the next MAP4 will be held in Atlanta, USA in 2010. Processing in magnetic fields is a rapidly expanding research area with a wide range of promising applications in materials science. MAP3 focused on the magnetic field interactions involved in the study and processing of materials in all disciplines ranging from physics to chemistry and biology: Magnetic field effects on chemical, physical, and biological phenomena Magnetic field effects on electrochemical phenomena Magnetic field effects on thermodynamic phenomena Magnetic field effects on hydrodynamic phenomena Magnetic field effects on crystal growth Magnetic processing of materials Diamagnetic levitation Magneto-Archimedes effect Spin chemistry Application of magnetic fields to analytical chemistry Magnetic orientation Control of structure by magnetic fields Magnetic separation and purification Magnetic field-induced phase transitions Materials properties in high magnetic fields Development of NMR and MRI Medical application of magnetic fields Novel magnetic phenomena Physical property measurement by Magnetic fields High magnetic field generation> MAP3 consisted of 84 presentations including 16 invited talks. This volume of Journal of Physics: Conference Series contains the proceeding of MAP3 with 34 papers that provide a scientific record of the topics covered by the conference with the special topics (13 papers) in

  13. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  14. Single-cell analysis tools for drug discovery and development.

    Science.gov (United States)

    Heath, James R; Ribas, Antoni; Mischel, Paul S

    2016-03-01

    The genetic, functional or compositional heterogeneity of healthy and diseased tissues presents major challenges in drug discovery and development. Such heterogeneity hinders the design of accurate disease models and can confound the interpretation of biomarker levels and of patient responses to specific therapies. The complex nature of virtually all tissues has motivated the development of tools for single-cell genomic, transcriptomic and multiplex proteomic analyses. Here, we review these tools and assess their advantages and limitations. Emerging applications of single cell analysis tools in drug discovery and development, particularly in the field of oncology, are discussed.

  15. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  16. EMERGING MODALITIES FOR SOIL CARBON ANALYSIS: SAMPLING STATISTICS AND ECONOMICS WORKSHOP.

    Energy Technology Data Exchange (ETDEWEB)

    WIELOPOLSKI, L.

    2006-04-01

    The workshop's main objectives are (1) to present the emerging modalities for analyzing carbon in soil, (2) to assess their error propagation, (3) to recommend new protocols and sampling strategies for the new instrumentation, and, (4) to compare the costs of the new methods with traditional chemical ones.

  17. MSFC Skylab Orbital Workshop, volume 1. [systems analysis and equipment specifications for orbital laboratory

    Science.gov (United States)

    1974-01-01

    The technical aspects of the Skylab-Orbital Workshop are discussed. Original concepts, goals, design philosophy, hardware, and testing are reported. The final flight configuration, overall test program, and mission performance are analyzed. The systems which are examined are: (1) the structural system, (2) the meteoroid shield systems, and (3) the environmental/thermal control subsystem.

  18. A Semi-Automated Functional Test Data Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Peng; Haves, Philip; Kim, Moosung

    2005-05-01

    The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

  19. Analysis Tool Web Services from the EMBL-EBI.

    Science.gov (United States)

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  20. Workshop Proceedings

    DEFF Research Database (Denmark)

    2012-01-01

    , the main focus there is on spoken languages in their written and spoken forms. This series of workshops, however, offers a forum for researchers focussing on sign languages. For the third time, the workshop had sign language corpora as its main topic. This time, the focus was on the interaction between......This collection of papers stems from the Fifth Workshop on the Representation and Processing of Sign Languages, held in May 2012 as a satellite to the Language Resources and Evaluation Conference in Istanbul. While there has been occasional attention for sign languages at the main LREC conference...... corpus and lexicon. More than half of the papers presented contribute to this topic. Once again, the papers at this workshop clearly identify the potentials of even closer cooperation between sign linguists and sign language engineers, and we think it is events like this that contribute a lot to a better...

  1. Physics analysis tools for beauty physics in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Anastopoulos, C [Physics Department, Aristotle University Of Thessaloniki (Greece); Bouhova-Thacker, E; Catmore, J; Mora, L de [Department of Physics, Lancaster University (United Kingdom); Dallison, S [Particle Physics Department, CCLRC Rutherford Appleton Laboratory (United Kingdom); Derue, F [LPNHE, IN2P3 - CNRS - Universites Paris VI et Paris VII (France); Epp, B; Jussel, P [Institute for Astro- and Particle Physics, University of Innsbruck (Austria); Kaczmarska, A [Institute of Nuclear Physics, Polish Academy of Sciences (Poland); Radziewski, H v; Stahl, T [Department of Physics, University of Siegen (Germany); Reznicek, P [IPNP, Faculty of Mathematics and Physics, Charles University in Prague (Czech Republic)], E-mail: pavel.reznicek@cern.ch

    2008-07-15

    The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools.

  2. [SIGAPS, a tool for the analysis of scientific publications].

    Science.gov (United States)

    Sillet, Arnauld

    2015-04-01

    The System for the Identification, Management and Analysis of Scientific Publications (SIGAPS) is essential for the funding of teaching hospitals on the basis of scientific publications. It is based on the analysis of articles indexed in Medline and is calculated by taking into account the place of the author and the ranking of the journal according to the disciplinary field. It also offers tools for the bibliometric analysis of scientific production.

  3. A Suite of Tools for ROC Analysis of Spatial Models

    Directory of Open Access Journals (Sweden)

    Hermann Rodrigues

    2013-09-01

    Full Text Available The Receiver Operating Characteristic (ROC is widely used for assessing the performance of classification algorithms. In GIScience, ROC has been applied to assess models aimed at predicting events, such as land use/cover change (LUCC, species distribution and disease risk. However, GIS software packages offer few statistical tests and guidance tools for ROC analysis and interpretation. This paper presents a suite of GIS tools designed to facilitate ROC curve analysis for GIS users by applying proper statistical tests and analysis procedures. The tools are freely available as models and submodels of Dinamica EGO freeware. The tools give the ROC curve, the area under the curve (AUC, partial AUC, lower and upper AUCs, the confidence interval of AUC, the density of event in probability bins and tests to evaluate the difference between the AUCs of two models. We present first the procedures and statistical tests implemented in Dinamica EGO, then the application of the tools to assess LUCC and species distribution models. Finally, we interpret and discuss the ROC-related statistics resulting from various case studies.

  4. Comparative analysis of marine ecosystems: workshop on predator-prey interactions

    DEFF Research Database (Denmark)

    Bailey, Kevin M.; Ciannelli, Lorenzo; Hunsicker, Mary

    2010-01-01

    Climate and human influences on marine ecosystems are largely manifested by changes in predator–prey interactions. It follows that ecosystem-based management of the world's oceans requires a better understanding of food web relationships. An international workshop on predator–prey interactions...... in marine ecosystems was held at the Oregon State University, Corvallis, OR, USA on 16–18 March 2010. The meeting brought together scientists from diverse fields of expertise including theoretical ecology, animal behaviour, fish and seabird ecology, statistics, fisheries science and ecosystem modelling....... The goals of the workshop were to critically examine the methods of scaling-up predator–prey interactions from local observations to systems, the role of shifting ecological processes with scale changes, and the complexity and organizational structure in trophic interactions....

  5. Emerging methods and tools for environmental risk assessment, decision-making, and policy for nanomaterials: summary of NATO Advanced Research Workshop

    DEFF Research Database (Denmark)

    Linkov, I; Steevens, J; Adlakha-Hutcheon, G;

    2009-01-01

    the environmental effects and health risks associated with nanomaterials is limited and sometimes contradictory. This article summarizes the conclusions of a 2008 NATO workshop designed to evaluate the wide-scale implications (e.g., benefits, risks, and costs) of the use of nanomaterials on human health...... and the environment. A unique feature of this workshop was its interdisciplinary nature and focus on the practical needs of policy decision makers. Workshop presentations and discussion panels were structured along four main themes: technology and benefits, human health risk, environmental risk, and policy......Nanomaterials and their associated technologies hold promising opportunities for the development of new materials and applications in a wide variety of disciplines, including medicine, environmental remediation, waste treatment, and energy conservation. However, current information regarding...

  6. 谈某厂房现有结构鉴定分析%On analysis of evaluation for existing structure at some workshop

    Institute of Scientific and Technical Information of China (English)

    乔宏伟

    2014-01-01

    以某车间为例,利用 MIDAS Gen 和 PKPM 结构计算分析软件分别对该厂房拱桁架结构和整体结构进行了受力分析,并进行了鉴定,可为类似的工业厂房结构受力和鉴定分析提供参考。%Taking some workshop as the example,the paper adopts MIDAS Gen and PKPM structural analysis software to undertake the stress a-nalysis of the arch truss structure and integrated structure of the workshop,and has the evaluation,so as to provide the reference for the structur-al stress and evaluation analysis of similar industrial workshop.

  7. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects...... and designers....

  8. Assessment of Available Numerical Tools for Dynamic Mooring Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Eskilsson, Claes; Ferri, Francesco

    This report covers a preliminary assessment of available numerical tools to be used in upcoming full dynamic analysis of the mooring systems assessed in the project _Mooring Solutions for Large Wave Energy Converters_. The assessments tends to cover potential candidate software and subsequently c...

  9. Selected Tools for Risk Analysis in Logistics Processes

    Science.gov (United States)

    Kulińska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  10. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  11. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  12. The Adversarial Route Analysis Tool: A Web Application

    Energy Technology Data Exchange (ETDEWEB)

    Casson, William H. Jr. [Los Alamos National Laboratory

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  13. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...

  14. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic in...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results....

  15. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    Science.gov (United States)

    Clough, D.; Fletcher, S.; Longstaff, A. P.; Willoughby, P.

    2012-05-01

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  16. Web-Oriented Visual Performance Analysis Tool for HPC: THPTiii

    Institute of Scientific and Technical Information of China (English)

    SHIPeizhi; LISanli

    2003-01-01

    Improving the low efficiency of most parallel applications with performance tool is an important issue in high performance computing. Performance tool, which usually collects and analyzes performance data, is an effective way of improving performance. This paper explores both the collecting and analysis of performance data, and two innovation ideas are proposed: both types of runtime performance data, concerning both system load and application behavior, should be collected simultaneously, which requires multiple instrument flow and low probing cost; and the performance analysis should be Weboriented, which can exploit the excellent portability and usability brought by Internet. This paper presents a Weboriented HPC (High performance computing) performance tool, which can collect information about both resource utilization, including the utilizing ratio of CPU and memory, and the program behavior during runtime, including the statuses such as sending and computing, and visualize the information in the users' browser window with JAVA applets in multiple filters and multiple views. Furthermore, this performance tool exposes the data dependency between components and provides an entry of task scheduling. With this performance tool, programmers can monitor the runtime state of the application, analyze the relationship between program process and system load, find out the performance bottleneck, and improve the performance of the application at last.

  17. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  18. Network workshop

    DEFF Research Database (Denmark)

    Bruun, Jesper; Evans, Robert Harry

    2014-01-01

    This paper describes the background for, realisation of and author reflections on a network workshop held at ESERA2013. As a new research area in science education, networks offer a unique opportunity to visualise and find patterns and relationships in complicated social or academic network data....... These include student relations and interactions and epistemic and linguistic networks of words, concepts and actions. Network methodology has already found use in science education research. However, while networks hold the potential for new insights, they have not yet found wide use in the science education...... research community. With this workshop, participants were offered a way into network science based on authentic educational research data. The workshop was constructed as an inquiry lesson with emphasis on user autonomy. Learning activities had participants choose to work with one of two cases of networks...

  19. Network workshop

    DEFF Research Database (Denmark)

    Bruun, Jesper; Evans, Robert Harry

    2014-01-01

    research community. With this workshop, participants were offered a way into network science based on authentic educational research data. The workshop was constructed as an inquiry lesson with emphasis on user autonomy. Learning activities had participants choose to work with one of two cases of networks......This paper describes the background for, realisation of and author reflections on a network workshop held at ESERA2013. As a new research area in science education, networks offer a unique opportunity to visualise and find patterns and relationships in complicated social or academic network data....... These include student relations and interactions and epistemic and linguistic networks of words, concepts and actions. Network methodology has already found use in science education research. However, while networks hold the potential for new insights, they have not yet found wide use in the science education...

  20. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  1. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  2. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  3. CRAB: the CMS distributed analysis tool development and design

    Energy Technology Data Exchange (ETDEWEB)

    Spiga, D. [University and INFN Perugia (Italy); Lacaprara, S. [INFN Legnaro (Italy); Bacchi, W. [University and INFN Bologna (Italy); Cinquilli, M. [University and INFN Perugia (Italy); Codispoti, G. [University and INFN Bologna (Italy); Corvo, M. [CERN (Switzerland); Dorigo, A. [INFN Padova (Italy); Fanfani, A. [University and INFN Bologna (Italy); Fanzago, F. [CERN (Switzerland); Farina, F. [INFN Milano-Bicocca (Italy); Gutsche, O. [FNAL (United States); Kavka, C. [INFN Trieste (Italy); Merlo, M. [INFN Milano-Bicocca (Italy); Servoli, L. [University and INFN Perugia (Italy)

    2008-03-15

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  4. CRAB: the CMS distributed analysis tool development and design

    CERN Document Server

    Spiga, D; Bacchi, W; Cinquilli, M; Codispoti, G; Corvo, M; Dorigo, A; Fanfani, A; Fanzago, F; Farina, F; Gutsche, O; Kavka, C; Merlo, M; Servoli, L

    2008-01-01

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  5. Mathematical Modeling: An Integrated Algebra, Physics and Chemistry Workshop for Teachers as a Tool for Recruiting Science, Technology and Mathematics Students.

    Science.gov (United States)

    Obot, V.; Reiff, P.; Morris, P. A.; Humphrey, M.

    2004-12-01

    Based on the philosophy that there exists an artificial boundary between mathematics and the sciences, we have developed a series of workshops and modules on mathematical modeling suitable as teaching examples in secondary schools. The workshops is a 60-hour workshop held on the campus of Texas Southern University during the summer months, followed by a series of follow-up workshops on Saturdays during the academic year. Texas Southern University is a Historically Black University devoted to urban programming. The workshops use experimental and observational data from various fields with particular emphasis on chemistry, physics, earth and space sciences. The data is used to construct mathematical models. In the process of constructing the model, the student learns the appropriate mathematical and scientific concepts. We have studied linear, exponential and logarithmic functions, and using planetary data, derived and discussed Kepler's laws. We have learned how to balance chemical equations as a solution to a system of equations. We have studied and modeled electromagnetic waves using ham radio as our launching pad. Judging from participant evaluations, follow up workshops and classroom visits; teachers who participated in this workshop have been re-invigorated in their teaching. They have incorporated our examples in their teaching. They have reported increased attentiveness and excitement regarding science and mathematics from their students. It appears that this approach have caused some students to think seriously about pursuing science and engineering careers. An added benefit of this program is that the teachers have invited us and affiliated scientist into their classrooms for demonstrations. This gives the students an opportunity to interact with actual scientist and engineers. These interactions have resulted in several of the students being invited to serve as summer interns in our laboratories. For the past three years, almost all of the interns have

  6. Accuracy Analysis and Calibration of Gantry Hybrid Machine Tool

    Institute of Scientific and Technical Information of China (English)

    唐晓强; 李铁民; 尹文生; 汪劲松

    2003-01-01

    The kinematic accuracy is a key factor in the design of parallel or hybrid machine tools. This analysis improved the accuracy of a 4-DOF (degree of freedom) gantry hybrid machine tool based on a 3-DOF planar parallel manipulator by compensating for various positioning errors. The machine tool architecture was described with the inverse kinematic solution. The control parameter error model was used to analyze the accuracy of the 3-DOF planar parallel manipulator and to develop a kinematic calibration method. The experimental results prove that the calibration method reduces the cutter nose errors from ±0.50 mm to ±0.03 mm for a horizontal movement of 600 mm by compensating for errors in the slider home position, the guide way distance and the extensible strut home position. The calibration method will be useful for similar types of parallel kinematic machines.

  7. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  8. EDITORIAL: Proceedings of the 12th Gravitational Wave Data Analysis Workshop (GWDAW 12), Cambridge, MA, USA, 13 16 December 2007

    Science.gov (United States)

    Hughes, S.; Katsavounidis, E.

    2008-09-01

    It was a great pleasure and an honor for us to host the 12th Gravitational Wave Data Analysis Workshop (GWDAW) at MIT and the LIGO Laboratory in Cambridge, Massachusetts, the place where this workshop series started in 1996. This time the conference was held at the conference facilities of the Royal Sonesta Hotel in Cambridge from 13 16 December, 2007. This 12th GWDAW found us with the ground interferometers having just completed their most sensitive search for gravitational waves and as they were starting their preparation to bring online and/or propose more sensitive instruments. Resonant mass detectors continued to observe the gravitational wave sky with instruments that have been operating now for many years. LISA, the Laser Interferometer Space Antenna, was recently reviewed by NASA's Beyond Einstein Program Assessment Committee (BEPAC) convened by the National Research Council (NRC) and found that 'on purely scientific grounds LISA is the mission that is the most promising and least scientifically risky…thus, the committee gave LISA its highest scientific ranking'. Even so, JDEM, the Joint Dark Energy Mission, was identified to go first, with LISA following a few years after. New methods, analysis ideas, results from the analysis of data collected by the instruments, as well as Mock Data Challenges for LISA were reported in this conference. While data from the most recent runs of the instruments are still being analyzed, the first upper limit results show how even non-detection statements can be interesting astrophysics. Beyond these traditional aspects of GWDAW though, for the first time in this workshop we tried to bring the non-gravitational wave physics and astronomy community on board in order to present, discuss and propose ways to work together as we pursue the first detection of gravitational waves and as we hope to transition to gravitational wave astronomy in the near future. Overview talks by colleagues leading observations in the electromagnetic

  9. AstroStat - A VO Tool for Statistical Analysis

    CERN Document Server

    Kembhavi, Ajit K; Kale, Tejas; Jagade, Santosh; Vibhute, Ajay; Garg, Prerak; Vaghmare, Kaustubh; Navelkar, Sharmad; Agrawal, Tushar; Nandrekar, Deoyani; Shaikh, Mohasin

    2015-01-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analysis is done using the public domain statistical software - R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features - as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on ...

  10. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  11. PREFACE: Proceedings of the 11th European Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis

    Science.gov (United States)

    2010-07-01

    This volume of IOP Conference Series: Materials Science and Engineering contains papers from the 11th Workshop of the European Microbeam Analysis Society (EMAS) on Modern Developments and Applications in Microbeam Analysis which took place from 10-14 May 2009 in the Hotel Faltom, Gdynia, Poland. The primary aim of this series of workshops is to assess the state-of-the-art and reliability of microbeam analysis techniques. The workshops also provide a forum where students and young scientists starting out on careers in microbeam analysis can meet and discuss with the established experts. The workshops have a very distinct format comprising invited plenary lectures by internationally recognized experts, poster presentations by the participants and round table discussions on the key topics led by specialists in the field. For this workshop EMAS invited speakers on the following topics: EPMA, EBSD, fast energy-dispersive X-ray spectroscopy, three-dimensional microanalysis, and micro-and nanoanalysis in the natural resources industry. The continuing relevance of the EMAS workshops and the high regard in which they are held internationally can be seen from the fact that 69 posters from 16 countries were on display at the meeting and that the participants came from as far away as Japan and the USA. A number of participants with posters were invited to give short oral presentations of their work in two dedicated sessions. As at previous workshops there was also a special oral session for young scientists. Small cash prizes were awarded for the three best posters and for the best oral presentation by a young scientist. The prize for the best poster went to the contribution by G Tylko, S Dubchak, Z Banach and K Turnau, entitled Monte Carlo simulation for an assessment of standard validity and quantitative X-ray microanalysis in plant. Joanna Wojewoda-Budka of the Institute of Metallurgy and Materials Science, Krakow, received the prize for the best oral presentation by a

  12. TMVA - Tool-kit for Multivariate Data Analysis in ROOT

    Energy Technology Data Exchange (ETDEWEB)

    Therhaag, Jan; Von Toerne, Eckhard [Univ. Bonn, Physikalisches Institut, Nussallee 12, 53115 Bonn (Germany); Hoecker, Andreas; Speckmayer, Peter [European Organization for Nuclear Research - CERN, CH-1211 Geneve 23 (Switzerland); Stelzer, Joerg [Deutsches Elektronen-Synchrotron - DESY, Platanenallee 6, D-15738 Zeuthen (Germany); Voss, Helge [Max-Planck-Institut fuer Kernphysik - MPI, Postfach 10 39 80, Saupfercheckweg 1, DE-69117 Heidelberg (Germany)

    2010-07-01

    Given the ever-increasing complexity of modern HEP data analysis, multivariate analysis techniques have proven an indispensable tool in extracting the most valuable information from the data. TMVA, the Tool-kit for Multivariate Data Analysis, provides a large variety of advanced multivariate analysis techniques for both signal/background classification and regression problems. In TMVA, all methods are embedded in a user-friendly framework capable of handling the pre-processing of the data as well as the evaluation of the results, thus allowing for a simple use of even the most sophisticated multivariate techniques. Convenient assessment and comparison of different analysis techniques enable the user to choose the most efficient approach for any particular data analysis task. TMVA is an integral part of the ROOT data analysis framework and is widely-used in the LHC experiments. In this talk I will review recent developments in TMVA, discuss typical use-cases in HEP and present the performance of our most important multivariate techniques on example data by comparing it to theoretical performance limits. (authors)

  13. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  14. Poetry Workshop.

    Science.gov (United States)

    Janeczko, Paul B.

    2000-01-01

    This workshop offers activities to teach students about poetry. After describing haiku as a brief snapshot rather than a story, it explains how to teach poetry using an attached reproducible and poster. The tear-out reproducible sheet teaches students how to write their own haiku, offering a sample one as a model. The poster presents three sample…

  15. Physics Analysis Tools for the CMS experiment at LHC

    CERN Document Server

    Fabozzi, Francesco; Hegner, Benedikt; Lista, Luca

    2008-01-01

    The CMS experiment is expected to start data taking during 2008, and large data samples, of the Peta-bytes scale, will be produced each year. The CMS Physics Tools package provides the CMS physicist with a powerful and flexible software layer for analysis of these huge datasets that is well integrated in the CMS experiment software. A core part of this package is the Candidate Model providing a coherent interface to different types of data. Standard tasks such as combinatorial analyses, generic cuts, MC truth matching and constrained fitting are supported. Advanced template techniques enable the user to add missing features easily. We explain the underlying model, certain details of the implementation and present some use cases showing how the tools are currently used in generator and full simulation studies as preparation for analysis of real data.

  16. SABRE: A Tool for Stochastic Analysis of Biochemical Reaction Networks

    CERN Document Server

    Didier, Frederic; Mateescu, Maria; Wolf, Verena

    2010-01-01

    The importance of stochasticity within biological systems has been shown repeatedly during the last years and has raised the need for efficient stochastic tools. We present SABRE, a tool for stochastic analysis of biochemical reaction networks. SABRE implements fast adaptive uniformization (FAU), a direct numerical approximation algorithm for computing transient solutions of biochemical reaction networks. Biochemical reactions networks represent biological systems studied at a molecular level and these reactions can be modeled as transitions of a Markov chain. SABRE accepts as input the formalism of guarded commands, which it interprets either as continuous-time or as discrete-time Markov chains. Besides operating in a stochastic mode, SABRE may also perform a deterministic analysis by directly computing a mean-field approximation of the system under study. We illustrate the different functionalities of SABRE by means of biological case studies.

  17. COMPARISON OF MALAYSIA MANUFACTURING COMPANIES BY FINANCIAL STATEMENT ANALYSIS TOOLS

    OpenAIRE

    MALEK, Afagh; Mohammadi, Maryam; NASSIRI, Fardokht

    2012-01-01

    One of the best ways to get the expected results from trading in the stock market is to acquire a good evaluation of companies’ performance. Similarly, this study aims at comparing the financial performance of Lb Aluminium Berhad and Seal Incorporated Berhad manufacturing companies, which are listed in the main market of Malaysian stock exchange. The data were gathered from the annual reports of companies during last three years and analysed by financial statement analysis tools, which are ...

  18. Ethics Auditing and Conflict Analysis as Management Tools

    OpenAIRE

    2008-01-01

    This paper deals with management tools like conflict analysis and ethics auditing. Ethics auditing is understood as an opportunity and agreement to devise a system to inform on ethical corporate behaviour. This system essentially aims to increase the transparency and credibility of a companyís commitment to ethics. At the same time, the process of elaborating this system allows us to introduce the moral dimension into the companyís actions and decisions, thereby completing a key dimension of ...

  19. A Performance Analysis Tool for PVM Parallel Programs

    Institute of Scientific and Technical Information of China (English)

    Chen Wang; Yin Liu; Changjun Jiang; Zhaoqing Zhang

    2004-01-01

    In this paper,we introduce the design and implementation of ParaVT,which is a visual performance analysis and parallel debugging tool.In ParaVT,we propose an automated instrumentation mechanism. Based on this mechanism,ParaVT automatically analyzes the performance bottleneck of parallel applications and provides a visual user interface to monitor and analyze the performance of parallel programs.In addition ,it also supports certain extensions.

  20. The RUMBA software: tools for neuroimaging data analysis.

    Science.gov (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  1. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    Science.gov (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  2. Judo match analysis,a powerful coaching tool, basic and advanced tools

    CERN Document Server

    Sacripanti, A

    2013-01-01

    In this second paper on match analysis, we analyze in deep the competition steps showing the evolution of this tool at National Federation level.On the basis of our,first classification. Furthermore, it is the most important source of technical assessment. Studying competition with this tool is essential for the coaches because they can obtain useful information for their coaching. Match Analysis is today the master key in situation sports like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a deeper study of the judo competitions at high level both from the male and female point of view, explaining at light of biomechanics, not only the throws evolution in time, introduction of Innovative and Chaotic techniques, but also the evolution of fighting style in these high level competitions, both connected with the grow of this Olympic Sport in the Word Arena. It is shown how new interesting ways are opened by this powerful coac...

  3. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  4. Shot planning and analysis tools on the NIF project

    Energy Technology Data Exchange (ETDEWEB)

    Beeler, R. [Lawrence Livermore National Laboratory, Livermore, CA (United States); Casey, A., E-mail: casey20@llnl.gov [Lawrence Livermore National Laboratory, Livermore, CA (United States); Conder, A.; Fallejo, R.; Flegel, M.; Hutton, M.; Jancaitis, K.; Lakamsani, V.; Potter, D.; Reisdorf, S.; Tappero, J.; Whitman, P.; Carr, W.; Liao, Z. [Lawrence Livermore National Laboratory, Livermore, CA (United States)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Target shots in NIF, dozens a month, vary widely in laser and target configuration. Black-Right-Pointing-Pointer A planning tool helps select shot sequences that optimize valuable facility time. Black-Right-Pointing-Pointer Fabrication and supply of targets, diagnostics, etc. are integrated into the plan. Black-Right-Pointing-Pointer Predictive modeling of aging parts (e.g., optics) aids maintenance decision support. Black-Right-Pointing-Pointer We describe the planning/analysis tool and its use in NIF experimental operations. - Abstract: Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Campaign Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modeling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics

  5. Emerging methods and tools for environmental risk assessment, decision-making, and policy for nanomaterials: summary of NATO Advanced Research Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Linkov, Igor, E-mail: Igor.Linkov@usace.army.mi [U.S. Army Corps of Engineers, Environmental Laboratory (United States); Steevens, Jeffery, E-mail: Jeffery.A.Steevens@us.army.mi [U.S. Army ERDC (United States); Adlakha-Hutcheon, Gitanjali, E-mail: Gitanjali.Adlakha-Hutcheon@drdc-rddc.gc.c [Defense Research and Development Canada (Canada); Bennett, Erin, E-mail: ebennett@bioengineering.co [Intertox Inc. and Bioengineering Group (United States); Chappell, Mark, E-mail: Mark.a.chappell@usace.army.mi [U.S. Army Corps of Engineers, Environmental Laboratory (United States); Colvin, Vicki, E-mail: colvin@rice.ed [Rice University, ICON (United States); Davis, J. Michael, E-mail: Davis.Jmichael@epa.go [Office of Research and Development, U.S. Environmental Protection Agency, National Center for Environmental Assessment (United States); Davis, Thomas, E-mail: ta.davis@umontreal.c [University of Montreal, Environment Canada and Department of Chemistry (Canada); Elder, Alison, E-mail: Alison_Elder@urmc.rochester.ed [University of Rochester, Department of Environmental Medicine (United States); Foss Hansen, Steffen, E-mail: sfh@er.dtu.d [Technical University of Denmark, Department of Environmental Engineering, NanoDTU (Denmark); Hakkinen, Pertti Bert, E-mail: berthakkinen@gmail.co [Toxicology Excellence for Risk Assessment (TERA) (United States); Hussain, Saber M., E-mail: Saber.Hussain@wpafb.af.mi [Air Force Research Laboratory (United States); Karkan, Delara, E-mail: Delara_karkan@hc-sc.gc.c [Health Canada (Canada); Korenstein, Rafi, E-mail: korens@post.tau.ac.i [Marian Gertner Institute for Medical Nanosystems, Tel Aviv University, Department of Physiology and Pharmacology, Faculty of Medicine (Israel); Lynch, Iseult, E-mail: iseult@fiachra.ucd.i [School of Chemistry and Chemical Biology, University College Dublin, Irish Centre for Colloid Science and Biomaterials (Ireland); Metcalfe, Chris, E-mail: cmetcalfe@trentu.c [Trent University (Canada)

    2009-04-15

    Nanomaterials and their associated technologies hold promising opportunities for the development of new materials and applications in a wide variety of disciplines, including medicine, environmental remediation, waste treatment, and energy conservation. However, current information regarding the environmental effects and health risks associated with nanomaterials is limited and sometimes contradictory. This article summarizes the conclusions of a 2008 NATO workshop designed to evaluate the wide-scale implications (e.g., benefits, risks, and costs) of the use of nanomaterials on human health and the environment. A unique feature of this workshop was its interdisciplinary nature and focus on the practical needs of policy decision makers. Workshop presentations and discussion panels were structured along four main themes: technology and benefits, human health risk, environmental risk, and policy implications. Four corresponding working groups (WGs) were formed to develop detailed summaries of the state-of-the-science in their respective areas and to discuss emerging gaps and research needs. The WGs identified gaps between the rapid advances in the types and applications of nanomaterials and the slower pace of human health and environmental risk science, along with strategies to reduce the uncertainties associated with calculating these risks.

  6. Emerging methods and tools for environmental risk assessment, decision-making, and policy for nanomaterials: summary of NATO Advanced Research Workshop.

    Science.gov (United States)

    Linkov, Igor; Steevens, Jeffery; Adlakha-Hutcheon, Gitanjali; Bennett, Erin; Chappell, Mark; Colvin, Vicki; Davis, J Michael; Davis, Thomas; Elder, Alison; Foss Hansen, Steffen; Hakkinen, Pertti Bert; Hussain, Saber M; Karkan, Delara; Korenstein, Rafi; Lynch, Iseult; Metcalfe, Chris; Ramadan, Abou Bakr; Satterstrom, F Kyle

    2009-04-01

    Nanomaterials and their associated technologies hold promising opportunities for the development of new materials and applications in a wide variety of disciplines, including medicine, environmental remediation, waste treatment, and energy conservation. However, current information regarding the environmental effects and health risks associated with nanomaterials is limited and sometimes contradictory. This article summarizes the conclusions of a 2008 NATO workshop designed to evaluate the wide-scale implications (e.g., benefits, risks, and costs) of the use of nanomaterials on human health and the environment. A unique feature of this workshop was its interdisciplinary nature and focus on the practical needs of policy decision makers. Workshop presentations and discussion panels were structured along four main themes: technology and benefits, human health risk, environmental risk, and policy implications. Four corresponding working groups (WGs) were formed to develop detailed summaries of the state-of-the-science in their respective areas and to discuss emerging gaps and research needs. The WGs identified gaps between the rapid advances in the types and applications of nanomaterials and the slower pace of human health and environmental risk science, along with strategies to reduce the uncertainties associated with calculating these risks.

  7. Graphical tools for network meta-analysis in STATA.

    Science.gov (United States)

    Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  8. Graphical tools for network meta-analysis in STATA.

    Directory of Open Access Journals (Sweden)

    Anna Chaimani

    Full Text Available Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  9. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  10. Evaluation of the Transfer of Permanent Formation: Analysis of an Experience of Workshops on Astronomy

    Science.gov (United States)

    Cano, Elena; Fabregat, Jaime; Ros, Rosa M.

    2016-08-01

    In the framework of a European project to bring astronomy near to children, several permanent teachers training activities were developed. These actions included workshops with teachers from various stages of the educational system. This paper presents the process and results of the evaluation of that training program. It intends to assess the satisfaction of the participants, as well as their learning and their later transfer of formation to the classroom. Barriers encountered in the transfer of formation, some of them linked to the type of training method chosen and other factors derived from personal and institutional conditions, are outlined. Finally, some guidelines for improving the transfer of scientific formation to the classroom in the future are pointed out.

  11. Use of Grid Tools to Support CMS Distributed Analysis

    CERN Document Server

    Fanfani, A; Anjum, A; Barrass, T; Bonacorsi, D; Bunn, J; Corvo, M; Darmenov, N; De Filippis, N; Donno, F; Donvito, G; Eulisse, G; Fanzago, F; Filine, A; Grandi, C; Hernández, J M; Innocente, V; Jan, A; Lacaprara, S; Legrand, I; Metson, S; Newman, H; Silvestris, L; Steenberg, C; Stockinger, H; Taylor, L; Thomas, M; Tuura, L; Van Lingen, F; Wildish, T

    2004-01-01

    In order to prepare the Physic Technical Design Report, due by end of 2005, the CMS experiment needs to simulate, reconstruct and anlayse about 100 million events, corresponding to more than 200 TB of data. The data will be distributed to several Computing Centres. In order to provide access to the whole data sample to all the world-wide dispersed physicists, CMS is developing a layer of software that uses the grid tools provided by the LCG project to gain access to data and resources and that aims to provide physicists with a user friendly interface for submitting analysis jobs. The GRID tools used are both those already available in the LCG-2 release and those being developed in gain access to data and resources and that aims to provide physicists with a user friendly interface for submitting analysis jobs. The GRID tools used are both those already available in the LCG-2 release and those being developed in the framework of the ARDA project. This work describes the current status and the future development...

  12. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    Science.gov (United States)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  13. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    Science.gov (United States)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  14. Networking Sensor Observations, Forecast Models & Data Analysis Tools

    Science.gov (United States)

    Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

    2009-12-01

    This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and

  15. Collection Development: A Summary of Workshop Discussions.

    Science.gov (United States)

    Dudley, Norman

    1979-01-01

    Highlights from five workshop sessions held during the Preconference Institute on Collection Development in June 1977 focus on collection development policy statements, selection tools, budgeting, evaluation, and weeding. (MBR)

  16. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  17. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    Science.gov (United States)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  18. Activity-Centred Tool Integration

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius

    2003-01-01

    This paper is concerned with integration of heterogeneous tools for system development. We argue that such tools should support concrete activities (e.g., programming, unit testing, conducting workshops) in contrast to abstract concerns (e.g., analysis, design, implementation). A consequence...... of this is that tools — or components —that support activities well should be integrated in ad-hoc, dynamic, and heterogeneous ways. We present a peer-to-peer architecture for this based on type-based publish subscribe and give an example of its use....

  19. GLIDER: Free tool imagery data visualization, analysis and mining

    Science.gov (United States)

    Ramachandran, R.; Graves, S. J.; Berendes, T.; Maskey, M.; Chidambaram, C.; Hogan, P.; Gaskin, T.

    2009-12-01

    Satellite imagery can be analyzed to extract thematic information, which has increasingly been used as a source of information for making policy decisions. The uses of such thematic information can vary from military applications such as detecting assets of interest to science applications such as characterizing land-use/land cover change at local, regional and global scales. However, extracting thematic information using satellite imagery is a non-trivial task. It requires a user to preprocess the data by applying operations for radiometric and geometric corrections. The user also needs to be able to visualize the data and apply different image enhancement operations to digitally improve the images to identify subtle information that might be otherwise missed. Finally, the user needs to apply different information extraction algorithms to the imagery to obtain the thematic information. At present, there are limited tools that provide users with the capability to easily extract and exploit the information contained within the satellite imagery. This presentation will present GLIDER, a free software tool addressing this void. GLIDER provides users with a easy to use tool to visualize, analyze and mine satellite imagery. GLIDER allows users to visualize and analyze satellite in its native sensor view, an important capability because any transformation to either a geographic coordinate system or any projected coordinate system entails spatial and intensity interpolation; and hence, loss of information. GLIDER allows users to perform their analysis in the native sensor view without any loss of information. GLIDER provides users with a full suite of image processing algorithms that can be used to enhance the satellite imagery. It also provides pattern recognition and data mining algorithms for information extraction. GLIDER allows its users to project satellite data and the analysis/mining results onto to a globe and overlay additional data layers. Traditional analysis

  20. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  1. Basic statistical tools in research and data analysis

    Science.gov (United States)

    Ali, Zulfiqar; Bhaskar, S Bala

    2016-01-01

    Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  2. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  3. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  4. A survey of tools for the analysis of quantitative PCR (qPCR data

    Directory of Open Access Journals (Sweden)

    Stephan Pabinger

    2014-09-01

    Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  5. Anaphe—OO Libraries and Tools for Data Analysis

    Institute of Scientific and Technical Information of China (English)

    O.Couet; B.Ferrero-Merlino; 等

    2001-01-01

    The Anaple project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments,A range of commercial and public domain libraries is used to cover basic functionalities;on top of these libraries a set of HENP-sepcific C++ class libraries for histogram management fitting,plotting and ntuple-like data analysis has been developed .In order to comply with the user requireements for a command-line driven tool,we have chosen to use a scripting language(Python)as the fromt-ent for a data analysis tool.The loose coupling provided by the consequent use of (AIDA compliant)Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provies an easy integration of existing libraries into modern scipting languages thus allowing for rapid application development.This integration is simplified even further suing a specialised toolkit(SWIG)to create" shadow Classes"for the Python language,which map the definitions of the Abstract Interfaces almost at a one-to-one level.This paper will give an overview of the architecture and design choices and will present the current status and future developments of the project.

  6. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  7. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    Science.gov (United States)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  8. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    Science.gov (United States)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Submillimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation Flying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  9. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  10. Simulation Process Analysis of Rubber Shock Absorber for Machine Tool

    Directory of Open Access Journals (Sweden)

    Chai Rong Xia

    2016-01-01

    Full Text Available The simulation on rubber shock absorber of machine tool was studied. The simple material model of rubber was obtained by through the finite element analysis software ABAQUS. The compression speed and the hardness of rubber material were considered to obtain the deformation law of rubber shock absorber. The location of fatigue were confirmed from the simulation results. The results shown that the fatigue position is distributed in the corner of shock absorber. The degree of deformation is increased with increasing of compress speed, and the hardness of rubber material is proportional to deformation.

  11. 3D-Aided-Analysis Tool for Lunar Rover

    Institute of Scientific and Technical Information of China (English)

    ZHANG Peng; LI Guo-peng; REN Xin; LIU Jian-jun; GAO Xing-ye; ZOU Xiao-duan

    2013-01-01

    3D-Aided-Analysis Tool (3DAAT) which is a virtual reality system is built up in this paper. 3DAAT is integrated with kinematics and dynamics model of rover as well as real lunar surface terrain mode. Methods of modeling which are proposed in this paper include constructing lunar surface, constructing 3D model of lander and rover, building up kinematic model of rover body. Photogrammetry technique and the remote sensing information are used to generate the terrain model of lunar surface. According to the implementation result, 3DAAT is an effective assist system for making exploration plan and analyzing the status of rover.

  12. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  13. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    Science.gov (United States)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  14. Creating Fantastic PI Workshops

    Energy Technology Data Exchange (ETDEWEB)

    Biedermann, Laura B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Clark, Blythe G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Colbert, Rachel S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dagel, Amber Lynn [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gupta, Vipin P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hibbs, Michael R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perkins, David Nikolaus [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); West, Roger Derek [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The goal of this SAND report is to provide guidance for other groups hosting workshops and peerto-peer learning events at Sandia. Thus this SAND report provides detail about our team structure, how we brainstormed workshop topics and developed the workshop structure. A Workshop “Nuts and Bolts” section provides our timeline and check-list for workshop activities. The survey section provides examples of the questions we asked and how we adapted the workshop in response to the feedback.

  15. Unified Airport Pavement Design and Analysis Concepts Workshops Held in Cambridge, Massachusetts on 16-17 July 1991

    Science.gov (United States)

    1992-07-01

    researcher but have noa been used extensively for airpor pavemen analysis. Finally, thet are wodels that have been devope in otr egb elds tha c be applied...primarily designed as a tool for the structural engineer, while GEOSYS is geared more toward geotechnical problems. Unfortunately, due to their general...York, (1982) 2. Arthur Q. Tool , "Relation Between Inelastic Deformubility and Thermal Expansion of Glass in its Annealing Range," J&m9l of The

  16. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  17. Research on Tool Management Based Total Life for DNC Workshop%面向DNC车间的刀具全寿命周期管理技术研究

    Institute of Scientific and Technical Information of China (English)

    殷锐; 陈金亮

    2013-01-01

    Tool is indispensable resources of DNC workshop. Tool management based total life is affected by many large manufacturing enterprises attention. Tool marking and identification and life prediction are difficult and pivotal of tool management. According to the tool marking and identification, direct carving technology and automatic identification technology of tool were discussed, put forward strip light source as auxiliary illuminant method to improve the barcode recognition rate. According to the tool life prediction, a forecast method based on the BP neural network was proposed. The basic idea of the algorithm was introduced and experimental results were given.%刀具作为DNC车间不可缺少的资源,其全寿命周期管理受到许多大型制造企业的重视,而刀具信息的标识和寿命预测是实现刀具管理的难点和关键.针对刀具信息标识,主要研究了刀具信息的直接标刻技术和刀具信息的自动识别技术,提出了用条状光源作为辅助光源的方法来提高条码识别率的方法;针对刀具寿命预测,提出了基于BP神经网络的预测方法,介绍了该算法的基本思想,并给出了实验结果.

  18. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis.

    Science.gov (United States)

    Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben

    2016-01-01

    Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f .

  19. Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.

    Science.gov (United States)

    Carlson, David H.

    1986-01-01

    This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…

  20. SIMS applications workshop. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    The first ANSTO/AINSE SIMS Workshop drew together a mixture of Surface Analysis experts and Surface Analysis users with the concept that SIMS analysis has to be enfolded within the spectrum of surface analysis techniques and that the user should select the technique most applicable to the problem. With this concept in mind the program was structured as sessions on SIMS Facilities; Applications to Mineral Surfaces; Applications to Biological Systems, Applications to Surfaces as Semi- conductors, Catalysts and Surface Coatings; and Applications to Ceramics

  1. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  2. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    Science.gov (United States)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is

  3. PyRAT (python radiography analysis tool): overview

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Jerawan C [Los Alamos National Laboratory; Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory

    2011-01-14

    PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

  4. Message Correlation Analysis Tool for NOvA

    CERN Document Server

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  5. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  6. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  7. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  8. Stacks: an analysis tool set for population genomics.

    Science.gov (United States)

    Catchen, Julian; Hohenlohe, Paul A; Bassham, Susan; Amores, Angel; Cresko, William A

    2013-06-01

    Massively parallel short-read sequencing technologies, coupled with powerful software platforms, are enabling investigators to analyse tens of thousands of genetic markers. This wealth of data is rapidly expanding and allowing biological questions to be addressed with unprecedented scope and precision. The sizes of the data sets are now posing significant data processing and analysis challenges. Here we describe an extension of the Stacks software package to efficiently use genotype-by-sequencing data for studies of populations of organisms. Stacks now produces core population genomic summary statistics and SNP-by-SNP statistical tests. These statistics can be analysed across a reference genome using a smoothed sliding window. Stacks also now provides several output formats for several commonly used downstream analysis packages. The expanded population genomics functions in Stacks will make it a useful tool to harness the newest generation of massively parallel genotyping data for ecological and evolutionary genetics.

  9. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  10. XQCAT eXtra Quark Combined Analysis Tool

    CERN Document Server

    Barducci, D; Buchkremer, M; Marrouche, J; Moretti, S; Panizzi, L

    2015-01-01

    XQCAT (eXtra Quark Combined Analysis Tool) is a tool aimed to determine exclusion Confidence Levels (eCLs) for scenarios of new physics characterised by the presence of one or multiple heavy extra quarks (XQ) which interact through Yukawa couplings with any of the Standard Model (SM) quarks. The code uses a database of efficiencies for pre-simulated processes of Quantum Chromo-Dynamics (QCD) pair production and on-shell decays of extra quarks. In the version 1.0 of XQCAT the efficiencies have been computed for a set of seven publicly available search results by the CMS experiment, and the package is subject to future updates to include further searches by both ATLAS and CMS collaborations. The input for the code is a text file in which masses, branching ratios (BRs) and dominant chirality of the couplings of the new quarks are provided. The output of the code is the eCL of the test point for each implemented experimental analysis considered individually and, when possible, in statistical combination.

  11. WooW-II: Workshop on open workflows

    Directory of Open Access Journals (Sweden)

    Daniel Arribas-Bel

    2015-07-01

    Full Text Available This resource describes WooW-II, a two-day workshop on open workflows for quantitative social scientists. The workshop is broken down in five main parts, where each of them typically consists of an introductionary tutorial and a hands-on assignment. The specific tools discussed in this workshop are Markdown, Pandoc, Git, Github, R, and Rstudio, but the theoretical approach applies to a wider range of tools (e.g., LATEX and Python. By the end of the workshop, participants should be able to reproduce a paper of their own and make it available in an open form applying the concepts and tools introduced.

  12. CRITICA: coding region identification tool invoking comparative analysis

    Science.gov (United States)

    Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

    1999-01-01

    Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

  13. Workshop experience

    Directory of Open Access Journals (Sweden)

    Georgina Holt

    2007-04-01

    Full Text Available The setting for the workshop was a heady mix of history, multiculturalism and picturesque riverscapes. Within the group there was, as in many food studies, a preponderance of female scientists (or ethnographers, but the group interacted on lively, non-gendered terms - focusing instead on an appreciation of locals food and enthusiasm for research shared by all, and points of theoretical variance within that.The food provided by our hosts was of the very highest eating and local food qualities...

  14. Workshop proceedings

    DEFF Research Database (Denmark)

    investigation already, but for many other domains, such as books, news, scientific articles, and Web pages we do not know if and how these data sources should be combined to provided the best recommendation performance. The CBRecSys 2014 workshop aims to address this by providing a dedicated venue for papers...... dedicated to all aspects of content-based recommendation. We issued a Call for Papers asking for submissions of novel research papers (both long and short) addressing recommendation in do- mains where textual content is abundant (e.g., books, news, scientific articles, jobs, educational resources, Web pages...

  15. A policy model to initiate environmental negotiations: Three hydropower workshops

    Science.gov (United States)

    Lamb, Berton Lee; Taylor, Jonathan G.; Burkardt, Nina; Ponds, Phadrea D.

    1998-01-01

    How do I get started in natural resource negotiations? Natural resource managers often face difficult negotiations when they implement laws and policies regulating such resources as water, wildlife, wetlands, endangered species, and recreation. As a result of these negotiations, managers must establish rules, grant permits, or create management plans. The Legal‐Institutional Analysis Model (LIAM) was designed to assist managers in systematically analyzing the parties in natural resource negotiations and using that analysis to prepare for bargaining. The LIAM relies on the theory that organizations consistently employ behavioral roles. The model uses those roles to predict likely negotiation behavior. One practical use of the LIAM is when all parties to a negotiation conduct a workshop as a way to open the bargaining on a note of trust and mutual understanding. The process and results of three LIAM workshops designed to guide hydroelectric power licensing negotiations are presented. Our experience with these workshops led us to conclude that the LIAM can be an effective tool to begin a negotiation and that trust built through the workshops can help create a successful result.

  16. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Ivan Bekavac; Daniela Garbin Praničević

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  17. IPMP 2013--a comprehensive data analysis tool for predictive microbiology.

    Science.gov (United States)

    Huang, Lihan

    2014-02-03

    Predictive microbiology is an area of applied research in food science that uses mathematical models to predict the changes in the population of pathogenic or spoilage microorganisms in foods exposed to complex environmental changes during processing, transportation, distribution, and storage. It finds applications in shelf-life prediction and risk assessments of foods. The objective of this research was to describe the performance of a new user-friendly comprehensive data analysis tool, the Integrated Pathogen Modeling Model (IPMP 2013), recently developed by the USDA Agricultural Research Service. This tool allows users, without detailed programming knowledge, to analyze experimental kinetic data and fit the data to known mathematical models commonly used in predictive microbiology. Data curves previously published in literature were used to test the models in IPMP 2013. The accuracies of the data analysis and models derived from IPMP 2013 were compared in parallel to commercial or open-source statistical packages, such as SAS® or R. Several models were analyzed and compared, including a three-parameter logistic model for growth curves without lag phases, reduced Huang and Baranyi models for growth curves without stationary phases, growth models for complete growth curves (Huang, Baranyi, and re-parameterized Gompertz models), survival models (linear, re-parameterized Gompertz, and Weibull models), and secondary models (Ratkowsky square-root, Huang square-root, Cardinal, and Arrhenius-type models). The comparative analysis suggests that the results from IPMP 2013 were equivalent to those obtained from SAS® or R. This work suggested that the IPMP 2013 could be used as a free alternative to SAS®, R, or other more sophisticated statistical packages for model development in predictive microbiology.

  18. Recent Workshops

    CERN Multimedia

    Wickens, F. J.

    Since the previous edition of ATLAS e-news, the NIKHEF Institute in Amsterdam has hosted not just one but two workshops related to ATLAS TDAQ activities. The first in October was dedicated to the Detector Control System (DCS). Just three institutes, CERN, NIKHEF and St Petersburg, provide the effort for the central DCS services, but each ATLAS sub-detector provides effort for their own controls. Some 30 people attended, including representatives for all of the ATLAS sub-detectors, representatives of the institutes working on the central services and the project leader of JCOP, which brings together common aspects of detector controls across the LHC experiments. During the three-day workshop the common components were discussed, and each sub-detector described their experiences and plans for their future systems. Whilst many of the components to be used are standard commercial components, a key custom item for ATLAS is the ELMB (Embedded Local Monitor Board). Prototypes for this have now been extensively test...

  19. Cellular barcoding tool for clonal analysis in the hematopoietic system.

    Science.gov (United States)

    Gerrits, Alice; Dykstra, Brad; Kalmykowa, Olga J; Klauke, Karin; Verovskaya, Evgenia; Broekhuis, Mathilde J C; de Haan, Gerald; Bystrykh, Leonid V

    2010-04-01

    Clonal analysis is important for many areas of hematopoietic stem cell research, including in vitro cell expansion, gene therapy, and cancer progression and treatment. A common approach to measure clonality of retrovirally transduced cells is to perform integration site analysis using Southern blotting or polymerase chain reaction-based methods. Although these methods are useful in principle, they generally provide a low-resolution, biased, and incomplete assessment of clonality. To overcome those limitations, we labeled retroviral vectors with random sequence tags or "barcodes." On integration, each vector introduces a unique, identifiable, and heritable mark into the host cell genome, allowing the clonal progeny of each cell to be tracked over time. By coupling the barcoding method to a sequencing-based detection system, we could identify major and minor clones in 2 distinct cell culture systems in vitro and in a long-term transplantation setting. In addition, we demonstrate how clonal analysis can be complemented with transgene expression and integration site analysis. This cellular barcoding tool permits a simple, sensitive assessment of clonality and holds great promise for future gene therapy protocols in humans, and any other applications when clonal tracking is important.

  20. In Silico Analysis of Crop Science: Report on the First China-UK Workshop on Chips, Computers and Crops

    Institute of Scientific and Technical Information of China (English)

    Ming Chen; Andrew Harrison

    2008-01-01

    A workshop on "Chips, Computers and Crops" was held in Hangzhou, China during September 26-27, 2008. The main objective of the workshop was to bring together China and UK scientists from mathematics, bioinformatics and plant molecular biology communities to exchange ideas, enhance awareness of each others' fields,explore synergisms and make recommendations on fruitful future directions in crop science. Here we describe the contributions to the workshop, and examine some conceptual issues that lie at the foundations and future of crop systems biology.

  1. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  2. Resubmission of Gap Analysis Workshop for Training for Reintegration of Surgical Skills

    Science.gov (United States)

    2011-10-01

    based training • Variety of procedure-specific simulators Red llama Sim Praxis .. .-... ...~ ..... ,.. __ . --u...341~:..--::-.:::.-=:.."=- Web-based tools • Red Llama • Discourse Virtual Surgical...ORReady- www.ORReady.org Red Llama - http://www.redllamainc.com/ SCORE- General Surgery Resident Curriculum Portal- https://portal.surgicalcore.org

  3. SisRadiologia: a new software tool for analysis of radiological accidents and incidents in industrial radiography

    Energy Technology Data Exchange (ETDEWEB)

    Lima, Camila M. Araujo; Silva, Francisco C.A. da, E-mail: araujocamila@yahoo.com.br, E-mail: dasilva@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Araujo, Rilton A., E-mail: consultoria@maximindustrial.com.br [Maxim Industrial Assessoria TI, Rio de Janeiro, RJ (Brazil)

    2013-07-01

    According to the International Atomic Energy Agency (IAEA), many efforts have been made by Member states, aiming a better control of radioactive sources. Accidents mostly happened in practices named as high radiological risk and classified by IAEA in categories 1 and 2, being highlighted those related to radiotherapy, large irradiators and industrial radiography. Worldwide, more than 40 radiological accidents have been recorded in the industrial radiography. Worldwide, more than 40 radiological accidents have been recorded in the industrial radiography area, involving 37 workers, 110 members of the public and 12 fatalities. Records display 5 severe radiological accidents in industrial radiography activities in Brazil, in which 7 workers and 19 members of the public were involved. Such events led to hands and fingers radiodermatitis, but to no death occurrence. The purpose of this study is to present a computational program that allows the data acquisition and recording in the company, in such a way to ease a further detailed analysis of radiological event, besides providing the learning cornerstones aiming the avoidance of future occurrences. After one year of the 'Industrial SisRadiologia' computational program application - and mostly based upon the workshop about Analysis and Dose Calculation of Radiological Accidents in Industrial Radiography (Workshop sobre Analise e Calculo de dose de acidentes Radiologicos em Radiografia Industrial - IRD 2012), in which several Radiation Protection officers took part - it can be concluded that the computational program is a powerful tool to data acquisition, as well as, to accidents and incidents events recording and surveying in Industrial Radiography. The program proved to be efficient in the report elaboration to the Brazilian Regulatory Authority, and very useful in workers training to fix the lessons learned from radiological events.

  4. Proceedings of the workshop on applications of synchrotron radiation to trace impurity analysis for advanced silicon processing

    Energy Technology Data Exchange (ETDEWEB)

    Laderman, S [Integrated Circuits Business Div., Hewlett Packard Co., Palo Alto, CA (United States); Pianetta, P [Stanford Linear Accelerator Center, Menlo Park, CA (United States)

    1993-03-01

    Wafer surface trace impurity analysis is essential for development of competitive Si circuit technologies. Today's grazing incidence x-ray fluorescence techniques with rotating anodes fall short of requirements for the future. Hewlett Packard/Toshiba experiments indicate that with second generation synchrotron sources such as SSRL, the techniques can be extended sufficiently to meet important needs of the leading edge Si circuit industry through nearly all of the 1990's. This workshop was held to identify people interested in use of synchrotron radiation-based methods and to document needs and concerns for further development. Viewgraphs are included for the following presentations: microcontamination needs in silicon technology (M. Liehr), analytical methods for wafer surface contamination (A. Schimazaki), trace impurity analysis of liquid drops using synchrotron radiation (D. Wherry), TRXRF using synchrotron sources (S. Laderman), potential role of synchrotron radiation TRXRF in Si process R D (M. Scott), potenital development of synchrotron radiation facilities (S. Brennan), and identification of goals, needs and concerns (M. Garner).

  5. Brief of BES-Belle-CLEO-BaBar 2007 joint workshop on charm physics

    Institute of Scientific and Technical Information of China (English)

    WANG Yi-Fang; ZHANG Chang-Chun

    2008-01-01

    @@ The Institute of High Energy Physics of Chinese Academy of Sciences organized a workshop to establish closer contacts between experimentalists (theorists) involved in the studies of charm physics from both c and B communities. The workshop covers talks of physics analysis and its results from four electron-positron colliding experiments (BES, Belle, CLEO and BaBar). Presentations at the workshop are organized in the following sessions : (1) Hadron spectroscopy and new resonances; (2) D0-D0 mixing; (3) Charmonium decays; (4) Charm hadronic and (semi-)lcptonic decays; (5) QCD at low energy and τphysics; (6) Partial wave analysis and Dalitz analysis, MC generator and Tools; (7) Detector upgrade.

  6. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-06-01

    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  7. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-12-01

    The deployment and use of lithium-ion (Li-ion) batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge (SOC) histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory (NREL) has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite. This suite of tools pairs NREL’s high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  8. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Parish, Esther S [ORNL; Nugent, Philip J [ORNL

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  9. Verification and Validation of the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  10. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  11. GATA: a graphic alignment tool for comparative sequence analysis

    Directory of Open Access Journals (Sweden)

    Nix David A

    2005-01-01

    Full Text Available Abstract Background Several problems exist with current methods used to align DNA sequences for comparative sequence analysis. Most dynamic programming algorithms assume that conserved sequence elements are collinear. This assumption appears valid when comparing orthologous protein coding sequences. Functional constraints on proteins provide strong selective pressure against sequence inversions, and minimize sequence duplications and feature shuffling. For non-coding sequences this collinearity assumption is often invalid. For example, enhancers contain clusters of transcription factor binding sites that change in number, orientation, and spacing during evolution yet the enhancer retains its activity. Dot plot analysis is often used to estimate non-coding sequence relatedness. Yet dot plots do not actually align sequences and thus cannot account well for base insertions or deletions. Moreover, they lack an adequate statistical framework for comparing sequence relatedness and are limited to pairwise comparisons. Lastly, dot plots and dynamic programming text outputs fail to provide an intuitive means for visualizing DNA alignments. Results To address some of these issues, we created a stand alone, platform independent, graphic alignment tool for comparative sequence analysis (GATA http://gata.sourceforge.net/. GATA uses the NCBI-BLASTN program and extensive post-processing to identify all small sub-alignments above a low cut-off score. These are graphed as two shaded boxes, one for each sequence, connected by a line using the coordinate system of their parent sequence. Shading and colour are used to indicate score and orientation. A variety of options exist for querying, modifying and retrieving conserved sequence elements. Extensive gene annotation can be added to both sequences using a standardized General Feature Format (GFF file. Conclusions GATA uses the NCBI-BLASTN program in conjunction with post-processing to exhaustively align two DNA

  12. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    Directory of Open Access Journals (Sweden)

    Matthew V Caruana

    Full Text Available Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns.

  13. In silico tools for the analysis of antibiotic biosynthetic pathways

    DEFF Research Database (Denmark)

    Weber, Tilmann

    2014-01-01

    Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel...... screening methods based on the genome sequences of potential producing organisms. The basic principle of such genome mining approaches is to identify genes, which are involved in the biosynthesis of such molecules, and to predict the products of the identified pathways. Thus, bioinformatics methods...... and tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data....

  14. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  15. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  16. Input Range Testing for the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  17. System-of-Systems Technology-Portfolio-Analysis Tool

    Science.gov (United States)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  18. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis

    DEFF Research Database (Denmark)

    Callot, Laurent Abdelkader Francois; Paldam, Martin

    Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis...

  19. Multi-tool design and analysis of an automotive HUD

    Science.gov (United States)

    Irving, Bruce; Hasenauer, David; Mulder, Steve

    2016-10-01

    Design and analysis of an optical system is often a multidisciplinary task, and can involve the use of specialized software packages for imaging, mechanics, and illumination. This paper will present a case study on the design and analysis of a basic heads-up display (HUD) for automotive use. The emphasis will be on the special requirements of a HUD visual system and on the tools and techniques needed to accomplish the design. The first section of this paper will present an overview of the imaging design using commercially available imaging design software. Topics addressed in this section include modeling the windshield, visualizing the imaging performance, using constraints and freeform surfaces to improve the system, and meeting specific visual performance specifications with design/analysis methods. The second section will address the use of a CAD program to design a basic mechanical structure to support and protect the optics. This section will also discuss some of the issues and limitations involved in translating data between a CAD program and a lens design or illumination program. Typical issues that arise include the precision of optical surface prescriptions, surface and material properties, and the management of large data files. In the final section, the combined optical and mechanical package will be considered, using an illumination design program for stray light analysis. The stray light analysis will be directed primarily toward finding, visualizing, and quantifying unexpected ray paths. Techniques for sorting optical ray paths by path length, power, and elements or materials encountered will be discussed, along with methods for estimating the impact of stray light on the optical system performance.

  20. The Astronomy Workshop

    Science.gov (United States)

    Hamilton, Douglas P.

    2013-05-01

    Abstract (2,250 Maximum Characters): The Astronomy Workshop (http://janus.astro.umd.edu) is a collection of interactive online educational tools developed for use by students, educators, professional astronomers, and the general public. The more than 20 tools in the Astronomy Workshop are rated for ease-of-use, and have been extensively tested in large university survey courses as well as more specialized classes for undergraduate majors and graduate students. Here we briefly describe the tools most relevant for the Professional Dynamical Astronomer. Solar Systems Visualizer: The orbital motions of planets, moons, and asteroids in the Solar System as well as many of the planets in exoplanetary systems are animated at their correct relative speeds in accurate to-scale drawings. Zoom in from the chaotic outer satellite systems of the giant planets all the way to their innermost ring systems. Orbital Integrators: Determine the orbital evolution of your initial conditions for a number of different scenarios including motions subject to general central forces, the classic three-body problem, and satellites of planets and exoplanets. Zero velocity curves are calculated and automatically included on relevant plots. Orbital Elements: Convert quickly and easily between state vectors and orbital elements with Changing the Elements. Use other routines to visualize your three-dimensional orbit and to convert between the different commonly used sets of orbital elements including the true, mean, and eccentric anomalies. Solar System Calculators: These tools calculate a user-defined mathematical expression simultaneously for all of the Solar System's planets (Planetary Calculator) or moons (Satellite Calculator). Key physical and orbital data are automatically accessed as needed.

  1. 21st Century Kinematics : The 2012 NSF Workshop

    CERN Document Server

    2013-01-01

    21st Century Kinematics focuses on algebraic problems in the analysis and synthesis of mechanisms and robots, compliant mechanisms, cable-driven systems and protein kinematics. The specialist contributors provide the background for a series of presentations at the 2012 NSF Workshop. The text shows how the analysis and design of innovative mechanical systems yield increasingly complex systems of polynomials, characteristic of those systems. In doing so, takes advantage of increasingly sophisticated computational tools developed for numerical algebraic geometry and demonstrates the now routine derivation of polynomial systems dwarfing the landmark problems of even the recent past. The 21st Century Kinematics workshop echoes the NSF-supported 1963 Yale Mechanisms Teachers Conference that taught a generation of university educators the fundamental principles of kinematic theory. As such these proceedings will be provide admirable supporting theory for a graduate course in modern kinematics and should be of consid...

  2. Usage of a Responsible Gambling Tool: A Descriptive Analysis and Latent Class Analysis of User Behavior.

    Science.gov (United States)

    Forsström, David; Hesser, Hugo; Carlbring, Per

    2016-09-01

    Gambling is a common pastime around the world. Most gamblers can engage in gambling activities without negative consequences, but some run the risk of developing an excessive gambling pattern. Excessive gambling has severe negative economic and psychological consequences, which makes the development of responsible gambling strategies vital to protecting individuals from these risks. One such strategy is responsible gambling (RG) tools. These tools track an individual's gambling history and supplies personalized feedback and might be one way to decrease excessive gambling behavior. However, research is lacking in this area and little is known about the usage of these tools. The aim of this article is to describe user behavior and to investigate if there are different subclasses of users by conducting a latent class analysis. The user behaviour of 9528 online gamblers who voluntarily used a RG tool was analysed. Number of visits to the site, self-tests made, and advice used were the observed variables included in the latent class analysis. Descriptive statistics show that overall the functions of the tool had a high initial usage and a low repeated usage. Latent class analysis yielded five distinct classes of users: self-testers, multi-function users, advice users, site visitors, and non-users. Multinomial regression revealed that classes were associated with different risk levels of excessive gambling. The self-testers and multi-function users used the tool to a higher extent and were found to have a greater risk of excessive gambling than the other classes.

  3. Generalized Analysis Tools for Multi-Spacecraft Missions

    Science.gov (United States)

    Chanteur, G. M.

    2011-12-01

    Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI

  4. Trade-Space Analysis Tool for Constellations (TAT-C)

    Science.gov (United States)

    Le Moigne, Jacqueline; Dabney, Philip; de Weck, Olivier; Foreman, Veronica; Grogan, Paul; Holland, Matthew; Hughes, Steven; Nag, Sreeja

    2016-01-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: How many spacecraft should be included in the constellation? Which design has the best costrisk value? The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time.This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit Coverage, Reduction Metrics, and Cost Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance.TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of

  5. Thermal buckling comparative analysis using Different FE (Finite Element) tools

    Energy Technology Data Exchange (ETDEWEB)

    Banasiak, Waldemar; Labouriau, Pedro [INTECSEA do Brasil, Rio de Janeiro, RJ (Brazil); Burnett, Christopher [INTECSEA UK, Surrey (United Kingdom); Falepin, Hendrik [Fugro Engineers SA/NV, Brussels (Belgium)

    2009-12-19

    High operational temperature and pressure in offshore pipelines may lead to unexpected lateral movements, sometimes call lateral buckling, which can have serious consequences for the integrity of the pipeline. The phenomenon of lateral buckling in offshore pipelines needs to be analysed in the design phase using FEM. The analysis should take into account many parameters, including operational temperature and pressure, fluid characteristic, seabed profile, soil parameters, coatings of the pipe, free spans etc. The buckling initiation force is sensitive to small changes of any initial geometric out-of-straightness, thus the modeling of the as-laid state of the pipeline is an important part of the design process. Recently some dedicated finite elements programs have been created making modeling of the offshore environment more convenient that has been the case with the use of general purpose finite element software. The present paper aims to compare thermal buckling analysis of sub sea pipeline performed using different finite elements tools, i.e. general purpose programs (ANSYS, ABAQUS) and dedicated software (SAGE Profile 3D) for a single pipeline resting on an the seabed. The analyses considered the pipeline resting on a flat seabed with a small levels of out-of straightness initiating the lateral buckling. The results show the quite good agreement of results of buckling in elastic range and in the conclusions next comparative analyses with sensitivity cases are recommended. (author)

  6. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  7. Tool for Sizing Analysis of the Advanced Life Support System

    Science.gov (United States)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  8. Bioanalyzer: An Efficient Tool for Sequence Retrieval, Analysis and Manipulation

    Directory of Open Access Journals (Sweden)

    Hassan Tariq

    2010-12-01

    Full Text Available Bioanalyzer provides combination of tools that are never assembled together. Software has list of tools that can be important for different researchers. The aim to develop this kind of software is to provide unique set of tools at one platform in a more efficient and better way than the software or web tools available. It is stand-alone application so it can save time and effort to locate individual tools on net. Flexible design has made it easy to expand it in future. We will make it available publicly soon.

  9. Fatigue in cold-forging dies: Tool life analysis

    DEFF Research Database (Denmark)

    Skov-Hansen, P.; Bay, Niels; Grønbæk, J.;

    1999-01-01

    In the present investigation it is shown how the tool life of heavily loaded cold-forging dies can be predicted. Low-cycle fatigue and fatigue crack growth testing of the tool materials are used in combination with finite element modelling to obtain predictions of tool lives. In the models...... the number of forming cycles is calculated first to crack initiation and then during crack growth to fatal failure. An investigation of a critical die insert in an industrial cold-forging tool as regards the influence of notch radius, the amount and method of pre-stressing and the selected tool material...

  10. Advancing Risk Analysis for Nanoscale Materials: Report from an International Workshop on the Role of Alternative Testing Strategies for Advancement.

    Science.gov (United States)

    Shatkin, J A; Ong, Kimberly J; Beaudrie, Christian; Clippinger, Amy J; Hendren, Christine Ogilvie; Haber, Lynne T; Hill, Myriam; Holden, Patricia; Kennedy, Alan J; Kim, Baram; MacDonell, Margaret; Powers, Christina M; Sharma, Monita; Sheremeta, Lorraine; Stone, Vicki; Sultan, Yasir; Turley, Audrey; White, Ronald H

    2016-08-01

    The Society for Risk Analysis (SRA) has a history of bringing thought leadership to topics of emerging risk. In September 2014, the SRA Emerging Nanoscale Materials Specialty Group convened an international workshop to examine the use of alternative testing strategies (ATS) for manufactured nanomaterials (NM) from a risk analysis perspective. Experts in NM environmental health and safety, human health, ecotoxicology, regulatory compliance, risk analysis, and ATS evaluated and discussed the state of the science for in vitro and other alternatives to traditional toxicology testing for NM. Based on this review, experts recommended immediate and near-term actions that would advance ATS use in NM risk assessment. Three focal areas-human health, ecological health, and exposure considerations-shaped deliberations about information needs, priorities, and the next steps required to increase confidence in and use of ATS in NM risk assessment. The deliberations revealed that ATS are now being used for screening, and that, in the near term, ATS could be developed for use in read-across or categorization decision making within certain regulatory frameworks. Participants recognized that leadership is required from within the scientific community to address basic challenges, including standardizing materials, protocols, techniques and reporting, and designing experiments relevant to real-world conditions, as well as coordination and sharing of large-scale collaborations and data. Experts agreed that it will be critical to include experimental parameters that can support the development of adverse outcome pathways. Numerous other insightful ideas for investment in ATS emerged throughout the discussions and are further highlighted in this article.

  11. Optical Network Testbeds Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Joe Mambretti

    2007-06-01

    This is the summary report of the third annual Optical Networking Testbed Workshop (ONT3), which brought together leading members of the international advanced research community to address major challenges in creating next generation communication services and technologies. Networking research and development (R&D) communities throughout the world continue to discover new methods and technologies that are enabling breakthroughs in advanced communications. These discoveries are keystones for building the foundation of the future economy, which requires the sophisticated management of extremely large qualities of digital information through high performance communications. This innovation is made possible by basic research and experiments within laboratories and on specialized testbeds. Initial network research and development initiatives are driven by diverse motives, including attempts to solve existing complex problems, the desire to create powerful new technologies that do not exist using traditional methods, and the need to create tools to address specific challenges, including those mandated by large scale science or government agency mission agendas. Many new discoveries related to communications technologies transition to wide-spread deployment through standards organizations and commercialization. These transition paths allow for new communications capabilities that drive many sectors of the digital economy. In the last few years, networking R&D has increasingly focused on advancing multiple new capabilities enabled by next generation optical networking. Both US Federal networking R&D and other national R&D initiatives, such as those organized by the National Institute of Information and Communications Technology (NICT) of Japan are creating optical networking technologies that allow for new, powerful communication services. Among the most promising services are those based on new types of multi-service or hybrid networks, which use new optical networking

  12. BUSINESS INTELLIGENCE TOOLS FOR DATA ANALYSIS AND DECISION MAKING

    Directory of Open Access Journals (Sweden)

    DEJAN ZDRAVESKI

    2011-04-01

    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  13. A Software Tool for Quantitative Seismicity Analysis - ZMAP

    Science.gov (United States)

    Wiemer, S.; Gerstenberger, M.

    2001-12-01

    Earthquake catalogs are probably the most basic product of seismology, and remain arguably the most useful for tectonic studies. Modern seismograph networks can locate up to 100,000 earthquakes annually, providing a continuous and sometime overwhelming stream of data. ZMAP is a set of tools driven by a graphical user interface (GUI), designed to help seismologists analyze catalog data. ZMAP is primarily a research tool suited to the evaluation of catalog quality and to addressing specific hypotheses; however, it can also be useful in routine network operations. Examples of ZMAP features include catalog quality assessment (artifacts, completeness, explosion contamination), interactive data exploration, mapping transients in seismicity (rate changes, b-values, p-values), fractal dimension analysis and stress tensor inversions. Roughly 100 scientists worldwide have used the software at least occasionally. About 30 peer-reviewed publications have made use of ZMAP. ZMAP code is open source, written in the commercial software language Matlab by the Mathworks, a widely used software in the natural sciences. ZMAP was first published in 1994, and has continued to grow over the past 7 years. Recently, we released ZMAP v.6. The poster will introduce the features of ZMAP. We will specifically focus on ZMAP features related to time-dependent probabilistic hazard assessment. We are currently implementing a ZMAP based system that computes probabilistic hazard maps, which combine the stationary background hazard as well as aftershock and foreshock hazard into a comprehensive time dependent probabilistic hazard map. These maps will be displayed in near real time on the Internet. This poster is also intended as a forum for ZMAP users to provide feedback and discuss the future of ZMAP.

  14. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    Science.gov (United States)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  15. Study of academic achievements using spatial analysis tools

    Science.gov (United States)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  16. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Science.gov (United States)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  17. Suspended Cell Culture ANalysis (SCAN) Tool to Enhance ISS On-Orbit Capabilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences and partner, Draper Laboratory, propose to develop an on-orbit immuno-based label-free Suspension Cell Culture ANalysis tool, SCAN tool, which...

  18. Django girls workshop at IdeaSquare

    CERN Document Server

    2016-01-01

    Short video highlights of the Django Girls coding workshop organized at IdeaSquare on Feb 26-27, 2016 by the Rosehipsters non profit organization, supported by the CERN diversity team and the IT department, attracting 39 women from 15 countries. The aim of the workshop was to introduce participants to the world of computer programming and technology by teaching them how to successfully create a blog application and deploy it to the internet. Most of the 16 volunteer mentors were female. Django Girls is a non-profit organization and a community that empowers and helps women to organize free, one-day programming workshops by providing tools, resources and support.

  19. PREFACE: Collapse Calderas Workshop

    Science.gov (United States)

    Gottsmann, Jo; Aguirre-Diaz, Gerardo

    2008-10-01

    Caldera-formation is one of the most awe-inspiring and powerful displays of nature's force. Resultant deposits may cover vast areas and significantly alter the immediate topography. Post-collapse activity may include resurgence, unrest, intra-caldera volcanism and potentially the start of a new magmatic cycle, perhaps eventually leading to renewed collapse. Since volcanoes and their eruptions are the surface manifestation of magmatic processes, calderas provide key insights into the generation and evolution of large-volume silicic magma bodies in the Earth's crust. Despite their potentially ferocious nature, calderas play a crucial role in modern society's life. Collapse calderas host essential economic deposits and supply power for many via the exploitation of geothermal reservoirs, and thus receive considerable scientific, economic and industrial attention. Calderas also attract millions of visitors world-wide with their spectacular scenic displays. To build on the outcomes of the 2005 calderas workshop in Tenerife (Spain) and to assess the most recent advances on caldera research, a follow-up meeting was proposed to be held in Mexico in 2008. This abstract volume presents contributions to the 2nd Calderas Workshop held at Hotel Misión La Muralla, Querétaro, Mexico, 19-25 October 2008. The title of the workshop `Reconstructing the evolution of collapse calderas: Magma storage, mobilisation and eruption' set the theme for five days of presentations and discussions, both at the venue as well as during visits to the surrounding calderas of Amealco, Amazcala and Huichapan. The multi-disciplinary workshop was attended by more than 40 scientist from North, Central and South America, Europe, Australia and Asia. Contributions covered five thematic topics: geology, geochemistry/petrology, structural analysis/modelling, geophysics, and hazards. The workshop was generously supported by the International Association of Volcanology and the Chemistry of The Earth's Interior

  20. 电子工业厂房环境振动实测与分析%Test and Analysis of Environmental Vibration for Electronic Industrial Workshop

    Institute of Scientific and Technical Information of China (English)

    高广运; 李佳; 张博; 柴俊磊

    2012-01-01

    High sensitivity accelerometers are employed in the micro-vibration test of a high-tech electronic industrial workshop in Yantai. The acceleration time-history of each measuring point is obtained, by processing the measured data with aid of one-third octave band analysis method. Reasonable proposals for micro-vibration of the workshop engineering design are put forward.%采用高灵敏度加速度传感器,测试烟台某高科技电子工业厂房环境微振动,获得了各测点的加速度时程数据.采用国际上通用的1/3倍频程法对测试数据进行了处理,对结果进行了比较与分析,得出了厂房微振动响应.

  1. The Future Workshop: Democratic problem solving

    DEFF Research Database (Denmark)

    Vidal, Rene Victor Valqui

    2005-01-01

    The origins, principles and practice of a very popular method known as The Future Workshop are presented. The fundamental theory and principles of this method are presented in an introductory way. In addition, practical guidelines to carry out such a workshop are outlined and several types of app...... of applications are shortly described. The crucial importance of both the facilitation process and the use of creative tools in team work are enhanced....

  2. 18th International Workshop on Advanced Computing and Analysis Techniques in Physics Research

    CERN Document Server

    2017-01-01

    The 18th edition of ACAT will bring together experts to explore and confront the boundaries of computing, automated data analysis, and theoretical calculation technologies, in particle and nuclear physics, astronomy and astrophysics, cosmology, accelerator science and beyond. ACAT provides a unique forum where these disciplines overlap with computer science, allowing for the exchange of ideas and the discussion of cutting-edge computing, data analysis and theoretical calculation technologies in fundamental physics research.

  3. Analysis of the influence of tool dynamics in diamond turning

    Energy Technology Data Exchange (ETDEWEB)

    Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

    1988-12-01

    This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

  4. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    Science.gov (United States)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  5. Analysis of Facial Injuries Caused by Power Tools.

    Science.gov (United States)

    Kim, Jiye; Choi, Jin-Hee; Hyun Kim, Oh; Won Kim, Sug

    2016-06-01

    The number of injuries caused by power tools is steadily increasing as more domestic woodwork is undertaken and more power tools are used recreationally. The injuries caused by the different power tools as a consequence of accidents are an issue, because they can lead to substantial costs for patients and the national insurance system. The increase in hand surgery as a consequence of the use of power tools and its economic impact, and the characteristics of the hand injuries caused by power saws have been described. In recent years, the authors have noticed that, in addition to hand injuries, facial injuries caused by power tools commonly present to the emergency room. This study aimed to review the data in relation to facial injuries caused by power saws that were gathered from patients who visited the trauma center at our hospital over the last 4 years, and to analyze the incidence and epidemiology of the facial injuries caused by power saws. The authors found that facial injuries caused by power tools have risen continually. Facial injuries caused by power tools are accidental, and they cause permanent facial disfigurements and functional disabilities. Accidents are almost inevitable in particular workplaces; however, most facial injuries could be avoided by providing sufficient operator training and by tool operators wearing suitable protective devices. The evaluation of the epidemiology and patterns of facial injuries caused by power tools in this study should provide the information required to reduce the number of accidental injuries.

  6. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets....... It describes some of the requirements which these tools must fulfil, in order to support the user in a natural and effective way. Finally some references are given to papers which describe examples of existing Petri net tools....

  7. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    Science.gov (United States)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  8. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies

    Directory of Open Access Journals (Sweden)

    Schanze Denny

    2010-09-01

    Full Text Available Abstract Background Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Results Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Conclusions Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  9. GammaWorkshops Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Ramebaeck, H. (ed.) (Swedish Defence Research Agency (Sweden)); Straalberg, E. (Institute for Energy Technology, Kjeller (Norway)); Klemola, S. (Radiation and Nuclear Safety Authority, STUK (Finland)); Nielsen, Sven P. (Technical Univ. of Denmark. Risoe National Lab. for Sustainable Energy, Roskilde (Denmark)); Palsson, S.E. (Icelandic Radiation Safety Authority (Iceland))

    2012-01-15

    Due to a sparse interaction during the last years between practioners in gamma ray spectrometry in the Nordic countries, a NKS activity was started in 2009. This GammaSem was focused on seminars relevant to gamma spectrometry. A follow up seminar was held in 2010. As an outcome of these activities it was suggested that the 2011 meeting should be focused on practical issues, e.g. different corrections needed in gamma spectrometric measurements. This three day's meeting, GammaWorkshops, was held in September at Risoe-DTU. Experts on different topics relevant for gamma spectrometric measurements were invited to the GammaWorkshops. The topics included efficiency transfer, true coincidence summing corrections, self-attenuation corrections, measurement of natural radionuclides (natural decay series), combined measurement uncertainty calculations, and detection limits. These topics covered both lectures and practical sessions. The practical sessions included demonstrations of tools for e.g. corrections and calculations of the above meantioned topics. (Author)

  10. Clinical decision support tools: analysis of online drug information databases

    Directory of Open Access Journals (Sweden)

    Seamon Matthew J

    2007-03-01

    Full Text Available Abstract Background Online drug information databases are used to assist in enhancing clinical decision support. However, the choice of which online database to consult, purchase or subscribe to is likely made based on subjective elements such as history of use, familiarity, or availability during professional training. The purpose of this study was to evaluate clinical decision support tools for drug information by systematically comparing the most commonly used online drug information databases. Methods Five commercially available and two freely available online drug information databases were evaluated according to scope (presence or absence of answer, completeness (the comprehensiveness of the answers, and ease of use. Additionally, a composite score integrating all three criteria was utilized. Fifteen weighted categories comprised of 158 questions were used to conduct the analysis. Descriptive statistics and Chi-square were used to summarize the evaluation components and make comparisons between databases. Scheffe's multiple comparison procedure was used to determine statistically different scope and completeness scores. The composite score was subjected to sensitivity analysis to investigate the effect of the choice of percentages for scope and completeness. Results The rankings for the databases from highest to lowest, based on composite scores were Clinical Pharmacology, Micromedex, Lexi-Comp Online, Facts & Comparisons 4.0, Epocrates Online Premium, RxList.com, and Epocrates Online Free. Differences in scope produced three statistical groupings with Group 1 (best performers being: Clinical Pharmacology, Micromedex, Facts & Comparisons 4.0, Lexi-Comp Online, Group 2: Epocrates Premium and RxList.com and Group 3: Epocrates Free (p Conclusion Online drug information databases, which belong to clinical decision support, vary in their ability to answer questions across a range of categories.

  11. Simplified Analysis Tool for Ship-Ship Collision

    DEFF Research Database (Denmark)

    Yamada, Yasuhira; Pedersen, Preben Terndrup

    2007-01-01

    to the collision scenario thatwhere a VLCC in ballast condition collides perpendicularly with the mid part of another D/H VLCC in fully loaded condition. The results obtained from the present tool are compared with those obtained by large scale FEA, and fairy good agreements are achieved. The applicability......, limitation and future enhancement of the present tool are discussed in detail....

  12. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning; Mullins, Michael

    2006-01-01

    the two types of tools. The paper therefore looks at integration of the two types in a prototype for a tool which allows aesthetics evaluation, and at the same time gives the architect instant technical feedback on ideas already in the initial sketching phase. The aim of the research is to look...... possible approaches for working with digital tectonics by means of acoustics: The architects, the architect-engineer or hybrid practitioner and finally a prototype for a possible digital tectonic tool. For the third approach in the case study a prototype digital tectonic tool is tested on the design......The digital design tools used by architects and engineers today are very useful with respect to their specific fields of aesthetical or technical evaluation. It is not yet possible to fully use the potential of the computer in the design process, as there is no well functioning interplay between...

  13. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  14. PIXE and {mu}-PIXE analysis of glazes from terracotta sculptures of the della Robbia workshop

    Energy Technology Data Exchange (ETDEWEB)

    Zucchiatti, Alessandro E-mail: zucc@ge.infn.it; Bouquillon, Anne; Lanterna, Giancarlo; Franco, Lucarelli; Mando, Pier Andrea; Prati, Paolo; Salomon, Joseph; Vaccari, Maria Grazia

    2002-04-01

    A series of PIXE analyses has been performed on glazes from terracotta sculptures of the Italian Renaissance and on reference standards. The problems related to the investigation of such heterogeneous materials are discussed and the experimental uncertainties are evaluated, for each element, from the PIXE analysis of standard glasses. Some examples from artefacts coming from Italian collections are given. This research has been conducted in the framework of the COST-G1 European action.

  15. PIXE and /μ-PIXE analysis of glazes from terracotta sculptures of the della Robbia workshop

    Science.gov (United States)

    Zucchiatti, Alessandro; Bouquillon, Anne; Giancarlo Lanterna; Lucarelli, Franco; Mandò, Pier Andrea; Prati, Paolo; Salomon, Joseph; Vaccari, Maria Grazia

    2002-04-01

    A series of PIXE analyses has been performed on glazes from terracotta sculptures of the Italian Renaissance and on reference standards. The problems related to the investigation of such heterogeneous materials are discussed and the experimental uncertainties are evaluated, for each element, from the PIXE analysis of standard glasses. Some examples from artefacts coming from Italian collections are given. This research has been conducted in the framework of the COST-G1 European action.

  16. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    Science.gov (United States)

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  17. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc

    2015-04-21

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  18. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  19. IPHE Infrastructure Workshop Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    None

    2010-02-01

    This proceedings contains information from the IPHE Infrastructure Workshop, a two-day interactive workshop held on February 25-26, 2010, to explore the market implementation needs for hydrogen fueling station development.

  20. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the langua...... can provide for tools that are less powerful in theory, but more practical for use under real-world conditions. We also point out some opportunities for future work in both areas, motivated by our successes and difficulties with the two techniques....

  1. Online Analysis of Wind and Solar Part I: Ramping Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  2. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    Science.gov (United States)

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  3. General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft

    Science.gov (United States)

    Dove, Edwin; Hughes, Steve

    2007-01-01

    The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

  4. IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation

    Science.gov (United States)

    Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.

    2005-01-01

    This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.

  5. A Geographic and Functional Network Flow Analysis Tool

    Science.gov (United States)

    2014-06-01

    INFORMATION SYSTEMS TOOLS AND DYSTOPIA .......................................................................................................5 B. MODELS OF...17 IV. CASE STUDY: FIBER OPTIC COMMUNICATIONS BACKBONE IN DYSTOPIA ...15 Figure 8. A simple fiber-optic backbone network for Dystopia . .....................................19 Figure 9

  6. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    Science.gov (United States)

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest

  7. Non-contact measurement and analysis of machine tool spindles

    OpenAIRE

    Clough, David A; Fletcher, Simon; Longstaff, Andrew P.

    2010-01-01

    Increasing demand on the manufacturing industry to produce tighter tolerance parts means it is\\ud necessary to gain a greater understanding of machine tool capabilities and error sources. A significant source of machine tool errors is down to spindle inaccuracies and performance, leading to part scrapping. Catastrophic spindle failure brings production to a standstill until a new spindle can be procured and installed, resulting in lost production time.\\ud This project aims to assess the effec...

  8. The Danish Scenario Workshop Report

    DEFF Research Database (Denmark)

    Brodersen, Søsser; Jørgensen, Michael Søgaard

    with informal drinks) and planned and carried out as recommended in Ahumada (2003). We have however not developed all the material recommended by Ahumada (2003) as informative material prior to the workshop, (e.g. a SWOT analysis) due to a wish only to produce material to the participants which we found useful...

  9. MOOC Design Workshop

    DEFF Research Database (Denmark)

    Nørgård, Rikke Toft; Mor, Yishay; Warburton, Steven

    2016-01-01

    For the last two years we have been running a series of successful MOOC design workshops. These workshops build on previous work in learning design and MOOC design patterns. The aim of these workshops is to aid practitioners in defining and conceptualising educational innovations (predominantly...

  10. ICP-MS Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Carman, April J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Eiden, Gregory C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-11-01

    This is a short document that explains the materials that will be transmitted to LLNL and DNN HQ regarding the ICP-MS Workshop held at PNNL June 17-19th. The goal of the information is to pass on to LLNL information regarding the planning and preparations for the Workshop at PNNL in preparation of the SIMS workshop at LLNL.

  11. Risk Management Techniques and Practice Workshop Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, T; Zosel, M

    2008-12-02

    At the request of the Department of Energy (DOE) Office of Science (SC), Lawrence Livermore National Laboratory (LLNL) hosted a two-day Risk Management Techniques and Practice (RMTAP) workshop held September 18-19 at the Hotel Nikko in San Francisco. The purpose of the workshop, which was sponsored by the SC/Advanced Scientific Computing Research (ASCR) program and the National Nuclear Security Administration (NNSA)/Advanced Simulation and Computing (ASC) program, was to assess current and emerging techniques, practices, and lessons learned for effectively identifying, understanding, managing, and mitigating the risks associated with acquiring leading-edge computing systems at high-performance computing centers (HPCCs). Representatives from fifteen high-performance computing (HPC) organizations, four HPC vendor partners, and three government agencies attended the workshop. The overall workshop findings were: (1) Standard risk management techniques and tools are in the aggregate applicable to projects at HPCCs and are commonly employed by the HPC community; (2) HPC projects have characteristics that necessitate a tailoring of the standard risk management practices; (3) All HPCC acquisition projects can benefit by employing risk management, but the specific choice of risk management processes and tools is less important to the success of the project; (4) The special relationship between the HPCCs and HPC vendors must be reflected in the risk management strategy; (5) Best practices findings include developing a prioritized risk register with special attention to the top risks, establishing a practice of regular meetings and status updates with the platform partner, supporting regular and open reviews that engage the interests and expertise of a wide range of staff and stakeholders, and documenting and sharing the acquisition/build/deployment experience; and (6) Top risk categories include system scaling issues, request for proposal/contract and acceptance testing, and

  12. The ATLAS Electromagnetic Calorimeter Calibration Workshop

    CERN Multimedia

    Hong Ma; Isabelle Wingerter

    The ATLAS Electromagnetic Calorimeter Calibration Workshop took place at LAPP-Annecy from the 1st to the 3rd of October; 45 people attended the workshop. A detailed program was setup before the workshop. The agenda was organised around very focused presentations where questions were raised to allow arguments to be exchanged and answers to be proposed. The main topics were: Electronics calibration Handling of problematic channels Cluster level corrections for electrons and photons Absolute energy scale Streams for calibration samples Calibration constants processing Learning from commissioning Forty-five people attended the workshop. The workshop was on the whole lively and fruitful. Based on years of experience with test beam analysis and Monte Carlo simulation, and the recent operation of the detector in the commissioning, the methods to calibrate the electromagnetic calorimeter are well known. Some of the procedures are being exercised in the commisssioning, which have demonstrated the c...

  13. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  14. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Directory of Open Access Journals (Sweden)

    Marilyn Wilhelmina Leonora Monster

    2015-12-01

    Full Text Available The multispecimen protocol (MSP is a method to estimate the Earth’s magnetic field’s past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA, that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected calculated following Dekkers and Böhnel (2006 and Fabian and Leonhardt (2010 and a number of other parameters proposed by Fabian and Leonhardt (2010, it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM and the partial thermoremanent magnetization (pTRM gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  15. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  16. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    Science.gov (United States)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  17. Comparative analysis of deterministic and probabilistic fracture mechanical assessment tools

    Energy Technology Data Exchange (ETDEWEB)

    Heckmann, Klaus [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Koeln (Germany); Saifi, Qais [VTT Technical Research Centre of Finland, Espoo (Finland)

    2016-11-15

    Uncertainties in material properties, manufacturing processes, loading conditions and damage mechanisms complicate the quantification of structural reliability. Probabilistic structure mechanical computing codes serve as tools for assessing leak- and break probabilities of nuclear piping components. Probabilistic fracture mechanical tools were compared in different benchmark activities, usually revealing minor, but systematic discrepancies between results of different codes. In this joint paper, probabilistic fracture mechanical codes are compared. Crack initiation, crack growth and the influence of in-service inspections are analyzed. Example cases for stress corrosion cracking and fatigue in LWR conditions are analyzed. The evolution of annual failure probabilities during simulated operation time is investigated, in order to identify the reasons for differences in the results of different codes. The comparison of the tools is used for further improvements of the codes applied by the partners.

  18. An Evaluation of Visual and Textual Network Analysis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Goodall, John R [ORNL

    2011-01-01

    User testing is an integral component of user-centered design, but has only rarely been applied to visualization for cyber security applications. This article presents the results of a comparative evaluation between a visualization-based application and a more traditional, table-based application for analyzing computer network packet captures. We conducted this evaluation as part of the user-centered design process. Participants performed both structured, well-defined tasks and exploratory, open-ended tasks with both tools. We measured accuracy and efficiency for the well-defined tasks, number of insights was measured for exploratory tasks and user perceptions were recorded for each tool. The results of this evaluation demonstrated that users performed significantly more accurately in the well-defined tasks, discovered a higher number of insights and demonstrated a clear preference for the visualization tool. The study design presented may be useful for future researchers performing user testing on visualization for cyber security applications.

  19. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    Science.gov (United States)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  20. Geothermal systems materials: a workshop/symposium

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    Sixteen papers are included. A separate abstract was prepared for each. Summaries of workshops on the following topics are also included in the report: non-metallic materials, corrosion, materials selection, fluid chemistry, and failure analysis. (MHR)

  1. Lagrangian analysis. Modern tool of the dynamics of solids

    Science.gov (United States)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  2. Writing and Learning in the Business Classroom: The Workshop Approach

    Science.gov (United States)

    Fernsten, Linda; Fernsten, Jeffrey

    2008-01-01

    A writing workshop is a pedagogical tool that can create a more productive experience for teachers and students alike. Business students who have used this technique with experienced instructors agree that a well-planned writing workshop can be useful for dispelling writing fears, furthering understanding of business communication skills,…

  3. Fractography analysis of tool samples used for cold forging

    DEFF Research Database (Denmark)

    Dahl, K.V.

    2002-01-01

    Three fractured tool dies used for industrial cold forging have been investigated using light optical microscopy and scanning electron microscopy. Two of the specimens were produced using the traditional Böhler P/M steel grade s790, while the lastspecimen was a third generation P/M steel produced...... using new technology developed by Böhler. All three steels have the same nominal composition of alloying elements. The failure in both types of material occurs as a crack formation at a notch inside ofthe tool. Generally the cold forging dies constructed in third generation steels have a longer lifetime...

  4. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    Science.gov (United States)

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper.

  5. Risk Assessment and Life Cycle Assessment, Environmental Strategies, Nordic Workshop, Vedbæk 1999

    DEFF Research Database (Denmark)

    Poll, Christian

    At a Nordic workshop on Product-oriented Environmental Strategies the roles of risk and hazard assessment and life cycle assessment of products in the future regulation of chemicals were discussed by participants representing administration, academia and industry from the Nordic countries....... This report compiles the papers and presentations given at the workshop. The papers present and discuss the different assessment tools and procedures - for individual chemicals through hazard and risk assessments and for products, materials and services through life-cycle assessment. The report also contains...... Analyses, Input/output analysis, Environmental Audits and Performance evaluations and Cost Accounting they constitute the toolbox of analysis and management tools required for a full product-related strategy towards a sustainable development....

  6. "Boden macht Schule" - a soil awareness workshop for Austrian pupils

    Science.gov (United States)

    Foldal, Cecile B.; Aust, Günter; Baumgarten, Andreas; Berthold, Helene; Birli, Barbara; Englisch, Michael; Ferstl, Elsa; Leregger, Florian; Schwarz, Sigrid; Tulipan, Monika

    2014-05-01

    In order to raise awareness and understanding for the importance of soil, we developed a workshop for schoolchildren between the age of nine and thirteen. The workshop focuses on soil formation, soil functions and soil organisms. Guided by young soil scientist the children can actively explore different soil properties. Key elements are studies and identification of soil animals, small physical experiments and several games followed up with creative tasks. Our aim is to make the workshop an attractive tool for environmental education in public schools and by this to increase the interest in soil and soil protection. This poster gives a short overview of the contents of the workshop "Boden macht Schule"

  7. Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).

    Science.gov (United States)

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

  8. Stability analysis of machine tool spindle under uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Dou

    2016-05-01

    Full Text Available Chatter is a harmful machining vibration that occurs between the workpiece and the cutting tool, usually resulting in irregular flaw streaks on the finished surface and severe tool wear. Stability lobe diagrams could predict chatter by providing graphical representations of the stable combinations of the axial depth of the cut and spindle speed. In this article, the analytical model of a spindle system is constructed, including a Timoshenko beam rotating shaft model and double sets of angular contact ball bearings with 5 degrees of freedom. Then, the stability lobe diagram of the model is developed according to its dynamic properties. The Monte Carlo method is applied to analyse the bearing preload influence on the system stability with uncertainty taken into account.

  9. Proteomic tools for the analysis of transient interactions between metalloproteins.

    Science.gov (United States)

    Martínez-Fábregas, Jonathan; Rubio, Silvia; Díaz-Quintana, Antonio; Díaz-Moreno, Irene; De la Rosa, Miguel Á

    2011-05-01

    Metalloproteins play major roles in cell metabolism and signalling pathways. In many cases, they show moonlighting behaviour, acting in different processes, depending on the physiological state of the cell. To understand these multitasking proteins, we need to discover the partners with which they carry out such novel functions. Although many technological and methodological tools have recently been reported for the detection of protein interactions, specific approaches to studying the interactions involving metalloproteins are not yet well developed. The task is even more challenging for metalloproteins, because they often form short-lived complexes that are difficult to detect. In this review, we gather the different proteomic techniques and biointeractomic tools reported in the literature. All of them have shown their applicability to the study of transient and weak protein-protein interactions, and are therefore suitable for metalloprotein interactions.

  10. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  11. Stakeholder Analysis of an Executable Achitecture Systems Engineering (EASE) Tool

    Science.gov (United States)

    2013-06-21

    regression representations of more complex M&S tools.4 C2WindTunnel. C2 WindTunnel is a software test bed developed by George Mason for Command and...Technology Project, U.S. Marine Corps Systems Command.    5  Roth , Karen; Barrett, Shelby. 2009 (July). Command and Control Wind Tunnel Integration

  12. Design tools for daylighting illumination and energy analysis

    Energy Technology Data Exchange (ETDEWEB)

    Selkowitz, S.

    1982-07-01

    The problems and potentials for using daylighting to provide illumination in building interiors are reviewed. It describes some of the design tools now or soon to be available for incorporating daylighting into the building design process. It also describes state-of-the-art methods for analyzing the impacts daylighting can have on selection of lighting controls, lighting energy consumption, heating and cooling loads, and peak power demand.

  13. Clinical decision support tools: analysis of online drug information databases

    OpenAIRE

    Seamon Matthew J; Polen Hyla H; Marsh Wallace A; Clauson Kevin A; Ortiz Blanca I

    2007-01-01

    Abstract Background Online drug information databases are used to assist in enhancing clinical decision support. However, the choice of which online database to consult, purchase or subscribe to is likely made based on subjective elements such as history of use, familiarity, or availability during professional training. The purpose of this study was to evaluate clinical decision support tools for drug information by systematically comparing the most commonly used online drug information datab...

  14. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    Directory of Open Access Journals (Sweden)

    Drechsel Marion

    2009-10-01

    Full Text Available Abstract Background Single nucleotide polymorphism (SNP genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis. Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  15. Performance Analysis of the Capability Assessment Tool for Sustainable Manufacturing

    Directory of Open Access Journals (Sweden)

    Enda Crossin

    2013-08-01

    Full Text Available This paper explores the performance of a novel capability assessment tool, developed to identify capability gaps and associated training and development requirements across the supply chain for environmentally-sustainable manufacturing. The tool was developed to assess 170 capabilities that have been clustered with respect to key areas of concern such as managing energy, water, material resources, carbon emissions and waste as well as environmental management practices for sustainability. Two independent expert teams used the tool to assess a sample group of five first and second tier sports apparel and footwear suppliers within the supply chain of a global sporting goods manufacturer in Asia. The paper addresses the reliability and robustness of the developed assessment method by formulating the expected links between the assessment results. The management practices of the participating suppliers were shown to be closely connected to their performance in managing their resources and emissions. The companies’ initiatives in implementing energy efficiency measures were found to be generally related to their performance in carbon emissions management. The suppliers were also asked to undertake a self-assessment by using a short questionnaire. The large gap between the comprehensive assessment and these in-house self-assessments revealed the suppliers’ misconceptions about their capabilities.

  16. DFTCalc: reliability centered maintenance via fault tree analysis (tool paper)

    NARCIS (Netherlands)

    Guck, Dennis; Spel, Jip; Stoelinga, Mariëlle; Butler, Michael; Conchon, Sylvain; Zaïdi, Fatiha

    2015-01-01

    Reliability, availability, maintenance and safety (RAMS) analysis is essential in the evaluation of safety critical systems like nuclear power plants and the railway infrastructure. A widely used methodology within RAMS analysis are fault trees, representing failure propagations throughout a system.

  17. On evaluation analysis of main structure of industrial workshops%工业厂房主体结构鉴定分析

    Institute of Scientific and Technical Information of China (English)

    吴果梅

    2014-01-01

    By taking the equipment stock of the Physical Supply Company of Xishan Coal and Electricity Power as the example,the paper adopts MIDAS Gen and PKPM,the structural calculation analysis software to undertake the analysis of the grid structure and supporting column of the industrial workshops according to its structural forms,and concludes some meaningful conclusions,so it is significant for the safety utility of the later period of the workshops.%以西山煤电集团物资供应公司设备库为例,根据该厂房的结构形式,利用MIDAS Gen和PKPM结构计算分析软件分别对该工业厂房网架结构和排架柱进行了受力分析,得出了一些有价值的结论,对厂房后期的安全使用具有重要意义。

  18. Propositional Analysis: A Tool for Library and Information Science Research.

    Science.gov (United States)

    Allen, Bryce

    1989-01-01

    Reviews the use of propositional analysis in library and information science research. Evidence that different analysts produce similar judgments about texts and use the method consistently over time is presented, and it is concluded that propositional analysis is a reliable and valid research method. An example of an analysis is appended. (32…

  19. 7th International Workshop on Statistical Simulation

    CERN Document Server

    Mignani, Stefania; Monari, Paola; Salmaso, Luigi

    2014-01-01

    The Department of Statistical Sciences of the University of Bologna in collaboration with the Department of Management and Engineering of the University of Padova, the Department of Statistical Modelling of Saint Petersburg State University, and INFORMS Simulation Society sponsored the Seventh Workshop on Simulation. This international conference was devoted to statistical techniques in stochastic simulation, data collection, analysis of scientific experiments, and studies representing broad areas of interest. The previous workshops took place in St. Petersburg, Russia in 1994, 1996, 1998, 2001, 2005, and 2009. The Seventh Workshop took place in the Rimini Campus of the University of Bologna, which is in Rimini’s historical center.

  20. Advancing Risk Analysis for Nanoscale Materials: Report from an International Workshop on the Role of Alternative Testing Strategies for Advancement: Advancing Risk Analysis for Nanoscale Materials

    Energy Technology Data Exchange (ETDEWEB)

    Shatkin, J. A. [Vireo Advisors, Boston MA USA; Ong, Kimberly J. [Vireo Advisors, Boston MA USA; Beaudrie, Christian [Compass RM, Vancouver CA USA; Clippinger, Amy J. [PETA International Science Consortium Ltd, London UK; Hendren, Christine Ogilvie [Center for the Environmental Implications of NanoTechnology, Duke University, Durham NC USA; Haber, Lynne T. [TERA, Cincinnati OH USA; Hill, Myriam [Health Canada, Ottawa Canada; Holden, Patricia [UC Santa Barbara, Bren School of Environmental Science & Management, ERI, and UC CEIN, University of California, Santa Barbara CA USA; Kennedy, Alan J. [U.S. Army Engineer Research and Development Center, Environmental Laboratory, Vicksburg MS USA; Kim, Baram [Independent, Somerville MA USA; MacDonell, Margaret [Argonne National Laboratory, Environmental Science Division, Argonne IL USA; Powers, Christina M. [U.S. Environmental Protection Agency, Office of Air and Radiation, Office of Transportation and Air Quality, Ann Arbor MI USA; Sharma, Monita [PETA International Science Consortium Ltd, London UK; Sheremeta, Lorraine [Alberta Ingenuity Labs, Edmonton Alberta Canada; Stone, Vicki [John Muir Building Gait 1 Heriot-Watt University, Edinburgh Scotland UK; Sultan, Yasir [Environment Canada, Gatineau QC Canada; Turley, Audrey [ICF International, Durham NC USA; White, Ronald H. [RH White Consultants, Silver Spring MD USA

    2016-08-01

    The Society for Risk Analysis (SRA) has a history of bringing thought leadership to topics of emerging risk. In September 2014, the SRA Emerging Nanoscale Materials Specialty Group convened an international workshop to examine the use of alternative testing strategies (ATS) for manufactured nanomaterials (NM) from a risk analysis perspective. Experts in NM environmental health and safety, human health, ecotoxicology, regulatory compliance, risk analysis, and ATS evaluated and discussed the state of the science for in vitro and other alternatives to traditional toxicology testing for NM. Based on this review, experts recommended immediate and near-term actions that would advance ATS use in NM risk assessment. Three focal areas-human health, ecological health, and exposure considerations-shaped deliberations about information needs, priorities, and the next steps required to increase confidence in and use of ATS in NM risk assessment. The deliberations revealed that ATS are now being used for screening, and that, in the near term, ATS could be developed for use in read-across or categorization decision making within certain regulatory frameworks. Participants recognized that leadership is required from within the scientific community to address basic challenges, including standardizing materials, protocols, techniques and reporting, and designing experiments relevant to real-world conditions, as well as coordination and sharing of large-scale collaborations and data. Experts agreed that it will be critical to include experimental parameters that can support the development of adverse outcome pathways. Numerous other insightful ideas for investment in ATS emerged throughout the discussions and are further highlighted in this article.

  1. Proceedings of the Thirteenth Annual Software Engineering Workshop

    Science.gov (United States)

    1988-01-01

    Topics covered in the workshop included studies and experiments conducted in the Software Engineering Laboratory (SEL), a cooperative effort of NASA Goddard Space Flight Center, the University of Maryland, and Computer Sciences Corporation; software models; software products; and software tools.

  2. Summary and abstracts of the Planetary Data Workshop, June 2012

    Science.gov (United States)

    Gaddis, Lisa R.; Hare, Trent; Beyer, Ross

    2014-01-01

    The recent boom in the volume of digital data returned by international planetary science missions continues to both delight and confound users of those data. In just the past decade, the Planetary Data System (PDS), NASA’s official archive of scientific results from U.S. planetary missions, has seen a nearly 50-fold increase in the amount of data and now serves nearly half a petabyte. In only a handful of years, this volume is expected to approach 1 petabyte (1,000 terabytes or 1 quadrillion bytes). Although data providers, archivists, users, and developers have done a creditable job of providing search functions, download capabilities, and analysis and visualization tools, the new wealth of data necessitates more frequent and extensive discussion among users and developers about their current capabilities and their needs for improved and new tools. A workshop to address these and other topics, “Planetary Data: A Workshop for Users and Planetary Software Developers,” was held June 25–29, 2012, at Northern Arizona University (NAU) in Flagstaff, Arizona. A goal of the workshop was to present a summary of currently available tools, along with hands-on training and how-to guides, for acquiring, processing and working with a variety of digital planetary data. The meeting emphasized presentations by data users and mission providers during days 1 and 2, and developers had the floor on days 4 and 5 using an “unconference” format for day 5. Day 3 featured keynote talks by Laurence Soderblom (U.S. Geological Survey, USGS) and Dan Crichton (Jet Propulsion Laboratory, JPL) followed by a panel discussion, and then research and technical discussions about tools and capabilities under recent or current development. Software and tool demonstrations were held in break-out sessions in parallel with the oral session. Nearly 150 data users and developers from across the globe attended, and 22 National Aeronautics and space Administration (NASA) and non-NASA data providers

  3. 2nd Ralf Yorque Workshop

    CERN Document Server

    1985-01-01

    These are the proceedings of the Second R. Yorque Workshop on Resource Management which took place in Ashland, Oregon on July 23-25, 1984. The purpose of the workshop is to provide an informal atmosphere for the discussion of resource assessment and management problems. Each participant presented a one hour morning talk; afternoons were reserved for informal chatting. The workshop was successful in stimulating ideas and interaction. The papers by R. Deriso, R. Hilborn and C. Walters all address the same basic issue, so they are lumped together. Other than that, the order to the papers in this volume was determined in the same fashion as the order of speakers during the workshop -- by random draw. Marc Mangel Department of Mathematics University of California Davis, California June 1985 TABLE OF CONTENTS A General Theory for Fishery Modeling Jon Schnute Data Transformations in Regression Analysis with Applications to Stock-Recruitment Relationships David Ruppert and Raymond J. Carroll ••••••. •�...

  4. Generalized Aliasing as a Basis for Program Analysis Tools

    Science.gov (United States)

    2000-11-01

    applications are described in the next chapter, in Section 9.2.2.) For example, the Ladybug specification checker tool [44] has a user interface shell...any particular implementation of the interface. At run time, Ladybug uses reflection to load the engine class by name and create an object of that...supplied with Sun’s JDK 1.1.7 Jess Java Expert System Shell version 4.4, from Sandia National Labs [35] Ladybug The Ladybug specification checker, by Craig

  5. Le Nouveau Manuel de Formation sur l'Elaboration et la Gestion des Projets. (The New Project Design and Management Workshop Training Manual).

    Science.gov (United States)

    Peace Corps, Washington, DC. Information Collection and Exchange Div.

    A french language version of a training manual that presents guidelines for planning and conducting a project design and management (PDM) workshop to teach Peace Corps volunteers to involve local community members in the process of using participatory analysis tools and planning and implementing projects meeting local desires and needs. The first…

  6. Practical Multi-Disciplinary Analysis Tools for Combustion Devices Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The use of multidisciplinary analysis (MDA) techniques for combustion device environment prediction, including complex fluid mixing phenomena, is now becoming...

  7. Practical Multi-Disciplinary Analysis Tools for Combustion Devices Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The use of multidisciplinary analysis (MDA) techniques for complex fluid/structure interaction phenomena is increasing as proven numerical and visualization...

  8. Proceedings Second Annual Cyber Security and Information Infrastructure Research Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Sheldon, Frederick T [ORNL; Krings, Axel [ORNL; Yoo, Seong-Moo [ORNL; Mili, Ali [ORNL; Trien, Joseph P [ORNL

    2006-01-01

    The workshop theme is Cyber Security: Beyond the Maginot Line Recently the FBI reported that computer crime has skyrocketed costing over $67 billion in 2005 alone and affecting 2.8M+ businesses and organizations. Attack sophistication is unprecedented along with availability of open source concomitant tools. Private, academic, and public sectors invest significant resources in cyber security. Industry primarily performs cyber security research as an investment in future products and services. While the public sector also funds cyber security R&D, the majority of this activity focuses on the specific mission(s) of the funding agency. Thus, broad areas of cyber security remain neglected or underdeveloped. Consequently, this workshop endeavors to explore issues involving cyber security and related technologies toward strengthening such areas and enabling the development of new tools and methods for securing our information infrastructure critical assets. We aim to assemble new ideas and proposals about robust models on which we can build the architecture of a secure cyberspace including but not limited to: * Knowledge discovery and management * Critical infrastructure protection * De-obfuscating tools for the validation and verification of tamper-proofed software * Computer network defense technologies * Scalable information assurance strategies * Assessment-driven design for trust * Security metrics and testing methodologies * Validation of security and survivability properties * Threat assessment and risk analysis * Early accurate detection of the insider threat * Security hardened sensor networks and ubiquitous computing environments * Mobile software authentication protocols * A new "model" of the threat to replace the "Maginot Line" model and more . . .

  9. The GRIP method for collaborative roadmapping workshops

    DEFF Research Database (Denmark)

    Piirainen, Kalle

    2015-01-01

    Technology roadmapping is a well-known tool for technology management, but practical advice for facilitating collaborative roadmapping workshops is relatively scarce. To cater for this need, we have designed a method for collaborative roadmapping, dubbed the GRIP method, for facilitating group work...

  10. MultiAlign: a multiple LC-MS analysis tool for targeted omics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lamarche, Brian L.; Crowell, Kevin L.; Jaitly, Navdeep; Petyuk, Vladislav A.; Shah, Anuj R.; Polpitiya, Ashoka D.; Sandoval, John D.; Kiebel, Gary R.; Monroe, Matthew E.; Callister, Stephen J.; Metz, Thomas O.; Anderson, Gordon A.; Smith, Richard D.

    2013-02-12

    MultiAlign is a free software tool that aligns multiple liquid chromatography-mass spectrometry datasets to one another by clustering mass and LC elution features across datasets. Applicable to both label-free proteomics and metabolomics comparative analyses, the software can be operated in several modes. Clustered features can be matched to a reference database to identify analytes, used to generate abundance profiles, linked to tandem mass spectra based on parent precursor masses, and culled for targeted liquid chromatography-tandem mass spectrometric analysis. MultiAlign is also capable of tandem mass spectral clustering to describe proteome structure and find similarity in subsequent sample runs.

  11. Code Analysis and Refactoring with Clang Tools, Version 0.1

    Energy Technology Data Exchange (ETDEWEB)

    2016-12-23

    Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.

  12. CyNC - towards a General Tool for Performance Analysis of Complex Distributed Real Time Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens F. Dalsgaard

    2005-01-01

    The paper addresses the current state and the ongoing activities of a tool for performance analysis of complex real time systems. The tool named CyNC is based on network calculus allowing for the computation of backlogs and delays in a system from specified lower and upper bounds of external...

  13. Ecotoxicological mechanisms and models in an impact analysis tool for oil spills

    NARCIS (Netherlands)

    Laender, de F.; Olsen, G.H.; Frost, T.; Grosvik, B.E.; Klok, T.C.

    2011-01-01

    In an international collaborative effort, an impact analysis tool is being developed to predict the effect of accidental oil spills on recruitment and production of Atlantic cod (Gadus morhua) in the Barents Sea. The tool consisted of three coupled ecological models that describe (1) plankton biomas

  14. 2014 MICCAI Workshop

    CERN Document Server

    Nedjati-Gilani, Gemma; Rathi, Yogesh; Reisert, Marco; Schneider, Torben

    2014-01-01

    This book contains papers presented at the 2014 MICCAI Workshop on Computational Diffusion MRI, CDMRI’14. Detailing new computational methods applied to diffusion magnetic resonance imaging data, it offers readers a snapshot of the current state of the art and covers a wide range of topics from fundamental theoretical work on mathematical modeling to the development and evaluation of robust algorithms and applications in neuroscientific studies and clinical practice.   Inside, readers will find information on brain network analysis, mathematical modeling for clinical applications, tissue microstructure imaging, super-resolution methods, signal reconstruction, visualization, and more. Contributions include both careful mathematical derivations and a large number of rich full-color visualizations.   Computational techniques are key to the continued success and development of diffusion MRI and to its widespread transfer into the clinic. This volume will offer a valuable starting point for anyone interested i...

  15. Open Source Software Tools for Anomaly Detection Analysis

    Science.gov (United States)

    2014-04-01

    Environment for Developing KDD-Applications Supported by Index-Structures (ELKI), RapidMiner , SHOGUN (toolbox) Waikato Environment for Knowledge Analysis...Structures (ELKI) 1 3. RapidMiner 2 4. SHOGUN (toolbox) 3 5. Waikato Environment for Knowledge Analysis (Weka) (Machine Learning) 4 6. Scikit-Learn 5 7...2 Figure 2. RapidMiner output results (7

  16. Pathway-based analysis tools for complex diseases: a review.

    Science.gov (United States)

    Jin, Lv; Zuo, Xiao-Yu; Su, Wei-Yang; Zhao, Xiao-Lei; Yuan, Man-Qiong; Han, Li-Zhen; Zhao, Xiang; Chen, Ye-Da; Rao, Shao-Qi

    2014-10-01

    Genetic studies are traditionally based on single-gene analysis. The use of these analyses can pose tremendous challenges for elucidating complicated genetic interplays involved in complex human diseases. Modern pathway-based analysis provides a technique, which allows a comprehensive understanding of the molecular mechanisms underlying complex diseases. Extensive studies utilizing the methods and applications for pathway-based analysis have significantly advanced our capacity to explore large-scale omics data, which has rapidly accumulated in biomedical fields. This article is a comprehensive review of the pathway-based analysis methods-the powerful methods with the potential to uncover the biological depths of the complex diseases. The general concepts and procedures for the pathway-based analysis methods are introduced and then, a comprehensive review of the major approaches for this analysis is presented. In addition, a list of available pathway-based analysis software and databases is provided. Finally, future directions and challenges for the methodological development and applications of pathway-based analysis techniques are discussed. This review will provide a useful guide to dissect complex diseases.

  17. Cellular barcoding tool for clonal analysis in the hematopoietic system

    NARCIS (Netherlands)

    Gerrits, Alice; Dykstra, Brad; Kalmykowa, Olga J.; Klauke, Karin; Verovskaya, Evgenia; Broekhuis, Mathilde J. C.; de Haan, Gerald; Bystrykh, Leonid V.

    2010-01-01

    Clonal analysis is important for many areas of hematopoietic stem cell research, including in vitro cell expansion, gene therapy, and cancer progression and treatment. A common approach to measure clonality of retrovirally transduced cells is to perform integration site analysis using Southern blott

  18. Pathway-based Analysis Tools for Complex Diseases: A Review

    Directory of Open Access Journals (Sweden)

    Lv Jin

    2014-10-01

    Full Text Available Genetic studies are traditionally based on single-gene analysis. The use of these analyses can pose tremendous challenges for elucidating complicated genetic interplays involved in complex human diseases. Modern pathway-based analysis provides a technique, which allows a comprehensive understanding of the molecular mechanisms underlying complex diseases. Extensive studies utilizing the methods and applications for pathway-based analysis have significantly advanced our capacity to explore large-scale omics data, which has rapidly accumulated in biomedical fields. This article is a comprehensive review of the pathway-based analysis methods—the powerful methods with the potential to uncover the biological depths of the complex diseases. The general concepts and procedures for the pathway-based analysis methods are introduced and then, a comprehensive review of the major approaches for this analysis is presented. In addition, a list of available pathway-based analysis software and databases is provided. Finally, future directions and challenges for the methodological development and applications of pathway-based analysis techniques are discussed. This review will provide a useful guide to dissect complex diseases.

  19. Teaching the Tools of Pharmaceutical Care Decision-Analysis.

    Science.gov (United States)

    Rittenhouse, Brian E.

    1994-01-01

    A method of decision-analysis in pharmaceutical care that integrates epidemiology and economics is presented, including an example illustrating both the deceptive nature of medical decision making and the power of decision analysis. Principles in determining both general and specific probabilities of interest and use of decision trees for…

  20. Miscue Analysis: A Transformative Tool for Researchers, Teachers, and Readers

    Science.gov (United States)

    Goodman, Yetta M.

    2015-01-01

    When a reader produces a response to a written text (the observed response) that is not expected by the listener, the result is called a miscue. Using psychosociolingustic analyses of miscues in the context of an authentic text, miscue analysis provides evidence to discover how readers read. I present miscue analysis history and development and…

  1. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    Directory of Open Access Journals (Sweden)

    Tousif ur Rehman

    2013-02-01

    Full Text Available Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non-functional requirement, yet are naturally considered fundamental to secure software development.

  2. User Behavior Analysis from Web Log using Log Analyzer Tool

    Directory of Open Access Journals (Sweden)

    Brijesh Bakariya

    2013-11-01

    Full Text Available Now a day, internet plays a role of huge database in which many websites, information and search engines are available. But due to unstructured and semi-structured data in webpage, it has become a challenging task to extract relevant information. Its main reason is that traditional knowledge based technique are not correct to efficiently utilization the knowledge, because it consist of many discover pattern, contains a lots of noise and uncertainty. In this paper, analyzing of web usage mining has been made with the help if web log data for which web log analyzer tool, “Deep Log Analyzer” to find out abstract information from particular server and also tried to find out the user behavior and also developed an ontology which consist the relation among efficient web apart of web usage mining.

  3. National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion

    Science.gov (United States)

    Follen, G.; Naiman, C.; Evans, A.

    1999-01-01

    Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.

  4. Integration of management control tools. Analysis of a case study

    Directory of Open Access Journals (Sweden)

    Raúl Comas Rodríguez

    2015-09-01

    Full Text Available The objective of this article is to design and to implement a procedure that integrates management control tools focusing on process, to improve the efficiency and the efficacy. It was carried out an experimental study where is defined a procedure, based in the Balanced Scorecard, which integrates the process management into the strategic planning and their evaluation. As results of this work, we define the key factors of success associated with the four perspectives of the Balanced Scorecard that are linked through the cause-effect relations obtaining the strategic map that allows visualizing and communicating the enterprise strategy. The indicators evaluate the key factor of success, integrating the process with the assistance of a software. The implementation of the procedure in a commercialization enterprise contributed to integrate the process definition into the strategic planning. The alignment was evaluated and the efficiency and efficacy indicators improved the company´s performance.

  5. Limits, limits everywhere the tools of mathematical analysis

    CERN Document Server

    Applebaum, David

    2012-01-01

    A quantity can be made smaller and smaller without it ever vanishing. This fact has profound consequences for science, technology, and even the way we think about numbers. In this book, we will explore this idea by moving at an easy pace through an account of elementary real analysis and, in particular, will focus on numbers, sequences, and series.Almost all textbooks on introductory analysis assume some background in calculus. This book doesn't and, instead, the emphasis is on the application of analysis to number theory. The book is split into two parts. Part 1 follows a standard university

  6. Design and Analysis Tools for Deployable Solar Array Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Large, lightweight, deployable solar array structures have been identified as a key enabling technology for NASA with analysis and design of these structures being...

  7. Integrative genomic analysis by interoperation of bioinformatics tools in GenomeSpace

    Science.gov (United States)

    Thorvaldsdottir, Helga; Liefeld, Ted; Ocana, Marco; Borges-Rivera, Diego; Pochet, Nathalie; Robinson, James T.; Demchak, Barry; Hull, Tim; Ben-Artzi, Gil; Blankenberg, Daniel; Barber, Galt P.; Lee, Brian T.; Kuhn, Robert M.; Nekrutenko, Anton; Segal, Eran; Ideker, Trey; Reich, Michael; Regev, Aviv; Chang, Howard Y.; Mesirov, Jill P.

    2015-01-01

    Integrative analysis of multiple data types to address complex biomedical questions requires the use of multiple software tools in concert and remains an enormous challenge for most of the biomedical research community. Here we introduce GenomeSpace (http://www.genomespace.org), a cloud-based, cooperative community resource. Seeded as a collaboration of six of the most popular genomics analysis tools, GenomeSpace now supports the streamlined interaction of 20 bioinformatics tools and data resources. To facilitate the ability of non-programming users’ to leverage GenomeSpace in integrative analysis, it offers a growing set of ‘recipes’, short workflows involving a few tools and steps to guide investigators through high utility analysis tasks. PMID:26780094

  8. Capability Portfolio Analysis Tool (CPAT) Verification and Validation Report

    Science.gov (United States)

    2013-01-01

    Med Evac Vehicle MGS Mobile Gun System MILPRS Military Personnel MILCON Military Construction MODA Multiple Objective Decision Analysis...Analysis ( MODA ) approach for assessing the value of vehicle modernization in the HBCT and SBCT combat fleets. The MODA approach provides insight to...used to measure the returns of scale for a given attribute. The MODA approach promotes buy-in from multiple stakeholders. The CPAT team held an SME

  9. Mutual Workshops enhancing Curriculum Integration

    DEFF Research Database (Denmark)

    Bjerregaard Jensen, Lotte; Markvorsen, Steen; Almegaard, Henrik

    2011-01-01

    to the joint project. The theme of the third semester is ‘structural design’. Structural design is defined as an integration of material science, statics and geometry in relation to an architectural project. Anticipating the implementation of CDIO and this theme, major changes were made to the curriculum...... course. This realized the full potential of structural design and firmly highlighted the creative potential in geometry for hesitant students. The joint workshop also showed potential as a general tool that can enhance curriculum integration....

  10. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang;

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...... (IWTA) 2013 seeks to bring together researchers and practitioners in the fields of education, learning sciences, learning analytics, and visual analytics to investigate the design, development, use, evaluation, and impact of visual analytical methods and tools for teachers’ dynamic diagnostic decision...

  11. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    Science.gov (United States)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  12. SSC workshop on environmental radiation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1986-01-09

    The Superconducting Super Collider is a 20 TeV-on-20 TeV proton beam collider where two 20-TeV proton accelerators whose beams, rotating in opposite senses, are brought into collision to provide 40 TeV in the center of mass. The scale of the project is set by the 6.6 tesla magnet guide field for the protons which results in a roughly circular machine with a circumference of 83 km (51.5 mi.). The energy scale of the proton beams and the physical scale of the machine are an order of magnitude greater than for any presently operating or contemplated proton accelerator yet the facility must be operated within the same strict radiological guidelines as existing accelerators in the US and Europe. To ensure that the facility conforms to existing and projected guidelines both in design and operation, the Workshop was charged to review the experience and practices of existing accelerator laboratories, to determine the relevant present and projected regulatory requirements, to review particle production and shielding data from accelerators and cosmic rays, to study the design and operational specifications of the Collider, to examine the parameters set forth in the Siting Parameters Document, and to evaluate the computational tools available to model the radiation patterns arising under various operational and failure scenarios. This report summarizes the extensive and intensive presentations and discussions of the Workshop. A great deal of material, much of it in the form of internal reports from the various laboratories and drafts of works in preparation, was provided by the participants for the various topics. This material, including the viewgraphs used by the presenters, forms the background and basis for the conclusions of the Workshop and, as such, is an important part of the Workshop. An introduction to the material and a catalog by topic are presented as section 6 of this report.

  13. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  14. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.

    2014-12-01

    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  15. Droplet microfluidics--a tool for single-cell analysis.

    Science.gov (United States)

    Joensson, Haakan N; Andersson Svahn, Helene

    2012-12-03

    Droplet microfluidics allows the isolation of single cells and reagents in monodisperse picoliter liquid capsules and manipulations at a throughput of thousands of droplets per second. These qualities allow many of the challenges in single-cell analysis to be overcome. Monodispersity enables quantitative control of solute concentrations, while encapsulation in droplets provides an isolated compartment for the single cell and its immediate environment. The high throughput allows the processing and analysis of the tens of thousands to millions of cells that must be analyzed to accurately describe a heterogeneous cell population so as to find rare cell types or access sufficient biological space to find hits in a directed evolution experiment. The low volumes of the droplets make very large screens economically viable. This Review gives an overview of the current state of single-cell analysis involving droplet microfluidics and offers examples where droplet microfluidics can further biological understanding.

  16. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, we built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of

  17. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    Directory of Open Access Journals (Sweden)

    Maike Kathrin Aurich

    2016-08-01

    Full Text Available Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools , we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.

  18. MetaboTools: A Comprehensive Toolbox for Analysis of Genome-Scale Metabolic Models.

    Science.gov (United States)

    Aurich, Maike K; Fleming, Ronan M T; Thiele, Ines

    2016-01-01

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.

  19. Simpler methods do it better: Success of Recurrence Quantification Analysis as a general purpose data analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Webber, Charles L., E-mail: cwebber@lumc.ed [Department of Cell and Molecular Physiology, Loyola University Medical Center, Maywood, IL (United States); Marwan, Norbert, E-mail: marwan@pik-potsdam.d [Potsdam Institute for Climate Impact Research (PIK), 14412 Potsdam (Germany); Facchini, Angelo, E-mail: a.facchini@unisi.i [Center the Study of Complex Systmes and Department of Information Enginering, University of Siena, 53100 Siena (Italy); Giuliani, Alessandro, E-mail: alessandro.giuliani@iss.i [Environment and Health Department, Istituto Superiore di Sanita, Roma (Italy)

    2009-10-05

    Over the last decade, Recurrence Quantification Analysis (RQA) has become a new standard tool in the toolbox of nonlinear methodologies. In this Letter we trace the history and utility of this powerful tool and cite some common applications. RQA continues to wend its way into numerous and diverse fields of study.

  20. An ontological knowledge based system for selection of process monitoring and analysis tools

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    monitoring and analysis tools for a wide range of operations has made their selection a difficult, time consuming and challenging task. Therefore, an efficient and systematic knowledge base coupled with an inference system is necessary to support the optimal selection of process monitoring and analysis tools......, satisfying the process and user constraints. A knowledge base consisting of the process knowledge as well as knowledge on measurement methods and tools has been developed. An ontology has been designed for knowledge representation and management. The developed knowledge base has a dual feature. On the one...... procedures has been developed to retrieve the data/information stored in the knowledge base....

  1. Extension of a System Level Tool for Component Level Analysis

    Science.gov (United States)

    Majumdar, Alok; Schallhorn, Paul

    2002-01-01

    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  2. VLLEEM-2 technical workshop

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    In order to overcome the limits of the energy-environment models, and to benefit at the same time of their past experiences, the VLEEM project proposes a genuine approach and innovative modelling tools to assess the energy-environment systems over the very long term, which are based on the strengths of existing long term energy models, but focussed on two major objectives: to describe normative futures which fit with a set of overall constraints, like the stabilisation of the concentration of green-house gases in the atmosphere, or the stabilisation of the overall inventory of plutonium and minor actinides, etc...; to describe and formalize the association of causalities necessary to bring the system from the present situation to the targeted future, through a '' back-casting '' approach. This first technical workshop presents the state of the art of the different Work Programmes. WP1:Enhancement of the Conceptual framework. WP2: Data base on conventional demand/supply technologies. WP3: Complement and up-date technology monographs. WP4: Formalization and computing of final VLEEM submodels. WP5: Case study 2030, 2050, 2100. (A.L.B.)

  3. Measures of radioactivity: a tool for understanding statistical data analysis

    CERN Document Server

    Montalbano, Vera

    2012-01-01

    A learning path on radioactivity in the last class of high school is presented. An introduction to radioactivity and nuclear phenomenology is followed by measurements of natural radioactivity. Background and weak sources are monitored for days or weeks. The data are analyzed in order to understand the importance of statistical analysis in modern physics.

  4. ProbFAST: Probabilistic Functional Analysis System Tool

    Directory of Open Access Journals (Sweden)

    Oliveira Thiago YK

    2010-03-01

    Full Text Available Abstract Background The post-genomic era has brought new challenges regarding the understanding of the organization and function of the human genome. Many of these challenges are centered on the meaning of differential gene regulation under distinct biological conditions and can be performed by analyzing the Multiple Differential Expression (MDE of genes associated with normal and abnormal biological processes. Currently MDE analyses are limited to usual methods of differential expression initially designed for paired analysis. Results We proposed a web platform named ProbFAST for MDE analysis which uses Bayesian inference to identify key genes that are intuitively prioritized by means of probabilities. A simulated study revealed that our method gives a better performance when compared to other approaches and when applied to public expression data, we demonstrated its flexibility to obtain relevant genes biologically associated with normal and abnormal biological processes. Conclusions ProbFAST is a free accessible web-based application that enables MDE analysis on a global scale. It offers an efficient methodological approach for MDE analysis of a set of genes that are turned on and off related to functional information during the evolution of a tumor or tissue differentiation. ProbFAST server can be accessed at http://gdm.fmrp.usp.br/probfast.

  5. Integral Design workshops: organization, structure and testing

    Directory of Open Access Journals (Sweden)

    Wim Zeiler

    2010-08-01

    Full Text Available The purpose of this paper is to achieve an understanding of design activities in the context of building design. The starting point is an overview of design research and design methodology. From the insights gained by this analysis of design in this specific context, we present an 'organization structure and design' workshop approach for collaborative multi-discipline design management. The workshops set-up, used to implement and to test the approach, are presented as well as the experiences of the participants. The project was done in close cooperation with the professional societies with in the Dutch building design field. More than one hundred experienced professionals participated in the workshops. The workshops have become part of the permanent professional training program Dutch architectural society.

  6. Warehouse Sanitation Workshop Handbook.

    Science.gov (United States)

    Food and Drug Administration (DHHS/PHS), Washington, DC.

    This workshop handbook contains information and reference materials on proper food warehouse sanitation. The materials have been used at Food and Drug Administration (FDA) food warehouse sanitation workshops, and are selected by the FDA for use by food warehouse operators and for training warehouse sanitation employees. The handbook is divided…

  7. SPLASH'13 workshops summary

    DEFF Research Database (Denmark)

    Balzer, S.; Schultz, U. P.

    2013-01-01

    Following its long-standing tradition, SPLASH 2013 will host 19 high-quality workshops, allowing their participants to meet and discuss research questions with peers, to mature new and exciting ideas, and to build up communities and start new collaborations. SPLASH workshops complement the main t...

  8. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  9. DYNAMICS ANALYSIS OF SPECIAL STRUCTURE OF MILLING-HEAD MACHINE TOOL

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The milling-bead machine tool is a sophisticated and high-quality machine tool of which the spindle system is made up of special multi-element structure. Two special mechanical configurations make the cutting performance of the machine tool decline. One is the milling head spindle supported on two sets of complex bearings. The mechanical dynamic rigidity of milling head structure is researched on designed digital prototype with finite element analysis(FEA) and modal synthesis analysis (MSA) for identifying the weak structures. The other is the ram structure hanging on milling head. The structure is researched to get dynamic performance on cutting at different ram extending positions. The analysis results on spindle and ram are used to improve the mechanical configurations and structure in design. The machine tool is built up with modified structure and gets better dynamic rigidity than it was before.

  10. Error Modeling and Sensitivity Analysis of a Five-Axis Machine Tool

    Directory of Open Access Journals (Sweden)

    Wenjie Tian

    2014-01-01

    Full Text Available Geometric error modeling and its sensitivity analysis are carried out in this paper, which is helpful for precision design of machine tools. Screw theory and rigid body kinematics are used to establish the error model of an RRTTT-type five-axis machine tool, which enables the source errors affecting the compensable and uncompensable pose accuracy of the machine tool to be explicitly separated, thereby providing designers and/or field engineers with an informative guideline for the accuracy improvement by suitable measures, that is, component tolerancing in design, manufacturing, and assembly processes, and error compensation. The sensitivity analysis method is proposed, and the sensitivities of compensable and uncompensable pose accuracies are analyzed. The analysis results will be used for the precision design of the machine tool.

  11. PHASE ANALYSIS AS A TOOL OF PREESTIMATED ANALYSIS OF THE ACTIVITY OF A MULTIFUNCTIONAL CENTER

    Directory of Open Access Journals (Sweden)

    Kovaleva K. A.

    2015-03-01

    Full Text Available The article is devoted to the phase analysis as a tool preprocessor analysis of a multi-purpose center. Consider the time series of the daily number of requests received on the basis of the phase portraits of these time series. These time series have strong properties of cycles and periodicity. Practice has shown that in modern conditions, for example, for the Russian economy with its instability and financial crises, classical economic theory and statistics, built on linear models, turned out to be unproductive. Overview of approaches and economic-mathematical methods preprocessor analysis of evolutionary economic processes and the corresponding time series allows concluding the following: one versatile, satisfying all the requirements, do not possess the shortcomings of the method of analysis and forecasting does not exist. Each approach and each method has its advantages, disadvantages, limits of use. Most of the known methods of forecasting operate detected in the considered time series properties of cycles and periodicity. Thus, the mere presence of a pronounced cyclicity at different levels of the considered hierarchical model of the time series of the number of requests in a multi-purpose center are important indicators of the possibility of constructing an adequate predictive model number of requests in the multi-purpose centre

  12. Modal interval analysis new tools for numerical information

    CERN Document Server

    Sainz, Miguel A; Calm, Remei; Herrero, Pau; Jorba, Lambert; Vehi, Josep

    2014-01-01

    This book presents an innovative new approach to interval analysis. Modal Interval Analysis (MIA) is an attempt to go beyond the limitations of classic intervals in terms of their structural, algebraic and logical features. The starting point of MIA is quite simple: It consists in defining a modal interval that attaches a quantifier to a classical interval and in introducing the basic relation of inclusion between modal intervals by means of the inclusion of the sets of predicates they accept. This modal approach introduces interval extensions of the real continuous functions, identifies equivalences between logical formulas and interval inclusions, and provides the semantic theorems that justify these equivalences, along with guidelines for arriving at these inclusions. Applications of these equivalences in different areas illustrate the obtained results. The book also presents a new interval object: marks, which aspire to be a new form of numerical treatment of errors in measurements and computations.

  13. Analysis of spreadable cheese by Raman spectroscopy and chemometric tools.

    Science.gov (United States)

    Oliveira, Kamila de Sá; Callegaro, Layce de Souza; Stephani, Rodrigo; Almeida, Mariana Ramos; de Oliveira, Luiz Fernando Cappa

    2016-03-01

    In this work, FT-Raman spectroscopy was explored to evaluate spreadable cheese samples. A partial least squares discriminant analysis was employed to identify the spreadable cheese samples containing starch. To build the models, two types of samples were used: commercial samples and samples manufactured in local industries. The method of supervised classification PLS-DA was employed to classify the samples as adulterated or without starch. Multivariate regression was performed using the partial least squares method to quantify the starch in the spreadable cheese. The limit of detection obtained for the model was 0.34% (w/w) and the limit of quantification was 1.14% (w/w). The reliability of the models was evaluated by determining the confidence interval, which was calculated using the bootstrap re-sampling technique. The results show that the classification models can be used to complement classical analysis and as screening methods.

  14. An open source tool for heart rate variability spectral analysis.

    Science.gov (United States)

    Rodríguez-Liñares, L; Méndez, A J; Lado, M J; Olivieri, D N; Vila, X A; Gómez-Conde, I

    2011-07-01

    In this paper we describe a software package for developing heart rate variability analysis. This package, called RHRV, is a third party extension for the open source statistical environment R, and can be freely downloaded from the R-CRAN repository. We review the state of the art of software related to the analysis of heart rate variability (HRV). Based upon this review, we motivate the development of an open source software platform which can be used for developing new algorithms for studying HRV or for performing clinical experiments. In particular, we show how the RHRV package greatly simplifies and accelerates the work of the computer scientist or medical specialist in the HRV field. We illustrate the utility of our package with practical examples.

  15. A Decision-Making Tools Workshop

    Science.gov (United States)

    1999-08-01

    Space and Electronic Warfare, an assignment he held until retirement. During this tour he crafted Navy’s C4I architec- ture, Copernicus , and Information...Identify Effective Decision Makers The Marine Corps Wargaming Division, supported by GAMA Corporation of Falls Church Virginia, VA, had been seeking

  16. A tool for public analysis of scientific data

    Directory of Open Access Journals (Sweden)

    D Haglin

    2006-01-01

    Full Text Available The scientific method encourages sharing data with other researchers to independently verify conclusions. Currently, technical barriers impede such public scrutiny. A strategy for offering scientific data for public analysis is described. With this strategy, effectively no requirements of software installation (other than a web browser or data manipulation are imposed on other researchers to prepare for perusing the scientific data. A prototype showcasing this strategy is described.

  17. Development of the Expert System Domain Advisor and Analysis Tool

    Science.gov (United States)

    1991-09-01

    analysis. Typical of the current methods in use at this time is the " tarot metric". This method defines a decision rule whose output is whether to go...B - TAROT METRIC B. ::TTRODUCTION The system chart of ESEM, Figure 1, shows the following three risk-based decision points: i. At prolect initiation...34 decisions. B-I 201 PRELIMINARY T" B-I. Evaluais Factan for ES Deyelopsineg FACTORS POSSIBLE VALUE RATINGS TAROT metric (overall suitability) Poor, Fair

  18. Power Systems Life Cycle Analysis Tool (Power L-CAT).

    Energy Technology Data Exchange (ETDEWEB)

    Andruski, Joel; Drennen, Thomas E.

    2011-01-01

    The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation; and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).

  19. NCC: A Physics-Based Design and Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    2000-01-01

    The National Combustion Code (NCC) is an integrated system of computer codes for physics-based design and analysis of combustion systems. It uses unstructured meshes and runs on parallel computing platforms. The NCC is composed of a set of distinct yet closely related modules. They are: (1) a gaseous flow module solving 3-D Navier-Stokes equations; (2) a turbulence module containing the non-linear k-epsilon models; (3) a chemistry module using either the conventional reduced kinetics approach of solving species equations or the Intrinsic Low Dimensional Manifold (ILDM) kinetics approach of table looking up in conjunction with solving the equations of the progressive variables; (4) a turbulence-chemistry interaction module including the option of solving the joint probability density function (PDF) for species and enthalpy; and (5) a spray module for solving the liquid phase equations. In early 1995, an industry-government team was formed to develop the NCC. In July 1998, the baseline beta version was completed and presented in two NCC sessions at the 34th AIAA/ASME/SAE/ASEE Joint Propulsion Conference & Exhibit, July 1998. An overview of this baseline beta version was presented at the NASA HPCCP/CAS Workshop 98, August 1998. Since then, the effort has been focused on the streamlining, validation, and enhancement of the th baseline beta version. The progress is presented in two NCC sessions at the AIAA 38 Aerospace Sciences Meeting & Exhibit, January 2000. At this NASA HPCCP/CAS Workshop 2000, an overview of the NCC papers presented at the AIAA 38 th Aerospace Sciences Meeting & Exhibit is presented, with emphasis on the reduction of analysis time of simulating the (gaseous) reacting flows in full combustors. In addition, results of NCC simulation of a modern turbofan combustor will also be reported.

  20. 浅析机械制造业车间成本管理%Analysis of the Workshop Cost Management on Machinery Manufacture

    Institute of Scientific and Technical Information of China (English)

    胡红彦

    2012-01-01

    本文对车间成本管理存在的问题进行简要剖析,分析车间成本管理的四项内容,对车间成本管理提出了相应的改善措施。%This paper analyzed the problem of Workshop cost management, and four content of Workshop cost management, and presented the corresponding improvement measures of Workshop cost management.

  1. Time-frequency tools of signal processing for EISCAT data analysis

    Directory of Open Access Journals (Sweden)

    J. Lilensten

    Full Text Available We demonstrate the usefulness of some signal-processing tools for the EISCAT data analysis. These tools are somewhat less classical than the familiar periodogram, squared modulus of the Fourier transform, and therefore not as commonly used in our community. The first is a stationary analysis, "Thomson's estimate'' of the power spectrum. The other two belong to time-frequency analysis: the short-time Fourier transform with the spectrogram, and the wavelet analysis via the scalogram. Because of the highly non-stationary character of our geophysical signals, the latter two tools are better suited for this analysis. Their results are compared with both a synthetic signal and EISCAT ion-velocity measurements. We show that they help to discriminate patterns such as gravity waves from noise.

  2. Proceedings Third Workshop on Formal Aspects of Virtual Organisations

    CERN Document Server

    Bryans, Jeremy; 10.4204/EPTCS.83

    2012-01-01

    This volume contains the proceedings of the 3rd International Workshop on Formal Aspects of Virtual Organisations (FAVO 2011). The workshop was held in Sao Paulo, Brazil on October 18th, 2011 as a satellite event to the 12th IFIP Working Conference on Virtual Enterprises (PRO-VE'11). The FAVO workshop aims to provide a forum for researchers interested in the application of formal techniques in the design and analysis of Virtual Organisations.

  3. 75 FR 25281 - Food Protection Workshop; Public Workshop

    Science.gov (United States)

    2010-05-07

    ... about food safety, food defense, the regulations authorized by the Public Health Security and..., visit http://www.uark.edu/ua/foodpro/Workshops/Food_Safety_Defense_Workshop.html or contact Steven C... visit http://www.uark.edu/ua/foodpro/Workshops/Food_Safety_Defense_Workshop.html to register online...

  4. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Eric Moyer

    2016-04-01

    Full Text Available Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f .

  5. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.

    Science.gov (United States)

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).

  6. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology

    Directory of Open Access Journals (Sweden)

    Peter J.A. Cock

    2013-09-01

    Full Text Available The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of “effector” proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen’s predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology.This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols.The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu.

  7. 11th Workshop Lie Theory and Its Applications in Physics

    CERN Document Server

    LT-11

    2016-01-01

    This volume presents modern trends in the area of symmetries and their applications based on contributions from the workshop "Lie Theory and Its Applications in Physics", held near Varna, Bulgaria, in June 2015. Traditionally, Lie theory is a tool to build mathematical models for physical systems. Recently, the trend has been towards geometrization of the mathematical description of physical systems and objects. A geometric approach to a system yields in general some notion of symmetry, which is very helpful in understanding its structure. Geometrization and symmetries are employed in their widest sense, embracing representation theory, algebraic geometry, number theory, infinite-dimensional Lie algebras and groups, superalgebras and supergroups, groups and quantum groups, noncommutative geometry, symmetries of linear and nonlinear partial differential operators (PDO), special functions, and others. Furthermore, the necessary tools from functional analysis are included.< This is a large interdisciplinary a...

  8. MATING DESIGNS: HELPFUL TOOL FOR QUANTITATIVE PLANT BREEDING ANALYSIS

    Directory of Open Access Journals (Sweden)

    Athanase Nduwumuremyi

    2013-12-01

    Full Text Available Selection of parental materials and good mating designs in conventional plant breeding are the keys to the successful plant breeding programme. However, there are several factors affecting the choices of mating designs. Mating design refers to the procedure of producing the progenies, in plant breeding, plant breeders and geneticists, theoretically and practically, they use different form of mating designs and arrangements for targeted purpose. The choice of a mating design for estimating genetic variances should be dictated by the objectives of the study, time, space, cost and other biological limitations. In all mating designs, the individuals are taken randomly and crossed to produce progenies which are related to each other as half-sibs or full-sibs. A form of multivariate analysis or the analysis of variance can be adopted to estimate the components of variances. Therefore, this review aimed at highlighting the most used mating design in plant breeding and genetics studies. It provides easy and quick insight of the different form of mating designs and some statistical components for successful plant breeding.

  9. Shell directions as a tool in palaeocurrent analysis

    Science.gov (United States)

    Wendt, Jobst

    1995-03-01

    Conical shells (mostly orthoconic nautiloids, locally gastropods and rugose corals) were used to determine current directions in Ludlovian to upper Famennian cephalopod limestones in the eastern Anti-Atlas of Morocco, Ougarta Aulacogen and Ahnet Basin (both Algeria). Data plots established on 50,413 measurements from 217 localities document rather consistent current patterns which show only minor variations through subsequent intervals. A conspicuous feature are currents derived from pelagic platforms and directed towards adjacent basins. Shell accumulations decrease markedly towards platform margins yielding less distinctive information on current directions which, due to lack of shells, cannot be established in the basins proper. Orientation patterns of styliolinids show such a puzzling variation in adjacent samples that their use for current analysis is doubtful. The same is true for the presumed down-stream position of goniatite apertures which shows a highly variable pattern which is rarely consistent with that of concomitant orthoconic nautiloids. The direction of orthocones in cephalopod limestones onlapping lower Givetian mud mounds and ridges in the Ahnet Basin of Algeria shows a radial pattern which is the result of a mere gravitational deposition of shells on the steep slopes of these buildups. Apart from this exception the applicability of conical shells for current analysis is confirmed.

  10. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  11. Betweenness as a Tool of Vulnerability Analysis of Power System

    Science.gov (United States)

    Rout, Gyanendra Kumar; Chowdhury, Tamalika; Chanda, Chandan Kumar

    2016-12-01

    Complex network theory finds its application in analysis of power grid as both share some common characteristics. By using this theory finding critical elements in power network can be achieved. As vulnerabilities of elements of the network decide the vulnerability of the total network, in this paper, vulnerability of each element is studied using two complex network models—betweenness centrality and extended betweenness. The betweenness centrality considers only topological structure of power system whereas extended betweenness is based on both topological and physical properties of the system. In the latter case, some of the electrical properties such as electrical distance, line flow limits, transmission capacities of lines and PTDF matrix are included. The standard IEEE 57 bus system has been studied based upon the above mentioned indices and following conclusions have been discussed.

  12. Visual Data Exploration and Analysis - Report on the Visualization Breakout Session of the SCaLeS Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, E. Wes; Frank, Randy; Fulcomer, Sam; Hansen, Chuck; Joy, Ken; Kohl, Jim; Middleton, Don

    2003-07-14

    Scientific visualization is the transformation of abstract information into images, and it plays an integral role in the scientific process by facilitating insight into observed or simulated phenomena. Visualization as a discipline spans many research areas from computer science, cognitive psychology and even art. Yet the most successful visualization applications are created when close synergistic interactions with domain scientists are part of the algorithmic design and implementation process, leading to visual representations with clear scientific meaning. Visualization is used to explore, to debug, to gain understanding, and as an analysis tool. Visualization is literally everywhere--images are present in this report, on television, on the web, in books and magazines--the common theme is the ability to present information visually that is rapidly assimilated by human observers, and transformed into understanding or insight. As an indispensable part a modern science laboratory, visualization is akin to the biologist's microscope or the electrical engineer's oscilloscope. Whereas the microscope is limited to small specimens or use of optics to focus light, the power of scientific visualization is virtually limitless: visualization provides the means to examine data that can be at galactic or atomic scales, or at any size in between. Unlike the traditional scientific tools for visual inspection, visualization offers the means to ''see the unseeable.'' Trends in demographics or changes in levels of atmospheric CO{sub 2} as a function of greenhouse gas emissions are familiar examples of such unseeable phenomena. Over time, visualization techniques evolve in response to scientific need. Each scientific discipline has its ''own language,'' verbal and visual, used for communication. The visual language for depicting electrical circuits is much different than the visual language for depicting theoretical molecules or

  13. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-01-01

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  14. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-12-31

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  15. An integrated data analysis tool for improving measurements on the MST RFP

    Energy Technology Data Exchange (ETDEWEB)

    Reusch, L. M., E-mail: lmmcguire@wisc.edu; Galante, M. E.; Johnson, J. R.; McGarry, M. B.; Den Hartog, D. J. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Franz, P. [Consorzio RFX, EURATOM-ENEA Association, Padova (Italy); Stephens, H. D. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Pierce College Fort Steilacoom, Lakewood, Washington 98498 (United States)

    2014-11-15

    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method.

  16. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example.......A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...

  17. Policy Analysis: A Tool for Setting District Computer Use Policy. Paper and Report Series No. 97.

    Science.gov (United States)

    Gray, Peter J.

    This report explores the use of policy analysis as a tool for setting computer use policy in a school district by discussing the steps in the policy formation and implementation processes and outlining how policy analysis methods can contribute to the creation of effective policy. Factors related to the adoption and implementation of innovations…

  18. A structured approach to forensic study of explosions: The TNO Inverse Explosion Analysis tool

    NARCIS (Netherlands)

    Voort, M.M. van der; Wees, R.M.M. van; Brouwer, S.D.; Jagt-Deutekom, M.J. van der; Verreault, J.

    2015-01-01

    Forensic analysis of explosions consists of determining the point of origin, the explosive substance involved, and the charge mass. Within the EU FP7 project Hyperion, TNO developed the Inverse Explosion Analysis (TNO-IEA) tool to estimate the charge mass and point of origin based on observed damage

  19. International development workshops. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-08-06

    The US Department of Energy (DOE) and the Nuclear Energy Agency of the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) began to act on their recognition of the importance of education in nuclear literacy, specifically in radioactive waste management (RWM), several years ago. To address this Goal for nuclear literacy, the US DOE; through the Information and Education Division of the Office of Civilian Radioactive Waste Management (OCRWM) and in cooperation with the OECD/NEA, organized an ``International Workshop on Education in the Field of Radioactive Waste Management`` in Engelberg, Switzerland in June of 1991. To this end, a grant to support nuclear literacy and RWM was written and funded by the OCRWM and the education division of the DOE Yucca Mountain Office in 1990. The over-riding Goal of that workshop and the DOE grant was to find ways of raising the level of nuclear literacy in the general public through educational programs in radioactive waste management (RWM). The two Main Objectives of the workshop were: first, to contribute to an information base for education systems, on global aspects of radioactive waste management; and second, to achieve international consensus on the basic tools and methods required to develop the information base. These two objectives also became the principal objectives of the DOE International Workshops grant. In other words, the global and local (Nevada) objectives were one and the same. Workshop overviews and accomplishments are summarized in this report.

  20. EpiTools: An Open-Source Image Analysis Toolkit for Quantifying Epithelial Growth Dynamics.

    Science.gov (United States)

    Heller, Davide; Hoppe, Andreas; Restrepo, Simon; Gatti, Lorenzo; Tournier, Alexander L; Tapon, Nicolas; Basler, Konrad; Mao, Yanlan

    2016-01-11

    Epithelia grow and undergo extensive rearrangements to achieve their final size and shape. Imaging the dynamics of tissue growth and morphogenesis is now possible with advances in time-lapse microscopy, but a true understanding of their complexities is limited by automated image analysis tools to extract quantitative data. To overcome such limitations, we have designed a new open-source image analysis toolkit called EpiTools. It provides user-friendly graphical user interfaces for accurately segmenting and tracking the contours of cell membrane signals obtained from 4D confocal imaging. It is designed for a broad audience, especially biologists with no computer-science background. Quantitative data extraction is integrated into a larger bioimaging platform, Icy, to increase the visibility and usability of our tools. We demonstrate the usefulness of EpiTools by analyzing Drosophila wing imaginal disc growth, revealing previously overlooked properties of this dynamic tissue, such as the patterns of cellular rearrangements.

  1. Computational tool for morphological analysis of cultured neonatal rat cardiomyocytes.

    Science.gov (United States)

    Leite, Maria Ruth C R; Cestari, Idágene A; Cestari, Ismar N

    2015-08-01

    This study describes the development and evaluation of a semiautomatic myocyte edge-detector using digital image processing. The algorithm was developed in Matlab 6.0 using the SDC Morphology Toolbox. Its conceptual basis is the mathematical morphology theory together with the watershed and Euclidean distance transformations. The algorithm enables the user to select cells within an image for automatic detection of their borders and calculation of their surface areas; these areas are determined by adding the pixels within each myocyte's boundaries. The algorithm was applied to images of cultured ventricular myocytes from neonatal rats. The edge-detector allowed the identification and quantification of morphometric alterations in cultured isolated myocytes induced by 72 hours of exposure to a hypertrophic agent (50 μM phenylephrine). There was a significant increase in the mean surface area of the phenylephrine-treated cells compared with the control cells (p<;0.05), corresponding to cellular hypertrophy of approximately 50%. In conclusion, this edge-detector provides a rapid, repeatable and accurate measurement of cell surface areas in a standardized manner. Other possible applications include morphologic measurement of other types of cultured cells and analysis of time-related morphometric changes in adult cardiac myocytes.

  2. Job analysis and student assessment tool: perfusion education clinical preceptor.

    Science.gov (United States)

    Riley, Jeffrey B

    2007-09-01

    The perfusion education system centers on the cardiac surgery operating room and the perfusionist teacher who serves as a preceptor for the perfusion student. One method to improve the quality of perfusion education is to create a valid method for perfusion students to give feedback to clinical teachers. The preceptor job analysis consisted of a literature review and interviews with preceptors to list their critical tasks, critical incidents, and cognitive and behavioral competencies. Behaviorally anchored rating traits associated with the preceptors' tasks were identified. Students voted to validate the instrument items. The perfusion instructor rating instrument with a 0-4, "very weak" to "very strong" Likert rating scale was used. The five preceptor traits for student evaluation of clinical instruction (SECI) are as follows: The clinical instructor (1) encourages self-learning, (2) encourages clinical reasoning, (3) meets student's learning needs, (4) gives continuous feedback, and (5) represents a good role model. Scores from 430 student-preceptor relationships for 28 students rotating at 24 affiliate institutions with 134 clinical instructors were evaluated. The mean overall good preceptor average (GPA) was 3.45 +/- 0.76 and was skewed to the left, ranging from 0.0 to 4.0 (median = 3.8). Only 21 of the SECI relationships earned a GPA education program.

  3. IMPORTANT - PERFORMANCE ANALYSIS AS A TOOL IN DESTINATION MARKETING

    Directory of Open Access Journals (Sweden)

    Eleina QIRICI

    2011-06-01

    Full Text Available The Korça Region is located in the Southeast of Albania and borders Greece and Macedonia to the South and the East. It is a mountainous region with two major lakes, Lake Ohrid, the oldest lake in Europe, which is shared with Macedonia and Lake Prespa which is shared with Greece and Macedonia (100km2 in Albania.If we consider the last years, there is an increasing tendency to improve the tourist facilities and to attract the tourist market which is interested for activities in open nature and relax in fresh and pure air. These demands could be met very well in Korca destination which is characterized by suitable climatic conditions and tourist services. Eventually a combination of development of town tourism and tourist villages helped the sustainability of the development of Korca as tourist destination in general.The main purpose of this paper is to present the using of important - performance analysis in marketing destination for the development of tourism.Highlights: (1 the paper considers multifarious goals of the destination management; (2 a computer booking system is used by hotels and guest houses in the region; (3 the relationship between what a tourists wants to find in a destination and that he finds in fact.

  4. Funtools: Fits Users Need Tools for Quick, Quantitative Analysis

    Science.gov (United States)

    Mandel, Eric; Brederkamp, Joe (Technical Monitor)

    2001-01-01

    The Funtools project arose out of conversations with astronomers about the decline in their software development efforts over the past decade. A stated reason for this decline is that it takes too much effort to master one of the existing FITS libraries simply in order to write a few analysis programs. This problem is exacerbated by the fact that astronomers typically develop new programs only occasionally, and the long interval between coding efforts often necessitates re-learning the FITS interfaces. We therefore set ourselves the goal of developing a minimal buy-in FITS library for researchers who are occasional (but serious) coders. In this case, "minimal buy-in" meant "easy to learn, easy to use, and easy to re-learn next month". Based on conversations with astronomers interested in writing code, we concluded that this goal could be achieved by emphasizing two essential capabilities. The first was the ability to write FITS programs without knowing much about FITS, i.e., without having to deal with the arcane rules for generating a properly formatted FITS file. The second was to support the use of already-familiar C/Unix facilities, especially C structs and Unix stdio. Taken together, these two capabilities would allow researchers to leverage their existing programming expertise while minimizing the need to learn new and complex coding rules.

  5. ANALYSIS OF USING EFFICIENT LOGGING TOOLS AT PT. PURWA PERMAI IN CENTRAL KALIMANTAN

    Directory of Open Access Journals (Sweden)

    Sona Suhartana

    2008-06-01

    Full Text Available A high log demand that often exceeds its supply capability should be overcome by using appropriate logging  tools. Numerous  kinds and types of logging  tools require  a well planning in their utilization. Number of tools which are greater or fewer than what is actually needed can be disadvantageous  for a company. In relevant to these aspects, a study was carried out at a timber estate in Central Kalimantan  in 2007. The aim of the study was to find out an efficient number  of tools used for logging  in a timber  estate. The analysis was based on the target and realization of the company’s log production. The result revealed that: (1 Optimum number of logging tools depended on production target,  i.e. 41 units  of chainsaws  for felling,  42 units  of farm tractors  for skidding,  9 units of loaders for loading and unloading, and 36 units of trucks for transportation; (2 Number  of logging tools as obtained from all activities  in the field was fewer than that from  the analysis based on production target and realization. This condition  indicated that number of logging tools used in the company was not yet efficient.

  6. Nucleic acid tool enzymes-aided signal amplification strategy for biochemical analysis: status and challenges.

    Science.gov (United States)

    Qing, Taiping; He, Dinggeng; He, Xiaoxiao; Wang, Kemin; Xu, Fengzhou; Wen, Li; Shangguan, Jingfang; Mao, Zhengui; Lei, Yanli

    2016-04-01

    Owing to their highly efficient catalytic effects and substrate specificity, the nucleic acid tool enzymes are applied as 'nano-tools' for manipulating different nucleic acid substrates both in the test-tube and in living organisms. In addition to the function as molecular scissors and molecular glue in genetic engineering, the application of nucleic acid tool enzymes in biochemical analysis has also been extensively developed in the past few decades. Used as amplifying labels for biorecognition events, the nucleic acid tool enzymes are mainly applied in nucleic acids amplification sensing, as well as the amplification sensing of biorelated variations of nucleic acids. With the introduction of aptamers, which can bind different target molecules, the nucleic acid tool enzymes-aided signal amplification strategies can also be used to sense non-nucleic targets (e.g., ions, small molecules, proteins, and cells). This review describes and discusses the amplification strategies of nucleic acid tool enzymes-aided biosensors for biochemical analysis applications. Various analytes, including nucleic acids, ions, small molecules, proteins, and cells, are reviewed briefly. This work also addresses the future trends and outlooks for signal amplification in nucleic acid tool enzymes-aided biosensors.

  7. Designing budgeting tool and sensitivity analysis for a start-up. : Case: Witrafi Oy

    OpenAIRE

    Arafath, Muhammad

    2014-01-01

    This study presents a thesis on the topic of designing budgeting tool and sensitivi-ty analysis for the commissioning company. The commissioning company is a Finnish Star-up and currently focusing on developing Intelligent Transport Sys-tems by using network based parking system. The aim of this thesis is to provide a ready-made budgeting tool therefore, the commissioning company can use the tool for its own purpose. This is a product-oriented thesis and it includes five project tasks. Pro...

  8. The analysis of the functionality of modern systems, methods and scheduling tools

    Directory of Open Access Journals (Sweden)

    Abramov Ivan

    2016-01-01

    Full Text Available Calendar planning is a key tool for efficient management applied in many industries: power, oil & gas, metallurgy, and construction. As a result of the growing complexity of projects and arising need for improvement of their efficiency, a large number of software tools for high-quality calendar planning appear. Construction companies are facing the challenge of optimum selection of such tools (programs for distribution of limited resources in time.The article provides analysis of the main software packages and their capabilities enabling improvement of project implementation efficiency.

  9. An Intelligent Tool to support Requirements Analysis and Conceptual Design of Database Design

    Institute of Scientific and Technical Information of China (English)

    王能斌; 刘海青

    1991-01-01

    As an application of artificial intelligence and expert system technology to database design,this paper presents an intelligent design tool NITDT,which comprises a requirements specification language NITSL,a knowledge representation language NITKL,and an inference engine with uncertainty reasoning capability.NITDT now covers the requirements analysis and conceptual design of database design.However,it is possible to be integrated with another database design tool,NITDBA,developed also at NIT to become an integrated design tool supporting the whole process of database design.

  10. Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) Users' Guide

    Science.gov (United States)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The tool for turbine engine closed-loop transient analysis (TTECTrA) is a semi-automated control design tool for subsonic aircraft engine simulations. At a specific flight condition, TTECTrA produces a basic controller designed to meet user-defined goals and containing only the fundamental limiters that affect the transient performance of the engine. The purpose of this tool is to provide the user a preliminary estimate of the transient performance of an engine model without the need to design a full nonlinear controller.

  11. The Fuzzy Cluster Analysis in Identification of Key Temperatures in Machine Tool

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The thermal-induced error is a very important sour ce of machining errors of machine tools. To compensate the thermal-induced machin ing errors, a relationship model between the thermal field and deformations was needed. The relationship can be deduced by virtual of FEM (Finite Element Method ), ANN (Artificial Neural Network) or MRA (Multiple Regression Analysis). MR A is on the basis of a total understanding of the temperature distribution of th e machine tool. Although the more the temperatures measu...

  12. SiLK: A Tool Suite for Unsampled Network Flow Analysis at Scale

    Science.gov (United States)

    2014-06-01

    SiLK : A Tool Suite for Unsampled Network Flow Analysis at Scale Mark Thomas, Leigh Metcalf, Jonathan Spring, Paul Krystosek, Katherine Prevost netsa...make the problem manageable, but sampling unacceptably reduces the fidelity of ana- lytic conclusions. In this paper we discuss SiLK , a tool suite...created to analyze this high-volume data source without sampling. SiLK implementation and archi- tectural design are optimized to manage this Big Data

  13. Nuclear Innovation Workshops Report

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, John Howard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Allen, Todd Randall [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hildebrandt, Philip Clay [Idaho National Lab. (INL), Idaho Falls, ID (United States); Baker, Suzanne Hobbs [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The Nuclear Innovation Workshops were held at six locations across the United States on March 3-5, 2015. The data collected during these workshops has been analyzed and sorted to bring out consistent themes toward enhancing innovation in nuclear energy. These themes include development of a test bed and demonstration platform, improved regulatory processes, improved communications, and increased public-private partnerships. This report contains a discussion of the workshops and resulting themes. Actionable steps are suggested at the end of the report. This revision has a small amount of the data in Appendix C removed in order to avoid potential confusion.

  14. Ocean margins workshop

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-12-31

    The Department of Energy (DOE) is announcing the refocusing of its marine research program to emphasize the study of ocean margins and their role in modulating, controlling, and driving Global Change phenomena. This is a proposal to conduct a workshop that will establish priorities and an implementation plan for a new research initiative by the Department of Energy on the ocean margins. The workshop will be attended by about 70 scientists who specialize in ocean margin research. The workshop will be held in the Norfolk, Virginia area in late June 1990.

  15. Synchrotron radiation micro-X-ray fluorescence analysis: A tool to increase accuracy in microscopic analysis

    CERN Document Server

    Adams, F

    2003-01-01

    Microscopic X-ray fluorescence (XRF) analysis has potential for development as a certification method and as a calibration tool for other microanalytical techniques. The interaction of X-rays with matter is well understood and modelling studies show excellent agreement between experimental data and calculations using Monte Carlo simulation. The method can be used for a direct iterative calculation of concentrations using available high accuracy physical constants. Average accuracy is in the range of 3-5% for micron sized objects at concentration levels of less than 1 ppm with focused radiation from SR sources. The end-station ID18F of the ESRF is dedicated to accurate quantitative micro-XRF analysis including fast 2D scanning with collection of full X-ray spectra. Important aspects of the beamline are the precise monitoring of the intensity of the polarized, variable energy beam and the high reproducibility of the set-up measurement geometry, instrumental parameters and long-term stability.

  16. MEL-IRIS: An Online Tool for Audio Analysis and Music Indexing

    Directory of Open Access Journals (Sweden)

    Dimitrios Margounakis

    2009-01-01

    Full Text Available Chroma is an important attribute of music and sound, although it has not yet been adequately defined in literature. As such, it can be used for further analysis of sound, resulting in interesting colorful representations that can be used in many tasks: indexing, classification, and retrieval. Especially in Music Information Retrieval (MIR, the visualization of the chromatic analysis can be used for comparison, pattern recognition, melodic sequence prediction, and color-based searching. MEL-IRIS is the tool which has been developed in order to analyze audio files and characterize music based on chroma. The tool implements specially designed algorithms and a unique way of visualization of the results. The tool is network-oriented and can be installed in audio servers, in order to manipulate large music collections. Several samples from world music have been tested and processed, in order to demonstrate the possible uses of such an analysis.

  17. AN ANALYSIS OF THE CAUSES OF PRODUCT DEFECTS USING QUALITY MANAGEMENT TOOLS

    Directory of Open Access Journals (Sweden)

    Katarzyna MIDOR

    2014-10-01

    Full Text Available To stay or strengthen its position on the market, a modern business needs to follow the principles of quality control in its actions. Especially important is the Zero Defects concept developed by Philip Crosby, which means flawless production. The concept consists in preventing the occurrence of defects and flaws in all production stages. To achieve that, we must, among other things, make use of quality management tools. This article presents an analysis of the reasons for the return of damaged or faulty goods in the automotive industry by means of quality management tools such as the Ishikawa diagram and Pareto analysis, which allow us to identify the causes of product defectiveness. Based on the results, preventive measures have been proposed. The actions presented in this article and the results of the analysis prove the effectiveness of the aforementioned quality management tools.

  18. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    Science.gov (United States)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  19. Second Annual Transformative Vertical Flight Concepts Workshop: Enabling New Flight Concepts Through Novel Propulsion and Energy Architectures

    Science.gov (United States)

    Dudley, Michael R. (Editor); Duffy, Michael; Hirschberg, Michael; Moore, Mark; German, Brian; Goodrich, Ken; Gunnarson, Tom; Petermaier,Korbinian; Stoll, Alex; Fredericks, Bill; Gibson, Andy; Newman, Aron; Ouellette, Richard; Antcliff, Kevin; Sinkula, Michael; Buettner-Garrett, Josh; Ricci, Mike; Keogh, Rory; Moser, Tim; Borer, Nick; Rizzi, Steve; Lighter, Gwen

    2015-01-01

    On August 3rd and 4th, 2015, a workshop was held at the NASA Ames Research Center, located at the Moffett Federal Airfield in California to explore the aviation communities interest in Transformative Vertical Flight (TVF) Concepts. The Workshop was sponsored by the AHS International (AHS), the American Institute of Aeronautics and Astronautics (AIAA), the National Aeronautics and Space Administration (NASA), and hosted by the NASA Aeronautics Research Institute (NARI). This second annual workshop built on the success and enthusiasm generated by the first TVF Workshop held in Washington, DC in August of 2014. The previous Workshop identified the existence of a multi-disciplinary community interested in this topic and established a consensus among the participants that opportunities to establish further collaborations in this area are warranted. The desire to conduct a series of annual workshops augmented by online virtual technical seminars to strengthen the TVF community and continue planning for advocacy and collaboration was a direct outcome of the first Workshop. The second Workshop organizers focused on four desired action-oriented outcomes. The first was to establish and document common stakeholder needs and areas of potential collaborations. This includes advocacy strategies to encourage the future success of unconventional vertiport capable flight concept solutions that are enabled by emerging technologies. The second was to assemble a community that can collaborate on new conceptual design and analysis tools to permit novel configuration paths with far greater multi-disciplinary coupling (i.e., aero-propulsive-control) to be investigated. The third was to establish a community to develop and deploy regulatory guidelines. This community would have the potential to initiate formation of an American Society for Testing and Materials (ASTM) F44 Committee Subgroup for the development of consensus-based certification standards for General Aviation scale vertiport

  20. Cybernetics and Workshop Design.

    Science.gov (United States)

    Eckstein, Daniel G.

    1979-01-01

    Cybernetic sessions allow for the investigation of several variables concurrently, resulting in a large volume of input compacted into a concise time frame. Three session questions are reproduced to illustrate the variety of ideas generated relative to workshop design. (Author)