WorldWideScience

Sample records for software analysis handbook

  1. Software quality assurance handbook

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  2. Handbook of Software Engineering and Knowledge Engineering

    2001-01-01

    This is the first handbook to cover comprehensively both software engineering and knowledge engineering - two important fields that have become interwoven in recent years. Over 60 international experts have contributed to the book. Each chapter has been written in such a way that a practitioner of software engineering and knowledge engineering can easily understand and obtain useful information. Each chapter covers one topic and can be read independently of other chapters, providing both a general survey of the topic and an in-depth exposition of the state of the art.

  3. Handbook of hardware/software codesign

    Teich, Jürgen

    2017-01-01

    This handbook presents fundamental knowledge on the hardware/software (HW/SW) codesign methodology. Contributing expert authors look at key techniques in the design flow as well as selected codesign tools and design environments, building on basic knowledge to consider the latest techniques. The book enables readers to gain real benefits from the HW/SW codesign methodology through explanations and case studies which demonstrate its usefulness. Readers are invited to follow the progress of design techniques through this work, which assists readers in following current research directions and learning about state-of-the-art techniques. Students and researchers will appreciate the wide spectrum of subjects that belong to the design methodology from this handbook. .

  4. The data analysis handbook

    Frank, IE

    1994-01-01

    Analyzing observed or measured data is an important step in applied sciences. The recent increase in computer capacity has resulted in a revolution both in data collection and data analysis. An increasing number of scientists, researchers and students are venturing into statistical data analysis; hence the need for more guidance in this field, which was previously dominated mainly by statisticians. This handbook fills the gap in the range of textbooks on data analysis. Written in a dictionary format, it will serve as a comprehensive reference book in a rapidly growing field. However, this book is more structured than an ordinary dictionary, where each entry is a separate, self-contained entity. The authors provide not only definitions and short descriptions, but also offer an overview of the different topics. Therefore, the handbook can also be used as a companion to textbooks for undergraduate or graduate courses. 1700 entries are given in alphabetical order grouped into 20 topics and each topic is organized...

  5. Handbook of radioactivity analysis

    2012-01-01

    The updated and much expanded Third Edition of the "Handbook of Radioactivity Analysis" is an authoritative reference providing the principles, practical techniques, and procedures for the accurate measurement of radioactivity from the very low levels encountered in the environment to higher levels measured in radioisotope research, clinical laboratories, biological sciences, radionuclide standardization, nuclear medicine, nuclear power, fuel cycle facilities and in the implementation of nuclear forensic analysis and nuclear safeguards. The Third Edition contains seven new chapters providing a reference text much broader in scope than the previous Second Edition, and all of the other chapters have been updated and expanded many with new authors. The book describes the basic principles of radiation detection and measurement, the preparation of samples from a wide variety of matrices, assists the investigator or technician in the selection and use of appropriate radiation detectors, and presents state-of-the-ar...

  6. Software exorcism a handbook for debugging and optimizing legacy code

    Blunden, Bill

    2013-01-01

    Software Exorcism: A Handbook for Debugging and Optimizing Legacy Code takes an unflinching, no bulls and look at behavioral problems in the software engineering industry, shedding much-needed light on the social forces that make it difficult for programmers to do their job. Do you have a co-worker who perpetually writes bad code that you are forced to clean up? This is your book. While there are plenty of books on the market that cover debugging and short-term workarounds for bad code, Reverend Bill Blunden takes a revolutionary step beyond them by bringing our atten

  7. Being Geek The Software Developer's Career Handbook

    Lopp, Michael

    2010-01-01

    As a software engineer, you recognize at some point that there's much more to your career than dealing with code. Is it time to become a manager? Tell your boss he's a jerk? Join that startup? Author Michael Lopp recalls his own make-or-break moments with Silicon Valley giants such as Apple, Netscape, and Symantec in Being Geek -- an insightful and entertaining book that will help you make better career decisions. With more than 40 standalone stories, Lopp walks through a complete job life cycle, starting with the job interview and ending with the realization that it might be time to find an

  8. DOE handbook table-top needs analysis

    NONE

    1996-03-01

    Purpose of this handbook is to establish guidelines for training personnel in developing training for operation, maintenance, and technical support personnel at DOE nuclear facilities. Information from this handbook can be used to conduct needs analysis as the information and methods apply. Operating contractors are encouraged to use good practices selectively in developing or improving programs to meet the specific needs of their facility.

  9. Handbook of Applied Analysis

    Papageorgiou, Nikolaos S

    2009-01-01

    Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.

  10. Computing handbook computer science and software engineering

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  11. Statistical data analysis handbook

    Wall, Francis J

    1986-01-01

    It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...

  12. Handbook for structural analysis of radioactive material transport casks

    Ikushima, Takeshi

    1991-04-01

    This paper described structural analysis method of radioactive material transport casks for use of a handbook of safety analysis and evaluation. Safety analysis conditions, computer codes for analyses and stress evaluation method are also involved in the handbook. (author)

  13. Handbook of practical X-ray fluorescence analysis

    Beckhoff, B.; Wedell, R.; Wolff, H.

    2006-01-01

    X-ray fluorescence analysis (XRF) is a reliable multi-elemental and nondestructive analytical method widely used in research and industrial applications. This practical handbook provides self-contained modules featuring XRF instrumentation, quantification methods, and most of the current applications. The broad spectrum of topics is due to the efforts of a large number of authors from a variety of different types of institutions such as universities, research institutes, and companies. The book gives a survey of the theoretical fundamentals, analytical instrumentation, software for data processing, various excitation regimes including gracing incidents and microfocus measurements, quantitative analysis, applications in routine and micro analysis, mineralogy, biology, medicine, criminal investigations, archeology, metallurgy, abrasion, microelectronics, environmental air and water analysis. It gives the basic knowledge on this technique, information on analytical equipment and guides the reader to the various applications. This practical handbook is intended as a resource for graduate students, research scientists, and industrial users. (orig.)

  14. Handbook of practical X-ray fluorescence analysis

    Beckhoff, B. [Physikalisch-Technische Bundesanstalt, Berlin (Germany). X-ray Spectrometry; Kanngiesser, B. [Technische Univ. Berlin (Germany). Inst. fuer Atomare Physik und Fachdidaktik; Langhoff, N. [IfG-Institute for Scientific Instruments GmbH, Berlin (Germany); Wedell, R.; Wolff, H. (eds.) [Institut fuer Angewandte Photonik e.V., Berlin (Germany)

    2006-07-01

    X-ray fluorescence analysis (XRF) is a reliable multi-elemental and nondestructive analytical method widely used in research and industrial applications. This practical handbook provides self-contained modules featuring XRF instrumentation, quantification methods, and most of the current applications. The broad spectrum of topics is due to the efforts of a large number of authors from a variety of different types of institutions such as universities, research institutes, and companies. The book gives a survey of the theoretical fundamentals, analytical instrumentation, software for data processing, various excitation regimes including gracing incidents and microfocus measurements, quantitative analysis, applications in routine and micro analysis, mineralogy, biology, medicine, criminal investigations, archeology, metallurgy, abrasion, microelectronics, environmental air and water analysis. It gives the basic knowledge on this technique, information on analytical equipment and guides the reader to the various applications. This practical handbook is intended as a resource for graduate students, research scientists, and industrial users. (orig.)

  15. Handbook of Applied Behavior Analysis

    Fisher, Wayne W., Ed.; Piazza, Cathleen C., Ed.; Roane, Henry S., Ed.

    2011-01-01

    Describing the state of the science of ABA, this comprehensive handbook provides detailed information about theory, research, and intervention. The contributors are leading ABA authorities who present current best practices in behavioral assessment and demonstrate evidence-based strategies for supporting positive behaviors and reducing problem…

  16. Regulatory analysis technical evaluation handbook. Final report

    1997-01-01

    The purpose of this Handbook is to provide guidance to the regulatory analyst to promote preparation of quality regulatory analysis documents and to implement the policies of the Regulatory Analysis Guidelines of the US Nuclear Regulatory Commission (NUREG/BR-0058 Rev. 2). This Handbook expands upon policy concepts included in the NRC Guidelines and translates the six steps in preparing regulatory analyses into implementable methodologies for the analyst. It provides standardized methods of preparation and presentation of regulatory analyses, with the inclusion of input that will satisfy all backfit requirements and requirements of NRC's Committee to Review Generic Requirements. Information on the objectives of the safety goal evaluation process and potential data sources for preparing a safety goal evaluation is also included. Consistent application of the methods provided here will result in more directly comparable analyses, thus aiding decision-makers in evaluating and comparing various regulatory actions. The handbook is being issued in loose-leaf format to facilitate revisions. NRC intends to periodically revise the handbook as new and improved guidance, data, and methods become available

  17. Handbook of Fourier analysis & its applications

    Marks, Robert J

    2009-01-01

    Fourier analysis has many scientific applications - in physics, number theory, combinatorics, signal processing, probability theory, statistics, option pricing, cryptography, acoustics, oceanography, optics and diffraction, geometry, and other areas. In signal processing and related fields, Fourier analysis is typically thought of as decomposing a signal into its component frequencies and their amplitudes. This practical, applications-based professional handbook comprehensively covers the theory and applications of Fourier Analysis, spanning topics from engineering mathematics, signal process

  18. Fuel Cycle System Analysis Handbook

    Piet, Steven J.; Dixon, Brent W.; Gombert, Dirk; Hoffman, Edward A.; Matthern, Gretchen E.; Williams, Kent A.

    2009-01-01

    This Handbook aims to improve understanding and communication regarding nuclear fuel cycle options. It is intended to assist DOE, Campaign Managers, and other presenters prepare presentations and reports. When looking for information, check here. The Handbook generally includes few details of how calculations were performed, which can be found by consulting references provided to the reader. The Handbook emphasizes results in the form of graphics and diagrams, with only enough text to explain the graphic, to ensure that the messages associated with the graphic is clear, and to explain key assumptions and methods that cause the graphed results. Some of the material is new and is not found in previous reports, for example: (1) Section 3 has system-level mass flow diagrams for 0-tier (once-through), 1-tier (UOX to CR=0.50 fast reactor), and 2-tier (UOX to MOX-Pu to CR=0.50 fast reactor) scenarios - at both static and dynamic equilibrium. (2) To help inform fast reactor transuranic (TRU) conversion ratio and uranium supply behavior, section 5 provides the sustainable fast reactor growth rate as a function of TRU conversion ratio. (3) To help clarify the difference in recycling Pu, NpPu, NpPuAm, and all-TRU, section 5 provides mass fraction, gamma, and neutron emission for those four cases for MOX, heterogeneous LWR IMF (assemblies mixing IMF and UOX pins), and a CR=0.50 fast reactor. There are data for the first 10 LWR recycle passes and equilibrium. (4) Section 6 provides information on the cycle length, planned and unplanned outages, and TRU enrichment as a function of fast reactor TRU conversion ratio, as well as the dilution of TRU feedstock by uranium in making fast reactor fuel. (The recovered uranium is considered to be more pure than recovered TRU.) The latter parameter impacts the required TRU impurity limits specified by the Fuels Campaign. (5) Section 7 provides flows for an 800-tonne UOX separation plant. (6) To complement 'tornado' economic uncertainty

  19. Handbook of radioactivity analysis. Second edition

    L'Annunziata, M.

    2003-07-01

    This updated and much expanded Second Edition is an authoritative handbook providing the principles, practical techniques, and procedures for the accurate measurement of radioactivity from the very low levels encountered in the environment to higher levels measured in radioisotope research, clinical laboratories, biological sciences, radionuclide standardization, nuclear medicine, nuclear power, fuel cycle facilities, and in the implementation of nuclear safeguards. The book describes the preparation of samples from a wide variety of matrices, assists the investigator or technician in the selection and use of appropriate radiation detectors, and presents state-of-the-art methods of analysis. Fundamentals of radioactivity properties, radionuclide decay, the calculations involved, and methods of detection provide the basis for a thorough understanding of the analytical procedures. The Handbook of Radioactivity Analysis, Second Edition is suitable as a teaching text for university and professional training courses

  20. Safety analysis and risk assessment handbook

    Peterson, V.L.; Colwell, R.G.; Dickey, R.L.

    1997-01-01

    This Safety Analysis and Risk Assessment Handbook (SARAH) provides guidance to the safety analyst at the Rocky Flats Environmental Technology Site (RFETS) in the preparation of safety analyses and risk assessments. Although the older guidance (the Rocky Flats Risk Assessment Guide) continues to be used for updating the Final Safety Analysis Reports developed in the mid-1980s, this new guidance is used with all new authorization basis documents. With the mission change at RFETS came the need to establish new authorization basis documents for its facilities, whose functions had changed. The methodology and databases for performing the evaluations that support the new authorization basis documents had to be standardized, to avoid the use of different approaches and/or databases for similar accidents in different facilities. This handbook presents this new standardized approach. The handbook begins with a discussion of the requirements of the different types of authorization basis documents and how to choose the one appropriate for the facility to be evaluated. It then walks the analyst through the process of identifying all the potential hazards in the facility, classifying them, and choosing the ones that need to be analyzed further. It then discusses the methods for evaluating accident initiation and progression and covers the basic steps in a safety analysis, including consequence and frequency binning and risk ranking. The handbook lays out standardized approaches for determining the source terms of the various accidents (including airborne release fractions, leakpath factors, etc.), the atmospheric dispersion factors appropriate for Rocky Flats, and the methods for radiological and chemical consequence assessments. The radiological assessments use a radiological open-quotes templateclose quotes, a spreadsheet that incorporates the standard values of parameters, whereas the chemical assessments use the standard codes ARCHIE and ALOHA

  1. Handbook of software quality assurance techniques applicable to the nuclear industry

    Bryant, J.L.; Wilburn, N.P.

    1987-08-01

    Pacific Northwest Laboratory is conducting a research project to recommend good engineering practices in the application of 10 CFR 50, Appendix B requirements to assure quality in the development and use of computer software for the design and operation of nuclear power plants for NRC and industry. This handbook defines the content of a software quality assurance program by enumerating the techniques applicable. Definitions, descriptions, and references where further information may be obtained are provided for each topic.

  2. Handbook of software quality assurance techniques applicable to the nuclear industry

    Bryant, J.L.; Wilburn, N.P.

    1987-08-01

    Pacific Northwest Laboratory is conducting a research project to recommend good engineering practices in the application of 10 CFR 50, Appendix B requirements to assure quality in the development and use of computer software for the design and operation of nuclear power plants for NRC and industry. This handbook defines the content of a software quality assurance program by enumerating the techniques applicable. Definitions, descriptions, and references where further information may be obtained are provided for each topic

  3. Nuclear fuel cycle facility accident analysis handbook

    Ayer, J.E.; Clark, A.T.; Loysen, P.; Ballinger, M.Y.; Mishima, J.; Owczarski, P.C.; Gregory, W.S.; Nichols, B.D.

    1988-05-01

    The Accident Analysis Handbook (AAH) covers four generic facilities: fuel manufacturing, fuel reprocessing, waste storage/solidification, and spent fuel storage; and six accident types: fire, explosion, tornado, criticality, spill, and equipment failure. These are the accident types considered to make major contributions to the radiological risk from accidents in nuclear fuel cycle facility operations. The AAH will enable the user to calculate source term releases from accident scenarios manually or by computer. A major feature of the AAH is development of accident sample problems to provide input to source term analysis methods and transport computer codes. Sample problems and illustrative examples for different accident types are included in the AAH

  4. HAZARD ANALYSIS SOFTWARE

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  5. Book Review: The Software IP Detective's Handbook: Measurement, Comparison, and Infringement Detections

    Diane Barrett

    2012-03-01

    Full Text Available Zeidman, B. (2011. The Software IP Detective's Handbook: Measurement, Comparison, and Infringement Detection. Boston, MA: Pearson Education, Inc. 480 pages, ISBN-10: 0137035330;ISBN-13: 978-0137035335, US$49.99.Reviewed by Diane Barrett, American Military University.Do not the book title fool you into thinking that the book is only for those looking to detect software infringement detection. It is a comprehensive look at software intellectual property. The book covers a wide range of topics and has something to offer for just about everyone from lawyers to programmers.(see PDF for full review

  6. A handbook of software and systems engineering empirical observations, laws and theories

    Endres, Albert

    2003-01-01

    This book is intended as a handbook for students and practitioners alike. The book is structured around the type of tasks that practitioners are confronted with, beginning with requirements definition and concluding with maintenance and withdrawal. It identifies and discusses existing laws that have a significant impact on the software engineering field. These laws are largely independent of the technologies involved, which allow students to learn the principles underlying software engineering. This also guides students toward the best practice when implementing software engineering techniques.

  7. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  8. System safety engineering analysis handbook

    Ijams, T. E.

    1972-01-01

    The basic requirements and guidelines for the preparation of System Safety Engineering Analysis are presented. The philosophy of System Safety and the various analytic methods available to the engineering profession are discussed. A text-book description of each of the methods is included.

  9. Handbook on data envelopment analysis

    Cooper, William W; Zhu, Joe

    2011-01-01

    Focusing on extensively used Data Envelopment Analysis topics, this volume aims to both describe the state of the field and extend the frontier of DEA research. New chapters include DEA models for DMUs, network DEA, models for supply chain operations and applications, and new developments.

  10. Handbook of mathematical analysis in mechanics of viscous fluids

    Novotný, Antonín

    2018-01-01

    Mathematics has always played a key role for researches in fluid mechanics. The purpose of this handbook is to give an overview of items that are key to handling problems in fluid mechanics. Since the field of fluid mechanics is huge, it is almost impossible to cover many topics. In this handbook, we focus on mathematical analysis on viscous Newtonian fluid. The first part is devoted to mathematical analysis on incompressible fluids while part 2 is devoted to compressible fluids.

  11. Software safety hazard analysis

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  12. A handbook of silicate rock analysis

    Potts, P.J.

    1986-01-01

    The topics covered in this handbook include the following: concepts in analytical chemistry; wet chemical and optical spectroscopy methods; X-ray and micro-beam methods; nuclear methods; pre-concentration techniques; and mass spectrometry

  13. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  14. Nuclear fuel cycle facility accident analysis handbook

    NONE

    1998-03-01

    The purpose of this Handbook is to provide guidance on how to calculate the characteristics of releases of radioactive materials and/or hazardous chemicals from nonreactor nuclear facilities. In addition, the Handbook provides guidance on how to calculate the consequences of those releases. There are four major chapters: Hazard Evaluation and Scenario Development; Source Term Determination; Transport Within Containment/Confinement; and Atmospheric Dispersion and Consequences Modeling. These chapters are supported by Appendices, including: a summary of chemical and nuclear information that contains descriptions of various fuel cycle facilities; details on how to calculate the characteristics of source terms for releases of hazardous chemicals; a comparison of NRC, EPA, and OSHA programs that address chemical safety; a summary of the performance of HEPA and other filters; and a discussion of uncertainties. Several sample problems are presented: a free-fall spill of powder, an explosion with radioactive release; a fire with radioactive release; filter failure; hydrogen fluoride release from a tankcar; a uranium hexafluoride cylinder rupture; a liquid spill in a vitrification plant; and a criticality incident. Finally, this Handbook includes a computer model, LPF No.1B, that is intended for use in calculating Leak Path Factors. A list of contributors to the Handbook is presented in Chapter 6. 39 figs., 35 tabs.

  15. Nuclear fuel cycle facility accident analysis handbook

    1998-03-01

    The purpose of this Handbook is to provide guidance on how to calculate the characteristics of releases of radioactive materials and/or hazardous chemicals from nonreactor nuclear facilities. In addition, the Handbook provides guidance on how to calculate the consequences of those releases. There are four major chapters: Hazard Evaluation and Scenario Development; Source Term Determination; Transport Within Containment/Confinement; and Atmospheric Dispersion and Consequences Modeling. These chapters are supported by Appendices, including: a summary of chemical and nuclear information that contains descriptions of various fuel cycle facilities; details on how to calculate the characteristics of source terms for releases of hazardous chemicals; a comparison of NRC, EPA, and OSHA programs that address chemical safety; a summary of the performance of HEPA and other filters; and a discussion of uncertainties. Several sample problems are presented: a free-fall spill of powder, an explosion with radioactive release; a fire with radioactive release; filter failure; hydrogen fluoride release from a tankcar; a uranium hexafluoride cylinder rupture; a liquid spill in a vitrification plant; and a criticality incident. Finally, this Handbook includes a computer model, LPF No.1B, that is intended for use in calculating Leak Path Factors. A list of contributors to the Handbook is presented in Chapter 6. 39 figs., 35 tabs

  16. Automated Software Vulnerability Analysis

    Sezer, Emre C.; Kil, Chongkyung; Ning, Peng

    Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.

  17. Book review: Handbook of cyanobacterial monitoring and cyanotoxin analysis

    Graham, Jennifer L.; Loftin, Keith A.

    2018-01-01

    Review of Meriluoto, Jussi, Lisa Spoof, and GeoffreyA. Codd [eds.]. 2017. Handbook of Cyanobacterial Monitoring and Cyanotoxin Analysis. John Wiley & Sons, Ltd.: Chichester, West Sussex, UK, ISBN 978‐1‐119‐06868‐6 (978‐1‐119‐06876‐1 eBook), DOI 10.1002/9781119068761.

  18. Safety Justification of Software Systems. Software Based Safety Systems. Regulatory Inspection Handbook

    Dahll, Gustav; Liwang, Bo; Wainwright, Norman

    2006-01-01

    The introduction of new software based technology in the safety systems in nuclear power plants also makes it necessary to develop new strategies for regulatory review and assessment of these new systems that is more focused on reviewing the processes at the different phases in design phases during the system life cycle. It is a general requirement that the licensee shall perform different kinds of reviews. From a regulatory point of view it is more cost effective to assess that the design activities at the suppliers and the review activities within the development project are performed with good quality. But the change from more technical reviews over to the development process oriented approach also cause problems. When reviewing development and quality aspects there are no 'hard facts' that can be judged against some specified criteria, the issues are more 'soft' and are more to build up structure of arguments and evidences that the requirements are met. The regulatory review strategy must therefore change to follow the development process over the whole life cycle from concept phase until installation and operation. Even if we know what factors that is of interest we need some guidance on how to interpret and judge the information.For that purpose SKl started research activities in this area at the end of the 1990s. In the first phase, in co-operation with Gustav Dahll at the Halden project, a life cycle model was selected. For the different phases a qualitative influence net was constructed of the type that is used in Bayesian Believe Network together with a discussion on different issues involved. In the second phase of the research work, in co-operation with Norman Wainwright, a former NII inspector, information from a selection of the most important sources as guidelines, IAEA and EC reports etc, was mapped into the influence net structure (the total list on used sources are in the report). The result is presented in the form of questions (Q) and a

  19. Safety Justification of Software Systems. Software Based Safety Systems. Regulatory Inspection Handbook

    Dahll, Gustav (OECD Halden Project, Halden (NO)); Liwaang, Bo (Swedish Nuclear Power Inspectorate, Stockholm (Sweden)); Wainwright, Norman (Wainwright Safety Advice (GB))

    2006-07-01

    The introduction of new software based technology in the safety systems in nuclear power plants also makes it necessary to develop new strategies for regulatory review and assessment of these new systems that is more focused on reviewing the processes at the different phases in design phases during the system life cycle. It is a general requirement that the licensee shall perform different kinds of reviews. From a regulatory point of view it is more cost effective to assess that the design activities at the suppliers and the review activities within the development project are performed with good quality. But the change from more technical reviews over to the development process oriented approach also cause problems. When reviewing development and quality aspects there are no 'hard facts' that can be judged against some specified criteria, the issues are more 'soft' and are more to build up structure of arguments and evidences that the requirements are met. The regulatory review strategy must therefore change to follow the development process over the whole life cycle from concept phase until installation and operation. Even if we know what factors that is of interest we need some guidance on how to interpret and judge the information.For that purpose SKl started research activities in this area at the end of the 1990s. In the first phase, in co-operation with Gustav Dahll at the Halden project, a life cycle model was selected. For the different phases a qualitative influence net was constructed of the type that is used in Bayesian Believe Network together with a discussion on different issues involved. In the second phase of the research work, in co-operation with Norman Wainwright, a former NII inspector, information from a selection of the most important sources as guidelines, IAEA and EC reports etc, was mapped into the influence net structure (the total list on used sources are in the report). The result is presented in the form of

  20. Software quality testing process analysis

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  1. Software FMEA analysis for safety-related application software

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  2. Handbook of Time Series Analysis Recent Theoretical Developments and Applications

    Schelter, Björn; Timmer, Jens

    2006-01-01

    This handbook provides an up-to-date survey of current research topics and applications of time series analysis methods written by leading experts in their fields. It covers recent developments in univariate as well as bivariate and multivariate time series analysis techniques ranging from physics' to life sciences' applications. Each chapter comprises both methodological aspects and applications to real world complex systems, such as the human brain or Earth's climate. Covering an exceptionally broad spectrum of topics, beginners, experts and practitioners who seek to understand the latest de

  3. Handbook of univariate and multivariate data analysis with IBM SPSS

    Ho, Robert

    2013-01-01

    Using the same accessible, hands-on approach as its best-selling predecessor, the Handbook of Univariate and Multivariate Data Analysis with IBM SPSS, Second Edition explains how to apply statistical tests to experimental findings, identify the assumptions underlying the tests, and interpret the findings. This second edition now covers more topics and has been updated with the SPSS statistical package for Windows.New to the Second EditionThree new chapters on multiple discriminant analysis, logistic regression, and canonical correlationNew section on how to deal with missing dataCoverage of te

  4. Applications of ion beam analysis workshop. Workshop handbook

    1995-01-01

    A workshop on applications of ion beam analysis was held at ANSTO, immediate prior to the IBMM-95 Conference in Canberra. It aims was to review developments and current status on use of ion beams for analysis, emphasizing the following aspects: fundamental ion beam research and secondary effects of ion beams; material sciences, geological, life sciences, environmental and industrial applications; computing codes for use in accelerator research; high energy heavy ion scattering and recoil; recent technological development using ion beams. The handbook contains the workshop's program, 29 abstracts and a list of participants

  5. On-Orbit Software Analysis

    Moran, Susanne I.

    2004-01-01

    The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications

  6. Software architecture analysis tool : software architecture metrics collection

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  7. Software development for teleroentgenogram analysis

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  8. Diversion Path Analysis Handbook. Volume 1. Methodology

    Goodwin, K.E.; Schleter, J.C.; Maltese, M.D.K.

    1978-11-01

    Diversion Path Analysis (DPA) is a safeguards evaluation tool which is used to determine the vulnerability of the Material Control and Material Accounting (MC and MA) Subsystems to the threat of theft of Special Nuclear Material (SNM) by a knowledgeable Insider. The DPA team should consist of two individuals who have technical backgrounds. The implementation of DPA is divided into five basic steps: Information and Data Gathering, Process Characterization, Analysis of Diversion Paths, Results and Findings, and Documentation

  9. Summary of project to develop handbook of human reliability analysis for nuclear power plant operations

    Swain, A.D.

    1978-01-01

    For the past two years Alan Swain and Henry E. Guttmann, of the Statistics, Computing, and Human Factors Division, Sandia Laboratories, have been developing a handbook to aid qualified persons to evaluate the effect of human error on the availability of engineered safety systems and features in nuclear power plants. The handbook includes a mathematical model, procedures, derived human failure data, and principles of human behavior and ergonomics. The handbook is expanding the human error analyses which were presented in WASH--1400. The work, under the sponsorship of Probabilistic Analysis Staff, NRC Office of Nuclear Regulatory Research (Dr. M.C. Cullingford, NRC Program Manager), is about half completed. An outline of the handbook contents is given in copies of vugraphs (attached), followed by copies of human performance model abstractors (also attached). A first draft of the handbook is scheduled for NRC review by July 1, 1979

  10. Diversion path analysis handbook. Volume 2 (of 4 volumes). Example

    Goodwin, K.E.; Schleter, J.C.; Maltese, M.D.K.

    1978-11-01

    Volume 2 of the Handbook is divided into two parts, the workpaper documentation and the summary documentation. The former sets forth, in terms of the hypothetical process, the analysis guidelines, the information gathered, the characterization of the process, the specific diversion paths related to the process, and, finally, the results and findings of the Diversion Path Analysis (DPA). The summary documentation, made up of portions of sections already prepared for the workpapers, is a concise statement of results and recommendations for management use. Most of the details available in the workpapers are not used, or are held to a minimum, in this report. Also, some rearrangement of the excerpted sections has been made in order to permit rapid comprehension by a manager having only limited time to devote to study and review of the analysis

  11. Diversion path analysis handbook. Volume I. Methodology

    Maltese, M.D.K.; Goodwin, K.E.; Schleter, J.C.

    1976-10-01

    Diversion Path Analysis (DPA) is a procedure for analyzing internal controls of a facility in order to identify vulnerabilities to successful diversion of material by an adversary. The internal covert threat is addressed but the results are also applicable to the external overt threat. The diversion paths are identified. Complexity parameters include records alteration or falsification, multiple removals of sub-threshold quantities, collusion, and access authorization of the individual. Indicators, or data elements and information of significance to detection of unprevented theft, are identified by means of DPA. Indicator sensitivity is developed in terms of the threshold quantity, the elapsed time between removal and indication and the degree of localization of facility area and personnel given by the indicator. Evaluation of facility internal controls in light of these sensitivities defines the capability of interrupting identified adversary action sequences related to acquisition of material at fixed sites associated with the identified potential vulnerabilities. Corrective measures can, in many cases, also be prescribed for management consideration and action. DPA theory and concepts have been developing over the last several years, and initial field testing proved both the feasibility and practicality of the procedure. Follow-on implementation testing verified the ability of facility personnel to perform DPA

  12. Software Design for Smile Analysis

    A. Sarkhosh

    2010-12-01

    Full Text Available Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproduciblesmile. The record then should be analyzed to determine its characteristics. In this study,we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients.Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high(=0.7-1.Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo.Conclusion: The results obtained from smile analysis could be used in diagnosis,treatment planning and evaluation of the treatment progress.

  13. NASA Accident Precursor Analysis Handbook, Version 1.0

    Groen, Frank; Everett, Chris; Hall, Anthony; Insley, Scott

    2011-01-01

    Catastrophic accidents are usually preceded by precursory events that, although observable, are not recognized as harbingers of a tragedy until after the fact. In the nuclear industry, the Three Mile Island accident was preceded by at least two events portending the potential for severe consequences from an underappreciated causal mechanism. Anomalies whose failure mechanisms were integral to the losses of Space Transportation Systems (STS) Challenger and Columbia had been occurring within the STS fleet prior to those accidents. Both the Rogers Commission Report and the Columbia Accident Investigation Board report found that processes in place at the time did not respond to the prior anomalies in a way that shed light on their true risk implications. This includes the concern that, in the words of the NASA Aerospace Safety Advisory Panel (ASAP), "no process addresses the need to update a hazard analysis when anomalies occur" At a broader level, the ASAP noted in 2007 that NASA "could better gauge the likelihood of losses by developing leading indicators, rather than continue to depend on lagging indicators". These observations suggest a need to revalidate prior assumptions and conclusions of existing safety (and reliability) analyses, as well as to consider the potential for previously unrecognized accident scenarios, when unexpected or otherwise undesired behaviors of the system are observed. This need is also discussed in NASA's system safety handbook, which advocates a view of safety assurance as driving a program to take steps that are necessary to establish and maintain a valid and credible argument for the safety of its missions. It is the premise of this handbook that making cases for safety more experience-based allows NASA to be better informed about the safety performance of its systems, and will ultimately help it to manage safety in a more effective manner. The APA process described in this handbook provides a systematic means of analyzing candidate

  14. Handbook of soil analysis. Mineralogical, organic and inorganic methods

    Pansu, M. [Centre IRD, 34 - Montpellier (France); Gautheyrou, J.

    2006-07-01

    This handbook is a reference guide for selecting and carrying out numerous methods of soil analysis. It is written in accordance with analytical standards and quality control approaches.It covers a large body of technical information including protocols, tables, formulae, spectrum models, chromatograms and additional analytical diagrams. The approaches are diverse, from the simplest tests to the most sophisticated determination methods in the physical chemistry of mineralogical and organic structures, available and total elements, soil exchange complex, pesticides and contaminants, trace elements and isotopes.As a basic reference, it will be particularly useful to scientists, engineers, technicians, professors and students, in the areas of soil science, agronomy, earth and environmental sciences as well as in related fields such as analytical chemistry, geology, hydrology, ecology, climatology, civil engineering and industrial activities associated with soil. (orig.)

  15. Handbook of nuclear data for neutron activation analysis. Vol. I

    Hnatowicz, V.

    1986-01-01

    The first part of a two-volume book which is meant for experimentalists working in instrumental activation analysis and related fields, such as nuclear metrology, materials testing and environmental studies. The volume describes the basic processes of gamma-ray interaction with matter as well as the important phenomena affecting gamma-ray spectra formation in semiconductor spectrometers. A brief account is also given of computation methods commonly employed for spectra evaluation. The results rather than detailed derivations are stressed. A great deal of material si divided into five chapters and nine appendices. The inclusion of many tables of significant spectroscopic data should make the text a useful handbook for those dealing with multi-channel gamma-ray spectra. (author) 26 figs., 82 tabs., 334 refs

  16. Engineer’s Handbook. Central Archive for Reusable Defense Software (CARDS)

    1994-02-28

    instance of a domain-specific infrastuctur built utilizing the CARDS Concept of Operations /Franchise franchisee Group to whom a franchise is granted. generic...34• develop and operate a domain-specific library system and necessary tools, "* develop a Franchise Plan which provides a blueprint for...Adoption Handbooks, CARDS Command Center Library development and operation related documents, and training and educational material. The Franchise

  17. Handbook of human-reliability analysis with emphasis on nuclear power plant applications. Final report

    Swain, A D; Guttmann, H E

    1983-08-01

    The primary purpose of the Handbook is to present methods, models, and estimated human error probabilities (HEPs) to enable qualified analysts to make quantitative or qualitative assessments of occurrences of human errors in nuclear power plants (NPPs) that affect the availability or operational reliability of engineered safety features and components. The Handbook is intended to provide much of the modeling and information necessary for the performance of human reliability analysis (HRA) as a part of probabilistic risk assessment (PRA) of NPPs. Although not a design guide, a second purpose of the Handbook is to enable the user to recognize error-likely equipment design, plant policies and practices, written procedures, and other human factors problems so that improvements can be considered. The Handbook provides the methodology to identify and quantify the potential for human error in NPP tasks.

  18. Handbook of human-reliability analysis with emphasis on nuclear power plant applications. Final report

    Swain, A.D.; Guttmann, H.E.

    1983-08-01

    The primary purpose of the Handbook is to present methods, models, and estimated human error probabilities (HEPs) to enable qualified analysts to make quantitative or qualitative assessments of occurrences of human errors in nuclear power plants (NPPs) that affect the availability or operational reliability of engineered safety features and components. The Handbook is intended to provide much of the modeling and information necessary for the performance of human reliability analysis (HRA) as a part of probabilistic risk assessment (PRA) of NPPs. Although not a design guide, a second purpose of the Handbook is to enable the user to recognize error-likely equipment design, plant policies and practices, written procedures, and other human factors problems so that improvements can be considered. The Handbook provides the methodology to identify and quantify the potential for human error in NPP tasks

  19. Application of Software Safety Analysis Methods

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C.; Lee, S. J.; Koo, Y. H.

    2009-01-01

    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  20. MAUS: MICE Analysis User Software

    CERN. Geneva

    2012-01-01

    The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for online data quality checks and offline batch simulation and reconstruction. The code is structured in a Map-Reduce framework to allow parallelization whether on a personal machine or in the control room. Various software engineering practices from industry are also used to ensure correct and maintainable physics code, which include unit, functional and integration tests, continuous integration and load testing, code reviews, and distributed version control systems. Lastly, there are various small design decisions like using JSON as the data structure, using SWIG to allow developers to write components in either Python or C++, or using the SCons python-based build system that may be of interest to other experiments.

  1. HANDBOOK OF SOCCER MATCH ANALYSIS: A SYSTEMATIC APPROACH TO IMPROVING PERFORMANCE

    Christopher Carling

    2006-03-01

    Full Text Available DESCRIPTION This book addresses and appropriately explains the soccer match analysis, looks at the very latest in match analysis research, and at the innovative technologies used by professional clubs. This handbook is also bridging the gap between research, theory and practice. The methods in it can be used by coaches, sport scientists and fitness coaches to improve: styles of play, technical ability and physical fitness; objective feedback to players; the development of specific training routines; use of available notation software, video analysis and manual systems; and understanding of current academic research in soccer notational analysis. PURPOSE The aim is to provide a prepared manual on soccer match analysis in general for coaches and sport scientists. Thus, the professionals in this field would gather objective data on the players and the team, which in turn could be used by coaches and players to learn more about performance as a whole and gain a competitive advantage as a result. The book efficiently meets these objectives. AUDIENCE The book is targeted the athlete, the coach, the sports scientist professional or any sport conscious person who wishes to analyze relevant soccer performance. The editors and the contributors are authorities in their respective fields and this handbook depend on their extensive experience and knowledge accumulated over the years. FEATURES The book demonstrates how a notation system can be established to produce data to analyze and improve performance in soccer. It is composed of 9 chapters which present the information in an order that is considered logical and progressive as in most texts. Chapter headings are: 1. Introduction to Soccer Match Analysis, 2. Developing a Manual Notation System, 3. Video and Computerized Match Analysis Technology, 4. General Advice on Analyzing Match Performance, 5. Analysis and Presentation of the Results, 6. Motion Analysis and Consequences for Training, 7. What Match

  2. HANFORD SAFETY ANALYSIS and RISK ASSESSMENT HANDBOOK (SARAH)

    EVANS, C.B.

    2004-01-01

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S and M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard

  3. Software Performs Complex Design Analysis

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  4. Intercomparison of gamma ray analysis software packages

    1998-04-01

    The IAEA undertook an intercomparison exercise to review available software for gamma ray spectra analysis. This document describes the methods used in the intercomparison exercise, characterizes the software packages reviewed and presents the results obtained. Only direct results are given without any recommendation for a particular software or method for gamma ray spectra analysis

  5. Infusing Reliability Techniques into Software Safety Analysis

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  6. Fault tree analysis of KNICS RPS software

    Park, Gee Yong; Kwon, Kee Choon; Koh, Kwang Yong; Jee, Eun Kyoung; Seong, Poong Hyun; Lee, Dae Hyung

    2008-01-01

    This paper describes the application of a software Fault Tree Analysis (FTA) as one of the analysis techniques for a Software Safety Analysis (SSA) at the design phase and its analysis results for the safety-critical software of a digital reactor protection system, which is called the KNICS RPS, being developed in the KNICS (Korea Nuclear Instrumentation and Control Systems) project. The software modules in the design description were represented by Function Blocks (FBs), and the software FTA was performed based on the well-defined fault tree templates for the FBs. The SSA, which is part of the verification and validation (V and V) activities, was activated at each phase of the software lifecycle for the KNICS RPS. At the design phase, the software HAZOP (Hazard and Operability) and the software FTA were employed in the SSA in such a way that the software HAZOP was performed first and then the software FTA was applied. The software FTA was applied to some critical modules selected from the software HAZOP analysis

  7. Dependability Analysis Methods For Configurable Software

    Dahll, Gustav; Pulkkinen, Urho

    1996-01-01

    Configurable software systems are systems which are built up by standard software components in the same way as a hardware system is built up by standard hardware components. Such systems are often used in the control of NPPs, also in safety related applications. A reliability analysis of such systems is therefore necessary. This report discusses what configurable software is, and what is particular with respect to reliability assessment of such software. Two very commonly used techniques in traditional reliability analysis, viz. failure mode, effect and criticality analysis (FMECA) and fault tree analysis are investigated. A real example is used to illustrate the discussed methods. Various aspects relevant to the assessment of the software reliability in such systems are discussed. Finally some models for quantitative software reliability assessment applicable on configurable software systems are described. (author)

  8. Reliability analysis of software based safety functions

    Pulkkinen, U.

    1993-05-01

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  9. Software architecture analysis of usability

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their

  10. International Reactor Physics Handbook Database and Analysis Tool (IDAT) - IDAT user manual

    2013-01-01

    The IRPhEP Database and Analysis Tool (IDAT) was first released in 2013 and is included on the DVD. This database and corresponding user interface allows easy access to handbook information. Selected information from each configuration was entered into IDAT, such as the measurements performed, benchmark values, calculated values and materials specifications of the benchmark. In many cases this is supplemented with calculated data such as neutron balance data, spectra data, k-eff nuclear data sensitivities, and spatial reaction rate plots. IDAT accomplishes two main objectives: 1. Allow users to search the handbook for experimental configurations that satisfy their input criteria. 2. Allow users to trend results and identify suitable benchmarks experiments for their application. IDAT provides the user with access to several categories of calculated data, including: - 1-group neutron balance data for each configuration with individual isotope contributions in the reactor system. - Flux and other reaction rates spectra in a 299-group energy scheme. Plotting capabilities were implemented into IDAT allowing the user to compare the spectra of selected configurations in the original fine energy structure or on any user-defined broader energy structure. - Sensitivity coefficients (percent changes of k-effective due to elementary change of basic nuclear data) for the major nuclides and nuclear processes in a 238-group energy structure. IDAT is actively being developed. Those approved to access the online version of the handbook will also have access to an online version of IDAT. As May 2013 marks the first release, IDAT may contain data entry errors and omissions. The handbook remains the primary source of reactor physics benchmark data. A copy of IDAT user's manual is attached to this document. A copy of the IRPhE Handbook can be obtained on request at http://www.oecd-nea.org/science/wprs/irphe/irphe-handbook/form.html

  11. Analysis of open source GIS software

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  12. Component Provider’s and Tool Developer’s Handbook. Central Archive for Reusable Defense Software (CARDS)

    1994-03-25

    metrics [DISA93b]. " The Software Engineering Institute (SET) has developed a domain analysis process (Feature-Oriented Domain Analysis - FODA ) and is...and expresses the range of variability of these decisions. 3.2.2.3 Feature Oriented Domain Analysis Feature Oriented Domain Analysis ( FODA ) is a domain...documents created in this phase. From a purely profit-oriented business point of view, a company may develop its own analysis of a government or commercial

  13. Gamma-Ray Spectrum Analysis Software GDA

    Wanabongse, P.

    1998-01-01

    The developmental work on computer software for gamma-ray spectrum analysis has been completed as a software package version 1.02 named GDA, which is an acronym for Gamma-spectrum Deconvolution and Analysis. The software package consists of three 3.5-inch diskettes for setup and a user's manual. GDA software can be installed for using on a personal computer with Windows 95 or Windows NT 4.0 operating system. A computer maybe the type of 80486 CPU with 8 megabytes of memory

  14. Software architecture analysis of usability

    Folmer, E; van Gurp, J; Bosch, J; Bastide, R; Palanque, P; Roth, J

    2005-01-01

    Studies of software engineering projects show that a large number of usability related change requests are made after its deployment. Fixing usability problems during the later stages of development often proves to be costly, since many of the necessary changes require changes to the system that

  15. Handbook of univariate and multivariate data analysis and interpretation with SPSS

    Ho, Robert

    2006-01-01

    Many statistics texts tend to focus more on the theory and mathematics underlying statistical tests than on their applications and interpretation. This can leave readers with little understanding of how to apply statistical tests or how to interpret their findings. While the SPSS statistical software has done much to alleviate the frustrations of social science professionals and students who must analyze data, they still face daunting challenges in selecting the proper tests, executing the tests, and interpreting the test results.With emphasis firmly on such practical matters, this handbook se

  16. Development of a fatigue analysis software system

    Choi, B. I.; Lee, H. J.; Han, S. W.; Kim, J. Y.; Hwang, K. H.; Kang, J. Y.

    2001-01-01

    A general purpose fatigue analysis software to predict fatigue lives of mechanical components and structures was developed. This software has some characteristic features including functions of searching weak regions on the free surface in order to reduce computing time significantly, a database of fatigue properties for various materials, and an expert system which can assist any users to get more proper results. This software can be used in the environment consists of commercial finite element packages. Using the software developed fatigue analyses for a SAE keyhole specimen and an automobile knuckle were carried out. It was observed that the results were agree well with those from commercial packages

  17. Numerical methods in software and analysis

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  18. Software Process Improvement Using Force Field Analysis ...

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  19. Software safety analysis practice in installation phase

    Huang, H. W.; Chen, M. H.; Shyu, S. S., E-mail: hwhwang@iner.gov.t [Institute of Nuclear Energy Research, No. 1000 Wenhua Road, Chiaan Village, Longtan Township, 32546 Taoyuan County, Taiwan (China)

    2010-10-15

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  20. Software safety analysis practice in installation phase

    Huang, H. W.; Chen, M. H.; Shyu, S. S.

    2010-10-01

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  1. Computer-assisted qualitative data analysis software.

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  2. Software safety analysis application in installation phase

    Huang, H. W.; Yih, S.; Wang, L. H.; Liao, B. C.; Lin, J. M.; Kao, T. M.

    2010-01-01

    This work performed a software safety analysis (SSA) in the installation phase of the Lungmen nuclear power plant (LMNPP) in Taiwan, under the cooperation of INER and TPC. The US Nuclear Regulatory Commission (USNRC) requests licensee to perform software safety analysis (SSA) and software verification and validation (SV and V) in each phase of software development life cycle with Branch Technical Position (BTP) 7-14. In this work, 37 safety grade digital instrumentation and control (I and C) systems were analyzed by Failure Mode and Effects Analysis (FMEA), which is suggested by IEEE Standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The FMEA showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (authors)

  3. A Poetry Handbook.

    Oliver, Mary

    Intended to impart the basic ways a poem is constructed, this concise handbook is a prose guide to writing poetry. The handbook talks about meter and rhyme, form and diction, sound and sense, iambs and trochees, couplets and sonnets, and how and why this should matter to any person writing or reading poetry. Interspersing history and analysis with…

  4. Handbook of methods for risk-based analysis of technical specification requirements

    Samanta, P.K.; Vesely, W.E.

    1994-01-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements were based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while others may not be conducive to safety. Improvements in these requirements are facilitated by the availability of plant specific Probabilistic Safety Assessments (PSAs). The use of risk and reliability-based methods to improve TS requirements has gained wide interest because these methods can: Quantitatively evaluate the risk and justify changes based on objective risk arguments; Provide a defensible basis for these requirements for regulatory applications. The US NRC Office of Research is sponsoring research to develop systematic risk-based methods to improve various aspects of TS requirements. The handbook of methods, which is being prepared, summarizes such risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), defenses against common-cause failures, managing plant configurations, and scheduling maintenances. For each topic, the handbook summarizes methods of analysis and data needs, outlines the insights to be gained, lists additional references, and presents examples of evaluations

  5. Handbook of methods for risk-based analysis of Technical Specification requirements

    Samanta, P.K.; Vesely, W.E.

    1993-01-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements were based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while others may not be conducive to safety. Improvements in these requirements are facilitated by the availability of plant specific Probabilistic Safety Assessments (PSAs). The use of risk and reliability-based methods to improve TS requirements has gained wide interest because these methods can: quantitatively evaluate the risk impact and justify changes based on objective risk arguments. Provide a defensible basis for these requirements for regulatory applications. The United States Nuclear Regulatory Commission (USNRC) Office of Research is sponsoring research to develop systematic risk-based methods to improve various aspects of TS requirements. The handbook of methods, which is being prepared, summarizes such risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), defenses against common-cause failures, managing plant configurations, and scheduling maintenances. For each topic, the handbook summarizes methods of analysis and data needs, outlines the insights to be gained, lists additional references, and presents examples of evaluations

  6. Software criticality analysis of COTS/SOUP

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-01-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading

  7. Software criticality analysis of COTS/SOUP

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-09-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading.

  8. Software

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  9. Operations planning and analysis handbook for NASA/MSFC phase B development projects

    Batson, Robert C.

    1986-01-01

    Current operations planning and analysis practices on NASA/MSFC Phase B projects were investigated with the objectives of (1) formalizing these practices into a handbook and (2) suggesting improvements. The study focused on how Science and Engineering (S&E) Operational Personnel support Program Development (PD) Task Teams. The intimate relationship between systems engineering and operations analysis was examined. Methods identified for use by operations analysts during Phase B include functional analysis, interface analysis methods to calculate/allocate such criteria as reliability, Maintainability, and operations and support cost.

  10. Acoustic Emission Analysis Applet (AEAA) Software

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  11. Introducing a New Software for Geodetic Analysis

    Hjelle, Geir Arne; Dähnn, Michael; Fausk, Ingrid; Kirkvik, Ann-Silje; Mysen, Eirik

    2017-04-01

    At the Norwegian Mapping Authority, we are currently developing Where, a new software for geodetic analysis. Where is built on our experiences with the Geosat software, and will be able to analyse and combine data from VLBI, SLR, GNSS and DORIS. The software is mainly written in Python which has proved very fruitful. The code is quick to write and the architecture is easily extendable and maintainable, while at the same time taking advantage of well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where, including benchmarks against other software packages, and outline our plans for further progress. In addition we will report on some investigations we have done experimenting with alternative weighting strategies for VLBI.

  12. Software for Graph Analysis and Visualization

    M. I. Kolomeychenko

    2014-01-01

    Full Text Available This paper describes the software for graph storage, analysis and visualization. The article presents a comparative analysis of existing software for analysis and visualization of graphs, describes the overall architecture of application and basic principles of construction and operation of the main modules. Furthermore, a description of the developed graph storage oriented to storage and processing of large-scale graphs is presented. The developed algorithm for finding communities and implemented algorithms of autolayouts of graphs are the main functionality of the product. The main advantage of the developed software is high speed processing of large size networks (up to millions of nodes and links. Moreover, the proposed graph storage architecture is unique and has no analogues. The developed approaches and algorithms are optimized for operating with big graphs and have high productivity.

  13. Power and performance software analysis and optimization

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  14. PIV/HPIV Film Analysis Software Package

    Blackshire, James L.

    1997-01-01

    A PIV/HPIV film analysis software system was developed that calculates the 2-dimensional spatial autocorrelations of subregions of Particle Image Velocimetry (PIV) or Holographic Particle Image Velocimetry (HPIV) film recordings. The software controls three hardware subsystems including (1) a Kodak Megaplus 1.4 camera and EPIX 4MEG framegrabber subsystem, (2) an IEEE/Unidex 11 precision motion control subsystem, and (3) an Alacron I860 array processor subsystem. The software runs on an IBM PC/AT host computer running either the Microsoft Windows 3.1 or Windows 95 operating system. It is capable of processing five PIV or HPIV displacement vectors per second, and is completely automated with the exception of user input to a configuration file prior to analysis execution for update of various system parameters.

  15. Software for computerised analysis of cardiotocographic traces.

    Romano, M; Bifulco, P; Ruffo, M; Improta, G; Clemente, F; Cesarelli, M

    2016-02-01

    Despite the widespread use of cardiotocography in foetal monitoring, the evaluation of foetal status suffers from a considerable inter and intra-observer variability. In order to overcome the main limitations of visual cardiotocographic assessment, computerised methods to analyse cardiotocographic recordings have been recently developed. In this study, a new software for automated analysis of foetal heart rate is presented. It allows an automatic procedure for measuring the most relevant parameters derivable from cardiotocographic traces. Simulated and real cardiotocographic traces were analysed to test software reliability. In artificial traces, we simulated a set number of events (accelerations, decelerations and contractions) to be recognised. In the case of real signals, instead, results of the computerised analysis were compared with the visual assessment performed by 18 expert clinicians and three performance indexes were computed to gain information about performances of the proposed software. The software showed preliminary performance we judged satisfactory in that the results matched completely the requirements, as proved by tests on artificial signals in which all simulated events were detected from the software. Performance indexes computed in comparison with obstetricians' evaluations are, on the contrary, not so satisfactory; in fact they led to obtain the following values of the statistical parameters: sensitivity equal to 93%, positive predictive value equal to 82% and accuracy equal to 77%. Very probably this arises from the high variability of trace annotation carried out by clinicians. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. GRACAT, Software for grounding and collision analysis

    Friis-Hansen, Peter; Simonsen, Bo Cerup

    2002-01-01

    and grounding accidents. The software consists of three basic analysis modules and one risk mitigation module: 1) frequency, 2) damage, and 3) consequence. These modules can be used individually or in series and the analyses can be performed in deterministic or probabilistic mode. Finally, in the mitigation...

  17. Residence time distribution software analysis. User's manual

    1996-01-01

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  18. Improving Software Systems By Flow Control Analysis

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  19. Intraprocedural dataflow analysis for software product lines

    Brabrand, Claus; Ribeiro, Márcio; Tolêdo, Társis

    2013-01-01

    Software product lines (SPLs) developed using annotative approaches such as conditional compilation come with an inherent risk of constructing erroneous products. For this reason, it is essential to be able to analyze such SPLs. However, as dataflow analysis techniques are not able to deal with SP...... and memory characteristics on five qualitatively different SPLs. On our benchmarks, the combined analysis strategy is up to almost eight times faster than the brute-force approach....

  20. Automating risk analysis of software design models.

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  1. Brazing handbook

    American Welding Society

    2007-01-01

    By agreement between the American Welding Society C3 Committee on Brazing and Soldering and the ASM Handbook Committee, the AWS Brazing Handbook has been formally adopted as part of the ASM Handbook Series. Through this agreement, the brazing content in the ASM Handbook is significantly updated and expanded. The AWS Brazing Handbook, 5th Edition provides a comprehensive, organized survey of the basics of brazing, processes, and applications. Addresses the fundamentals of brazing, brazement design, brazing filler metals and fluxes, safety and health, and many other topics. Includes new chapters on induction brazing and diamond brazing.

  2. Exploratory analysis of textual data from the Mother and Child Handbook using the text-mining method: Relationships with maternal traits and post-partum depression.

    Matsuda, Yoshio; Manaka, Tomoko; Kobayashi, Makiko; Sato, Shuhei; Ohwada, Michitaka

    2016-06-01

    The aim of the present study was to examine the possibility of screening apprehensive pregnant women and mothers at risk for post-partum depression from an analysis of the textual data in the Mother and Child Handbook by using the text-mining method. Uncomplicated pregnant women (n = 58) were divided into two groups according to State-Trait Anxiety Inventory grade (high trait [group I, n = 21] and low trait [group II, n = 37]) or Edinburgh Postnatal Depression Scale score (high score [group III, n = 15] and low score [group IV, n = 43]). An exploratory analysis of the textual data from the Maternal and Child Handbook was conducted using the text-mining method with the Word Miner software program. A comparison of the 'structure elements' was made between the two groups. The number of structure elements extracted by separated words from text data was 20 004 and the number of structure elements with a threshold of 2 or more as an initial value was 1168. Fifteen key words related to maternal anxiety, and six key words related to post-partum depression were extracted. The text-mining method is useful for the exploratory analysis of textual data obtained from pregnant woman, and this screening method has been suggested to be useful for apprehensive pregnant women and mothers at risk for post-partum depression. © 2016 Japan Society of Obstetrics and Gynecology.

  3. Development of default uncertainties for the value/benefit attributes in the regulatory analysis technical evaluation handbook

    Gallucci, Raymond H.V.

    2016-01-01

    Highlights: • Uncertainties for values/benefits. • Upper bound four times higher than mean. • Distributional histograms. - Abstract: NUREG/BR-0184, Regulatory Analysis Technical Evaluation (RATE) Handbook, was produced in 1997 as an update to the original NUREG/CR-3568, A Handbook for Value-Impact Assessment (1983). Both documents, especially the later RATE Handbook, have been used extensively by the USNRC and its contractors not only for regulatory analyses to support backfit considerations but also for similar applications, such as Severe Accident Management Alternative (SAMA) analyses as part of license renewals. While both provided high-level guidance on the performance of uncertainty analyses for the various value/benefit attributes, detailed quantification was not of prime interest at the times of the Handbooks’ development, defaulting only to best estimates with low and high bounds on these attributes. As the USNRC examines the possibility of updating the RATE Handbook, renewed interest in a more quantitative approach to uncertainty analyses for the attributes has surfaced. As the result of an effort to enhance the RATE Handbook to permit at least default uncertainty analyses for the value/benefit attributes, it has proven feasible to assign default uncertainties in terms of 95th %ile upper bounds (and absolute lower bounds) on the five dominant value/benefit attributes, and their sum, when performing a regulatory analysis via the RATE Handbook. Appropriate default lower bounds of zero (no value/benefit) and an upper bound (95th %ile) that is four times higher than the mean (for individual value/benefit attributes) or three times higher (for their summation) can be recommended. Distributions in the form of histograms on the summed value/benefit attributes are also provided which could be combined, after appropriate scaling and most likely via simulation, with their counterpart(s) from the impact/cost analysis to yield a final distribution on the net

  4. Development of neutron activation analysis software

    Wang Liyu

    1987-10-01

    The software for quantitative neutron activation analysis was developed to run under the MS/DOS operating system. The programmes of the IBM/SPAN include: spectra file transfer from and to a Canberra Series 35 multichannel analyzer, spectrum evaluation routines, calibration subprogrammes, and quantitative analysis. The programmes for spectrum analysis include fitting routine for separation of multiple lines by reproducing the peak shape with a combination of Gaussian and exponential terms. The programmes were tested on an IBM/AT-compatible computer. The programmes and the sources are available costfree for the IAEA projects of Technical Cooperation. 7 refs, 3 figs

  5. IUE Data Analysis Software for Personal Computers

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  6. Application of Metric-based Software Reliability Analysis to Example Software

    Kim, Man Cheol; Smidts, Carol

    2008-07-01

    The software reliability of TELLERFAST ATM software is analyzed by using two metric-based software reliability analysis methods, a state transition diagram-based method and a test coverage-based method. The procedures for the software reliability analysis by using the two methods and the analysis results are provided in this report. It is found that the two methods have a relation of complementary cooperation, and therefore further researches on combining the two methods to reflect the benefit of the complementary cooperative effect to the software reliability analysis are recommended

  7. Digital PIV (DPIV) Software Analysis System

    Blackshire, James L.

    1997-01-01

    A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.

  8. Specdata: Automated Analysis Software for Broadband Spectra

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  9. Software development processes and analysis software: a mismatch and a novel framework

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  10. CMS Computing Software and Analysis Challenge 2006

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  11. CMS Computing Software and Analysis Challenge 2006

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  12. Engineering Design Handbook. Army Weapon Systems Analysis. Part 2

    1979-10-01

    detecting the target with a given number of looks. Schweitzer (Ref. 21) also studied this same problem of minimum-search policy. In Ref.M21, Schweitzer ...Stephen M. Pollack, "A Simple Model of Search for a Moving Target", Operations Research 18., pp. 883-903 (September.October 1970). 21. Paul J. Schweitzer ... Albert Shapero and Charles Bates, Jr., A Method for Performing Human Engineeriýg Analysis of Wenpoa Systems, WADC Technical Report 59-784, Wright

  13. Longitudinal data analysis a handbook of modern statistical methods

    Fitzmaurice, Garrett; Verbeke, Geert; Molenberghs, Geert

    2008-01-01

    Although many books currently available describe statistical models and methods for analyzing longitudinal data, they do not highlight connections between various research threads in the statistical literature. Responding to this void, Longitudinal Data Analysis provides a clear, comprehensive, and unified overview of state-of-the-art theory and applications. It also focuses on the assorted challenges that arise in analyzing longitudinal data. After discussing historical aspects, leading researchers explore four broad themes: parametric modeling, nonparametric and semiparametric methods, joint

  14. Handbook of Basic Tables for Chemical Analysis. Final report

    Bruno, T.J.; Svoronos, P.D.N.

    1988-04-01

    This work began as a slim booklet prepared by one of the authors (TJB) to accompany a course on chemical instrumentation presented at the National Bureau of Standards, Boulder Laboratories. The booklet contained tables on chromatography, spectroscopy, and chemical (wet) methods, and was intended to provide the students with enough basic data to design their own analytical methods and procedures. Shortly thereafter, with the co-authorship of Prof. Paris D. N. Svoronos, it was expanded into a more-extensive compilation entitled Basic Tables for Chemical Analysis, published as National Bureau of Standards Technical Note 1096. That work has now been expanded and updated into the present body of tables. Although there have been considerable changes since the first version of these tables, the aim has remained essentially the same. The authors have tried to provide a single source of information for those practicing scientists and research students who must use various aspects of chemical analysis in their work. In this respect, it is geared less toward the researcher in analytical chemistry than to those practitioners in other chemical disciplines who must have routine use of chemical analysis

  15. Decommissioning Handbook

    Cusack, J.G.; Dalfonso, P.H.; Lenyk, R.G.

    1994-01-01

    The Decommissioning Handbook provides technical guidance on conducting decommissioning projects. Information presented ranges from planning logic, regulations affecting decommissioning, technology discussion, health and safety requirements, an developing a cost estimate. The major focus of the handbook are the technologies -- decontamination technologies, waste treatment, dismantling/segmenting/demolition, and remote operations. Over 90 technologies are discussed in the handbook providing descriptions, applications, and advantages/disadvantages. The handbook was prepared to provide a compendium of available or potentially available technologies in order to aid the planner in meeting the specific needs of each decommissioning project. Other subjects presented in the Decommissioning Handbook include the decommissioning plan, characterization, final project configuration based planning, environmental protection, and packaging/transportation. These discussions are presented to complement the technologies presented in the handbook

  16. Static analysis of software the abstract interpretation

    Boulanger, Jean-Louis

    2013-01-01

    The existing literature currently available to students and researchers is very general, covering only the formal techniques of static analysis. This book presents real examples of the formal techniques called ""abstract interpretation"" currently being used in various industrial fields: railway, aeronautics, space, automotive, etc. The purpose of this book is to present students and researchers, in a single book, with the wealth of experience of people who are intrinsically involved in the realization and evaluation of software-based safety critical systems. As the authors are people curr

  17. Radioactivity Handbook

    Firestone, R.B.; Browne, E.

    1985-01-01

    The Radioactivity Handbook will be published in 1985. This handbook is intended primarily for applied users of nuclear data. It will contain recommended radiation data for all radioactive isotopes. Pages from the Radioactivity Handbook for A = 221 are shown as examples. These have been produced from the LBL Isotopes Project extended ENDSF data-base. The skeleton schemes have been manually updated from the Table of Isotopes and the tabular data are prepared using UNIX with a phototypesetter. Some of the features of the Radioactivity Handbook are discussed here

  18. A new paradigm for the development of analysis software

    Kelly, D.; Harauz, J.

    2012-01-01

    For the CANDU industry, analysis software is an important tool for scientists and engineers to examine issues related to safety, operation, and design. However, the software quality assurance approach currently used for these tools assumes the software is the delivered product. In this paper, we present a model that shifts the emphasis from software being the end-product to software being support for the end-product, the science. We describe a novel software development paradigm that supports this shift and provides the groundwork for re-examining the quality assurance practices used for analysis software. (author)

  19. Modularity analysis of automotive control software

    Dajsuren, Y.; Brand, van den, M.G.J.; Serebrenik, A.

    2013-01-01

    A design language and tool like MATLAB/Simulink is used for the graphical modelling and simulation of automotive control software. As the functionality based on electronics and software systems increases in motor vehicles, it is becoming increasingly important for system/software architects and control engineers in the automotive industry to ensure the quality of the highly complex MATLAB/Simulink control software. For automotive software, modularity is recognized as being a crucial quality a...

  20. Handbook on nuclear data for borehole logging and mineral analysis

    1993-01-01

    In nuclear geophysics, an extension of the nuclear data available for reactor and shielding calculations, is required. In general, the problems and the methods of attack are the same, but in nuclear geophysics the environment is earth materials, with virtually all the natural elements in the Periodic Table involved, although not at the same time. In addition, the geometrical configurations encountered in nuclear geophysics are very different from those associated with reactor and shielding design, and they can impose a different demand on the required accuracy of the nuclear data and on the dependence on the calculational approach. Borehole logging is a very good example, since an experimental investigation aimed at varying only one parameter (e.g. moisture content) whilst keeping all the others constant in a geologically complex system that effectively exhibits 'infinite geometry' for neutrons and γ rays is virtually impossible. An increasingly important are of nuclear geophysics is the on-line analysis of natural materials such as coal (e.g. C, H, O, Al, Si, Ca, Fe, Cl, S, N,) the raw materials of the cement industry (S, Na, K, Al, Si, Ca, Fe, Mn, Ti, P, Mg, F, O), and mined ores of Fe, Al, Mn, Cu, Ni, Ag and Au, amongst others. Refs, figs and tabs

  1. Development of Image Analysis Software of MAXI

    Eguchi, S.; Ueda, Y.; Hiroi, K.; Isobe, N.; Sugizaki, M.; Suzuki, M.; Tomida, H.; Maxi Team

    2010-12-01

    Monitor of All-sky X-ray Image (MAXI) is an X-ray all-sky monitor, attached to the Japanese experiment module Kibo on the International Space Station. The main scientific goals of the MAXI mission include the discovery of X-ray novae followed by prompt alerts to the community (Negoro et al., in this conference), and production of X-ray all-sky maps and new source catalogs with unprecedented sensitivities. To extract the best capabilities of the MAXI mission, we are working on the development of detailed image analysis tools. We utilize maximum likelihood fitting to a projected sky image, where we take account of the complicated detector responses, such as the background and point spread functions (PSFs). The modeling of PSFs, which strongly depend on the orbit and attitude of MAXI, is a key element in the image analysis. In this paper, we present the status of our software development.

  2. STAR: Software Toolkit for Analysis Research

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R.; Helman, P.

    1993-01-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems

  3. The ESA's Space Trajectory Analysis software suite

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and

  4. Effective Results Analysis for the Similar Software Products’ Orthogonality

    Ion Ivan

    2009-10-01

    Full Text Available It is defined the concept of similar software. There are established conditions of archiving the software components. It is carried out the orthogonality evaluation and the correlation between the orthogonality and the complexity of the homogenous software components is analyzed. Shall proceed to build groups of similar software products, belonging to the orthogonality intervals. There are presented in graphical form the results of the analysis. There are detailed aspects of the functioning of the software product allocated for the orthogonality.

  5. Peer-review study of the draft handbook for human-reliability analysis with emphasis on nuclear-power-plant applications, NUREG/CR-1278

    Brune, R. L.; Weinstein, M.; Fitzwater, M. E.

    1983-01-01

    This report describes a peer review of the draft Handbook for Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications, NUREG/CR-1278. The purpose of the study was to determine to what extent peers agree with the human behavior models and estimates of human error probabilities (HEPs) contained in the Handbook. Twenty-nine human factors experts participated in the study. Twenty of the participants were Americans; nine were from other countries. The peers performed human reliability analyses of a variety of human performance scenarios describing operator activities in nuclear power plant settings. They also answered questionnaires pertaining to the contents and application of the Handbook. An analysis of peer solutions to the human reliability analysis problems and peer responses to the questionnaire was performed. Recommendations regarding the format and contents of the Handbook were developed from the study findings.

  6. Modularity analysis of automotive control software

    Dajsuren, Y.; Brand, van den M.G.J.; Serebrenik, A.

    2013-01-01

    A design language and tool like MATLAB/Simulink is used for the graphical modelling and simulation of automotive control software. As the functionality based on electronics and software systems increases in motor vehicles, it is becoming increasingly important for system/software architects and

  7. A shortened version of the THERP/Handbook approach to human reliability analysis for probabilistic risk assessment

    Swain, A.D.

    1986-01-01

    The approach to human reliability analysis (HRA) known as THERP/Handbook has been applied to several probabilistic risk assessments (PRAs) of nuclear power plants (NPPs) and other complex systems. The approach is based on a thorough task analysis of the man-machine interfaces, including the interactions among the people, involved in the operations being assessed. The idea is to assess fully the underlying performance shaping factors (PSFs) and dependence effects which result either in reliable or unreliable human performance

  8. Elementary study on γ analysis software for low level measurement

    Ruan Guanglin; Huang Xianguo; Xing Shixiong

    2001-01-01

    The difficulty in using fashion γ analysis software in low level measurement is discussed. The ROI report file of ORTEC operation system has been chosen as interface file to write γ analysis software for low-level measurement. The author gives software flowchart and applied example and discusses the existent problems

  9. Visual querying and analysis of large software repositories

    Voinea, Lucian; Telea, Alexandru

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  10. Image processing and analysis software development

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  11. Software applications for flux balance analysis.

    Lakshmanan, Meiyappan; Koh, Geoffrey; Chung, Bevan K S; Lee, Dong-Yup

    2014-01-01

    Flux balance analysis (FBA) is a widely used computational method for characterizing and engineering intrinsic cellular metabolism. The increasing number of its successful applications and growing popularity are possibly attributable to the availability of specific software tools for FBA. Each tool has its unique features and limitations with respect to operational environment, user-interface and supported analysis algorithms. Presented herein is an in-depth evaluation of currently available FBA applications, focusing mainly on usability, functionality, graphical representation and inter-operability. Overall, most of the applications are able to perform basic features of model creation and FBA simulation. COBRA toolbox, OptFlux and FASIMU are versatile to support advanced in silico algorithms to identify environmental and genetic targets for strain design. SurreyFBA, WEbcoli, Acorn, FAME, GEMSiRV and MetaFluxNet are the distinct tools which provide the user friendly interfaces in model handling. In terms of software architecture, FBA-SimVis and OptFlux have the flexible environments as they enable the plug-in/add-on feature to aid prospective functional extensions. Notably, an increasing trend towards the implementation of more tailored e-services such as central model repository and assistance to collaborative efforts was observed among the web-based applications with the help of advanced web-technologies. Furthermore, most recent applications such as the Model SEED, FAME, MetaFlux and MicrobesFlux have even included several routines to facilitate the reconstruction of genome-scale metabolic models. Finally, a brief discussion on the future directions of FBA applications was made for the benefit of potential tool developers.

  12. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  13. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  14. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    R. Schmidt

    2012-08-01

    Full Text Available The European Space Agency (ESA is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk associated to a landing site in terms of a successful touchdown and subsequent surface operation of the lander. In addition, global illumination conditions at the landing site have to be simulated and analyzed. The Landing Site Risk Analysis software framework (LandSAfe is a system for the analysis, selection and certification of safe landing sites on the lunar surface. LandSAfe generates several data products including high resolution digital terrain models (DTMs, hazard maps, illumination maps, temperature maps and surface reflectance maps which assist the user in evaluating potential landing site candidates. This paper presents the LandSAfe system and describes the methods and products of the different modules. For one candidate landing site on the rim of Shackleton crater at the south pole of the Moon a high resolution DTM is showcased.

  15. Handbook of networking & connectivity

    McClain, Gary R

    1994-01-01

    Handbook of Networking & Connectivity focuses on connectivity standards in use, including hardware and software options. The book serves as a guide for solving specific problems that arise in designing and maintaining organizational networks.The selection first tackles open systems interconnection, guide to digital communications, and implementing TCP/IP in an SNA environment. Discussions focus on elimination of the SNA backbone, routing SNA over internets, connectionless versus connection-oriented networks, internet concepts, application program interfaces, basic principles of layering, proto

  16. Software patterns, knowledge maps, and domain analysis

    Fayad, Mohamed E; Hegde, Srikanth GK; Basia, Anshu; Vakil, Ashka

    2014-01-01

    Preface AcknowledgmentsAuthors INTRODUCTIONAn Overview of Knowledge MapsIntroduction: Key Concepts-Software Stable Models, Knowledge Maps, Pattern Language, Goals, Capabilities (Enduring Business Themes + Business Objects) The Motivation The Problem The Objectives Overview of Software Stability Concepts Overview of Knowledge Maps Pattern Languages versus Knowledge Maps: A Brief ComparisonThe Solution Knowledge Maps Methodology or Concurrent Software Development ModelWhy Knowledge Maps? Research Methodology Undertaken Research Verification and Validation The Stratification of This Book Summary

  17. Crane handbook

    Dickie, D E

    1975-01-01

    Crane Handbook offers extensive advice on how to properly handle a crane. The handbook highlights various safety requirements and rules. The aim of the book is to improve the readers' crane operating skills, which could eventually make the book a standard working guide for training operators. The handbook first reminds the readers that the machine should be carefully tested by a regulatory board before use. The text then notes that choosing the right crane for a particular job is vital and explains why this is the case. It then discusses how well-equipped and durable the crane should be. T

  18. The SAVI Vulnerability Analysis Software Package

    Mc Aniff, R.J.; Paulus, W.K.; Key, B.; Simpkins, B.

    1987-01-01

    SAVI (Systematic Analysis of Vulnerability to Intrusion) is a new PC-based software package for modeling Physical Protection Systems (PPS). SAVI utilizes a path analysis approach based on the Adversary Sequence Diagram (ASD) methodology. A highly interactive interface allows the user to accurately model complex facilities, maintain a library of these models on disk, and calculate the most vulnerable paths through any facility. Recommendations are provided to help the user choose facility upgrades which should reduce identified path vulnerabilities. Pop-up windows throughout SAVI are used for the input and display of information. A menu at the top of the screen presents all options to the user. These options are further explained on a message line directly below the menu. A diagram on the screen graphically represents the current protection system model. All input is checked for errors, and data are presented in a logical and clear manner. Print utilities provide the user with hard copies of all information and calculated results

  19. Social network analysis in software process improvement

    Nielsen, Peter Axel; Tjørnehøj, Gitte

    2010-01-01

    Software process improvement in small organisation is often problematic and communication and knowledge sharing is more informal. To improve software processes we need to understand how they communicate and share knowledge. In this article have studied the company SmallSoft through action research...

  20. Data envelopment analysis a handbook of modeling internal structure and network

    Cook, Wade D

    2014-01-01

    This comprehensive handbook on state-of-the-art topics in DEA modeling of internal structures and networks presents work by leading researchers who share their results on subjects including additive efficiency decomposition and slacks-based network DEA.

  1. Effective Results Analysis for the Similar Software Products’ Orthogonality

    Ion Ivan; Daniel Milodin

    2009-01-01

    It is defined the concept of similar software. There are established conditions of archiving the software components. It is carried out the orthogonality evaluation and the correlation between the orthogonality and the complexity of the homogenous software components is analyzed. Shall proceed to build groups of similar software products, belonging to the orthogonality intervals. There are presented in graphical form the results of the analysis. There are detailed aspects of the functioning o...

  2. Security Testing Handbook for Banking Applications

    Doraiswamy, Arvind; Kapoor, Nilesh

    2009-01-01

    Security Testing Handbook for Banking Applications is a specialised guide to testing a wide range of banking applications. The book is intended as a companion to security professionals, software developers and QA professionals who work with banking applications.

  3. Decommissioning Handbook

    1994-03-01

    The Decommissioning Handbook is a technical guide for the decommissioning of nuclear facilities. The decommissioning of a nuclear facility involves the removal of the radioactive and, for practical reasons, hazardous materials to enable the facility to be released and not represent a further risk to human health and the environment. This handbook identifies and technologies and techniques that will accomplish these objectives. The emphasis in this handbook is on characterization; waste treatment; decontamination; dismantling, segmenting, demolition; and remote technologies. Other aspects that are discussed in some detail include the regulations governing decommissioning, worker and environmental protection, and packaging and transportation of the waste materials. The handbook describes in general terms the overall decommissioning project, including planning, cost estimating, and operating practices that would ease preparation of the Decommissioning Plan and the decommissioning itself. The reader is referred to other documents for more detailed information. This Decommissioning Handbook has been prepared by Enserch Environmental Corporation for the US Department of Energy and is a complete restructuring of the original handbook developed in 1980 by Nuclear Energy Services. The significant changes between the two documents are the addition of current and the deletion of obsolete technologies and the addition of chapters on project planning and the Decommissioning Plan, regulatory requirements, characterization, remote technology, and packaging and transportation of the waste materials.

  4. Criticality handbook. Pt. 1

    Heinicke, W.; Krug, H.; Thomas, W.; Weber, W.; Gmal, B.

    1985-12-01

    The GRS Criticality Handbook is intended as a source of information on criticality problems for the persons concerned in industry, authorities, or research laboratories. It is to serve as a guide allowing quick and appropriate evaluation of criticality problems during design or erection of nuclear installations. This present issue replaces the one published in 1979, presenting revised and new data in a modified construction, but within the framework of the proven basic structure of the Handbook. Some fundamental knowledge is required of criticality problems and the relevant terms and definitions of nuclear safety, in order to fully deploy the information given. Part 1 of the Handbook therefore first introduces terminology and definitions, followed by experimental methods and calculation models for criticality calculations. The next chapters deal with the function and efficiency of neutron reflectors and neutron absorbers, measuring methods for criticality monitoring, organisational safety measures, and criticality accidents and their subsequent analysis. (orig./HP) [de

  5. COMPARATIVE ANALYSIS OF SOFTWARE DEVELOPMENT MODELS

    Sandeep Kaur*

    2017-01-01

    No geek is unfamiliar with the concept of software development life cycle (SDLC). This research deals with the various SDLC models covering waterfall, spiral, and iterative, agile, V-shaped, prototype model. In the modern era, all the software systems are fallible as they can’t stand with certainty. So, it is tried to compare all aspects of the various models, their pros and cons so that it could be easy to choose a particular model at the time of need

  6. CAX a software for automated spectrum analysis

    Zahn, Guilherme S.; Genezini, Frederico A.

    2017-01-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  7. CAX a software for automated spectrum analysis

    Zahn, Guilherme S.; Genezini, Frederico A., E-mail: gzahn@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (CRPq/IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro do Reator de Pesquisas

    2017-11-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  8. Development of a New VLBI Data Analysis Software

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  9. Continuous software quality analysis for the ATLAS experiment

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The software for the ATLAS experiment on the Large Hadron Collider at CERN has evolved over many years to meet the demands of Monte Carlo simulation, particle detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by an active worldwide developer community. In order to run the experiment software efficiently at hundreds of computing centres it is essential to maintain a high level of software quality standards. The methods proposed to improve software quality practices by incorporating checks into the new ATLAS software build infrastructure.

  10. Software Piracy in Research: A Moral Analysis.

    Santillanes, Gary; Felder, Ryan Marshall

    2015-08-01

    Researchers in virtually every discipline rely on sophisticated proprietary software for their work. However, some researchers are unable to afford the licenses and instead procure the software illegally. We discuss the prohibition of software piracy by intellectual property laws, and argue that the moral basis for the copyright law offers the possibility of cases where software piracy may be morally justified. The ethics codes that scientific institutions abide by are informed by a rule-consequentialist logic: by preserving personal rights to authored works, people able to do so will be incentivized to create. By showing that the law has this rule-consequentialist grounding, we suggest that scientists who blindly adopt their institutional ethics codes will commit themselves to accepting that software piracy could be morally justified, in some cases. We hope that this conclusion will spark debate over important tensions between ethics codes, copyright law, and the underlying moral basis for these regulations. We conclude by offering practical solutions (other than piracy) for researchers.

  11. A proposal for performing software safety hazard analysis

    Lawrence, J.D.; Gallagher, J.M.

    1997-01-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper. The method concentrates on finding hazards during the early stages of the software life cycle, using an extension of HAZOP

  12. Shallow land burial handbook

    Stinton, L.H.

    1983-01-01

    The facility development phases (preliminary analysis, site selection, facility design and construction, facility operation, and facility closure/post-closure) are systematically integrated into a logical plan for developing near surface disposal plans. The Shallow Land Burial Handbook provides initial guidance and concepts for understanding the magnitude and the complexity of developing new low-level radioactive waste disposal facilities

  13. PASHTO INSTRUCTOR'S HANDBOOK.

    CHAVARRIA-AGUILAR, O.L.

    THE MATERIALS IN THIS HANDBOOK CONSIST OF 64 PRONUNCIATION DRILLS FOR THE "PASHTO BASIC COURSE." THESE DRILLS ARE BASED ON A CONTRASTIVE ANALYSIS OF PASHTO AND ENGLISH PHONOLOGY AND ARE TO BE ADMINISTERED BY A NATIVE SPEAKER. SIXTY PASHTO ITEMS ARE INCLUDED IN EACH DRILL, 30 CONTAINING THE PARTICULAR SOUND BEING TAUGHT AND 30 CONTAINING…

  14. Exploratory analysis of textual data from the Mother and Child Handbook using a text mining method (II): Monthly changes in the words recorded by mothers.

    Tagawa, Miki; Matsuda, Yoshio; Manaka, Tomoko; Kobayashi, Makiko; Ohwada, Michitaka; Matsubara, Shigeki

    2017-01-01

    The aim of the study was to examine the possibility of converting subjective textual data written in the free column space of the Mother and Child Handbook (MCH) into objective information using text mining and to compare any monthly changes in the words written by the mothers. Pregnant women without complications (n = 60) were divided into two groups according to State-Trait Anxiety Inventory grade: low trait anxiety (group I, n = 39) and high trait anxiety (group II, n = 21). Exploratory analysis of the textual data from the MCH was conducted by text mining using the Word Miner software program. Using 1203 structural elements extracted after processing, a comparison of monthly changes in the words used in the mothers' comments was made between the two groups. The data was mainly analyzed by a correspondence analysis. The structural elements in groups I and II were divided into seven and six clusters, respectively, by cluster analysis. Correspondence analysis revealed clear monthly changes in the words used in the mothers' comments as the pregnancy progressed in group I, whereas the association was not clear in group II. The text mining method was useful for exploratory analysis of the textual data obtained from pregnant women, and the monthly change in the words used in the mothers' comments as pregnancy progressed differed according to their degree of unease. © 2016 Japan Society of Obstetrics and Gynecology.

  15. INFOS: spectrum fitting software for NMR analysis

    Smith, Albert A., E-mail: alsi@nmr.phys.chem.ethz.ch [ETH Zürich, Physical Chemistry (Switzerland)

    2017-02-15

    Software for fitting of NMR spectra in MATLAB is presented. Spectra are fitted in the frequency domain, using Fourier transformed lineshapes, which are derived using the experimental acquisition and processing parameters. This yields more accurate fits compared to common fitting methods that use Lorentzian or Gaussian functions. Furthermore, a very time-efficient algorithm for calculating and fitting spectra has been developed. The software also performs initial peak picking, followed by subsequent fitting and refinement of the peak list, by iteratively adding and removing peaks to improve the overall fit. Estimation of error on fitting parameters is performed using a Monte-Carlo approach. Many fitting options allow the software to be flexible enough for a wide array of applications, while still being straightforward to set up with minimal user input.

  16. Software Design Reviews Using the Software Architecture Analysis Method: A Case Study

    2000-02-01

    Management, pp. 323-356, JAI Press Inc. Stake, R. E. (1994). Case Studies. In Handbook of Qualitative Research. (N. K. Denzin and Y. S. Lincoln . Eds.), pp. 236...Henderson and Gabb 1997). References Adler, P. A. and P. Adler (1994). Observational Techniques. In Handbook of Qualitative Research. (N. K. Denzin and Y...S. Lincoln . Eds.), pp. 377-392, SAGE Publications, Thousand Oaks, California. ADO (1998). Electronic Systems Acquisition Division Compendium

  17. Handbook of methods for risk-based analysis of technical specifications

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1996-01-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operations (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analyses and engineering judgments. Improvements in these requirements are facilitated by the availability of plant-specific Probabilistic Risk Assessments (PRAs). The US Nuclear Regulatory Commission (USNRC) Office of Research sponsored research to develop systematic, risk-based methods to improve various aspects of TS requirements. A handbook of methods summarizing such risk-based approaches has been completed in 1994. It is expected that this handbook will provide valuable input to NRC's present work in developing guidance for using PRA in risk-informed regulation. The handbook addresses reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), managing plant configurations, and scheduling maintenance

  18. Research and Development on Food Nutrition Statistical Analysis Software System

    Du Li; Ke Yun

    2013-01-01

    Designing and developing a set of food nutrition component statistical analysis software can realize the automation of nutrition calculation, improve the nutrition processional professional’s working efficiency and achieve the informatization of the nutrition propaganda and education. In the software development process, the software engineering method and database technology are used to calculate the human daily nutritional intake and the intelligent system is used to evaluate the user’s hea...

  19. Cross-instrument Analysis Correlation Software

    2017-06-28

    This program has been designed to assist with the tracking of a sample from one analytical instrument to another such as SEM, microscopes, micro x-ray diffraction and other instruments where particular positions/locations on the sample are examined, photographed, etc. The software is designed to easily enter the position of fiducials and locations of interest such that in a future session in the same of different instrument the positions of interest can be re-found through using the known location fiducials in the current and reference session to transform the point into the current sessions coordinate system. The software is dialog box driven guiding the user through the necessary data entry and program choices. Information is stored in a series of text based extensible markup language (XML) files.

  20. Nuclear Fuel Depletion Analysis Using Matlab Software

    Faghihi, F.; Nematollahi, M. R.

    Coupled first order IVPs are frequently used in many parts of engineering and sciences. In this article, we presented a code including three computer programs which are joint with the Matlab software to solve and plot the solutions of the first order coupled stiff or non-stiff IVPs. Some engineering and scientific problems related to IVPs are given and fuel depletion (production of the 239Pu isotope) in a Pressurized Water Nuclear Reactor (PWR) are computed by the present code.

  1. Employee Handbook

    Bello, Madelyn

    2008-09-05

    Welcome to Berkeley Lab. You are joining or are already a part of a laboratory with a sterling tradition of scientific achievement, including eleven Nobel Laureates and thirteen National Medal of Science winners. No matter what job you do, you make Berkeley Lab the outstanding organization that it is. Without your hard work and dedication, we could not achieve all that we have. We value you and thank you for choosing to be part of our community. This Employee Handbook is designed to help you navigate the Lab. With over 3,000 employees, an additional 3,000 guests visiting from countries around the world, a 200-acre campus and many policies and procedures, learning all the ins and outs may seem overwhelming, especially if you're a new employee. However, even if you have been here for a while, this Handbook should be a useful reference tool. It is meant to serve as a guide, highlighting and summarizing what you need to know and informing you where you can go for more detailed information. The general information provided in this Handbook serves only as a brief description of many of the Lab's policies. Policies, procedures and information are found in the Lab's Regulations and Procedures Manual (RPM), Summary Plan Descriptions, University of California policies, and provisions of Contract 31 between the Regents of the University and the U.S. Department of Energy. In addition, specific terms and conditions for represented employees are found in applicable collective bargaining agreements. Nothing in this Handbook is intended to supplant, change or conflict with the previously mentioned documents. In addition, the information in this Handbook does not constitute a contract or a promise of continued employment and may be changed at any time by the Lab. We believe employees are happier and more productive if they know what they can expect from their organization and what their organization expects from them. The Handbook will familiarize you with the

  2. Analysis of MOZART critical experiment using the IRPhEP handbook data

    Chiba, Go

    2010-12-01

    Using the experimental data described in the IRPhEP handbook, an experimental analysis of the MOZART experiment is carried out with the nuclear data JENDL-4.0, and the reactor physics codes SLAROM-UF and CBG. The following results are obtained: -The C/E values for criticality are 0.9981 for the small-sized core MZA and 1.0006 for the middle-sized core MZB. Good agreement between calculation and experimental values has been observed similarly in the analyses for criticality of other MOX-fueled fast reactors. Hence, consistency between the present analysis and the others is confirmed. -In reaction rate ratios at the core center, calculation values agree with experimental values within 1.0% for F25/F49 and C28/F49, and within 4.0% for F28/F25, F40/F49 and F41/F49. -In sodium void reactivity worths, calculation values are about 10% larger than experimental values for the non-leakage-dominated data. For the data to which the leakage component largely contributes, absolute differences normalized by the leakage component are less than 10%. -In material worths, calculation values are about 5% larger than experimental values for plutonium. Calculation values agree with experimental values within 10% differences for uranium and SS. -In control rod worths, calculation values are 2% to 5% larger than experimental values. -In reaction rate distributions, calculation values agree well with experimental values in core regions. On the other hand, underestimation is observed systematically in calculation values of threshold reactions in blanket regions. For the reactivity characteristic, overestimation is systematically observed in calculations. While the reason has not yet been investigated, the result suggests underestimation of β eff . (author)

  3. A study of software safety analysis system for safety-critical software

    Chang, H. S.; Shin, H. K.; Chang, Y. W.; Jung, J. C.; Kim, J. H.; Han, H. H.; Son, H. S.

    2004-01-01

    The core factors and requirements for the safety-critical software traced and the methodology adopted in each stage of software life cycle are presented. In concept phase, Failure Modes and Effects Analysis (FMEA) for the system has been performed. The feasibility evaluation of selected safety parameter was performed and Preliminary Hazards Analysis list was prepared using HAZOP(Hazard and Operability) technique. And the check list for management control has been produced via walk-through technique. Based on the evaluation of the check list, activities to be performed in requirement phase have been determined. In the design phase, hazard analysis has been performed to check the safety capability of the system with regard to safety software algorithm using Fault Tree Analysis (FTA). In the test phase, the test items based on FMEA have been checked for fitness guided by an accident scenario. The pressurizer low pressure trip algorithm has been selected to apply FTA method to software safety analysis as a sample. By applying CASE tool, the requirements traceability of safety critical system has been enhanced during all of software life cycle phases

  4. Change Impact Analysis of Crosscutting in Software Architectural Design

    van den Berg, Klaas

    2006-01-01

    Software architectures should be amenable to changes in user requirements and implementation technology. The analysis of the impact of these changes can be based on traceability of architectural design elements. Design elements have dependencies with other software artifacts but also evolve in time.

  5. Integrated analysis software for bulk power system stability

    Tanaka, T; Nagao, T; Takahashi, K [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1994-12-31

    This paper presents Central Research Inst.of Electric Power Industry - CRIEPI`s - own developed three softwares for bulk power network analysis and the user support system which arranges tremendous data necessary for these softwares with easy and high reliability. (author) 3 refs., 7 figs., 2 tabs.

  6. Continuous Software Quality analysis for the ATLAS experiment

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The regular application of software quality tools in large collaborative projects is required to reduce code defects to an acceptable level. If left unchecked the accumulation of defects invariably results in performance degradation at scale and problems with the long-term maintainability of the code. Although software quality tools are effective for identification there remains a non-trivial sociological challenge to resolve defects in a timely manner. This is a ongoing concern for the ATLAS software which has evolved over many years to meet the demands of Monte Carlo simulation, detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by a community of hundreds of developers worldwide. It is therefore preferable to address code defects before they are introduced into a widely used software release. Recent wholesale changes to the ATLAS software infrastructure have provided an ideal opportunity to apply software quali...

  7. The development of a packaging handbook

    Shappert, L.B.

    1994-01-01

    The Packaging Handbook, dealing with the development of packagings designed to carry radioactive material, is being written for DOE's Transportation and Packaging Safety Division. The primary goal of the Handbook is to provide sufficient technical information and guidance to improve the quality of Safety Analysis Reports on Type B Packagings (SARPs) that are submitted to DOE for certification. This paper provides an update on the status of the Handbook

  8. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  9. Fault tree handbook

    Haasl, D.F.; Roberts, N.H.; Vesely, W.E.; Goldberg, F.F.

    1981-01-01

    This handbook describes a methodology for reliability analysis of complex systems such as those which comprise the engineered safety features of nuclear power generating stations. After an initial overview of the available system analysis approaches, the handbook focuses on a description of the deductive method known as fault tree analysis. The following aspects of fault tree analysis are covered: basic concepts for fault tree analysis; basic elements of a fault tree; fault tree construction; probability, statistics, and Boolean algebra for the fault tree analyst; qualitative and quantitative fault tree evaluation techniques; and computer codes for fault tree evaluation. Also discussed are several example problems illustrating the basic concepts of fault tree construction and evaluation

  10. Energy analysis handbook. CAC document 214. [Combining process analysis with input-output analysis

    Bullard, C. W.; Penner, P. S.; Pilati, D. A.

    1976-10-01

    Methods are presented for calculating the energy required, directly and indirectly, to produce all types of goods and services. Procedures for combining process analysis with input-output analysis are described. This enables the analyst to focus data acquisition cost-effectively, and to achieve a specified degree of accuracy in the results. The report presents sample calculations and provides the tables and charts needed to perform most energy cost calculations, including the cost of systems for producing or conserving energy.

  11. GWAMA: software for genome-wide association meta-analysis

    Mägi Reedik

    2010-05-01

    Full Text Available Abstract Background Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. Results We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. Conclusions The GWAMA (Genome-Wide Association Meta-Analysis software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.

  12. Long-term preservation of analysis software environment

    Toppe Larsen, Dag; Blomer, Jakob; Buncic, Predrag; Charalampidis, Ioannis; Haratyunyan, Artem

    2012-01-01

    Long-term preservation of scientific data represents a challenge to experiments, especially regarding the analysis software. Preserving data is not enough; the full software and hardware environment is needed. Virtual machines (VMs) make it possible to preserve hardware “in software”. A complete infrastructure package has been developed for easy deployment and management of VMs, based on CERN virtual machine (CernVM). Further, a HTTP-based file system, CernVM file system (CVMFS), is used for the distribution of the software. It is possible to process data with any given software version, and a matching, regenerated VM version. A point-and-click web user interface is being developed for setting up the complete processing chain, including VM and software versions, number and type of processing nodes, and the particular type of analysis and data. This paradigm also allows for distributed cloud-computing on private and public clouds, for both legacy and contemporary experiments.

  13. Russian system of computerized analysis for licensing at atomic industry (SCALA) and its validation on ICSBEP handbook data and on some burnup calculations

    Ivanova, T.; Polyakov, A.; Saraeva, T.; Tsiboulia, A.

    2001-01-01

    Validation of criticality calculations using SCALA was performed using data presented in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. This paper contains the results of statistical analysis of discrepancies between calculated and benchmark-model k eff and conclusions about uncertainties of criticality prediction for different types of multiplying systems following from this analysis. (authors)

  14. Handbook for sound engineers

    Ballou, Glen

    2013-01-01

    Handbook for Sound Engineers is the most comprehensive reference available for audio engineers. All audio topics are explored: if you work on anything related to audio you should not be without this book! The 4th edition of this trusted reference has been updated to reflect changes in the industry since the publication of the 3rd edition in 2002 -- including new technologies like software-based recording systems such as Pro Tools and Sound Forge; digital recording using MP3, wave files and others; mobile audio devices such as iPods and MP3 players. Over 40 topic

  15. Handbook of linear algebra

    Hogben, Leslie

    2013-01-01

    With a substantial amount of new material, the Handbook of Linear Algebra, Second Edition provides comprehensive coverage of linear algebra concepts, applications, and computational software packages in an easy-to-use format. It guides you from the very elementary aspects of the subject to the frontiers of current research. Along with revisions and updates throughout, the second edition of this bestseller includes 20 new chapters.New to the Second EditionSeparate chapters on Schur complements, additional types of canonical forms, tensors, matrix polynomials, matrix equations, special types of

  16. AVR RISC microcontroller handbook

    Kuhnel, Claus

    1998-01-01

    The AVR RISC Microcontroller Handbook is a comprehensive guide to designing with Atmel's new controller family, which is designed to offer high speed and low power consumption at a lower cost. The main text is divided into three sections: hardware, which covers all internal peripherals; software, which covers programming and the instruction set; and tools, which explains using Atmel's Assembler and Simulator (available on the Web) as well as IAR's C compiler.Practical guide for advanced hobbyists or design professionalsDevelopment tools and code available on the Web

  17. Development of design and analysis software for advanced nuclear system

    Wu Yican; Hu Liqin; Long Pengcheng; Luo Yuetong; Li Yazhou; Zeng Qin; Lu Lei; Zhang Junjun; Zou Jun; Xu Dezheng; Bai Yunqing; Zhou Tao; Chen Hongli; Peng Lei; Song Yong; Huang Qunying

    2010-01-01

    A series of professional codes, which are necessary software tools and data libraries for advanced nuclear system design and analysis, were developed by the FDS Team, including the codes of automatic modeling, physics and engineering calculation, virtual simulation and visualization, system engineering and safety analysis and the related database management etc. The development of these software series was proposed as an exercise of development of nuclear informatics. This paper introduced the main functions and key techniques of the software series, as well as some tests and practical applications. (authors)

  18. Planetary Geologic Mapping Handbook - 2009

    Tanaka, K. L.; Skinner, J. A.; Hare, T. M.

    2009-01-01

    . Terrestrial geologic maps published by the USGS now are primarily digital products using geographic information system (GIS) software and file formats. GIS mapping tools permit easy spatial comparison, generation, importation, manipulation, and analysis of multiple raster image, gridded, and vector data sets. GIS software has also permitted the development of project-specific tools and the sharing of geospatial products among researchers. GIS approaches are now being used in planetary geologic mapping as well (e.g., Hare and others, 2009). Guidelines or handbooks on techniques in planetary geologic mapping have been developed periodically (e.g., Wilhelms, 1972, 1990; Tanaka and others, 1994). As records of the heritage of mapping methods and data, these remain extremely useful guides. However, many of the fundamental aspects of earlier mapping handbooks have evolved significantly, and a comprehensive review of currently accepted mapping methodologies is now warranted. As documented in this handbook, such a review incorporates additional guidelines developed in recent years for planetary geologic mapping by the NASA Planetary Geology and Geophysics (PGG) Program s Planetary Cartography and Geologic Mapping Working Group s (PCGMWG) Geologic Mapping Subcommittee (GEMS) on the selection and use of map bases as well as map preparation, review, publication, and distribution. In light of the current boom in planetary exploration and the ongoing rapid evolution of available data for planetary mapping, this handbook is especially timely.

  19. Saving time with a computerised handbook

    Henrie, D.K.

    1993-01-01

    The DE/CAASE computerised engineering handbook (Desktop Engineering, Mahwah, NJ, USA) is a software tool designed to automate a wide variety of engineering tasks that are typically performed with an engineering handbook and hand calculator. It significantly reduces the time taken to perform these tasks. For example, instead of spending 60 minutes on determining section properties of composite sections in control room panels and other equipment by hand, it might take less than 5 minutes by using the computerised handbook. Similarly, mode shapes and frequencies of simple structures may take less than 10 minutes to calculate, compared with the hours it used to take. (author)

  20. Handbook Of X-ray Astronomy

    Arnaud, Keith A.; Smith, R. K.; Siemiginowska, A.; Edgar, R. J.; Grant, C. E.; Kuntz, K. D.; Schwartz, D. A.

    2011-09-01

    This poster advertises a book to be published in September 2011 by Cambridge University Press. Written for graduate students, professional astronomers and researchers who want to start working in this field, this book is a practical guide to x-ray astronomy. The handbook begins with x-ray optics, basic detector physics and CCDs, before focussing on data analysis. It introduces the reduction and calibration of x-ray data, scientific analysis, archives, statistical issues and the particular problems of highly extended sources. The book describes the main hardware used in x-ray astronomy, emphasizing the implications for data analysis. The concepts behind common x-ray astronomy data analysis software are explained. The appendices present reference material often required during data analysis.

  1. Change impact analysis for software product lines

    Jihen Maâzoun

    2016-10-01

    Full Text Available A software product line (SPL represents a family of products in a given application domain. Each SPL is constructed to provide for the derivation of new products by covering a wide range of features in its domain. Nevertheless, over time, some domain features may become obsolete with the apparition of new features while others may become refined. Accordingly, the SPL must be maintained to account for the domain evolution. Such evolution requires a means for managing the impact of changes on the SPL models, including the feature model and design. This paper presents an automated method that analyzes feature model evolution, traces their impact on the SPL design, and offers a set of recommendations to ensure the consistency of both models. The proposed method defines a set of new metrics adapted to SPL evolution to identify the effort needed to maintain the SPL models consistently and with a quality as good as the original models. The method and its tool are illustrated through an example of an SPL in the Text Editing domain. In addition, they are experimentally evaluated in terms of both the quality of the maintained SPL models and the precision of the impact change management.

  2. Dispersion analysis of biotoxins using HPAC software

    Wu, A.; Nurthen, N.; Horstman, A.; Watson, R.; Phillips, M.

    2009-01-01

    Biotoxins are emerging threat agents produced by living organisms: bacteria, plants, or animals. Biotoxins are generally classified as cyanotoxins, hemotoxins, necrotoxins, neurotoxins, and cytotoxins. The application of classical biotoxins as weapons of terror has been realized because of extreme potency and lethality; ease of production, transport, and misuse; and the need for prolonged intensive care among affected persons. Recently, emerging biotoxins, such as ricin and T2 micotoxin have been clandestinely used by either terrorist groups or military combat operations. It is thus highly desirable to have a modeling system to simulate dispersions of biotoxins in a terrorist attack scenario in order to provide prompt technical support and casualty estimation to the first responders and military rescuers. The Hazard Prediction and Assessment Capability (HPAC) automated software system provides the means to accurately predict the effects of hazardous material released into the atmosphere and its impact on civilian and military populations. The system uses integrated source terms, high-resolution weather forecasts and atmospheric transport and dispersion analyses to model hazard areas produced by military or terrorist incidents and industrial accidents. We have successfully incorporated physical, chemical, epidemiological and biological characteristics of a variety of biotoxins into the HPAC system and have conducted numerous analyses for our emergency responders. The health effects caused by these hazards are closely reflected in HPAC output results.(author)

  3. OST: analysis tool for real time software by simulation of material and software environments

    Boulc'h; Le Meur; Lapassat; Salichon; Segalard

    1988-07-01

    The utilization of microprocessors systems in a nuclear installation control oblige a great operation safety in the installation operation and in the environment protection. For the safety analysis of these installations the Institute of Protection and Nuclear Safety (IPSN) will dispose tools which permit to make controls during all the life of the software. The simulation and test tool (OST) which have been created is completely made by softwares. It is used on VAX calculators and can be easily transportable on other calculators [fr

  4. Power Analysis Software for Educational Researchers

    Peng, Chao-Ying Joanne; Long, Haiying; Abaci, Serdar

    2012-01-01

    Given the importance of statistical power analysis in quantitative research and the repeated emphasis on it by American Educational Research Association/American Psychological Association journals, the authors examined the reporting practice of power analysis by the quantitative studies published in 12 education/psychology journals between 2005…

  5. JEM-X science analysis software

    Westergaard, Niels Jørgen Stenfeldt; Kretschmar, P.; Oxborrow, Carol Anne

    2003-01-01

    The science analysis of the data from JEM-X on INTEGRAL is performed through a number of levels including corrections, good time selection, imaging and source finding, spectrum and light-curve extraction. These levels consist of individual executables and the running of the complete analysis...

  6. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  7. The analysis of subsidence associated with geothermal development. Volume 1. Handbook

    Atherton, R.W.; Finnemore, E.J.; Gillam, M.L.

    1976-09-01

    This study evaluates the state of knowledge of subsidence associated with geothermal development, and provides preliminary methods to assess the potential of land subsidence for any specific geothermal site. The results of this study are presented in three volumes. Volume 1 is designed to serve as a concise reference, a handbook, for the evaluation of the potential for land subsidence from the development of geothermal resources.

  8. Adapted wavelet analysis from theory to software

    Wickerhauser, Mladen Victor

    1994-01-01

    This detail-oriented text is intended for engineers and applied mathematicians who must write computer programs to perform wavelet and related analysis on real data. It contains an overview of mathematical prerequisites and proceeds to describe hands-on programming techniques to implement special programs for signal analysis and other applications. From the table of contents: - Mathematical Preliminaries - Programming Techniques - The Discrete Fourier Transform - Local Trigonometric Transforms - Quadrature Filters - The Discrete Wavelet Transform - Wavelet Packets - The Best Basis Algorithm - Multidimensional Library Trees - Time-Frequency Analysis - Some Applications - Solutions to Some of the Exercises - List of Symbols - Quadrature Filter Coefficients

  9. Development of Emittance Analysis Software for Ion Beam Characterization

    Padilla, M.J.; Liu, Yuan

    2007-01-01

    Transverse beam emittance is a crucial property of charged particle beams that describes their angular and spatial spread. It is a figure of merit frequently used to determine the quality of ion beams, the compatibility of an ion beam with a given beam transport system, and the ability to suppress neighboring isotopes at on-line mass separator facilities. Generally, a high-quality beam is characterized by a small emittance. In order to determine and improve the quality of ion beams used at the Holifield Radioactive Ion Beam Facility (HRIBF) for nuclear physics and nuclear astrophysics research, the emittances of the ion beams are measured at the off-line Ion Source Test Facilities. In this project, emittance analysis software was developed to perform various data processing tasks for noise reduction, to evaluate root-mean-square emittance, Twiss parameters, and area emittance of different beam fractions. The software also provides 2D and 3D graphical views of the emittance data, beam profiles, emittance contours, and RMS. Noise exclusion is essential for accurate determination of beam emittance values. A Self-Consistent, Unbiased Elliptical Exclusion (SCUBEEx) method is employed. Numerical data analysis techniques such as interpolation and nonlinear fitting are also incorporated into the software. The software will provide a simplified, fast tool for comprehensive emittance analysis. The main functions of the software package have been completed. In preliminary tests with experimental emittance data, the analysis results using the software were shown to be accurate

  10. DEVELOPMENT OF EMITTANCE ANALYSIS SOFTWARE FOR ION BEAM CHARACTERIZATION

    Padilla, M. J.; Liu, Y.

    2007-01-01

    Transverse beam emittance is a crucial property of charged particle beams that describes their angular and spatial spread. It is a fi gure of merit frequently used to determine the quality of ion beams, the compatibility of an ion beam with a given beam transport system, and the ability to suppress neighboring isotopes at on-line mass separator facilities. Generally a high quality beam is characterized by a small emittance. In order to determine and improve the quality of ion beams used at the Holifi eld Radioactive Ion beam Facility (HRIBF) for nuclear physics and nuclear astrophysics research, the emittances of the ion beams are measured at the off-line Ion Source Test Facilities. In this project, emittance analysis software was developed to perform various data processing tasks for noise reduction, to evaluate root-mean-square emittance, Twiss parameters, and area emittance of different beam fractions. The software also provides 2D and 3D graphical views of the emittance data, beam profi les, emittance contours, and RMS. Noise exclusion is essential for accurate determination of beam emittance values. A Self-Consistent, Unbiased Elliptical Exclusion (SCUBEEx) method is employed. Numerical data analysis techniques such as interpolation and nonlinear fi tting are also incorporated into the software. The software will provide a simplifi ed, fast tool for comprehensive emittance analysis. The main functions of the software package have been completed. In preliminary tests with experimental emittance data, the analysis results using the software were shown to be accurate.

  11. Development of Cell Analysis Software for Cultivated Corneal Endothelial Cells.

    Okumura, Naoki; Ishida, Naoya; Kakutani, Kazuya; Hongo, Akane; Hiwa, Satoru; Hiroyasu, Tomoyuki; Koizumi, Noriko

    2017-11-01

    To develop analysis software for cultured human corneal endothelial cells (HCECs). Software was designed to recognize cell borders and to provide parameters such as cell density, coefficient of variation, and polygonality of cultured HCECs based on phase contrast images. Cultured HCECs with high or low cell density were incubated with Ca-free and Mg-free phosphate-buffered saline for 10 minutes to reveal the cell borders and were then analyzed with software (n = 50). Phase contrast images showed that cell borders were not distinctly outlined, but these borders became more distinctly outlined after phosphate-buffered saline treatment and were recognized by cell analysis software. The cell density value provided by software was similar to that obtained using manual cell counting by an experienced researcher. Morphometric parameters, such as the coefficient of variation and polygonality, were also produced by software, and these values were significantly correlated with cell density (Pearson correlation coefficients -0.62 and 0.63, respectively). The software described here provides morphometric information from phase contrast images, and it enables subjective and noninvasive quality assessment for tissue engineering therapy of the corneal endothelium.

  12. Loudspeaker handbook

    Eargle, John

    2003-01-01

    The second edition of Loudspeaker Handbook follows the same general outlines as the highly successful first edition and has been augmented and updated in many areas of technology. Most notable are the developments in large-scale, programmable line arrays, distributed mode loudspeakers, and ultrasonic-based audio transduction. Additionally, the core chapters on low frequency systems, system concepts, and horn systems have been expanded to include both more analytical material and a richer array of examples. Much of the success of the first edition has been due to its accessibility both to loudspeaker engineers and to lay technicians working in the field - a point of view the author maintains in the present work. A full understanding of the underlying technology requires a fairly rigorous engineering background through the second year of professional study. At the same time, the generous use of graphs, with their intuitive thrust, will be useful to all readers. Loudspeaker Handbook, Second Edition continues to ...

  13. Nanobiomaterials handbook

    Sitharaman, Balaji

    2011-01-01

    Nanobiomaterials exhibit distinctive characteristics, including mechanical, electrical, and optical properties, which make them suitable for a variety of biological applications. Because of their versatility, they are poised to play a central role in nanobiotechnology and make significant contributions to biomedical research and healthcare. Nanobiomaterials Handbook provides a comprehensive overview of the field, offering a broad introduction for those new to the subject as well as a useful reference for advanced professionals.Analyzing major topics and disciplines in this arena, this volume:

  14. Automated Freedom from Interference Analysis for Automotive Software

    Leitner-Fischer , Florian; Leue , Stefan; Liu , Sirui

    2016-01-01

    International audience; Freedom from Interference for automotive software systems developed according to the ISO 26262 standard means that a fault in a less safety critical software component will not lead to a fault in a more safety critical component. It is an important concern in the realm of functional safety for automotive systems. We present an automated method for the analysis of concurrency-related interferences based on the QuantUM approach and tool that we have previously developed....

  15. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  16. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  17. Development of output user interface software to support analysis

    Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id [Center for Development of Nuclear Informatics - National Nuclear Energy Agency, PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)

    2014-09-30

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  18. Development of output user interface software to support analysis

    Wahanani, Nursinta Adi; Natsir, Khairina; Hartini, Entin

    2014-01-01

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu 239 and Pu 241 . Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis

  19. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    Schmidt, Ralph; Bostelmann, Jonas; Cornet, Yves; Heipke, Christian; Philippe, Christian; Poncelet, Nadia; de Rosa, Diego; Vandeloise, Yannick

    2012-01-01

    The European Space Agency (ESA) is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk assoc...

  20. Development of evaluation method for software safety analysis techniques

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  1. How to do Meta-Analysis using HLM software

    Petscher, Yaacov

    2013-01-01

    This is a step-by-step presentation of how to run a meta-analysis using HLM software. Because it's a variance known model, it is not run through the GUI, but batch mode. These slides show how to prepare the data and run the analysis.

  2. A relational approach to support software architecture analysis

    Feijs, L.M.G.; Krikhaar, R.L.; van Ommering, R.C.

    1998-01-01

    This paper reports on our experience with a relational approach to support the analysis of existing software architectures. The analysis options provide for visualization and view calculation. The approach has been applied for reverse engineering. It is also possible to check concrete designs

  3. Development of data acquisition and analysis software for multichannel detectors

    Chung, Y.

    1988-06-01

    This report describes the development of data acquisition and analysis software for Apple Macintosh computers, capable of controlling two multichannel detectors. With the help of outstanding graphics capabilities, easy-to-use user interface, and several other built-in convenience features, this application has enhanced the productivity and the efficiency of data analysis. 2 refs., 6 figs

  4. Handbook of satellite applications

    Madry, Scott; Camacho-Lara, Sergio

    2013-01-01

    Top space experts from around the world have collaborated to produce this comprehensive, authoritative, and clearly illustrated reference guide to the fast growing, multi-billion dollar field of satellite applications and space communications. This handbook, done under the auspices of the International Space University based in France, addresses not only system technologies but also examines market dynamics, technical standards and regulatory constraints. The handbook is a completely multi-disciplinary reference book that covers, in an in-depth fashion, the fields of satellite telecommunications, Earth observation, remote sensing, satellite navigation, geographical information systems, and geosynchronous meteorological systems. It covers current practices and designs as well as advanced concepts and future systems. It provides a comparative analysis of the common technologies and design elements for satellite application bus structures, thermal controls, power systems, stabilization techniques, telemetry, com...

  5. Handbook of Brain Connectivity

    Jirsa, Viktor K

    2007-01-01

    Our contemporary understanding of brain function is deeply rooted in the ideas of the nonlinear dynamics of distributed networks. Cognition and motor coordination seem to arise from the interactions of local neuronal networks, which themselves are connected in large scales across the entire brain. The spatial architectures between various scales inevitably influence the dynamics of the brain and thereby its function. But how can we integrate brain connectivity amongst these structural and functional domains? Our Handbook provides an account of the current knowledge on the measurement, analysis and theory of the anatomical and functional connectivity of the brain. All contributors are leading experts in various fields concerning structural and functional brain connectivity. In the first part of the Handbook, the chapters focus on an introduction and discussion of the principles underlying connected neural systems. The second part introduces the currently available non-invasive technologies for measuring struct...

  6. NASA systems engineering handbook

    Shishko, Robert; Aster, Robert; Chamberlain, Robert G.; McDuffee, Patrick; Pieniazek, Les; Rowell, Tom; Bain, Beth; Cox, Renee I.; Mooz, Harold; Polaski, Lou

    1995-06-01

    This handbook brings the fundamental concepts and techniques of systems engineering to NASA personnel in a way that recognizes the nature of NASA systems and environment. It is intended to accompany formal NASA training courses on systems engineering and project management when appropriate, and is designed to be a top-level overview. The concepts were drawn from NASA field center handbooks, NMI's/NHB's, the work of the NASA-wide Systems Engineering Working Group and the Systems Engineering Process Improvement Task team, several non-NASA textbooks and guides, and material from independent systems engineering courses taught to NASA personnel. Five core chapters cover systems engineering fundamentals, the NASA Project Cycle, management issues in systems engineering, systems analysis and modeling, and specialty engineering integration. It is not intended as a directive.

  7. Radio-science performance analysis software

    Morabito, D. D.; Asmar, S. W.

    1995-02-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  8. Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2018-02-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  9. A 'Toolbox' Equivalent Process for Safety Analysis Software

    O'Kula, K.R.; Eng, Tony

    2004-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  10. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  11. FORECAST: Regulatory effects cost analysis software annual

    Lopez, B.; Sciacca, F.W.

    1991-11-01

    Over the past several years the NRC has developed a generic cost methodology for the quantification of cost/economic impacts associated with a wide range of new or revised regulatory requirements. This methodology has been developed to aid the NRC in preparing Regulatory Impact Analyses (RIAs). These generic costing methods can be useful in quantifying impacts both to industry and to the NRC. The FORECAST program was developed to facilitate the use of the generic costing methodology. This PC program integrates the major cost considerations that may be required because of a regulatory change. FORECAST automates much of the calculations typically needed in an RIA and thus reduces the time and labor required to perform these analysis. More importantly, its integrated and consistent treatment of the different cost elements should help assure comprehensiveness, uniformity, and accuracy in the preparation of needed cost estimates

  12. Equipment Obsolescence Analysis and Management Software

    Redmond, J.; Carret, L.; Shaon, S.; Schultz, C.

    2015-07-01

    The procurement engineering resources at Nuclear Power Plants (NPPs) are experiencing increasing backlog for procurement items primarily due to the inability to order the original replacement parts. The level of effort and time required to prepare procurement packages is increasing since the number of obsolete parts are increasing exponentially. Procurement packages for obsolete components and parts are much more complex and take more time to prepare because of the need to perform equivalency evaluations, testing requirements and test acceptance criteria development, commercial grade dedication or equipment qualification, and increasing efforts to verify that no fraudulent or counterfeit parts are procured. This problem will be further compounded when NPPs pursue license renewal and approval for plant-life extension. Advanced planning and advanced knowledge of equipment obsolescence is required to allow for sufficient time to properly procure replacement parts for obsolete items. The uncertain supply chain capability due to obsolescence is a real problem and can cause a risk to reliable plant operations due to the potential for a lack of available spare parts and replacement components to support outages and unplanned component failures. Advanced notification of obsolescence is increasingly more important to ensure that adequate time and planning is scheduled to procure the proper replacement parts. A thorough analysis of Original Equipment Manufacturer (OEM) availability and inventory as well as an analysis of failure rates and usage rates is required to predict critical part needs to allow for early identification of obsolescence issues so that a planned and controlled strategy to qualify replacement equipment can be implemented. (Author)

  13. The software analysis project for the Office of Human Resources

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  14. Development of interactive software for fuel management analysis

    Graves, H.W. Jr.

    1986-01-01

    Electronic computation plays a central part in engineering analysis of all types. Utilization of microcomputers for calculations that were formerly carried out on large mainframe computers presents a unique opportunity to develop software that not only takes advantage of the lower cost of using these machines, but also increases the efficiency of the engineers performing these calculations. This paper reviews the use of electronic computers in engineering analysis, discusses the potential for microcomputer utilization in this area, and describes a series of steps to be followed in software development that can yield significant gains in engineering design efficiency

  15. A software package for biomedical image processing and analysis

    Goncalves, J.G.M.; Mealha, O.

    1988-01-01

    The decreasing cost of computing power and the introduction of low cost imaging boards justifies the increasing number of applications of digital image processing techniques in the area of biomedicine. There is however a large software gap to be fulfilled, between the application and the equipment. The requirements to bridge this gap are twofold: good knowledge of the hardware provided and its interface to the host computer, and expertise in digital image processing and analysis techniques. A software package incorporating these two requirements was developed using the C programming language, in order to create a user friendly image processing programming environment. The software package can be considered in two different ways: as a data structure adapted to image processing and analysis, which acts as the backbone and the standard of communication for all the software; and as a set of routines implementing the basic algorithms used in image processing and analysis. Hardware dependency is restricted to a single module upon which all hardware calls are based. The data structure that was built has four main features: hierchical, open, object oriented, and object dependent dimensions. Considering the vast amount of memory needed by imaging applications and the memory available in small imaging systems, an effective image memory management scheme was implemented. This software package is being used for more than one and a half years by users with different applications. It proved to be an excellent tool for helping people to get adapted into the system, and for standardizing and exchanging software, yet preserving flexibility allowing for users' specific implementations. The philosophy of the software package is discussed and the data structure that was built is described in detail

  16. Applications of the BEam Cross section Analysis Software (BECAS)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert; Fedorov, Vladimir

    2013-01-01

    A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used for the gener......A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used...... for the generation of beam finite element models which correctly account for effects stemming from material anisotropy and inhomogeneity in cross sections of arbitrary geometry. These type of modelling approach allows for an accurate yet computationally inexpensive representation of a general class of three...

  17. Handbook on modelling for discrete optimization

    Pitsoulis, Leonidas; Williams, H

    2006-01-01

    The primary objective underlying the Handbook on Modelling for Discrete Optimization is to demonstrate and detail the pervasive nature of Discrete Optimization. While its applications cut across an incredibly wide range of activities, many of the applications are only known to specialists. It is the aim of this handbook to correct this. It has long been recognized that "modelling" is a critically important mathematical activity in designing algorithms for solving these discrete optimization problems. Nevertheless solving the resultant models is also often far from straightforward. In recent years it has become possible to solve many large-scale discrete optimization problems. However, some problems remain a challenge, even though advances in mathematical methods, hardware, and software technology have pushed the frontiers forward. This handbook couples the difficult, critical-thinking aspects of mathematical modeling with the hot area of discrete optimization. It will be done in an academic handbook treatment...

  18. Integrated Circuit Electromagnetic Immunity Handbook

    Sketoe, J. G.

    2000-08-01

    This handbook presents the results of the Boeing Company effort for NASA under contract NAS8-98217. Immunity level data for certain integrated circuit parts are discussed herein, along with analytical techniques for applying the data to electronics systems. This handbook is built heavily on the one produced in the seventies by McDonnell Douglas Astronautics Company (MDAC, MDC Report E1929 of 1 August 1978, entitled Integrated Circuit Electromagnetic Susceptibility Handbook, known commonly as the ICES Handbook, which has served countless systems designers for over 20 years). Sections 2 and 3 supplement the device susceptibility data presented in section 4 by presenting information on related material required to use the IC susceptibility information. Section 2 concerns itself with electromagnetic susceptibility analysis and serves as a guide in using the information contained in the rest of the handbook. A suggested system hardening requirements is presented in this chapter. Section 3 briefly discusses coupling and shielding considerations. For conservatism and simplicity, a worst case approach is advocated to determine the maximum amount of RF power picked up from a given field. This handbook expands the scope of the immunity data in this Handbook is to of 10 MHz to 10 GHz. However, the analytical techniques provided are applicable to much higher frequencies as well. It is expected however, that the upper frequency limit of concern is near 10 GHz. This is due to two factors; the pickup of microwave energy on system cables and wiring falls off as the square of the wavelength, and component response falls off at a rapid rate due to the effects of parasitic shunt paths for the RF energy. It should be noted also that the pickup on wires and cables does not approach infinity as the frequency decreases (as would be expected by extrapolating the square law dependence of the high frequency roll-off to lower frequencies) but levels off due to mismatch effects.

  19. Safety handbook

    1990-01-01

    The purpose of the Australian Nuclear Science and Technology Organization's Safety Handbook is to outline simply the fundamental procedures and safety precautions which provide an appropriate framework for safe working with any potential hazards, such as fire and explosion, welding, cutting, brazing and soldering, compressed gases, cryogenic liquids, chemicals, ionizing radiations, non-ionising radiations, sound and vibration, as well as safety in the office. It also specifies the organisation for safety at the Lucas Heights Research Laboratories and the responsibilities of individuals and committees. It also defines the procedures for the scrutiny and review of all operations and the resultant setting of safety rules for them. ills

  20. Development of Software for Measurement and Analysis of Solar Radiation

    Mohamad Idris Taib; Abul Adli Anuar; Noor Ezati Shuib

    2015-01-01

    This software was under development using LabVIEW to be using with StellarNet spectrometers system with USB communication to computer. LabVIEW have capabilities in hardware interfacing, graphical user interfacing and mathematical calculation including array manipulation and processing. This software read data from StellarNet spectrometer in real-time and then processed for analysis. Several measurement of solar radiation and analysis have been done. Solar radiation involved mainly infra-red, visible light and ultra-violet. With solar radiation spectrum data, information of weather and suitability of plant can be gathered and analyzed. Furthermore, optimization of utilization and safety precaution of solar radiation can be planned. Using this software, more research and development in utilization and safety of solar radiation can be explored. (author)

  1. Evaluation of peak-fitting software for gamma spectrum analysis

    Zahn, Guilherme S.; Genezini, Frederico A.; Moralles, Mauricio

    2009-01-01

    In all applications of gamma-ray spectroscopy, one of the most important and delicate parts of the data analysis is the fitting of the gamma-ray spectra, where information as the number of counts, the position of the centroid and the width, for instance, are associated with each peak of each spectrum. There's a huge choice of computer programs that perform this type of analysis, and the most commonly used in routine work are the ones that automatically locate and fit the peaks; this fit can be made in several different ways - the most common ways are to fit a Gaussian function to each peak or simply to integrate the area under the peak, but some software go far beyond and include several small corrections to the simple Gaussian peak function, in order to compensate for secondary effects. In this work several gamma-ray spectroscopy software are compared in the task of finding and fitting the gamma-ray peaks in spectra taken with standard sources of 137 Cs, 60 Co, 133 Ba and 152 Eu. The results show that all of the automatic software can be properly used in the task of finding and fitting peaks, with the exception of GammaVision; also, it was possible to verify that the automatic peak-fitting software did perform as well as - and sometimes even better than - a manual peak-fitting software. (author)

  2. Handbook of smoke control engineering

    Klote, John H; Turnbull, Paul G; Kashef, Ahmed; Ferreira, Michael J

    2012-01-01

    The Handbook of Smoke Control Engineering extends the tradition of the comprehensive treatment of smoke control technology, including fundamental concepts, smoke control systems, and methods of analysis. The handbook provides information needed for the analysis of design fires, including considerations of sprinklers, shielded fires, and transient fuels. It is also extremely useful for practicing engineers, architects, code officials, researchers, and students. Following the success of Principles of Smoke Management in 2002, this new book incorporates the latest research and advances in smoke control practice. New topics in the handbook are: controls, fire and smoke control in transport tunnels, and full-scale fire testing. For those getting started with the computer models CONTAM and CFAST, there are simplified instructions with examples. This is the first smoke control book with climatic data so that users will have easy-to-use weather data specifically for smoke control design for locations in the U.S., Can...

  3. UTOOLS: microcomputer software for spatial analysis and landscape visualization.

    Alan A. Ager; Robert J. McGaughey

    1997-01-01

    UTOOLS is a collection of programs designed to integrate various spatial data in a way that allows versatile spatial analysis and visualization. The programs were designed for watershed-scale assessments in which a wide array of resource data must be integrated, analyzed, and interpreted. UTOOLS software combines raster, attribute, and vector data into "spatial...

  4. Using Business Analysis Software in a Business Intelligence Course

    Elizondo, Juan; Parzinger, Monica J.; Welch, Orion J.

    2011-01-01

    This paper presents an example of a project used in an undergraduate business intelligence class which integrates concepts from statistics, marketing, and information systems disciplines. SAS Enterprise Miner software is used as the foundation for predictive analysis and data mining. The course culminates with a competition and the project is used…

  5. Application of software technology to automatic test data analysis

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  6. WinDAM C earthen embankment internal erosion analysis software

    Two primary causes of dam failure are overtopping and internal erosion. For the purpose of evaluating dam safety for existing earthen embankment dams and proposed earthen embankment dams, Windows Dam Analysis Modules C (WinDAM C) software will simulate either internal erosion or erosion resulting f...

  7. ANALYSIS OF CONTEMPORARY SOFTWARE BEING USED FOR FORWARDING SERVICES

    Naumov, V.

    2013-01-01

    Full Text Available The role of information technologies in the forwarding services has been specified. The typical structure of the logistic sites providing the search of requests of freight owners and carriers has been described. The analysis of the software for transportation companies was conducted. The perspective directions of improvement of forwarding services process have been revealed.

  8. A Knowledge-based Environment for Software Process Performance Analysis

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  9. Synchronized analysis of testbeam data with the Judith software

    McGoldrick, Garrin; Gorišek, Andrej

    2014-01-01

    The Judith software performs pixel detector analysis tasks utilizing two different data streams such as those produced by the reference and tested devices typically found in a testbeam. This software addresses and fixes problems arising from the desynchronization of the two simultaneously triggered data streams by detecting missed triggers in either of the streams. The software can perform all tasks required to generate particle tracks using multiple detector planes: it can align the planes, cluster hits and generate tracks from these clusters. This information can then be used to measure the properties of a particle detector with very fine spatial resolution. It was tested at DESY in the Kartel telescope, a silicon tracking detector, with ATLAS Diamond Beam Monitor modules as a device under test.

  10. One-Click Data Analysis Software for Science Operations

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  11. Application of econometric and ecology analysis methods in physics software

    Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Ronchieri, Elisabetta; Saracco, Paolo

    2017-10-01

    Some data analysis methods typically used in econometric studies and in ecology have been evaluated and applied in physics software environments. They concern the evolution of observables through objective identification of change points and trends, and measurements of inequality, diversity and evenness across a data set. Within each analysis area, various statistical tests and measures have been examined. This conference paper summarizes a brief overview of some of these methods.

  12. Comparison of two three-dimensional cephalometric analysis computer software.

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-10-01

    Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Twenty cone beam computed tomography images were obtained using i-CAT(®) imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (University of Illinois at Chicago, Chicago, IL, USA) software. Before and after orthodontic treatments data were analyzed using t-test. Reliability test using interclass correlation coefficient was stronger for InVivoDental5.0 (0.83-0.98) compared with 3DCeph™ (0.51-0.90). Paired t-test comparison of the two softwares shows no statistical significant difference in the measurements made in the two softwares. InVivoDental5.0 measurements are more reproducible and user friendly when compared to 3DCeph™. No statistical difference between the two softwares in linear or angular measurements. 3DCeph™ is more time-consuming in performing three-dimensional analysis compared with InVivoDental5.0.

  13. Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers

    Bjorner, Nikolaj

    2010-01-01

    The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings

  14. Handbook of systems toxicology

    Casciano, Daniel A; Sahu, Saura C

    2011-01-01

    "In the first handbook to comprehensively cover the emerging area of systems toxicology, the Handbook of Systems Toxicology provides an authoritative compilation of up-to-date developments presented...

  15. Handbook of thin film technology

    Frey, Hartmut

    2015-01-01

    Handbook of Thin Film Technology” covers all aspects of coatings preparation, characterization and applications. Different deposition techniques based on vacuum and plasma processes are presented. Methods of surface and thin film analysis including coating thickness, structural, optical, electrical, mechanical and magnetic properties of films are detailed described. The several applications of thin coatings and a special chapter focusing on nanoparticle-based films can be found in this handbook. A complete reference for students and professionals interested in the science and technology of thin films.

  16. Knickpoint finder: A software tool that improves neotectonic analysis

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  17. Mechanical engineer's handbook

    Marghitu, Dan B

    2001-01-01

    The Mechanical Engineer's Handbook was developed and written specifically to fill a need for mechanical engineers and mechanical engineering students throughout the world. With over 1000 pages, 550 illustrations, and 26 tables the Mechanical Engineer's Handbook is very comprehensive, yet affordable, compact, and durable. The Handbook covers all major areas of mechanical engineering with succinct coverage of the definitions, formulas, examples, theory, proofs, and explanations of all principle subject areas. The Handbook is an essential, practical companion for all mechanic

  18. Machine Vision Handbook

    2012-01-01

    The automation of visual inspection is becoming more and more important in modern industry as a consistent, reliable means of judging the quality of raw materials and manufactured goods . The Machine Vision Handbook  equips the reader with the practical details required to engineer integrated mechanical-optical-electronic-software systems. Machine vision is first set in the context of basic information on light, natural vision, colour sensing and optics. The physical apparatus required for mechanized image capture – lenses, cameras, scanners and light sources – are discussed followed by detailed treatment of various image-processing methods including an introduction to the QT image processing system. QT is unique to this book, and provides an example of a practical machine vision system along with extensive libraries of useful commands, functions and images which can be implemented by the reader. The main text of the book is completed by studies of a wide variety of applications of machine vision in insp...

  19. Handbook of Intelligent Vehicles

    2012-01-01

    The Handbook of Intelligent Vehicles provides a complete coverage of the fundamentals, new technologies, and sub-areas essential to the development of intelligent vehicles; it also includes advances made to date, challenges, and future trends. Significant strides in the field have been made to date; however, so far there has been no single book or volume which captures these advances in a comprehensive format, addressing all essential components and subspecialties of intelligent vehicles, as this book does. Since the intended users are engineering practitioners, as well as researchers and graduate students, the book chapters do not only cover fundamentals, methods, and algorithms but also include how software/hardware are implemented, and demonstrate the advances along with their present challenges. Research at both component and systems levels are required to advance the functionality of intelligent vehicles. This volume covers both of these aspects in addition to the fundamentals listed above.

  20. STARS software tool for analysis of reliability and safety

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  1. Semiconductors data handbook

    Madelung, Otfried

    2004-01-01

    This volume Semiconductors: Data Handbook contains frequently used data from the corresponding larger Landolt-Börnstein handbooks in a low price book for the individual scientist working in the laboratory. The Handbook contain important information about a large number of semiconductors

  2. STAMPS: development and verification of swallowing kinematic analysis software.

    Lee, Woo Hyung; Chun, Changmook; Seo, Han Gil; Lee, Seung Hak; Oh, Byung-Mo

    2017-10-17

    Swallowing impairment is a common complication in various geriatric and neurodegenerative diseases. Swallowing kinematic analysis is essential to quantitatively evaluate the swallowing motion of the oropharyngeal structures. This study aims to develop a novel swallowing kinematic analysis software, called spatio-temporal analyzer for motion and physiologic study (STAMPS), and verify its validity and reliability. STAMPS was developed in MATLAB, which is one of the most popular platforms for biomedical analysis. This software was constructed to acquire, process, and analyze the data of swallowing motion. The target of swallowing structures includes bony structures (hyoid bone, mandible, maxilla, and cervical vertebral bodies), cartilages (epiglottis and arytenoid), soft tissues (larynx and upper esophageal sphincter), and food bolus. Numerous functions are available for the spatiotemporal parameters of the swallowing structures. Testing for validity and reliability was performed in 10 dysphagia patients with diverse etiologies and using the instrumental swallowing model which was designed to mimic the motion of the hyoid bone and the epiglottis. The intra- and inter-rater reliability tests showed excellent agreement for displacement and moderate to excellent agreement for velocity. The Pearson correlation coefficients between the measured and instrumental reference values were nearly 1.00 (P software is expected to be useful for researchers who are interested in the swallowing motion analysis.

  3. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  4. Model Based Analysis and Test Generation for Flight Software

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  5. Hardware and software constructs for a vibration analysis network

    Cook, S.A.; Crowe, R.D.; Toffer, H.

    1985-01-01

    Vibration level monitoring and analysis has been initiated at N Reactor, the dual purpose reactor operated at Hanford, Washington by UNC Nuclear Industries (UNC) for the Department of Energy (DOE). The machinery to be monitored was located in several buildings scattered over the plant site, necessitating an approach using satellite stations to collect, monitor and temporarily store data. The satellite stations are, in turn, linked to a centralized processing computer for further analysis. The advantages of a networked data analysis system are discussed in this paper along with the hardware and software required to implement such a system

  6. Calibration Analysis Software for the ATLAS Pixel Detector

    AUTHOR|(INSPIRE)INSPIRE-00372086; The ATLAS collaboration

    2016-01-01

    The calibration of the ATLAS Pixel detector at LHC fulfils two main purposes: to tune the front-end configuration parameters for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel detector scans and analyses is called Calibration Console. The introduction of a new layer, equipped with new Front End-I4 Chips, required an update the Console architecture. It now handles scans and scans analyses applied together to chips with different characteristics. An overview of the newly developed Calibration Analysis Software will be presented, together with some preliminary result.

  7. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    Shear, Trevor Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-29

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystal sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.

  8. Spectrum analysis on quality requirements consideration in software design documents.

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  9. Software for a measuring facility for activation analysis

    De Keyser, A.; De Roost, E.

    1985-01-01

    A software package has been developed for an APPLE P.C. The programs are intended to control an automated measuring station for photon activation analysis at GELINA, the linear accelerator of C.B.N.M. at Geel (Belgium). They allow to set-up a measuring scheme, to execute it under computer control, to accumulate and store 2 K-spectra using a built-in ADC and to output the results as listings, plots or evaluated reports

  10. Phenomenology and Qualitative Data Analysis Software (QDAS): A Careful Reconciliation

    Brian Kelleher Sohn

    2017-01-01

    An oft-cited phenomenological methodologist, Max VAN MANEN (2014), claims that qualitative data analysis software (QDAS) is not an appropriate tool for phenomenological research. Yet phenomenologists rarely describe how phenomenology is to be done: pencil, paper, computer? DAVIDSON and DI GREGORIO (2011) urge QDAS contrarians such as VAN MANEN to get over their methodological loyalties and join the digital world, claiming that all qualitative researchers, whatever their methodology, perform p...

  11. Handbook of electromagnetic compatibility

    1995-01-01

    This""know-how""book gives readers a concise understanding of the fundamentals of EMC, from basic mathematical and physical concepts through present, computer-age methods used in analysis, design, and tests. With contributions from leading experts in their fields, the text provides a comprehensive overview. Fortified with information on how to solve potential electromagnetic interference (EMI) problems that may arise in electronic design, practitioners will be betterable to grasp the latest techniques, trends, and applications of this increasingly important engineering discipline.Handbook of E

  12. Induction machine handbook

    Boldea, Ion

    2002-01-01

    Often called the workhorse of industry, the advent of power electronics and advances in digital control are transforming the induction motor into the racehorse of industrial motion control. Now, the classic texts on induction machines are nearly three decades old, while more recent books on electric motors lack the necessary depth and detail on induction machines.The Induction Machine Handbook fills industry's long-standing need for a comprehensive treatise embracing the many intricate facets of induction machine analysis and design. Moving gradually from simple to complex and from standard to

  13. Handbook of purified gases

    Schoen, Helmut

    2015-01-01

    Technical gases are used in almost every field of industry, science and medicine and also as a means of control by government authorities and institutions and are regarded as indispensable means of assistance. In this complete handbook of purified gases the physical foundations of purified gases and mixtures as well as their manufacturing, purification, analysis, storage, handling and transport are presented in a comprehensive way. This important reference work is accompanied with a large number of Data Sheets dedicated to the most important purified gases.  

  14. Circuits and filters handbook

    Chen, Wai-Kai

    2003-01-01

    A bestseller in its first edition, The Circuits and Filters Handbook has been thoroughly updated to provide the most current, most comprehensive information available in both the classical and emerging fields of circuits and filters, both analog and digital. This edition contains 29 new chapters, with significant additions in the areas of computer-aided design, circuit simulation, VLSI circuits, design automation, and active and digital filters. It will undoubtedly take its place as the engineer's first choice in looking for solutions to problems encountered in the design, analysis, and behavi

  15. SIMA: Python software for analysis of dynamic fluorescence imaging data

    Patrick eKaifosh

    2014-09-01

    Full Text Available Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs, and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  16. Nuclear analysis software. Pt. 1: Spectrum transfer and reformatting (SPEDAC)

    1991-01-01

    GANAAS (Gamma, Activity, and Neutron Activation Analysis System) is one in the family of software packages developed under the auspices of the International Atomic Energy Agency. Primarily, the package was intended to support the IAEA Technical Assistance and Cooperation projects in developing countries. However, it is open domain software that can be copied and used by anybody, except for commercial purposes. All the nuclear analysis software provided by the IAEA has the same design philosophy and similar structure. The intention was to provide the user with maximum flexibility, at the same time with a simple and logical organization that requires minimum digging through the manuals. GANAAS is a modular system. It consists of several programmes that can be installed on the hard disk as the are needed. Obviously, some parts of they system are required in all cases. Those are installed at the beginning, without consulting the operator. GANAAS offers the opportunity to expand and improve the system. The gamma spectrum evaluation programmes using different fitting algorithms can be added to GANAAS, under the condition that the format of their input and output files corresponds to the rules of GANAAS. The same applies to the quantitative analysis parts of the programme

  17. Gen IV Materials Handbook Implementation Plan

    Rittenhouse, P.; Ren, W.

    2005-01-01

    A Gen IV Materials Handbook is being developed to provide an authoritative single source of highly qualified structural materials information and materials properties data for use in design and analyses of all Generation IV Reactor Systems. The Handbook will be responsive to the needs expressed by all of the principal government, national laboratory, and private company stakeholders of Gen IV Reactor Systems. The Gen IV Materials Handbook Implementation Plan provided here addresses the purpose, rationale, attributes, and benefits of the Handbook and will detail its content, format, quality assurance, applicability, and access. Structural materials, both metallic and ceramic, for all Gen IV reactor types currently supported by the Department of Energy (DOE) will be included in the Gen IV Materials Handbook. However, initial emphasis will be on materials for the Very High Temperature Reactor (VHTR). Descriptive information (e.g., chemical composition and applicable technical specifications and codes) will be provided for each material along with an extensive presentation of mechanical and physical property data including consideration of temperature, irradiation, environment, etc. effects on properties. Access to the Gen IV Materials Handbook will be internet-based with appropriate levels of control. Information and data in the Handbook will be configured to allow search by material classes, specific materials, specific information or property class, specific property, data parameters, and individual data points identified with materials parameters, test conditions, and data source. Details on all of these as well as proposed applicability and consideration of data quality classes are provided in the Implementation Plan. Website development for the Handbook is divided into six phases including (1) detailed product analysis and specification, (2) simulation and design, (3) implementation and testing, (4) product release, (5) project/product evaluation, and (6) product

  18. Freud: a software suite for high-throughput simulation analysis

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  19. Handbook for Policymackers

    Jensen, Ulla Højmark

    2016-01-01

    The purpose of the handbook is to support policy makers / decision makers by enabling them to make informed decisions about funding for second chance education or the adoption of its methods into mainstream education. The Handbook makes policy recommendations and provides guidelines to structure...... a quality assurance system that evidences success factors of second chance education and the value of informal learning. The handbook draws on the results of the literature review, the development of the quality assurance system (SMS system), the Teachers Handbook and the Organisational Handbook...

  20. IMMAN: free software for information theory-based chemometric analysis.

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  1. An ion beam analysis software based on ImageJ

    Udalagama, C.; Chen, X.; Bettiol, A.A.; Watt, F.

    2013-01-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation

  2. An ion beam analysis software based on ImageJ

    Udalagama, C., E-mail: chammika@nus.edu.sg [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore); Chen, X.; Bettiol, A.A.; Watt, F. [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore)

    2013-07-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation.

  3. Handbook of nuclear engineering: vol 1: nuclear engineering fundamentals; vol 2: reactor design; vol 3: reactor analysis; vol 4: reactors of waste disposal and safeguards

    2013-01-01

    The Handbook of Nuclear Engineering is an authoritative compilation of information regarding methods and data used in all phases of nuclear engineering. Addressing nuclear engineers and scientists at all academic levels, this five volume set provides the latest findings in nuclear data and experimental techniques, reactor physics, kinetics, dynamics and control. Readers will also find a detailed description of data assimilation, model validation and calibration, sensitivity and uncertainty analysis, fuel management and cycles, nuclear reactor types and radiation shielding. A discussion of radioactive waste disposal, safeguards and non-proliferation, and fuel processing with partitioning and transmutation is also included. As nuclear technology becomes an important resource of non-polluting sustainable energy in the future, The Handbook of Nuclear Engineering is an excellent reference for practicing engineers, researchers and professionals.

  4. Analysis of signal acquisition in GPS receiver software

    Vlada S. Sokolović

    2011-01-01

    Full Text Available This paper presents a critical analysis of the flow signal processing carried out in GPS receiver software, which served as a basis for a critical comparison of different signal processing architectures within the GPS receiver. It is possible to achieve Increased flexibility and reduction of GPS device commercial costs, including those of mobile devices, by using radio technology software (SDR, Software Defined Radio. The SDR application can be realized when certain hardware components in a GPS receiver are replaced. Signal processing in the SDR is implemented using a programmable DSP (Digital Signal Processing or FPGA (Field Programmable Gate Array circuit, which allows a simple change of digital signal processing algorithms and a simple change of the receiver parameters. The starting point of the research is the signal generated on the satellite the structure of which is shown in the paper. Based on the GPS signal structure, a receiver is realized with a task to extract an appropriate signal from the spectrum and detect it. Based on collected navigation data, the receiver calculates the position of the end user. The signal coming from the satellite may be at the carrier frequencies of L1 and L2. Since the SPS is used in the civil service, all the tests shown in the work were performed on the L1 signal. The signal coming to the receiver is generated in the spread spectrum technology and is situated below the level of noise. Such signals often interfere with signals from the environment which presents a difficulty for a receiver to perform proper detection and signal processing. Therefore, signal processing technology is continually being improved, aiming at more accurate and faster signal processing. All tests were carried out on a signal acquired from the satellite using the SE4110 input circuit used for filtering, amplification and signal selection. The samples of the received signal were forwarded to a computer for data post processing, i. e

  5. Open source software and crowdsourcing for energy analysis

    Bazilian, Morgan; Rice, Andrew; Rotich, Juliana; Howells, Mark; DeCarolis, Joseph; Macmillan, Stuart; Brooks, Cameron; Bauer, Florian; Liebreich, Michael

    2012-01-01

    Informed energy decision making requires effective software, high-quality input data, and a suitably trained user community. Developing these resources can be expensive and time consuming. Even when data and tools are intended for public re-use they often come with technical, legal, economic and social barriers that make them difficult to adopt, adapt and combine for use in new contexts. We focus on the promise of open, publically accessible software and data as well as crowdsourcing techniques to develop robust energy analysis tools that can deliver crucial, policy-relevant insight, particularly in developing countries, where planning resources are highly constrained—and the need to adapt these resources and methods to the local context is high. We survey existing research, which argues that these techniques can produce high-quality results, and also explore the potential role that linked, open data can play in both supporting the modelling process and in enhancing public engagement with energy issues. - Highlights: ► We focus on the promise of open, publicly accessible software and data. ► These emerging techniques can produce high-quality results for energy analysis. ► Developing economies require new techniques for energy planning.

  6. PuMA: the Porous Microstructure Analysis software

    Ferguson, Joseph C.; Panerai, Francesco; Borner, Arnaud; Mansour, Nagi N.

    2018-01-01

    The Porous Microstructure Analysis (PuMA) software has been developed in order to compute effective material properties and perform material response simulations on digitized microstructures of porous media. PuMA is able to import digital three-dimensional images obtained from X-ray microtomography or to generate artificial microstructures. PuMA also provides a module for interactive 3D visualizations. Version 2.1 includes modules to compute porosity, volume fractions, and surface area. Two finite difference Laplace solvers have been implemented to compute the continuum tortuosity factor, effective thermal conductivity, and effective electrical conductivity. A random method has been developed to compute tortuosity factors from the continuum to rarefied regimes. Representative elementary volume analysis can be performed on each property. The software also includes a time-dependent, particle-based model for the oxidation of fibrous materials. PuMA was developed for Linux operating systems and is available as a NASA software under a US & Foreign release.

  7. Uses of software in digital image analysis: a forensic report

    Sharma, Mukesh; Jha, Shailendra

    2010-02-01

    Forensic image analysis is required an expertise to interpret the content of an image or the image itself in legal matters. Major sub-disciplines of forensic image analysis with law enforcement applications include photo-grammetry, photographic comparison, content analysis and image authentication. It has wide applications in forensic science range from documenting crime scenes to enhancing faint or indistinct patterns such as partial fingerprints. The process of forensic image analysis can involve several different tasks, regardless of the type of image analysis performed. Through this paper authors have tried to explain these tasks, which are described in to three categories: Image Compression, Image Enhancement & Restoration and Measurement Extraction. With the help of examples like signature comparison, counterfeit currency comparison and foot-wear sole impression using the software Canvas and Corel Draw.

  8. A software platform for the analysis of dermatology images

    Vlassi, Maria; Mavraganis, Vlasios; Asvestas, Panteleimon

    2017-11-01

    The purpose of this paper is to present a software platform developed in Python programming environment that can be used for the processing and analysis of dermatology images. The platform provides the capability for reading a file that contains a dermatology image. The platform supports image formats such as Windows bitmaps, JPEG, JPEG2000, portable network graphics, TIFF. Furthermore, it provides suitable tools for selecting, either manually or automatically, a region of interest (ROI) on the image. The automated selection of a ROI includes filtering for smoothing the image and thresholding. The proposed software platform has a friendly and clear graphical user interface and could be a useful second-opinion tool to a dermatologist. Furthermore, it could be used to classify images including from other anatomical parts such as breast or lung, after proper re-training of the classification algorithms.

  9. New analysis software for Viking Lander meteorological data

    O. Kemppinen

    2013-02-01

    Full Text Available We have developed a set of tools that enable us to process Viking Lander meteorological data beyond what has been previously publicly available. Besides providing data for new periods of time, the existing data periods have been augmented by enhancing the data resolution significantly. This was accomplished by first transferring the original Prime computer version of the data analysis software to a standard Linux platform, and then by modifying the software to be able to process the data despite irregularities in the original raw data and reverse engineering various parameter files. In addition to this, the processing pipeline has been streamlined, making processing the data faster and easier. As a case example of new data, freshly processed Viking Lander 1 and 2 temperature records are described and briefly analyzed in ways that have not been previously possible due to the lack of data.

  10. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  11. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    2011-09-30

    ... Software AGENCY: Nuclear Regulatory Commission. ACTION: Regulatory issue summary; request for comment... computer software package, WESTEMS TM , to demonstrate compliance with Section III, ``Rules for... Software Addressees All holders of, and applicants for, a power reactor operating license or construction...

  12. A software architectural framework specification for neutron activation analysis

    Preston, J.A.; Grant, C.N.

    2013-01-01

    Neutron Activation Analysis (NAA) is a sensitive multi-element nuclear analytical technique that has been routinely applied by research reactor (RR) facilities to environmental, nutritional, health related, geological and geochemical studies. As RR facilities face calls to increase their research output and impact, with existing or reducing budgets, automation of NAA offers a possible solution. However, automation has many challenges, not the least of which is a lack of system architecture standards to establish acceptable mechanisms for the various hardware/software and software/software interactions among data acquisition systems, specialised hardware such as sample changers, sample loaders, and data processing modules. This lack of standardization often results in automation hardware and software being incompatible with existing system components, in a facility looking to automate its NAA operations. This limits the availability of automation to a few RR facilities with adequate budgets or in-house engineering resources. What is needed is a modern open system architecture for NAA, that provides the required set of functionalities. This paper describes such an 'architectural framework' (OpenNAA), and portions of a reference implementation. As an example of the benefits, calculations indicate that applying this architecture to the compilation and QA steps associated with the analysis of 35 elements in 140 samples, with 14 SRM's, can reduce the time required by over 80 %. The adoption of open standards in the nuclear industry has been very successful over the years in promoting interchangeability and maximising the lifetime and output of nuclear measurement systems. OpenNAA will provide similar benefits within the NAA application space, safeguarding user investments in their current system, while providing a solid path for development into the future. (author)

  13. Cookstove handbook

    1985-01-01

    Firewood has become a matter of serious concern due to dwindling forest resources coupled with excessive consumption resulting from wasteful and inefficient methods of cooking. Efforts are now being made by various researchers to design more efficient and less expensive cooking stoves. This handbook is a compendium of more than 40 different types of solid-fuel cooking stoves which can be manufactured using locally available materials and skills. The evolution of cooking stoves from open-fire cooking arrangements is described. One chapter is devoted to cooking-stove design considerations, including performance, combustion, oxygen requirements, surrounding air effects, near-by solid-surface effects, the excess-air concept, heat transfer, geometrical efficiency and stove components. The bulk of the document introduces traditionally used and improved cooking-stove designs for wood and charcoal burning in detail providing information on fuels, stove materials, and advantages and disadvantages. A description of 14 different laboratory investigations on efficiency aspects of cooking stoves is presented. Annexes are devoted to the composition and properties of wood, a summary of characteristics of the stoves described in the text, the efficiencies of commonly used cooking stoves with different fuels and a glossary. 82 references.

  14. Decommissioning handbook

    Manion, W.J.; LaGuardia, T.S.

    1980-11-01

    This document is a compilation of information pertinent to the decommissioning of surplus nuclear facilities. This handbook is intended to describe all stages of the decommissioning process including selection of the end product, estimation of the radioactive inventory, estimation of occupational exposures, description of the state-of-the-art in re decontamination, remote csposition of wastes, and estimation of program costs. Presentation of state-of-the-art technology and data related to decommissioning will aid in consistent and efficient program planning and performance. Particular attention is focused on available technology applicable to those decommissioning activities that have not been accomplished before, such as remote segmenting and handling of highly activated 1100 MW(e) light water reactor vessel internals and thick-walled reactor vessels. A summary of available information associated with the planning and estimating of a decommissioning program is also presented. Summarized in particular are the methodologies associated with the calculation and measurement of activated material inventory, distribution, and surface dose level, system contamination inventory and distribution, and work area dose levels. Cost estimating techniques are also presented and the manner in which to account for variations in labor costs as impacting labor-intensive work activities is explained.

  15. Decommissioning handbook

    Manion, W.J.; LaGuardia, T.S.

    1980-11-01

    This document is a compilation of information pertinent to the decommissioning of surplus nuclear facilities. This handbook is intended to describe all stages of the decommissioning process including selection of the end product, estimation of the radioactive inventory, estimation of occupational exposures, description of the state-of-the-art in re decontamination, remote csposition of wastes, and estimation of program costs. Presentation of state-of-the-art technology and data related to decommissioning will aid in consistent and efficient program planning and performance. Particular attention is focused on available technology applicable to those decommissioning activities that have not been accomplished before, such as remote segmenting and handling of highly activated 1100 MW(e) light water reactor vessel internals and thick-walled reactor vessels. A summary of available information associated with the planning and estimating of a decommissioning program is also presented. Summarized in particular are the methodologies associated with the calculation and measurement of activated material inventory, distribution, and surface dose level, system contamination inventory and distribution, and work area dose levels. Cost estimating techniques are also presented and the manner in which to account for variations in labor costs as impacting labor-intensive work activities is explained

  16. BIM Software Capability and Interoperability Analysis : An analytical approach toward structural usage of BIM software (S-BIM)

    A. Taher, Ali

    2016-01-01

    This study focused on the structuralanalysis of BIM models. Different commercial software (Autodesk products and Rhinoceros)are presented through modelling and analysis of different structures with varying complexity,section properties, geometry, and material. Beside the commercial software, differentarchitectural and different tools for structural analysis are evaluated (dynamo, grasshopper,add-on tool, direct link, indirect link via IFC). BIM and Structural BIM (S-BIM)

  17. UPVapor: Cofrentes nuclear power plant production results analysis software

    Curiel, M.; Palomo, M. J.; Baraza, A.; Vaquer, J.

    2010-10-01

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  18. UPVapor: Cofrentes nuclear power plant production results analysis software

    Curiel, M. [Logistica y Acondicionamientos Industriales SAU, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain); Palomo, M. J. [ISIRYM, Universidad Politecnica de Valencia, Camino de Vera s/n, Valencia (Spain); Baraza, A. [Iberdrola Generacion S. A., Central Nuclear Cofrentes, Carretera Almansa Requena s/n, 04662 Cofrentes, Valencia (Spain); Vaquer, J., E-mail: m.curiel@lainsa.co [TITANIA Servicios Tecnologicos SL, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain)

    2010-10-15

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  19. International Atomic Energy Agency intercomparison of ion beam analysis software

    Barradas, N.P. [Instituto Tecnologico e Nuclear, Estrada Nacional No. 10, Apartado 21, 2686-953 Sacavem (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Avenida do Professor Gama Pinto 2, 1649-003 Lisboa (Portugal)], E-mail: nunoni@itn.pt; Arstila, K. [K.U. Leuven, Instituut voor Kern-en Stralingsfysica, Celestijnenlaan 200D, B-3001 Leuven (Belgium); Battistig, G. [MFA Research Institute for Technical Physics and Materials Science, P.O. Box 49, H-1525 Budapest (Hungary); Bianconi, M. [CNR-IMM-Sezione di Bologna, Via P. Gobetti, 101, I-40129 Bologna (Italy); Dytlewski, N. [International Atomic Energy Agency, Wagramer Strasse 5, P.O. Box 100, A-1400 Vienna (Austria); Jeynes, C. [Surrey Ion Beam Centre, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom); Kotai, E. [KFKI Research Institute for Particle and Nuclear Physics, P.O. Box 49, H-1525 Budapest (Hungary); Lulli, G. [CNR-IMM-Sezione di Bologna, Via P. Gobetti, 101, I-40129 Bologna (Italy); Mayer, M. [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany); Rauhala, E. [Accelerator Laboratory, Department of Physics, University of Helsinki, P.O. Box 43, FIN-00014 Helsinki (Finland); Szilagyi, E. [KFKI Research Institute for Particle and Nuclear Physics, P.O. Box 49, H-1525 Budapest (Hungary); Thompson, M. [Department of MS and E/Bard Hall 328, Cornell University, Ithaca, NY 14853 (United States)

    2007-09-15

    Ion beam analysis (IBA) includes a group of techniques for the determination of elemental concentration depth profiles of thin film materials. Often the final results rely on simulations, fits and calculations, made by dedicated codes written for specific techniques. Here we evaluate numerical codes dedicated to the analysis of Rutherford backscattering spectrometry, non-Rutherford elastic backscattering spectrometry, elastic recoil detection analysis and non-resonant nuclear reaction analysis data. Several software packages have been presented and made available to the community. New codes regularly appear, and old codes continue to be used and occasionally updated and expanded. However, those codes have to date not been validated, or even compared to each other. Consequently, IBA practitioners use codes whose validity, correctness and accuracy have never been validated beyond the authors' efforts. In this work, we present the results of an IBA software intercomparison exercise, where seven different packages participated. These were DEPTH, GISA, DataFurnace (NDF), RBX, RUMP, SIMNRA (all analytical codes) and MCERD (a Monte Carlo code). In a first step, a series of simulations were defined, testing different capabilities of the codes, for fixed conditions. In a second step, a set of real experimental data were analysed. The main conclusion is that the codes perform well within the limits of their design, and that the largest differences in the results obtained are due to differences in the fundamental databases used (stopping power and scattering cross section). In particular, spectra can be calculated including Rutherford cross sections with screening, energy resolution convolutions including energy straggling, and pileup effects, with agreement between the codes available at the 0.1% level. This same agreement is also available for the non-RBS techniques. This agreement is not limited to calculation of spectra from particular structures with predetermined

  20. Research and Development of Statistical Analysis Software System of Maize Seedling Experiment

    Hui Cao

    2014-01-01

    In this study, software engineer measures were used to develop a set of software system for maize seedling experiments statistics and analysis works. During development works, B/S structure software design method was used and a set of statistics indicators for maize seedling evaluation were established. The experiments results indicated that this set of software system could finish quality statistics and analysis for maize seedling very well. The development of this software system explored a...

  1. A Parallel Software Pipeline for DMET Microarray Genotyping Data Analysis

    Giuseppe Agapito

    2018-06-01

    Full Text Available Personalized medicine is an aspect of the P4 medicine (predictive, preventive, personalized and participatory based precisely on the customization of all medical characters of each subject. In personalized medicine, the development of medical treatments and drugs is tailored to the individual characteristics and needs of each subject, according to the study of diseases at different scales from genotype to phenotype scale. To make concrete the goal of personalized medicine, it is necessary to employ high-throughput methodologies such as Next Generation Sequencing (NGS, Genome-Wide Association Studies (GWAS, Mass Spectrometry or Microarrays, that are able to investigate a single disease from a broader perspective. A side effect of high-throughput methodologies is the massive amount of data produced for each single experiment, that poses several challenges (e.g., high execution time and required memory to bioinformatic software. Thus a main requirement of modern bioinformatic softwares, is the use of good software engineering methods and efficient programming techniques, able to face those challenges, that include the use of parallel programming and efficient and compact data structures. This paper presents the design and the experimentation of a comprehensive software pipeline, named microPipe, for the preprocessing, annotation and analysis of microarray-based Single Nucleotide Polymorphism (SNP genotyping data. A use case in pharmacogenomics is presented. The main advantages of using microPipe are: the reduction of errors that may happen when trying to make data compatible among different tools; the possibility to analyze in parallel huge datasets; the easy annotation and integration of data. microPipe is available under Creative Commons license, and is freely downloadable for academic and not-for-profit institutions.

  2. Comparison of two software versions for assessment of body-composition analysis by DXA

    Vozarova, B; Wang, J; Weyer, C

    2001-01-01

    To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA.......To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA....

  3. Graph based communication analysis for hardware/software codesign

    Knudsen, Peter Voigt; Madsen, Jan

    1999-01-01

    In this paper we present a coarse grain CDFG (Control/Data Flow Graph) model suitable for hardware/software partitioning of single processes and demonstrate how it is necessary to perform various transformations on the graph structure before partitioning in order to achieve a structure that allows...... for accurate estimation of communication overhead between nodes mapped to different processors. In particular, we demonstrate how various transformations of control structures can lead to a more accurate communication analysis and more efficient implementations. The purpose of the transformations is to obtain...

  4. Development of RCM analysis software for Korean nuclear power plants

    Kim, Young Ho; Choi, Kwang Hee; Jeong, Hyeong Jong [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korean nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot system, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC. 3 refs., 4 figs. (Author)

  5. Development of RCM analysis software for Korean nuclear power plants

    Kim, Young Ho; Choi, Kwang Hee; Jeong, Hyeong Jong [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korean nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot system, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC. 3 refs., 4 figs. (Author)

  6. Gamma camera image processing and graphical analysis mutual software system

    Wang Zhiqian; Chen Yongming; Ding Ailian; Ling Zhiye; Jin Yongjie

    1992-01-01

    GCCS gamma camera image processing and graphical analysis system is a special mutual software system. It is mainly used to analyse various patient data acquired from gamma camera. This system is used on IBM PC, PC/XT or PC/AT. It consists of several parts: system management, data management, device management, program package and user programs. The system provides two kinds of user interfaces: command menu and command characters. It is easy to change and enlarge this system because it is best modularized. The user programs include almost all the clinical protocols used now

  7. Discriminant Analysis of the Effects of Software Cost Drivers on ...

    The paper work investigates the effect of software cost drivers on project schedule estimation of software development projects in Nigeria. Specifically, the paper determines the extent to which software cost variables affect our software project time schedule in our environment. Such studies are lacking in the recent ...

  8. Review: Reiner Keller, Andreas Hirseland, Werner Schneider & Willy Viehöver (Eds. (2006. Handbuch Sozialwissenschaftliche Diskursanalyse. Band I: Theorien und Methoden [Handbook on Discourse Analysis in Social Sciences. Theories and Methods

    Steffen Großkopf

    2008-01-01

    Full Text Available The handbook gives a general overview of the established—but increasingly complex—theoretical and methodological practice of discourse analysis. The attempt is made to systematize the diversity of discourse analysis, with its different historical roots and fields of research in the social sciences, even though the focus is mainly on FOUCAULT's approach. Thus, the handbook has to be understood as an important step towards a insertion of discourse analysis into social science. Although there are unavoidable contradictions between the articles—almost because of the open theoretical foundations—the handbook characterizes, as a whole, discourse analysis as a heterogeneous strategy of research rather than as a stringent method. With this approach, it can be a helpful guide for the reader’s own research projects. However, if one is looking for a precise method, the investment of time in discourse analysis could be wasted. URN: urn:nbn:de:0114-fqs0801143

  9. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  10. Development of a software for INAA analysis automation

    Zahn, Guilherme S.; Genezini, Frederico A.; Figueiredo, Ana Maria G.; Ticianelli, Regina B.

    2013-01-01

    In this work, a software to automate the post-counting tasks in comparative INAA has been developed that aims to become more flexible than the available options, integrating itself with some of the routines currently in use in the IPEN Activation Analysis Laboratory and allowing the user to choose between a fully-automatic analysis or an Excel-oriented one. The software makes use of the Genie 2000 data importing and analysis routines and stores each 'energy-counts-uncertainty' table as a separate ASCII file that can be used later on if required by the analyst. Moreover, it generates an Excel-compatible CSV (comma separated values) file with only the relevant results from the analyses for each sample or comparator, as well as the results of the concentration calculations and the results obtained with four different statistical tools (unweighted average, weighted average, normalized residuals and Rajeval technique), allowing the analyst to double-check the results. Finally, a 'summary' CSV file is also produced, with the final concentration results obtained for each element in each sample. (author)

  11. Models for composing software : an analysis of software composition and objects

    Bergmans, Lodewijk

    1999-01-01

    In this report, we investigate component-based software construction with a focus on composition. In particular we try to analyze the requirements and issues for components and software composition. As a means to understand this research area, we introduce a canonical model for representing

  12. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate.......However there is clearly a real need for robust tools, standard operating procedures and general acceptance of best practises. Thus we submit to the proteomics community a call for a community-wide open set of proteomics analysis challenges—PROTEINCHALLENGE—that directly target and compare data analysis workflows......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  13. Risk assessment handbook

    Farmer, F.G.; Jones, J.L.; Hunt, R.N.; Roush, M.L.; Wierman, T.E.

    1990-09-01

    The Probabilistic Risk Assessment Unit at EG ampersand G Idaho has developed this handbook to provide guidance to a facility manager exploring the potential benefit to be gained by performance of a risk assessment properly scoped to meet local needs. This document is designed to help the manager control the resources expended commensurate with the risks being managed and to assure that the products can be used programmatically to support future needs in order to derive maximum beneflt from the resources expended. We present a logical and functional mapping scheme between several discrete phases of project definition to ensure that a potential customer, working with an analyst, is able to define the areas of interest and that appropriate methods are employed in the analysis. In addition the handbook is written to provide a high-level perspective for the analyst. Previously, the needed information was either scattered or existed only in the minds of experienced analysts. By compiling this information and exploring the breadth of knowledge which exists within the members of the PRA Unit, the functional relationships between the customers' needs and the product have been established

  14. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  15. Statistical Analysis Software for the TRS-80 Microcomputer.

    1981-09-01

    1450 3190 PRINT : INPUT ’Enter Desired Confidence Lvel (e.g. .95)’,AL : IF AL) I THEN AL=AL/100 3200 IF ALCO OR AL)1 THEN PRINT".- ERROR -# ENTER A...4540 PRIX P-Value a’lt IF 1 W𔃻 OR ZAIsL" PRINT 0OX ELSE PRINT FOX 4=0 PRINTt INPUrinter Desired Alpha LVel "’.AL 4560 17 AL.0 OR AL>1 THEN PEN*T...Abramowitz M. and Stegun, I. A., Handbook of Mathematical Functions, p. 932, National Bureau of Standards, 1964 7. Ibid., p. 941 8. Poole, L. and

  16. Don't Blame the Software: Using Qualitative Data Analysis Software Successfully in Doctoral Research

    Michelle Salmona

    2016-07-01

    Full Text Available In this article, we explore the learning experiences of doctoral candidates as they use qualitative data analysis software (QDAS. Of particular interest is the process of adopting technology during the development of research methodology. Using an action research approach, data was gathered over five years from advanced doctoral research candidates and supervisors. The technology acceptance model (TAM was then applied as a theoretical analytic lens for better understanding how students interact with new technology. Findings relate to two significant barriers which doctoral students confront: 1. aligning perceptions of ease of use and usefulness is essential in overcoming resistance to technological change; 2. transparency into the research process through technology promotes insights into methodological challenges. Transitioning through both barriers requires a competent foundation in qualitative research. The study acknowledges the importance of higher degree research, curriculum reform and doctoral supervision in post-graduate research training together with their interconnected relationships in support of high-quality inquiry. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1603117

  17. eXtended CASA Line Analysis Software Suite (XCLASS)

    Möller, T.; Endres, C.; Schilke, P.

    2017-02-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy (CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), that is, finding the parameter set that most closely reproduces the data. http://www.astro.uni-koeln.de/projects/schilke/myXCLASSInterface A copy of the code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A7

  18. FINODEX Handbook for Entrepreneurs

    The handbook provides short introductions to necessary knowledge for applicants in the two calls in October 2014 and June 2015 where they can present an idea for product development and apply for up to 10,000 Euro. Furthermore, the handbook is relevant for the next phase, where the selected approx....... 50 projects elaborate detailed technical and market plans. Last, the handbook provides links to further study and information about how to get help in later phases....

  19. Energy Efficiency Governance: Handbook

    NONE

    2010-07-01

    This handbook has been written to assist EE practitioners, government officials and stakeholders to establish effective EE governance structures for their country. The handbook provides readers with relevant information in an accessible format that will help develop comprehensive and effective governance mechanisms. For each of the specific topics dealt with (see Figure 1 in the Handbook), the IEA offers guidelines for addressing issues, or directs readers to examples of how such issues have been dealt with by specific countries.

  20. Planetary Geologic Mapping Handbook - 2010. Appendix

    Tanaka, K. L.; Skinner, J. A., Jr.; Hare, T. M.

    2010-01-01

    the USGS now are primarily digital products using geographic information system (GIS) software and file formats. GIS mapping tools permit easy spatial comparison, generation, importation, manipulation, and analysis of multiple raster image, gridded, and vector data sets. GIS software has also permitted the development of projectspecific tools and the sharing of geospatial products among researchers. GIS approaches are now being used in planetary geologic mapping as well. Guidelines or handbooks on techniques in planetary geologic mapping have been developed periodically. As records of the heritage of mapping methods and data, these remain extremely useful guides. However, many of the fundamental aspects of earlier mapping handbooks have evolved significantly, and a comprehensive review of currently accepted mapping methodologies is now warranted. As documented in this handbook, such a review incorporates additional guidelines developed in recent years for planetary geologic mapping by the NASA Planetary Geology and Geophysics (PGG) Program's Planetary Cartography and Geologic Mapping Working Group's (PCGMWG) Geologic Mapping Subcommittee (GEMS) on the selection and use of map bases as well as map preparation, review, publication, and distribution. In light of the current boom in planetary exploration and the ongoing rapid evolution of available data for planetary mapping, this handbook is especially timely.

  1. System and software safety analysis for the ERA control computer

    Beerthuizen, P.G.; Kruidhof, W.

    2001-01-01

    The European Robotic Arm (ERA) is a seven degrees of freedom relocatable anthropomorphic robotic manipulator system, to be used in manned space operation on the International Space Station, supporting the assembly and external servicing of the Russian segment. The safety design concept and implementation of the ERA is described, in particular with respect to the central computer's software design. A top-down analysis and specification process is used to down flow the safety aspects of the ERA system towards the subsystems, which are produced by a consortium of companies in many countries. The user requirements documents and the critical function list are the key documents in this process. Bottom-up analysis (FMECA) and test, on both subsystem and system level, are the basis for safety verification. A number of examples show the use of the approach and methods used

  2. Compositional Solution Space Quantification for Probabilistic Software Analysis

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  3. HistFitter software framework for statistical data analysis

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  4. The biodiesel handbook

    Knothe, Gerhard; Krahl, Jurgen; Van Gerpen, Jon Harlan

    2010-01-01

    .... The Biodiesel Handbook delivers solutions to issues associated with biodiesel feedstocks, production issues, quality control, viscosity, stability, applications, emissions, and other environmental...

  5. An Analysis of Related Software Cycles Among Organizations, People and the Software Industry

    Moore, Robert; Adams, Brady

    2008-01-01

    .... This thesis intends to explore the moderating factors of these three distinct and disjointed cycles and propose courses of action towards mitigating various issues and problems inherent in the software upgrade process...

  6. Geothermal handbook

    1976-01-01

    The Bureau of Land Management offered over 400,000 hectares (one million acres) for geothermal exploration and development in 1975, and figure is expected to double this year. The Energy Research and Development Administration hopes for 10-15,000 megawatts of geothermal energy by 1985, which would require, leasing over 16.3 million hectares (37 million acres) of land, at least half of which is federal land. Since there is an 8 to 8-1/2 year time laf between initial exploration and full field development, there would have to be a ten-fold increase in the amount of federal land leased within the next three years. Seventy percent of geothermal potential, 22.3 million hectares (55 million acres), is on federal lands in the west. The implication for the Service are enormous and the problems immediate. Geothermal resource are so widespread they are found to some extent in most biomes and ecosystems in the western United States. In most cases exploitation and production of geothermal resources can be made compatible with fish and wildlife management without damage, if probable impacts are clearly understood and provided for before damage has unwittingly been allowed to occur. Planning for site suitability and concern with specific operating techniques are crucial factors. There will be opportunities for enhancement: during exploration and testing many shallow groundwater bodies may be penetrated which might be developed for wildlife use. Construction equipment and materials needed for enhancement projects will be available in areas heretofore considered remote projects will be available in areas heretofore considered remote by land managers. A comprehensive knowledge of geothermal development is necessary to avoid dangers and seize opportunities. This handbook is intended to serve as a working tool in the field. It anticipated where geothermal resource development will occur in the western United States in the near future. A set of environmental assessment procedures are

  7. Using CASE Software to Teach Undergraduates Systems Analysis and Design.

    Wilcox, Russell E.

    1988-01-01

    Describes the design and delivery of a college course for information system students utilizing a Computer-Aided Software Engineering program. Discusses class assignments, cooperative learning, student attitudes, and the advantages of using this software in the course. (CW)

  8. Software for 3D diagnostic image reconstruction and analysis

    Taton, G.; Rokita, E.; Sierzega, M.; Klek, S.; Kulig, J.; Urbanik, A.

    2005-01-01

    Recent advances in computer technologies have opened new frontiers in medical diagnostics. Interesting possibilities are the use of three-dimensional (3D) imaging and the combination of images from different modalities. Software prepared in our laboratories devoted to 3D image reconstruction and analysis from computed tomography and ultrasonography is presented. In developing our software it was assumed that it should be applicable in standard medical practice, i.e. it should work effectively with a PC. An additional feature is the possibility of combining 3D images from different modalities. The reconstruction and data processing can be conducted using a standard PC, so low investment costs result in the introduction of advanced and useful diagnostic possibilities. The program was tested on a PC using DICOM data from computed tomography and TIFF files obtained from a 3D ultrasound system. The results of the anthropomorphic phantom and patient data were taken into consideration. A new approach was used to achieve spatial correlation of two independently obtained 3D images. The method relies on the use of four pairs of markers within the regions under consideration. The user selects the markers manually and the computer calculates the transformations necessary for coupling the images. The main software feature is the possibility of 3D image reconstruction from a series of two-dimensional (2D) images. The reconstructed 3D image can be: (1) viewed with the most popular methods of 3D image viewing, (2) filtered and processed to improve image quality, (3) analyzed quantitatively (geometrical measurements), and (4) coupled with another, independently acquired 3D image. The reconstructed and processed 3D image can be stored at every stage of image processing. The overall software performance was good considering the relatively low costs of the hardware used and the huge data sets processed. The program can be freely used and tested (source code and program available at

  9. ATLAS tile calorimeter cesium calibration control and analysis software

    Solovyanov, O; Solodkov, A; Starchenko, E; Karyukhin, A; Isaev, A; Shalanda, N

    2008-01-01

    An online control system to calibrate and monitor ATLAS Barrel hadronic calorimeter (TileCal) with a movable radioactive source, driven by liquid flow, is described. To read out and control the system an online software has been developed, using ATLAS TDAQ components like DVS (Diagnostic and Verification System) to verify the hardware before running, IS (Information Server) for data and status exchange between networked computers, and other components like DDC (DCS to DAQ Connection), to connect to PVSS-based slow control systems of Tile Calorimeter, high voltage and low voltage. A system of scripting facilities, based on Python language, is used to handle all the calibration and monitoring processes from hardware perspective to final data storage, including various abnormal situations. A QT based graphical user interface to display the status of the calibration system during the cesium source scan is described. The software for analysis of the detector response, using online data, is discussed. Performance of the system and first experience from the ATLAS pit are presented

  10. Phenomenology and Qualitative Data Analysis Software (QDAS: A Careful Reconciliation

    Brian Kelleher Sohn

    2017-01-01

    Full Text Available An oft-cited phenomenological methodologist, Max VAN MANEN (2014, claims that qualitative data analysis software (QDAS is not an appropriate tool for phenomenological research. Yet phenomenologists rarely describe how phenomenology is to be done: pencil, paper, computer? DAVIDSON and DI GREGORIO (2011 urge QDAS contrarians such as VAN MANEN to get over their methodological loyalties and join the digital world, claiming that all qualitative researchers, whatever their methodology, perform processes aided by QDAS: disaggregation and recontextualization of texts. Other phenomenologists exemplify DAVIDSON and DI GREGORIO's observation that arguments against QDAS often identify problems more closely related to the researchers than QDAS. But the concerns about technology of McLUHAN (2003 [1964], HEIDEGGER (2008 [1977], and FLUSSER (2013 cannot be ignored. In this conceptual article I answer the questions of phenomenologists and the call of QDAS methodologists to describe how I used QDAS to carry out a phenomenological study in order to guide others who choose to reconcile the use of software to assist their research. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1701142

  11. ATLAS tile calorimeter cesium calibration control and analysis software

    Solovyanov, O; Solodkov, A; Starchenko, E; Karyukhin, A; Isaev, A; Shalanda, N [Institute for High Energy Physics, Protvino 142281 (Russian Federation)], E-mail: Oleg.Solovyanov@ihep.ru

    2008-07-01

    An online control system to calibrate and monitor ATLAS Barrel hadronic calorimeter (TileCal) with a movable radioactive source, driven by liquid flow, is described. To read out and control the system an online software has been developed, using ATLAS TDAQ components like DVS (Diagnostic and Verification System) to verify the hardware before running, IS (Information Server) for data and status exchange between networked computers, and other components like DDC (DCS to DAQ Connection), to connect to PVSS-based slow control systems of Tile Calorimeter, high voltage and low voltage. A system of scripting facilities, based on Python language, is used to handle all the calibration and monitoring processes from hardware perspective to final data storage, including various abnormal situations. A QT based graphical user interface to display the status of the calibration system during the cesium source scan is described. The software for analysis of the detector response, using online data, is discussed. Performance of the system and first experience from the ATLAS pit are presented.

  12. Experimental software for modeling and interpreting educational data analysis processes

    Natalya V. Zorina

    2017-12-01

    Full Text Available Problems, tasks and processes of educational data mining are considered in this article. The objective is to create a fundamentally new information system of the University using the results educational data analysis. One of the functions of such a system is knowledge extraction from accumulated in the operation process data. The creation of the national system of this type is an iterative and time-consuming process requiring the preliminary studies and incremental prototyping modules. The novelty of such systems is that there is a lack of those using this methodology of the development, for this purpose a number of experiments was carried out in order to collect data, choose appropriate methods for the study and to interpret them. As a result of the experiment, the authors were available sources available for analysis in the information environment of the home university. The data were taken from the semester performance, obtained from the information system of the training department of the Institute of IT MTU MIREA, the data obtained as a result of the independent work of students and data, using specially designed Google-forms. To automate the collection of information and analysis of educational data, an experimental software package was created. As a methodology for developing the experimental software complex, a decision was made using the methodologies of rational-empirical complexes (REX and single-experimentation program technologies (TPEI. The details of the program implementation of the complex are described in detail, conclusions are given about the availability of the data sources used, and conclusions are drawn about the prospects for further development.

  13. Visual data mining and analysis of software repositories

    Voinea, S.L.; Telea, A.C.

    2007-01-01

    In this article we describe an ongoing effort to integrate information visualization techniques into the process of configuration management for software systems. Our focus is to help software engineers manage the evolution of large and complex software systems by offering them effective and

  14. PSA Review Handbook

    Hallman, Anders; Nyman, Ralph; Knochenhauer, Michael

    2004-05-01

    The Swedish Nuclear Power Inspectorate (SKI) expresses requirements on the performance of PSAs as well as on PSA activities in general in the the regulatory document 'Regulations Concerning Safety in Certain Nuclear Facilities', SKlFS 1998:1. The follow-up of these activities is part of the inspection tasks of the SKI. In view or this, there is a need for documented guidelines on now to perform these inspections and reviews. The SKI PSA Review Handbook is intended to be a support in the SKI inspection and control of the PSA activities or the licensees. These PSA activities include both the organisation and working procedures of the licensee, the layout and contents of the PSA, and its areas of application. Using the regulation SKIFS 1998:1 as a starting point, the review handbook presents important aspects to be considered when judging whether a licensee fulfils the requirements on PSA activities, including the performance of PSA:s or PSA applications. The handbook shall also be a guidance for the review of PSA:s. However, the intention of the PSA Review Handbook is not to be a handbook for how a PSA is performed. The PSA Review Handbook is applicable to all types or initiating events and all operating conditions, and has been structured in a way, which stresses the integrated characteristics of PSA in the creation of the risk picture of a plant. The PSA Review Handbook has been based on the requirements for PSA of nuclear power plants, as this is the most extensive application. However, the relevant parts of it are also applicable when analysing other nuclear installations. The PSA Review Handbook is published as a research report as its contents are judged to be of general interest, and the SKI welcomes comments to the handbook. An update or the PSA Review Handbook may be required as experience with the use of the handbook is acquired and if general PSA requirements change

  15. Automated software analysis of nuclear core discharge data

    Larson, T.W.; Halbig, J.K.; Howell, J.A.; Eccleston, G.W.; Klosterbuer, S.F.

    1993-03-01

    Monitoring the fueling process of an on-load nuclear reactor is a full-time job for nuclear safeguarding agencies. Nuclear core discharge monitors (CDMS) can provide continuous, unattended recording of the reactor's fueling activity for later, qualitative review by a safeguards inspector. A quantitative analysis of this collected data could prove to be a great asset to inspectors because more information can be extracted from the data and the analysis time can be reduced considerably. This paper presents a prototype for an automated software analysis system capable of identifying when fuel bundle pushes occurred and monitoring the power level of the reactor. Neural network models were developed for calculating the region on the reactor face from which the fuel was discharged and predicting the burnup. These models were created and tested using actual data collected from a CDM system at an on-load reactor facility. Collectively, these automated quantitative analysis programs could help safeguarding agencies to gain a better perspective on the complete picture of the fueling activity of an on-load nuclear reactor. This type of system can provide a cost-effective solution for automated monitoring of on-load reactors significantly reducing time and effort

  16. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  17. Cost Analysis of Poor Quality Using a Software Simulation

    Jana Fabianová

    2017-02-01

    Full Text Available The issues of quality, cost of poor quality and factors affecting quality are crucial to maintaining a competitiveness regarding to business activities. Use of software applications and computer simulation enables more effective quality management. Simulation tools offer incorporating the variability of more variables in experiments and evaluating their common impact on the final output. The article presents a case study focused on the possibility of using computer simulation Monte Carlo in the field of quality management. Two approaches for determining the cost of poor quality are introduced here. One from retrospective scope of view, where the cost of poor quality and production process are calculated based on historical data. The second approach uses the probabilistic characteristics of the input variables by means of simulation, and reflects as a perspective view of the costs of poor quality. Simulation output in the form of a tornado and sensitivity charts complement the risk analysis.

  18. Knowledge-based requirements analysis for automating software development

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  19. Nonlinear analysis of reinforced concrete structures using software package abaqus

    Marković Nemanja

    2014-01-01

    Full Text Available Reinforced concrete (AB is characterized by huge inhomogeneity resulting from the material characteristics of the concrete, then, quasi-brittle behavior during failure. These and other phenomena require the introduction of material nonlinearity in the modeling of reinforced concrete structures. This paper presents the modeling reinforced concrete in the software package ABAQUS. A brief theoretical overview is presented of methods such as: Concrete Damage Plasticity (CDP, Smeared Concrete Cracking (CSC, Cap Plasticity (CP and Drucker-Prager model (DPM. We performed a nonlinear analysis of two-storey reinforced concrete frame by applying CDP method for modeling material nonlinearity of concrete. We have analyzed damage zones, crack propagation and loading-deflection ratio.

  20. The Database and Data Analysis Software of Radiation Monitoring System

    Wang Weizhen; Li Jianmin; Wang Xiaobing; Hua Zhengdong; Xu Xunjiang

    2009-01-01

    Shanghai Synchrotron Radiation Facility (SSRF for short) is a third-generation light source building in China, including a 150MeV injector, 3.5GeV booster, 3.5GeV storage ring and an amount of beam line stations. The data is fetched by the monitoring computer from collecting modules in the front end, and saved in the MySQL database in the managing computer. The data analysis software is coded with Python, a script language, to inquire, summarize and plot the data of a certain monitoring channel during a certain period and export to an external file. In addition, the warning event can be inquired separately. The website for historical and real-time data inquiry and plotting is coded with PHP. (authors)

  1. Image analysis software for following progression of peripheral neuropathy

    Epplin-Zapf, Thomas; Miller, Clayton; Larkin, Sean; Hermesmeyer, Eduardo; Macy, Jenny; Pellegrini, Marco; Luccarelli, Saverio; Staurenghi, Giovanni; Holmes, Timothy

    2009-02-01

    A relationship has been reported by several research groups [1 - 4] between the density and shapes of nerve fibers in the cornea and the existence and severity of peripheral neuropathy. Peripheral neuropathy is a complication of several prevalent diseases or conditions, which include diabetes, HIV, prolonged alcohol overconsumption and aging. A common clinical technique for confirming the condition is intramuscular electromyography (EMG), which is invasive, so a noninvasive technique like the one proposed here carries important potential advantages for the physician and patient. A software program that automatically detects the nerve fibers, counts them and measures their shapes is being developed and tested. Tests were carried out with a database of subjects with levels of severity of diabetic neuropathy as determined by EMG testing. Results from this testing, that include a linear regression analysis are shown.

  2. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate......, with the aim of setting a community-driven gold standard for data handling, reporting and sharing. This article is part of a Special Issue entitled: New Horizons and Applications for Proteomics [EuPA 2012].......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  3. Development of Spectrometer Software for Electromagnetic Radiation Measurement and Analysis

    Mohd Idris Taib; Noor Ezati Shuib; Wan Saffiey Wan Abdullah

    2013-01-01

    This software was under development using LabVIEW to be using with StellarNet Spectrometer system. StellarNet Spectrometer was supplied with SpectraWiz operating software that can measure spectral data for real-time spectroscopy. This LabVIEW software was used to access real-time data from SpectraWiz dynamic link library as hardware interfacing. This software will acquire amplitude of every electromagnetic wavelength at periodic time. In addition to hardware interfacing, the user interface capabilities of software include plotting of spectral data in various mode including scope, absorbance, transmission and irradiance mode. This software surely can be used for research and development in application, utilization and safety of electromagnetic radiation, especially solar, laser and ultra violet. Of-line capabilities of this software are almost unlimited due to availability of mathematical and signal processing function in the LabVIEW add on library. (author)

  4. Fault tree synthesis for software design analysis of PLC based safety-critical systems

    Koo, S. R.; Cho, C. H.; Seong, P. H.

    2006-01-01

    As a software verification and validation should be performed for the development of PLC based safety-critical systems, a software safety analysis is also considered in line with entire software life cycle. In this paper, we propose a technique of software safety analysis in the design phase. Among various software hazard analysis techniques, fault tree analysis is most widely used for the safety analysis of nuclear power plant systems. Fault tree analysis also has the most intuitive notation and makes both qualitative and quantitative analyses possible. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Consequently, we can analyze the safety of software on the basis of fault tree synthesis. (authors)

  5. Measurement, instrumentation, and sensors handbook

    Eren, Halit

    2014-01-01

    The Second Edition of the bestselling Measurement, Instrumentation, and Sensors Handbook brings together all aspects of the design and implementation of measurement, instrumentation, and sensors. Reflecting the current state of the art, it describes the use of instruments and techniques for performing practical measurements in engineering, physics, chemistry, and the life sciences and discusses processing systems, automatic data acquisition, reduction and analysis, operation characteristics, accuracy, errors, calibrations, and the incorporation of standards for control purposes. Organized acco

  6. Procurement Career Management Handbook.

    Department of the Treasury, Washington, DC.

    This handbook is the result of the Treasury Department's efforts to increase professionalism among its procurement employees nationwide through its Procurement Career Management Program. First, the scope and objectives of the Procurement Career Management Program are discussed. The remaining sections of the handbook deal with the following program…

  7. Warehouse Sanitation Workshop Handbook.

    Food and Drug Administration (DHHS/PHS), Washington, DC.

    This workshop handbook contains information and reference materials on proper food warehouse sanitation. The materials have been used at Food and Drug Administration (FDA) food warehouse sanitation workshops, and are selected by the FDA for use by food warehouse operators and for training warehouse sanitation employees. The handbook is divided…

  8. Photovoltaic engineering handbook

    Lasnier, F; Ang, T G [Asian Institute of Technolgoy, Bangkok (TH)

    1990-01-01

    The Photovoltaic Engineering Handbook is a comprehensive 'nuts and bolts' guide to photovoltaic technology and systems engineering aimed at engineers and designers in the field. It is the first book to look closely at the practical problems involved in evaluating and setting up a PV power system. The authors' comprehensive insight into the different procedures and decisions that a designer needs to make. The book is unique in its coverage and the technical information is presented in a concise and simple way to enable engineers from a wide range of backgrounds to initiate, assess, analyse and design a PV system. Energy planners making decisions on the most appropriate system for specific needs will also benefit from reading this book. Topics covered include technological processes, including solar cell technology, the photovoltaic generator, photovoltaic systems engineering; characterization and testing methods, sizing procedure; economic analysis and instrumentation. (author).

  9. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  10. Applied statistics a handbook of BMDP analyses

    Snell, E J

    1987-01-01

    This handbook is a realization of a long term goal of BMDP Statistical Software. As the software supporting statistical analysis has grown in breadth and depth to the point where it can serve many of the needs of accomplished statisticians it can also serve as an essential support to those needing to expand their knowledge of statistical applications. Statisticians should not be handicapped by heavy computation or by the lack of needed options. When Applied Statistics, Principle and Examples by Cox and Snell appeared we at BMDP were impressed with the scope of the applications discussed and felt that many statisticians eager to expand their capabilities in handling such problems could profit from having the solutions carried further, to get them started and guided to a more advanced level in problem solving. Who would be better to undertake that task than the authors of Applied Statistics? A year or two later discussions with David Cox and Joyce Snell at Imperial College indicated that a wedding of the proble...

  11. Software hazard analysis for nuclear digital protection system by Colored Petri Net

    Bai, Tao; Chen, Wei-Hua; Liu, Zhen; Gao, Feng

    2017-01-01

    Highlights: •A dynamic hazard analysis method is proposed for the safety-critical software. •The mechanism relies on Colored Petri Net. •Complex interactions between software and hardware are captured properly. •Common failure mode in software are identified effectively. -- Abstract: The software safety of a nuclear digital protection system is critical for the safety of nuclear power plants as any software defect may result in severe damage. In order to ensure the safety and reliability of safety-critical digital system products and their applications, software hazard analysis is required to be performed during the lifecycle of software development. The dynamic software hazard modeling and analysis method based on Colored Petri Net is proposed and applied to the safety-critical control software of the nuclear digital protection system in this paper. The analysis results show that the proposed method can explain the complex interactions between software and hardware and identify the potential common cause failure in software properly and effectively. Moreover, the method can find the dominant software induced hazard to safety control actions, which aids in increasing software quality.

  12. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  13. Software use cases to elicit the software requirements analysis within the ASTRI project

    Conforti, Vito; Antolini, Elisa; Bonnoli, Giacomo; Bruno, Pietro; Bulgarelli, Andrea; Capalbi, Milvia; Fioretti, Valentina; Fugazza, Dino; Gardiol, Daniele; Grillo, Alessandro; Leto, Giuseppe; Lombardi, Saverio; Lucarelli, Fabrizio; Maccarone, Maria Concetta; Malaguti, Giuseppe; Pareschi, Giovanni; Russo, Federico; Sangiorgi, Pierluca; Schwarz, Joseph; Scuderi, Salvatore; Tanci, Claudio; Tosti, Gino; Trifoglio, Massimo; Vercellone, Stefano; Zanmar Sanchez, Ricardo

    2016-07-01

    The Italian National Institute for Astrophysics (INAF) is leading the Astrofisica con Specchi a Tecnologia Replicante Italiana (ASTRI) project whose main purpose is the realization of small size telescopes (SST) for the Cherenkov Telescope Array (CTA). The first goal of the ASTRI project has been the development and operation of an innovative end-to-end telescope prototype using a dual-mirror optical configuration (SST-2M) equipped with a camera based on silicon photo-multipliers and very fast read-out electronics. The ASTRI SST-2M prototype has been installed in Italy at the INAF "M.G. Fracastoro" Astronomical Station located at Serra La Nave, on Mount Etna, Sicily. This prototype will be used to test several mechanical, optical, control hardware and software solutions which will be used in the ASTRI mini-array, comprising nine telescopes proposed to be placed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort led by INAF and carried out by Italy, Brazil and South-Africa. We present here the use cases, through UML (Unified Modeling Language) diagrams and text details, that describe the functional requirements of the software that will manage the ASTRI SST-2M prototype, and the lessons learned thanks to these activities. We intend to adopt the same approach for the Mini Array Software System that will manage the ASTRI miniarray operations. Use cases are of importance for the whole software life cycle; in particular they provide valuable support to the validation and verification activities. Following the iterative development approach, which breaks down the software development into smaller chunks, we have analysed the requirements, developed, and then tested the code in repeated cycles. The use case technique allowed us to formalize the problem through user stories that describe how the user procedurally interacts with the software system. Through the use cases we improved the communication among team members, fostered

  14. Software requirements definition Shipping Cask Analysis System (SCANS)

    Johnson, G.L.; Serbin, R.

    1985-01-01

    The US Nuclear Regulatory Commission (NRC) staff reviews the technical adequacy of applications for certification of designs of shipping casks for spent nuclear fuel. In order to confirm an acceptable design, the NRC staff may perform independent calculations. The current NRC procedure for confirming cask design analyses is laborious and tedious. Most of the work is currently done by hand or through the use of a remote computer network. The time required to certify a cask can be long. The review process may vary somewhat with the engineer doing the reviewing. Similarly, the documentation on the results of the review can also vary with the reviewer. To increase the efficiency of this certification process, LLNL was requested to design and write an integrated set of user-oriented, interactive computer programs for a personal microcomputer. The system is known as the NRC Shipping Cask Analysis System (SCANS). The computer codes and the software system supporting these codes are being developed and maintained for the NRC by LLNL. The objective of this system is generally to lessen the time and effort needed to review an application. Additionally, an objective of the system is to assure standardized methods and documentation of the confirmatory analyses used in the review of these cask designs. A software system should be designed based on NRC-defined requirements contained in a requirements document. The requirements document is a statement of a project's wants and needs as the users and implementers jointly understand them. The requirements document states the desired end products (i.e. WHAT's) of the project, not HOW the project provides them. This document describes the wants and needs for the SCANS system. 1 fig., 3 tabs

  15. Study of gamma ray analysis software's. Application to activation analysis of geological samples

    Silva, Luiz Roberto Nogueira da

    1998-01-01

    A comparative evaluation of the gamma-ray analysis software VISPECT, in relation to two commercial gamma-ray analysis software packages, OMNIGAM (EG and G Ortec) and SAMPO 90 (Canberra) was performed. For this evaluation, artificial gamma ray spectra were created, presenting peaks of different intensities and located at four different regions of the spectrum. Multiplet peaks with equal and different intensities, but with different channel separations, were also created. The results obtained showed a good performance of VISPECT in detecting and analysing single and multiplet peaks of different intensities in the gamma-ray spectrum. Neutron activation analysis of the geological reference material GS-N (IWG-GIT) and of the granite G-94, used in a Proficiency Testing Trial of Analytical Geochemistry Laboratories, was also performed , in order to evaluate the VISEPCT software in the analysis of real samples. The results obtained by using VISPECT were as good or better than the ones obtained using the other programs. (author)

  16. The Software Therapist: Usability Problem Diagnosis Through Latent Semantic Analysis

    Sparks, Randall; Hartson, Rex

    2006-01-01

    The work we report on here addresses the problem of low return on investment in software usability engineering and offers support for usability practitioners in identifying, understanding, documenting...

  17. HistFitter software framework for statistical data analysis

    Baak, M. [CERN, Geneva (Switzerland); Besjes, G.J. [Radboud University Nijmegen, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Cote, D. [University of Texas, Arlington (United States); Koutsman, A. [TRIUMF, Vancouver (Canada); Lorenz, J. [Ludwig-Maximilians-Universitaet Muenchen, Munich (Germany); Excellence Cluster Universe, Garching (Germany); Short, D. [University of Oxford, Oxford (United Kingdom)

    2015-04-15

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  18. Development and applications of Kramers-Kronig PEELS analysis software

    Fan, X. D.; Peng, J.L.; Bursill, L.A.

    1997-01-01

    A Kramers-Kronig analysis program is developed as a custom function for the GATAN parallel electron energy loss spectroscopy (PEELS) software package EL/P. When used with a JEOL 4000EX high-resolution transmission electron microscope this program allows to measure the dielectric functions of materials with an energy resolution of approx 1.4eV. The imaginary part of the dielectric function is particularly useful, since it allows the magnitude of the band gap to be determined for relatively wide-gap materials. More importantly, changes in the gap may be monitored at high spatial resolution, when used in conjunction with the HRTEM images. The principles of the method are described and applications are presented for Type-1a gem quality diamond, before and after neutron irradiation. The former shows a band gap of about 5.8 eV, as expected, whereas for the latter the gap appears to be effectively collapsed. The core-loss spectra confirm that Type-1a diamond has pure sp 3 tetrahedral bonding, whereas the neutron irradiated diamond has mixed sp 2 /sp 3 bonding. Analysis of the low-loss spectra for the neutron-irradiated specimen yielded density 1.6 g/cm 3 , approximately half that of diamond. 10 refs., 2 figs

  19. HistFitter software framework for statistical data analysis

    Baak, M.; Besjes, G.J.; Cote, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  20. EURANOS. Generic handbook for assisting in the management of contaminated inhabited areas in Europe following a radiological emergency

    Nisbet, A.F.; Andersson, Kasper Grann; Brown, J.

    industry and others who may be affected. The handbook is a living document that requires updating from time to time to remain state-of-the-art and customisation of the generic handbook is an essential part of its use within individual countries. The handbook includes management options for application...... and plants; trees and shrubs. The handbook is divided into several independent sections comprising: supporting scientific and technical information; an analysis of the factors influencing recovery; compendia of comprehensive, state-of-the-art datasheets for more than 50 management options; guidance....... The handbook for inhabited areas complements the two other handbooks for food production systems and drinking water supplies....

  1. An analysis software of tritium distribution in food and environmental water in China

    Li Wenhong; Xu Cuihua; Ren Tianshan; Deng Guilong

    2006-01-01

    Objective: The purpose of developing this analysis-software of tritium distribution in food and environmental water is to collect tritium monitoring data, to analyze the data, both automatically, statistically and graphically, and to study and share the data. Methods: Based on the data obtained before, analysis-software is wrote by using VC++. NET as tool software. The software first transfers data from EXCEL into a database. It has additive function of data-append, so operators can embody new monitoring data easily. Results: After turning the monitoring data saved as EXCEL file by original researchers into a database, people can easily access them. The software provides a tool of distributing-analysis of tritium. Conclusion: This software is a first attempt of data-analysis about tritium level in food and environmental water in China. Data achieving, searching and analyzing become easily and directly with the software. (authors)

  2. Software selection based on analysis and forecasting methods, practised in 1C

    Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.

    2015-09-01

    The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.

  3. Spacelab user implementation assessment study. (Software requirements analysis). Volume 2: Technical report

    1976-01-01

    The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.

  4. Empirical analysis of change metrics for software fault prediction

    Choudhary, Garvit Rajesh; Kumar, Sandeep; Kumar, Kuldeep; Mishra, Alok; Catal, Cagatay

    2018-01-01

    A quality assurance activity, known as software fault prediction, can reduce development costs and improve software quality. The objective of this study is to investigate change metrics in conjunction with code metrics to improve the performance of fault prediction models. Experimental studies are

  5. An Analysis of Open Source Security Software Products Downloads

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  6. Multi-criteria decision analysis methods and software

    Ishizaka, Alessio

    2013-01-01

    This book presents an introduction to MCDA followed by more detailed chapters about each of the leading methods used in this field. Comparison of methods and software is also featured to enable readers to choose the most appropriate method needed in their research. Worked examples as well as the software featured in the book are available on an accompanying website.

  7. A pattern framework for software quality assessment and tradeoff analysis

    Folmer, Eelke; Boscht, Jan

    The earliest design decisions often have a significant impact on software quality and are the most costly to revoke. One of the challenges in architecture design is to reduce the frequency of retrofit problems in software designs; not being able to improve the quality of a system cost effectively, a

  8. Software para análise quantitativa da deglutição Swallowing quantitative analysis software

    André Augusto Spadotto

    2008-02-01

    Full Text Available OBJETIVO: Apresentar um software que permita uma análise detalhada da dinâmica da deglutição. MATERIAIS E MÉTODOS: Participaram deste estudo dez indivíduos após acidente vascular encefálico, sendo seis do gênero masculino, com idade média de 57,6 anos. Foi realizada videofluoroscopia da deglutição e as imagens foram digitalizadas em microcomputador, com posterior análise do tempo do trânsito faríngeo da deglutição, por meio de um cronômetro e do software. RESULTADOS: O tempo médio do trânsito faríngeo da deglutição apresentou-se diferente quando comparados os métodos utilizados (cronômetro e software. CONCLUSÃO: Este software é um instrumento de análise dos parâmetros tempo e velocidade da deglutição, propiciando melhor compreensão da dinâmica da deglutição, com reflexos tanto na abordagem clínica dos pacientes com disfagia como para fins de pesquisa científica.OBJECTIVE: The present paper is aimed at introducing a software to allow a detailed analysis of the swallowing dynamics. MATERIALS AND METHODS: The sample included ten (six male and four female stroke patients, with mean age of 57.6 years. Swallowing videofluoroscopy was performed and images were digitized for posterior analysis of the pharyngeal transit time with the aid of a chronometer and the software. RESULTS: Differences were observed in the average pharyngeal swallowing transit time as a result of measurements with chronometer and software. CONCLUSION: This software is a useful tool for the analysis of parameters such as swallowing time and speed, allowing a better understanding of the swallowing dynamics, both in the clinical approach of patients with oropharyngeal dysphagia and for scientific research purposes.

  9. RNAstructure: software for RNA secondary structure prediction and analysis.

    Reuter, Jessica S; Mathews, David H

    2010-03-15

    To understand an RNA sequence's mechanism of action, the structure must be known. Furthermore, target RNA structure is an important consideration in the design of small interfering RNAs and antisense DNA oligonucleotides. RNA secondary structure prediction, using thermodynamics, can be used to develop hypotheses about the structure of an RNA sequence. RNAstructure is a software package for RNA secondary structure prediction and analysis. It uses thermodynamics and utilizes the most recent set of nearest neighbor parameters from the Turner group. It includes methods for secondary structure prediction (using several algorithms), prediction of base pair probabilities, bimolecular structure prediction, and prediction of a structure common to two sequences. This contribution describes new extensions to the package, including a library of C++ classes for incorporation into other programs, a user-friendly graphical user interface written in JAVA, and new Unix-style text interfaces. The original graphical user interface for Microsoft Windows is still maintained. The extensions to RNAstructure serve to make RNA secondary structure prediction user-friendly. The package is available for download from the Mathews lab homepage at http://rna.urmc.rochester.edu/RNAstructure.html.

  10. Analysis on Influential Functions in the Weighted Software Network

    Haitao He

    2018-01-01

    Full Text Available Identifying influential nodes is important for software in terms of understanding the design patterns and controlling the development and the maintenance process. However, there are no efficient methods to discover them so far. Based on the invoking dependency relationships between the nodes, this paper proposes a novel approach to define the node importance for mining the influential software nodes. First, according to the multiple execution information, we construct a weighted software network (WSN to denote the software execution dependency structure. Second, considering the invoking times and outdegree about software nodes, we improve the method PageRank and put forward the targeted algorithm FunctionRank to evaluate the node importance (NI in weighted software network. It has higher influence when the node has lager value of NI. Finally, comparing the NI of nodes, we can obtain the most influential nodes in the software network. In addition, the experimental results show that the proposed approach has good performance in identifying the influential nodes.

  11. Java EE 7 handbook

    Pilgrim, Peter A

    2013-01-01

    Java EE 7 Handbook is an example based tutorial with descriptions and explanations.""Java EE 7 Handbook"" is for the developer, designer, and architect aiming to get acquainted with the Java EE platform in its newest edition. This guide will enhance your knowledge about the Java EE 7 platform. Whether you are a long-term Java EE (J2EE) developer or an intermediate level engineer on the JVM with just Java SE behind you, this handbook is for you, the new contemporary Java EE 7 developer!

  12. Handbook of energy

    Cleveland, Cutler J

    2013-01-01

    Handbook of Energy, Volume II: Chronologies, Top Ten Lists, and Word Clouds draws together a comprehensive account of the energy field from the prestigious and award-winning authors of the Encyclopedia of Energy (2004), The Dictionary of Energy, Expanded Edition (2009), and the Handbook of Energy, Volume I (2013). Handbook of Energy, Volume II takes the wealth of information about historical aspects of energy spread across many books, journals, websites, disciplines, ideologies, and user communities and synthesizes the information in one central repository. This book meets the needs of a di

  13. Experimental analysis of specification language impact on NPP software diversity

    Yoo, Chang Sik; Seong, Poong Hyun

    1998-01-01

    When redundancy and diversity is applied in NPP digital computer system, diversification of system software may be a critical point for the entire system dependability. As the means of enhancing software diversity, specification language diversity is suggested in this study. We set up a simple hypothesis for the specification language impact on common errors, and an experiment based on NPP protection system application was performed. Experiment result showed that this hypothesis could be justified and specification language diversity is effective in overcoming software common mode failure problem

  14. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  15. Development of the software dead time methodology for the 4πβ-γ software coincidence system analysis program

    Toledo, Fabio de; Brancaccio, Franco; Dias, Mauro da Silva

    2009-01-01

    The Laboratorio de Metrologia Nuclear - LMN, Nuclear Metrology Laboratory -, at IPEN-CNEN/SP, Sao Paulo, Brazil, developed a new Software Coincidence System (SCS) for 4πβ-γ radioisotope standardization. SCS is composed by the data acquisition hardware, for the coincidence data recording, and the coincidence data analysis program that performs the radioactive activity calculation for the target sample. Due to hardware intrinsic signal sampling characteristics, multiple undesired data recording occurs from a single saturated pulse. Also pulse pileup leads to bad data recording. As the beta counting rates are much greater than the gamma ones, due to the high 4π geometry beta detecting efficiencies, the beta counting significantly increases because of multiple pulse recordings, resulting in a respective increasing in the calculated activity value. In order to minimize such bad recordings effect, a software dead time value was introduced in the coincidence analysis program, under development at LMN, discarding multiple recordings, due to pulse pileup or saturation. This work presents the methodology developed to determine the optimal software dead time data value, for better accuracy results attaining, and discusses the results, pointing to software improvement possibilities. (author)

  16. Software safety analysis on the model specified by NuSCR and SMV input language at requirements phase of software development life cycle using SMV

    Koh, Kwang Yong; Seong, Poong Hyun

    2005-01-01

    Safety-critical software process is composed of development process, verification and validation (V and V) process and safety analysis process. Safety analysis process has been often treated as an additional process and not found in a conventional software process. But software safety analysis (SSA) is required if software is applied to a safety system, and the SSA shall be performed independently for the safety software through software development life cycle (SDLC). Of all the phases in software development, requirements engineering is generally considered to play the most critical role in determining the overall software quality. NASA data demonstrate that nearly 75% of failures found in operational software were caused by errors in the requirements. The verification process in requirements phase checks the correctness of software requirements specification, and the safety analysis process analyzes the safety-related properties in detail. In this paper, the method for safety analysis at requirements phase of software development life cycle using symbolic model verifier (SMV) is proposed. Hazard is discovered by hazard analysis and in other to use SMV for the safety analysis, the safety-related properties are expressed by computation tree logic (CTL)

  17. BEANS - a software package for distributed Big Data analysis

    Hypki, Arkadiusz

    2018-03-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in huge datasets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open source software too.

  18. User-friendly software for SANS data reduction and analysis

    Biemann, P.; Haese-Seiller, M.; Staron, P.

    1999-01-01

    At the Geesthacht Neutron Facility (GeNF) a new software is being developed for the reduction of two-dimensional small-angle neutron scattering (SANS) data. The main motivation for this work was to created software for users of our SANS facilities that is easy to use. Another motivation was to provide users with software they can also use at their home institute. Therefore, the software is implemented on a personal computer running WINDOWS. The program reads raw data from an area detector in binary or ascii format and produces ascii files containing the scattering curve. The cross section can be averaged over the whole area of the detector or over users defined sectors only. Scripts can be created for processing large numbers of files. (author)

  19. Multi-channel software defined radio experimental evaluation and analysis

    Van der Merwe, JR

    2014-09-01

    Full Text Available Multi-channel software-defined radios (SDRs) can be utilised as inexpensive prototyping platforms for transceiver arrays. The application for multi-channel prototyping is discussed and measured results of coherent channels for both receiver...

  20. An online database for plant image analysis software tools

    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire

    2013-01-01

    Background: Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is...

  1. Prototype Software for Automated Structural Analysis of Systems

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri......-partite graph that represents the structural model of the system. A component-based approach has been used to address issues such as system complexity and recon¯gurability possibilities....

  2. Prototype Software for Automated Structural Analysis of Systems

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri......-partite graph that represents the structural model of the system. A component-based approach has been used to address issues such as system complexity and reconfigurability possibilities....

  3. Network-based analysis of software change propagation.

    Wang, Rongcun; Huang, Rubing; Qu, Binbin

    2014-01-01

    The object-oriented software systems frequently evolve to meet new change requirements. Understanding the characteristics of changes aids testers and system designers to improve the quality of softwares. Identifying important modules becomes a key issue in the process of evolution. In this context, a novel network-based approach is proposed to comprehensively investigate change distributions and the correlation between centrality measures and the scope of change propagation. First, software dependency networks are constructed at class level. And then, the number of times of cochanges among classes is minded from software repositories. According to the dependency relationships and the number of times of cochanges among classes, the scope of change propagation is calculated. Using Spearman rank correlation analyzes the correlation between centrality measures and the scope of change propagation. Three case studies on java open source software projects Findbugs, Hibernate, and Spring are conducted to research the characteristics of change propagation. Experimental results show that (i) change distribution is very uneven; (ii) PageRank, Degree, and CIRank are significantly correlated to the scope of change propagation. Particularly, CIRank shows higher correlation coefficient, which suggests it can be a more useful indicator for measuring the scope of change propagation of classes in object-oriented software system.

  4. Radiation and environmental data analysis computer (REDAC) hardware, software band analysis procedures

    Hendricks, T.J.

    1985-01-01

    The REDAC was conceived originally as a tape verifier for the Radiation and Environmental Data Acquisition Recorder (REDAR). From that simple beginning in 1971, the REDAC has evolved into a family of systems used for complete analysis of data obtained by the REDAR and other acquisition systems. Portable or mobile REDACs are deployed to support checkout and analysis tasks in the field. Laboratory systems are additionally used for software development, physics investigations, data base management and graphics. System configurations range from man-portable systems to a large laboratory-based system which supports time-shared analysis and development tasks. Custom operating software allows the analyst to process data either interactively or by batch procedures. Analysis packages are provided for numerous necessary functions. All these analysis procedures can be performed even on the smallest man-portable REDAC. Examples of the multi-isotope stripping and radiation isopleth mapping are presented. Techniques utilized for these operations are also presented

  5. The Fuel Handbook 2012; Braenslehandboken 2012

    Stroemberg, Birgitta; Herstad Svaerd, Solvie

    2012-04-15

    This fuel handbook provides, from a plant owner's perspective, a method of evaluating different fuels on the market. The fuel handbook concerns renewable fuels (but not including household waste) that are available on the Swedish market today or which are judged to have the potential to be available within the next ten years. The handbook covers 31 different fuels. Analysis data, special properties, operating experience, recommendations, risks when using the fuels and literature references are outlined for each fuel. The handbook also contains 1. A proposed route to follow when one plans to introduce a new fuel. Those analyses and tests one should perform to reduce the risk of encountering problems. 2. A summary of relevant laws and taxes for energy production, with directions as to where relevant documentation can be found. 3. Theory and background to judge the fuel's combustion, ash and corrosion properties and different methods that can be used for such judgement. 4. Summary of standards, databases and handbooks for biomass fuels and other solid fuels, and details on where further information on the fuels can be found, with the help of links to different web sites. Included in the annexes are 1. A proposal for a standard procedure for test burning of fuel. 2. Calculations procedures for, amongst others, heating value, flue gas composition and key numbers. In addition, calculations routines for different units for a number of different applications are provided

  6. The Fuel Handbook 2012; Braenslehandboken 2012

    Stroemberg, Birgitta; Herstad Svaerd, Solvie

    2012-04-15

    This fuel handbook provides, from a plant owner's perspective, a method of evaluating different fuels on the market. The fuel handbook concerns renewable fuels (but not including household waste) that are available on the Swedish market today or which are judged to have the potential to be available within the next ten years. The handbook covers 31 different fuels. Analysis data, special properties, operating experience, recommendations, risks when using the fuels and literature references are outlined for each fuel. The handbook also contains 1. A proposed route to follow when one plans to introduce a new fuel. Those analyses and tests one should perform to reduce the risk of encountering problems. 2. A summary of relevant laws and taxes for energy production, with directions as to where relevant documentation can be found. 3. Theory and background to judge the fuel's combustion, ash and corrosion properties and different methods that can be used for such judgement. 4. Summary of standards, databases and handbooks for biomass fuels and other solid fuels, and details on where further information on the fuels can be found, with the help of links to different web sites. Included in the annexes are 1. A proposal for a standard procedure for test burning of fuel. 2. Calculations procedures for, amongst others, heating value, flue gas composition and key numbers. In addition, calculations routines for different units for a number of different applications are provided

  7. Introduction: Green Building Handbook

    Van Wyk, Llewellyn V

    2010-08-01

    Full Text Available By recognising the specific environmental challenges facing South Africa, mindful of the government‘s commitment to reducing South Africa‘s Greenhouse gas emissions, and acknowledging the need to build social cohesion, the Green Building Handbook...

  8. Metallic Fuels Handbook

    Janney, Dawn E. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Papesch, Cynthia A. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Burkes, Douglas E. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cole, James I. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Fielding, Randall S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Frank, Steven M. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hartmann, Thomas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hyde, Timothy A. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Keiser, Jr., Dennis D. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kennedy, J. Rory [Idaho National Lab. (INL), Idaho Falls, ID (United States); Maddison, Andrew [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mariani, Robert D. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Middlemas, Scott C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); O' Holleran, Thomas P. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Sencer, Bulent H. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Squires, Leah N. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-08-07

    This is not a typical External Report--It is a Handbook. No Abstract is involved. This includes both Parts 1 and 2. The Metallic Fuels Handbook summarizes currently available information about phases and phase diagrams, heat capacity, thermal expansion, and thermal conductivity of elements and alloys in the U-Pu-Zr-Np-Am-La-Ce-Pr-Nd system. Although many sections are reviews and updates of material in previous versions of the Handbook [1, 2], this revision is the first to include alloys with four or more elements. In addition to presenting information about materials properties, the handbook attempts to provide information about how well each property is known and how much variation exists between measurements. Although it includes some results from models, its primary focus is experimental data.

  9. Construction project management handbook.

    2012-03-01

    The purpose of the FTA Construction Project Management Handbook is to provide guidelines for use by public transit agencies (Agen-cies) undertaking substantial construction projects, either for the first time or with little prior experience with cons...

  10. Handbook of nanomedicine

    Jain, Kewal K

    2012-01-01

    In its updated and reorganized second edition, this handbook captures the latest advances in nanomedicine applied to researching the pathomechanism of disease, refining molecular diagnostics, and aiding in the discovery, development, and delivery of drugs.

  11. Handbook of antenna technologies

    Liu, Duixian; Nakano, Hisamatsu; Qing, Xianming; Zwick, Thomas

    2016-01-01

    The Handbook of Antenna Technologies aims to present the rapid development of antenna technologies, particularly in the past two decades, and also showcasing the newly developed technologies and the latest applications. The handbook will provide readers with the comprehensive updated reference information covering theory, modeling and optimization methods, design and measurement, new electromagnetic materials, and applications of antennas. The handbook will widely cover not only all key antenna design issues but also fundamentals, issues related to antennas (transmission, propagation, feeding structure, materials, fabrication, measurement, system, and unique design challenges in specific applications). This handbook will benefit the readers as a full and quick technical reference with a high-level historic review of technology, detailed technical descriptions and the latest practical applications.

  12. Toxic substances handbook

    Junod, T. L.

    1979-01-01

    Handbook, published in conjunction with Toxic Substances Alert Program at NASA Lewis Research Center, profiles 187 toxic chemicals in their relatively pure states and include 27 known or suspected carcinogens.

  13. Inequalities in Open Source Software Development: Analysis of Contributor's Commits in Apache Software Foundation Projects.

    Chełkowski, Tadeusz; Gloor, Peter; Jemielniak, Dariusz

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution.

  14. Diversion Path Analysis handbook. Volume 3 (of 4 volumes). Computer Program 1

    Schleter, J.C.

    1978-11-01

    The FORTRAN IV computer program, DPA Computer Program 1 (DPACP-1), is used to assemble and tabulate the data for Specific Diversion Paths (SDPs) identified when performing a Diversion Path Analysis (DPA) in accord with the methodology given in Volume 1. The program requires 255498 bytes exclusive of the operating system. The data assembled and tabulated by DPACP-1 are used by the DPA team to assist in analyzing vulnerabilities, in a plant's material control and material accounting subsystems, to diversion of special nuclear material (SNM) by a knowledgable insider. Based on this analysis, the DPA team can identify, and propose to plant management, modifications to the plant's safeguards system that would eliminate, or reduce the severity of, the identified vulnerabilities. The data are also used by plant supervision when investigating a potential diversion

  15. Accident Damage Analysis Module (ADAM) – Technical Guidance, Software tool for Consequence Analysis calculations

    FABBRI LUCIANO; BINDA MASSIMO; BRUINEN DE BRUIN YURI

    2017-01-01

    This report provides a technical description of the modelling and assumptions of the Accident Damage Analysis Module (ADAM) software application, which has been recently developed by the Joint Research Centre (JRC) of the European Commission (EC) to assess physical effects of an industrial accident resulting from an unintended release of a dangerous substance

  16. Evaluation of Distribution Analysis Software for DER Applications

    Staunton, RH

    2003-01-23

    unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of

  17. Handbook on Decision Making

    Jain, Lakhmi C

    2010-01-01

    The present "Volume 1: Techniques and Applications" of the "Handbook on Decision Making" presents a useful collection of AI techniques, as well as other complementary methodologies, that are useful for the design and development of intelligent decision support systems. Application examples of how these intelligent decision support systems can be utilized to help tackle a variety of real-world problems in different domains, such as business, management, manufacturing, transportation and food industries, and biomedicine, are presented. The handbook includes twenty condensed c

  18. Soviet Space Program Handbook.

    1988-04-01

    in advance and some events were even broadcast live. Immediately following the first success- ful launch of their new Energia space launch vehicle in...early 1988. Just as a handbook written a couple of years ago would need updating with Mir, Energia , and the SL-16, this handbook will one day need up...1986. Johnson, Nicholas L. The Soviet Year in Space 1983. Colorado Springs, CO: Teledyne Brown Engineering, 1984. Lawton, A. " Energia - Soviet Super

  19. Knowledge Service Engineering Handbook

    Kantola, Jussi

    2012-01-01

    Covering the emerging field of knowledge service engineering, this groundbreaking handbook outlines how to acquire and utilize knowledge in the 21st century. Drawn on the expertise of the founding faculty member of the world's first university knowledge engineering service department, this book describes what knowledge services engineering means and how it is different from service engineering and service production. Presenting multiple cultural aspects including US, Finnish, and Korean, this handbook provides engineering, systemic, industry, and consumer use viewpoints to knowledge service sy

  20. DOE handbook electrical safety

    NONE

    1998-01-01

    Electrical Safety Handbook presents the Department of Energy (DOE) safety standards for DOE field offices or facilities involved in the use of electrical energy. It has been prepared to provide a uniform set of electrical safety guidance and information for DOE installations to effect a reduction or elimination of risks associated with the use of electrical energy. The objectives of this handbook are to enhance electrical safety awareness and mitigate electrical hazards to employees, the public, and the environment.

  1. Handbook of Social Capital

    The Handbook of Social Capital balances the ‘troika' of sociology, political science and economics by offering important contributions to the study of bonding and bridging social capital networks. This inter-disciplinary Handbook intends to serve as a bridge for students and scholars within all...... the social sciences. The contributors explore the different scientific approaches that are all needed if international research is to embrace both the bright and the more shadowy aspects of social capital....

  2. Harwell emergency handbook

    1986-12-01

    The Harwell Laboratory Emergency Handbook 1987 contains emergency procedures to deal with any incident which might occur at AERE Harwell involving radioactive or toxic material releases. The Handbook gives details of the duties of members of the Site Emergency Organization and other key members of staff, the methods by which incidents are controlled, the communication links and liaison arrangements with other organizations and the possible consequences and actions that may be needed following an emergency. (UK)

  3. Newnes electronics assembly handbook

    Brindley, Keith

    2013-01-01

    Newnes Electronics Assembly Handbook: Techniques, Standards and Quality Assurance focuses on the aspects of electronic assembling. The handbook first looks at the printed circuit board (PCB). Base materials, basic mechanical properties, cleaning of assemblies, design, and PCB manufacturing processes are then explained. The text also discusses surface mounted assemblies and packaging of electromechanical assemblies, as well as the soldering process. Requirements for the soldering process; solderability and protective coatings; cleaning of PCBs; and mass solder/component reflow soldering are des

  4. Handbook of Technical Communication

    Mehler , Alexander; Romary , Laurent; Gibbon , Dafydd

    2012-01-01

    International audience; The handbook "Technical Communication" brings together a variety of topics which range from the role of technical media in human communication to the linguistic, multimodal enhancement of present-day technologies. It covers the area of computer-mediated text, voice and multimedia communication as well as of technical documentation. In doing so, the handbook takes professional and private communication into account. Special emphasis is put on technical communication bas...

  5. Rechargeable batteries applications handbook

    1998-01-01

    Represents the first widely available compendium of the information needed by those design professionals responsible for using rechargeable batteries. This handbook introduces the most common forms of rechargeable batteries, including their history, the basic chemistry that governs their operation, and common design approaches. The introduction also exposes reader to common battery design terms and concepts.Two sections of the handbook provide performance information on two principal types of rechargeable batteries commonly found in consumer and industrial products: sealed nickel-cad

  6. Diversion Path Analysis handbook. Volume 4 (of 4 volumes). Computer Program 2

    Schleter, J.C.

    1978-11-01

    The FORTRAN IV computer program, DPA Computer Program 2 (DPACP-2) is used to produce tables and statistics on modifications identified when performing a Diversion Path Analysis (DPA) in accord with the methodology given in Volume 1. The program requires 259088 bytes exclusive of the operating system. The data assembled and tabulated by DPACP-2 assist the DPA team in analyzing and evaluating modifications to the plant's safeguards system that would eliminate, or reduce the severity of, vulnerabilities identified by means of the DPA. These vulnerabilities relate to the capability of the plant's material control and material accounting subsystems to indicate diversion of special nuclear material (SNM) by a knowledgeable insider

  7. NASA Systems Engineering Handbook

    Hirshorn, Steven R.; Voss, Linda D.; Bromley, Linda K.

    2017-01-01

    The update of this handbook continues the methodology of the previous revision: a top-down compatibility with higher level Agency policy and a bottom-up infusion of guidance from the NASA practitioners in the field. This approach provides the opportunity to obtain best practices from across NASA and bridge the information to the established NASA systems engineering processes and to communicate principles of good practice as well as alternative approaches rather than specify a particular way to accomplish a task. The result embodied in this handbook is a top-level implementation approach on the practice of systems engineering unique to NASA. Material used for updating this handbook has been drawn from many sources, including NPRs, Center systems engineering handbooks and processes, other Agency best practices, and external systems engineering textbooks and guides. This handbook consists of six chapters: (1) an introduction, (2) a systems engineering fundamentals discussion, (3) the NASA program project life cycles, (4) systems engineering processes to get from a concept to a design, (5) systems engineering processes to get from a design to a final product, and (6) crosscutting management processes in systems engineering. The chapters are supplemented by appendices that provide outlines, examples, and further information to illustrate topics in the chapters. The handbook makes extensive use of boxes and figures to define, refine, illustrate, and extend concepts in the chapters.

  8. Software and package applicating for network meta-analysis: A usage-based comparative study.

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  9. NuFTA: A CASE Tool for Automatic Software Fault Tree Analysis

    Yun, Sang Hyun; Lee, Dong Ah; Yoo, Jun Beom

    2010-01-01

    Software fault tree analysis (SFTA) is widely used for analyzing software requiring high-reliability. In SFTA, experts predict failures of system through HA-ZOP (Hazard and Operability study) or FMEA (Failure Mode and Effects Analysis) and draw software fault trees about the failures. Quality and cost of the software fault tree, therefore, depend on knowledge and experience of the experts. This paper proposes a CASE tool NuFTA in order to assist experts of safety analysis. The NuFTA automatically generate software fault trees from NuSCR formal requirements specification. NuSCR is a formal specification language used for specifying software requirements of KNICS RPS (Reactor Protection System) in Korea. We used the SFTA templates proposed by in order to generate SFTA automatically. The NuFTA also generates logical formulae summarizing the failure's cause, and we have a plan to use the formulae usefully through formal verification techniques

  10. Mobile retroreflectivity best practices handbook.

    2009-07-01

    This handbook documents best practices related to proper use of the mobile retroreflectometer, sampling of : sites for data collection, and handling of mobile retroreflectivity data. The best practices described in this : handbook are derived from th...

  11. Travel time data collection handbook

    1998-03-01

    This Travel Time Data Collection Handbook provides guidance to transportation : professionals and practitioners for the collection, reduction, and presentation : of travel time data. The handbook should be a useful reference for designing : travel ti...

  12. Analyzing the State of Static Analysis : A Large-Scale Evaluation in Open Source Software

    Beller, M.; Bholanath, R.; McIntosh, S.; Zaidman, A.E.

    2016-01-01

    The use of automatic static analysis has been a software engineering best practice for decades. However, we still do not know a lot about its use in real-world software projects: How prevalent is the use of Automated Static Analysis Tools (ASATs) such as FindBugs and JSHint? How do developers use

  13. A Method for Software Requirement Volatility Analysis Using QFD

    Yunarso Anang

    2016-10-01

    Full Text Available Changes of software requirements are inevitable during the development life cycle. Rather than avoiding the circumstance, it is easier to just accept it and find a way to anticipate those changes. This paper proposes a method to analyze the volatility of requirement by using the Quality Function Deployment (QFD method and the introduced degree of volatility. Customer requirements are deployed to software functions and subsequently to architectural design elements. And then, after determining the potential for changes of the design elements, the degree of volatility of the software requirements is calculated. In this paper the method is described using a flow diagram and illustrated using a simple example, and is evaluated using a case study.

  14. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Jyri Pakarinen

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  15. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    VINCENT, ANDREW

    2005-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture

  16. An effective technique for the software requirements analysis of NPP safety-critical systems, based on software inspection, requirements traceability, and formal specification

    Koo, Seo Ryong; Seong, Poong Hyun; Yoo, Junbeom; Cha, Sung Deok; Yoo, Yeong Jae

    2005-01-01

    A thorough requirements analysis is indispensable for developing and implementing safety-critical software systems such as nuclear power plant (NPP) software systems because a single error in the requirements can generate serious software faults. However, it is very difficult to completely analyze system requirements. In this paper, an effective technique for the software requirements analysis is suggested. For requirements verification and validation (V and V) tasks, our technique uses software inspection, requirement traceability, and formal specification with structural decomposition. Software inspection and requirements traceability analysis are widely considered the most effective software V and V methods. Although formal methods are also considered an effective V and V activity, they are difficult to use properly in the nuclear fields as well as in other fields because of their mathematical nature. In this work, we propose an integrated environment (IE) approach for requirements, which is an integrated approach that enables easy inspection by combining requirement traceability and effective use of a formal method. The paper also introduces computer-aided tools for supporting IE approach for requirements. Called the nuclear software inspection support and requirements traceability (NuSISRT), the tool incorporates software inspection, requirement traceability, and formal specification capabilities. We designed the NuSISRT to partially automate software inspection and analysis of requirement traceability. In addition, for the formal specification and analysis, we used the formal requirements specification and analysis tool for nuclear engineering (NuSRS)

  17. Using recurrence plot analysis for software execution interpretation and fault detection

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  18. GROMOS++Software for the Analysis of Biomolecular Simulation Trajectories

    Eichenberger, A.P.; Allison, J.R.; Dolenc, J.; Geerke, D.P.; Horta, B.A.C.; Meier, K; Oostenbrink, B.C.; Schmid, N.; Steiner, D; Wang, D.; van Gunsteren, W.F.

    2011-01-01

    GROMOS++ is a set of C++ programs for pre- and postprocessing of molecular dynamics simulation trajectories and as such is part of the GROningen MOlecular Simulation software for (bio)molecular simulation. It contains more than 70 programs that can be used to prepare data for the production of

  19. Algebraic software analysis and embedded simulation of a driving robot

    Merkx, L.L.F.; Duringhof, H.M.; Cuijpers, P.J.L.

    2007-01-01

    At TNO Automotive the Generic Driving Actuator (GDA) is developed. The GDA is a device capable of driving a vehicle fully automatically using the same interface as a human driver does. In this paper, the design of the GDA is discussed. The software and hardware of the GDA and its effect on vehicle

  20. Program spectra analysis in embedded software : A case study

    Abreu, R.; Zoeteweij, P.; Van Gemund, A.J.C.

    2006-01-01

    Because of constraints imposed by the market, embedded software in consumer electronics is almost inevitably shipped with faults and the goal is just to reduce the inherent unreliability to an acceptable level before a product has to be released. Automatic fault diagnosis is a valuable tool to

  1. Image analysis software versus direct anthropometry for breast measurements.

    Quieregatto, Paulo Rogério; Hochman, Bernardo; Furtado, Fabianne; Machado, Aline Fernanda Perez; Sabino Neto, Miguel; Ferreira, Lydia Masako

    2014-10-01

    To compare breast measurements performed using the software packages ImageTool(r), AutoCAD(r) and Adobe Photoshop(r) with direct anthropometric measurements. Points were marked on the breasts and arms of 40 volunteer women aged between 18 and 60 years. When connecting the points, seven linear segments and one angular measurement on each half of the body, and one medial segment common to both body halves were defined. The volunteers were photographed in a standardized manner. Photogrammetric measurements were performed by three independent observers using the three software packages and compared to direct anthropometric measurements made with calipers and a protractor. Measurements obtained with AutoCAD(r) were the most reproducible and those made with ImageTool(r) were the most similar to direct anthropometry, while measurements with Adobe Photoshop(r) showed the largest differences. Except for angular measurements, significant differences were found between measurements of line segments made using the three software packages and those obtained by direct anthropometry. AutoCAD(r) provided the highest precision and intermediate accuracy; ImageTool(r) had the highest accuracy and lowest precision; and Adobe Photoshop(r) showed intermediate precision and the worst accuracy among the three software packages.

  2. Software Graphical User Interface For Analysis Of Images

    Leonard, Desiree M.; Nolf, Scott R.; Avis, Elizabeth L.; Stacy, Kathryn

    1992-01-01

    CAMTOOL software provides graphical interface between Sun Microsystems workstation and Eikonix Model 1412 digitizing camera system. Camera scans and digitizes images, halftones, reflectives, transmissives, rigid or flexible flat material, or three-dimensional objects. Users digitize images and select from three destinations: work-station display screen, magnetic-tape drive, or hard disk. Written in C.

  3. Control and analysis software for a laser scanning microdensitometer

    A PC-based control software and data acquisition system is devel- oped for an existing ... Description of the system. Figure 1 shows a schematic diagram of the microdensitometer and the data acquisition system. ... ming language with very strong library functions and it also supports direct input/output programming. 5.

  4. Prospects for Evidence -Based Software Assurance: Models and Analysis

    2015-09-01

    Languages and Systems (TOPLAS) 28, 1 (2006), 175–205. [27] Higgins , K. Spear-phishing attacks out of China targeted source code, intellectual property...Spinuzzi. Building more usable APIs. IEEE Software, 15(3):78–86, 1998. [23] Chris Parnin and Christoph Treude. Measuring api documentation on the web. In

  5. Software engineering article types: An analysis of the literature

    Montesi, M.; Lago, P.

    2008-01-01

    The software engineering (SE) community has recently recognized that the field lacks well-established research paradigms and clear guidance on how to write good research reports. With no comprehensive guide to the different article types in the field, article writing and reviewing heavily depends on

  6. Software package for analysis of completely randomized block design

    This study is to design and develop statistical software (package), OYSP1.0 which conveniently accommodates and analyzes large mass of data emanating from experimental designs, in particular, completely Randomized Block design. Visual Basic programming is used in the design. The statistical package OYSP 1.0 ...

  7. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  8. Practicality for Software Hazard Analysis for Nuclear Safety I and C System

    Kim, Yong-Ho; Moon, Kwon-Ki; Chang, Young-Woo; Jeong, Soo-Hyun

    2016-01-01

    We are using the concept of system safety in engineering. It is difficult to make any system perfectly safe and probably a complete system may not easily be achieved. The standard definition of a system from MIL-STD- 882E is: “The organization of hardware, software, material, facilities, personnel, data, and services needed to perform a designated function within a stated environment with specified results.” From the perspective of the system safety engineer and the hazard analysis process, software is considered as a subsystem. Regarding hazard analysis, to date, methods for identifying software failures and determining their effects is still a research problem. Since the success of software development is based on rigorous test of hardware and software, it is necessary to check the balance between software test and hardware test, and in terms of efficiency. Lessons learned and experience from similar systems are important for the work of hazard analysis. No major hazard has been issued for the software developed and verified in Korean NPPs. In addition to hazard analysis, software development, and verification and validation were thoroughly performed. It is reasonable that the test implementation including the development of the test case, stress and abnormal conditions, error recovery situations, and high risk hazardous situations play a key role in detecting and preventing software faults

  9. Practicality for Software Hazard Analysis for Nuclear Safety I and C System

    Kim, Yong-Ho; Moon, Kwon-Ki; Chang, Young-Woo; Jeong, Soo-Hyun [KEPCO Engineering and Construction Co., Deajeon (Korea, Republic of)

    2016-10-15

    We are using the concept of system safety in engineering. It is difficult to make any system perfectly safe and probably a complete system may not easily be achieved. The standard definition of a system from MIL-STD- 882E is: “The organization of hardware, software, material, facilities, personnel, data, and services needed to perform a designated function within a stated environment with specified results.” From the perspective of the system safety engineer and the hazard analysis process, software is considered as a subsystem. Regarding hazard analysis, to date, methods for identifying software failures and determining their effects is still a research problem. Since the success of software development is based on rigorous test of hardware and software, it is necessary to check the balance between software test and hardware test, and in terms of efficiency. Lessons learned and experience from similar systems are important for the work of hazard analysis. No major hazard has been issued for the software developed and verified in Korean NPPs. In addition to hazard analysis, software development, and verification and validation were thoroughly performed. It is reasonable that the test implementation including the development of the test case, stress and abnormal conditions, error recovery situations, and high risk hazardous situations play a key role in detecting and preventing software faults.

  10. A tool to include gamma analysis software into a quality assurance program.

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  11. Design and validation of Segment - freely available software for cardiovascular image analysis

    Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan

    2010-01-01

    Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page (http://segment.heiberg.se). Segment

  12. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  13. Russian system of computerized analysis for licensing at atomic industry (SCALA) and its validation on ICSBEP handbook data and some burnup calculations

    Ivanova, T.; Nikolaev, M.; Polyakov, A.; Saraeva, T.; Tsiboulia, A.

    2000-01-01

    The System of Computerized Analysis for Licensing at Atomic industry (SCALA) is a Russian analogue of the well-known SCALE system. For criticality evaluations the ABBN-93 system is used with TWODANT and with joined American KENO and Russian MMK Monte-Carlo code MMKKENO. Using the same cross sections and input models, all these codes give results that coincide within the statistical uncertainties (for Monte-Carlo codes). Validation of criticality calculations using SCALA was performed using data presented in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. Another task of the work was to test the burnup capability of SCALA system in complex geometry in compare with other codes. Benchmark models of VVER type reactor assemblies with UO 2 and MOX fuel including the cases with burnable gadolinium absorbers were calculated. KENO-VI and MMK codes were used for power distribution calculations, ORIGEN code was used for the isotopic kinetics calculations. (authors)

  14. Decision making model design for antivirus software selection using Factor Analysis and Analytical Hierarchy Process

    Nurhayati Ai; Gautama Aditya; Naseer Muchammad

    2018-01-01

    Virus spread increase significantly through the internet in 2017. One of the protection method is using antivirus software. The wide variety of antivirus software in the market tends to creating confusion among consumer. Selecting the right antivirus according to their needs has become difficult. This is the reason we conduct our research. We formulate a decision making model for antivirus software consumer. The model is constructed by using factor analysis and AHP method. First we spread que...

  15. The Oxford handbook of economic inequality

    Salverda, W.; Nolan, B.; Smeeding, T.M.

    2009-01-01

    The essential guide for students and researchers interested in economic inequality Contains 27 original research contributions from the top names in economic inequality. The Oxford Handbook of Economic Inequality presents a new and challenging analysis of economic inequality, focusing primarily on

  16. Handbook for Trade and Industrial Shop Teachers.

    Texas A and M Univ., College Station. Vocational Instructional Services.

    This handbook is intended to help teachers of pre-employment shop courses in organizing and delivering instruction in both the shop and classroom. Addressed in the guide are the following topics: the instructor's place in the local school organization; the instructor's job (objectives, advisory committees, occupational analysis, shop/classroom and…

  17. Handbook of methods for the analysis of the various parameters of the carbon dioxide system in sea water. Version 2

    Dickson, A.G.; Goyet, C. [eds.

    1994-09-01

    The collection of extensive, reliable, oceanic carbon data is a key component of the Joint Global Ocean Flux Study (JGOFS). A portion of the US JGOFS oceanic carbon dioxide measurements will be made during the World Ocean Circulation Experiment Hydrographic Program. A science team has been formed to plan and coordinate the various activities needed to produce high quality oceanic carbon dioxide measurements under this program. This handbook was prepared at the request of, and with the active participation of, that science team. The procedures have been agreed on by the members of the science team and describe well tested methods. They are intended to provide standard operating procedures, together with an appropriate quality control plan, for measurements made as part of this survey. These are not the only measurement techniques in use for the parameters of the oceanic carbon system; however, they do represent the current state-of-the-art for ship-board measurements. In the end, the editors hope that this handbook can serve widely as a clear and unambiguous guide to other investigators who are setting up to analyze the various parameters of the carbon dioxide system in sea water.

  18. Handbook of Collaborative Management Research

    Shani, A B Rami B; Pasmore, William A A; Stymne, Dr Bengt; Adler, Niclas

    2007-01-01

    This handbook provides the latest thinking, methodologies and cases in the rapidly growing area of collaborative management research. What makes collaborative management research different is its emphasis on creating a close partnership between scholars and practitioners in the search for knowledge concerning organizations and complex systems. In the ideal situation, scholars and their managerial partners would work together to define the research focus, develop the methods to be used for data collection, participate equally in the analysis of data, and work together in the application and dis

  19. Failure Analysis Handbook

    1989-08-18

    extent of the fatigue is showin by brackets. 409 FMH 100242 MAG: 10OX FAM 0023 MA bT 41013 PAM 99372 NAG: 50X FIGURE 8-111: overall photograph of the...Relieved X S~~Ta. anW=~ Alloy 90Ta-lOW Smte Relieved X Co- Cobalt Alloy L 605 125 ksi nux Cu-2% lit IsO 17 ksi x 733 8) T. E. Tallian, G. H. Baile, H

  20. Handbook of multilevel analysis

    Leeuw, Jan de; Meijer, Erik

    2008-01-01

    ... appropriate and efficient model-based methods have become available to deal with this issue, that we have come to appreciate the power that more complex models provide for describing the world and providing new insights. This book sets out to present some of the most recent developments in what has come to be known as multilevel modelling. An...

  1. Handbook of numerical analysis

    Ciarlet, Philippe G

    Mathematical finance is a prolific scientific domain in which there exists a particular characteristic of developing both advanced theories and practical techniques simultaneously. Mathematical Modelling and Numerical Methods in Finance addresses the three most important aspects in the field: mathematical models, computational methods, and applications, and provides a solid overview of major new ideas and results in the three domains. Coverage of all aspects of quantitative finance including models, computational methods and applications Provides an overview of new ideas an

  2. Usability study of clinical exome analysis software: top lessons learned and recommendations.

    Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W

    2014-10-01

    New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to

  3. AMIDE: A Free Software Tool for Multimodality Medical Image Analysis

    Andreas Markus Loening

    2003-07-01

    Full Text Available Amide's a Medical Image Data Examiner (AMIDE has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.

  4. Nonlinear analysis of reinforced concrete structures using software package abaqus

    Marković Nemanja; Stojić Dragoslav; Cvetković Radovan

    2014-01-01

    Reinforced concrete (AB) is characterized by huge inhomogeneity resulting from the material characteristics of the concrete, then, quasi-brittle behavior during failure. These and other phenomena require the introduction of material nonlinearity in the modeling of reinforced concrete structures. This paper presents the modeling reinforced concrete in the software package ABAQUS. A brief theoretical overview is presented of methods such as: Concrete Damage Plasticity (CDP), Smeared Concrete Cr...

  5. Comparison of two three-dimensional cephalometric analysis computer software

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-01-01

    Background: Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Materials and Methods: Twenty cone beam computed tomography images were obtained using i-CAT® imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (Unive...

  6. Security Analysis of a Software Defined Wide Area Network Solution

    Rajendran, Ashok

    2016-01-01

    Enterprise wide area network (WAN) is a private network that connects the computers and other devices across an organisation's branch locations and the data centers. It forms the backbone of enterprise communication. Currently, multiprotocol label switching (MPLS) is commonly used to provide this service. As a recent alternative to MPLS, software-dened wide area networking (SD-WAN) solutions are being introduced as an IP based cloud-networking service for enterprises. SD-WAN virtualizes the n...

  7. Software for muscle fibre type classification and analysis

    Karen, Petr; Števanec, M.; Smerdu, V.; Cvetko, E.; Kubínová, Lucie; Eržen, I.

    2009-01-01

    Roč. 53, č. 2 (2009), s. 87-95 ISSN 1121-760X R&D Projects: GA MŠk(CZ) LC06063; GA MŠk(CZ) MEB090910 Institutional research plan: CEZ:AV0Z50110509 Keywords : muscle fiber types * myosin heavy chain isoforms * image processing Subject RIV: JC - Computer Hardware ; Software Impact factor: 0.886, year: 2009

  8. The analysis of software system in SOPHY SPECT

    Xu Chikang

    1993-01-01

    The FORTH software system of the Single Photon Emission Computed Tomography (SPECT) made by French SOPHA MEDICAL Corp. are analysed. On the basis of brief introduction to the construction principle and programming methods of FORTH language the whole structure and lay-out of the Sophy system are described. With the help of some figures the modular structure, the allocation of the hard disk and internal storage, as well as the running procedure of the system are introduced in details

  9. Potku – New analysis software for heavy ion elastic recoil detection analysis

    Arstila, K.; Julin, J.; Laitinen, M.I.; Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T.; Sajavaara, T.

    2014-01-01

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight–energy (ToF–E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF–E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments

  10. Potku – New analysis software for heavy ion elastic recoil detection analysis

    Arstila, K., E-mail: kai.arstila@jyu.fi [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Julin, J.; Laitinen, M.I. [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T. [Department of Mathematical Information Technology, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Sajavaara, T. [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland)

    2014-07-15

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight–energy (ToF–E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF–E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments.

  11. Software design specification and analysis(NuFDS) approach for the safety critical software based on porgrammable logic controller(PLC)

    Koo, Seo Ryong; Seong, Poong Hyun; Jung, Jin Yong; Choi, Seong Soo

    2004-01-01

    This paper introduces the software design specification and analysis technique for the safety-critical system based on Programmable Logic Controller (PLC). During software development phases, the design phase should perform an important role to connect between requirements phase and implementation phase as a process of translating problem requirements into software structures. In this work, the Nuclear FBD-style Design Specification and analysis (NuFDS) approach was proposed. The NuFDS approach for nuclear Instrumentation and Control (I and C) software are suggested in a straight forward manner. It consists of four major specifications as follows; Database, Software Architecture, System Behavior, and PLC Hardware Configuration. Additionally, correctness, completeness, consistency, and traceability check techniques are also suggested for the formal design analysis in NuFDS approach. In addition, for the tool supporting, we are developing NuSDS tool based on the NuFDS approach which is a tool, especially for the software design specification in nuclear fields

  12. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  13. Choosing your weapons : on sentiment analysis tools for software engineering research

    Jongeling, R.M.; Datta, S.; Serebrenik, A.; Koschke, R.; Krinke, J.; Robillard, M.

    2015-01-01

    Recent years have seen an increasing attention to social aspects of software engineering, including studies of emotions and sentiments experienced and expressed by the software developers. Most of these studies reuse existing sentiment analysis tools such as SentiStrength and NLTK. However, these

  14. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data

    Oostenveld, R.; Fries, P.; Maris, E.G.G.; Schoffelen, J.M.

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow

  15. Analysis and design of software ecosystem architectures – Towards the 4S telemedicine ecosystem

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Kyng, Morten

    2014-01-01

    performed a descriptive, revelatory case study of the Danish telemedicine ecosystem and for ii), we experimentally designed, implemented, and evaluated the architecture of 4S. Results We contribute in three areas. First, we define the software ecosystem architecture concept that captures organization......, and application stove-pipes that inhibit the adoption of telemedical solutions. To which extent can a software ecosystem approach to telemedicine alleviate this? Objective In this article, we define the concept of software ecosystem architecture as the structure(s) of a software ecosystem comprising elements...... experience in creating and evolving the 4S telemedicine ecosystem. Conclusion The concept of software ecosystem architecture can be used analytically and constructively in respectively the analysis and design of software ecosystems....

  16. The software safety analysis based on SFTA for reactor power regulating system in nuclear power plant

    Liu Zhaohui; Yang Xiaohua; Liao Longtao; Wu Zhiqiang

    2015-01-01

    The digitalized Instrumentation and Control (I and C) system of Nuclear power plants can provide many advantages. However, digital control systems induce new failure modes that differ from those of analog control systems. While the cost effectiveness and flexibility of software is widely recognized, it is very difficult to achieve and prove high levels of dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. Software safety analysis (SSA) was one way to improve the software safety by identify the system hazards caused by software failure. This paper describes the application of a software fault tree analysis (SFTA) at the software design phase. At first, we evaluate all the software modules of the reactor power regulating system in nuclear power plant and identify various hazards. The SFTA was applied to some critical modules selected from the previous step. At last, we get some new hazards that had not been identified in the prior processes of the document evaluation which were helpful for our design. (author)

  17. Springer Handbook of Acoustics

    Rossing, Thomas D

    2007-01-01

    Acoustics, the science of sound, has developed into a broad interdisciplinary field encompassing the academic disciplines of physics, engineering, psychology, speech, audiology, music, architecture, physiology, neuroscience, and others. The Springer Handbook of Acoustics is an unparalleled modern handbook reflecting this richly interdisciplinary nature edited by one of the acknowledged masters in the field, Thomas Rossing. Researchers and students benefit from the comprehensive contents spanning: animal acoustics including infrasound and ultrasound, environmental noise control, music and human speech and singing, physiological and psychological acoustics, architectural acoustics, physical and engineering acoustics, signal processing, medical acoustics, and ocean acoustics. This handbook reviews the most important areas of acoustics, with emphasis on current research. The authors of the various chapters are all experts in their fields. Each chapter is richly illustrated with figures and tables. The latest rese...

  18. Springer handbook of robotics

    Khatib, Oussama

    2016-01-01

    The second edition of this handbook provides a state-of-the-art cover view on the various aspects in the rapidly developing field of robotics. Reaching for the human frontier, robotics is vigorously engaged in the growing challenges of new emerging domains. Interacting, exploring, and working with humans, the new generation of robots will increasingly touch people and their lives. The credible prospect of practical robots among humans is the result of the scientific endeavour of a half a century of robotic developments that established robotics as a modern scientific discipline. The ongoing vibrant expansion and strong growth of the field during the last decade has fueled this second edition of the Springer Handbook of Robotics. The first edition of the handbook soon became a landmark in robotics publishing and won the American Association of Publishers PROSE Award for Excellence in Physical Sciences & Mathematics as well as the organization’s Award for Engineering & Technology. The second edition o...

  19. Springer handbook of acoustics

    2014-01-01

    Acoustics, the science of sound, has developed into a broad interdisciplinary field encompassing the academic disciplines of physics, engineering, psychology, speech, audiology, music, architecture, physiology, neuroscience, and electronics. The Springer Handbook of Acoustics is also in his 2nd edition an unparalleled modern handbook reflecting this richly interdisciplinary nature edited by one of the acknowledged masters in the field, Thomas Rossing. Researchers and students benefit from the comprehensive contents. This new edition of the Handbook features over 11 revised and expanded chapters, new illustrations, and 2 new chapters covering microphone arrays  and acoustic emission.  Updated chapters contain the latest research and applications in, e.g. sound propagation in the atmosphere, nonlinear acoustics in fluids, building and concert hall acoustics, signal processing, psychoacoustics, computer music, animal bioacousics, sound intensity, modal acoustics as well as new chapters on microphone arrays an...

  20. Verification of LRFD Bridge Design and Analysis Software for INDOT

    Varma, Amit H.; Seo, Jungil

    2009-01-01

    NCHRP Process 12-50 was implemented to evaluate and verify composite steel I-girder bridge design software used commonly in Indiana. A test-bed of twenty one bridges was developed with the guidance from an Indiana Department of Transportation appointed research advisory panel (RAP). The test-bed included five simple-span and sixteen multi-span bridge superstructures. More than 80 parameters were required to define a bridge and they include bridge span, girder spacing, number of beams, section...

  1. Application of Gaia Analysis Software AGIS to Nano-JASMINE

    Yamada, Y.; Lammers, U.; Gouda, N.

    2011-07-01

    The core data reduction for the Nano-JASMINE mission is planned to be done with Gaia's Astrometric Global Iterative Solution (AGIS). Nano-JASMINE is an ultra small (35 kg) satellite for astrometry observations in Japan and Gaia is ESA's large (over 1000 kg) next-generation astrometry mission. The accuracy of Nano-JASMINE is about 3 mas, comparable to the Hipparcos mission, Gaia's predecessor some 20 years ago. It is challenging that such a small satellite can perform real scientific observations. The collaboration for sharing software started in 2007. In addition to similar design and operating principles of the two missions, this is possible thanks to the encapsulation of all Gaia-specific aspects of AGIS in a Parameter Database. Nano-JASMINE will be the test bench for the Gaia AGIS software. We present this idea in detail and the necessary practical steps to make AGIS work with Nano-JASMINE data. We also show the key mission parameters, goals, and status of the data reduction for the Nano-JASMINE.

  2. SEDA: A software package for the Statistical Earthquake Data Analysis

    Lombardi, A. M.

    2017-03-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.

  3. The digital media handbook

    Dewdney, Andrew

    2013-01-01

    The new edition of The Digital Media Handbook presents an essential guide to the historical and theoretical development of digital media, emphasising cultural continuity alongside technological change, and highlighting the emergence of new forms of communication in contemporary networked culture.Andrew Dewdney and Peter Ride present detailed critical commentary and descriptive historical accounts, as well as a series of interviews from a range of digital media practitioners, including producers, developers, curators and artists.The Digital Media Handbook highlights key concerns of today's prac

  4. Practical electronics handbook

    Sinclair, Ian R

    1988-01-01

    Practical Electronics Handbook, Second Edition covers information useful in electronics, with focus on mathematical conventions. The handbook discusses the passive (resistors, capacitors, band coding, and inductors) and active discrete (diodes, transistors and negative feedback) components; discrete component circuits; and transferring digital data. Linear I.C.s, which are the single-chip arrangements of amplifier circuits that are intended to be biased and operated in a linear way, and digital I.C.s, which process signals and consist of two significant voltage levels, are also considered. T

  5. Federal environmental inspections handbook

    1991-10-01

    This Federal Environmental Inspection Handbook has been prepared by the Department of Energy (DOE), Office of Environmental Guidance, RCRA/CERCLA Division (EH-231). It is designed to provide DOE personnel with an easily accessible compilation of the environmental inspection requirements under Federal environmental statutes which may impact DOE operations and activities. DOE personnel are reminded that this Handbook is intended to be used in concert with, and not as a substitute for, the Code of Federal Regulations (CFR). Federal Register (FR), and other applicable regulatory documents

  6. Handbook of optical design

    Malacara-Hernández, Daniel

    2013-01-01

    Handbook of Optical Design, Third Edition covers the fundamental principles of geometric optics and their application to lens design in one volume. It incorporates classic aspects of lens design along with important modern methods, tools, and instruments, including contemporary astronomical telescopes, Gaussian beams, and computer lens design. Written by respected researchers, the book has been extensively classroom-tested and developed in their lens design courses. This well-illustrated handbook clearly and concisely explains the intricacies of optical system design and evaluation. It also di

  7. The french criticality handbook

    Maubert, L.; Puit, J.C.

    1987-01-01

    The french criticality handbook, published in 1978 by the ''Commissariat a l'Energie Atomique'', is presented with the main targets aimed by the writer and the main choices taken relating to fissile mediums, reflection conditions, dilution curves. The validation of the critical values is presented as one of the most important aspects of this handbook which is mainly intended, in the mind of the author, to specialists well advertised in the field of criticality. The complements which have been introduced since 1978 and those which are foreseen in a near future are also detailed. (author)

  8. Handbook of biomedical optics

    Boas, David A

    2011-01-01

    Biomedical optics holds tremendous promise to deliver effective, safe, non- or minimally invasive diagnostics and targeted, customizable therapeutics. Handbook of Biomedical Optics provides an in-depth treatment of the field, including coverage of applications for biomedical research, diagnosis, and therapy. It introduces the theory and fundamentals of each subject, ensuring accessibility to a wide multidisciplinary readership. It also offers a view of the state of the art and discusses advantages and disadvantages of various techniques.Organized into six sections, this handbook: Contains intr

  9. Information security management handbook

    2002-01-01

    The Information Security Management Handbook continues its tradition of consistently communicating the fundamental concepts of security needed to be a true CISSP. In response to new developments, Volume 4 supplements the previous volumes with new information covering topics such as wireless, HIPAA, the latest hacker attacks and defenses, intrusion detection, and provides expanded coverage on security management issues and applications security. Even those that don't plan on sitting for the CISSP exam will find that this handbook is a great information security reference.The changes in the tech

  10. Handbook of vacuum physics

    1964-01-01

    Handbook of Vacuum Physics, Volume 3: Technology is a handbook of vacuum physics, with emphasis on the properties of miscellaneous materials such as mica, oils, greases, waxes, and rubber. Accurate modern tables of physical constants, properties of materials, laboratory techniques, and properties of commercial pumps, gauges, and leak detectors are presented. This volume is comprised of 12 chapters and begins with a discussion on pump oils, divided into rotary pump oils and vapor pump oils. The next chapter deals with the properties and applications of greases, including outgassing and vapor pr

  11. Handbook of interatomic potentials

    Stoneham, A.M.; Taylor, R.

    1981-08-01

    This Handbook collects together interatomic potentials for a large number of metals. Most of the potentials describe the interactions of host metal atoms with each other, and these, in some cases, may be applied to solid and liquid metals. In addition, there are potentials (a) for a metallic impurity alloyed with the host, (b) for a small number of chemical impurities in the metal (eg H, O), and (c) for rare-gas impurities, notably He. The Handbook is intended to be a convenient source of potentials for bulk, surface and defect calculations, both static and dynamic. (author)

  12. Nanoelectronic device applications handbook

    Morris, James E

    2013-01-01

    Nanoelectronic Device Applications Handbook gives a comprehensive snapshot of the state of the art in nanodevices for nanoelectronics applications. Combining breadth and depth, the book includes 68 chapters on topics that range from nano-scaled complementary metal-oxide-semiconductor (CMOS) devices through recent developments in nano capacitors and AlGaAs/GaAs devices. The contributors are world-renowned experts from academia and industry from around the globe. The handbook explores current research into potentially disruptive technologies for a post-CMOS world.These include: Nanoscale advance

  13. Information security management handbook

    Tipton, Harold F

    2003-01-01

    Since 1993, the Information Security Management Handbook has served not only as an everyday reference for information security practitioners but also as an important document for conducting the intense review necessary to prepare for the Certified Information System Security Professional (CISSP) examination. Now completely revised and updated and in its fifth edition, the handbook maps the ten domains of the Information Security Common Body of Knowledge and provides a complete understanding of all the items in it. This is a ...must have... book, both for preparing for the CISSP exam and as a c

  14. Handbook of probability

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  15. DOE fundamentals handbook: Chemistry

    1993-01-01

    The Chemistry Handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of chemistry. The handbook includes information on the atomic structure of matter; chemical bonding; chemical equations; chemical interactions involved with corrosion processes; water chemistry control, including the principles of water treatment; the hazards of chemicals and gases, and basic gaseous diffusion processes. This information will provide personnel with a foundation for understanding the chemical properties of materials and the way these properties can impose limitations on the operation of equipment and systems

  16. Handbook of surveillance technologies

    Petersen, JK

    2012-01-01

    From officially sanctioned, high-tech operations to budget spy cameras and cell phone video, this updated and expanded edition of a bestselling handbook reflects the rapid and significant growth of the surveillance industry. The Handbook of Surveillance Technologies, Third Edition is the only comprehensive work to chronicle the background and current applications of the full-range of surveillance technologies--offering the latest in surveillance and privacy issues.Cutting-Edge--updates its bestselling predecessor with discussions on social media, GPS circuits in cell phones and PDAs, new GIS s

  17. Handbook of radioimmunoassay

    Abraham, G.E.

    1977-01-01

    This handbook provides clear, detailed descriptions of ways to set up radioimmunoassay procedures for a variety of polypeptides and low molecular weight compounds. It covers extensively the subjects of statistical evaluation of radioimmunoassay, instrumentation in radioimmunoassay, and radiation safety in the performance of radioimmunoassay. Related nonconventional methods are also discussed. Contributors to this handbook have presented their own procedures for performing the radioimmunoassay and their rationale for choosing particular reagents and conditions. Emphasis is on providing sufficient information to enable relatively inexperienced immunoassayists to set up assay systems with a minimum of difficulty

  18. Application of software quality assurance methods in validation and maintenance of reactor analysis computer codes

    Reznik, L.

    1994-01-01

    Various computer codes employed at Israel Electricity Company for preliminary reactor design analysis and fuel cycle scoping calculations have been often subject to program source modifications. Although most changes were due to computer or operating system compatibility problems, a number of significant modifications were due to model improvement and enhancements of algorithm efficiency and accuracy. With growing acceptance of software quality assurance requirements and methods, a program of implementing extensive testing of modified software has been adopted within the regular maintenance activities. In this work survey has been performed of various software quality assurance methods of software testing which belong mainly to the two major categories of implementation ('white box') and specification-based ('black box') testing. The results of this survey exhibits a clear preference of specification-based testing. In particular the equivalence class partitioning method and the boundary value method have been selected as especially suitable functional methods for testing reactor analysis codes.A separate study of software quality assurance methods and techniques has been performed in this work objective to establish appropriate pre-test software specification methods. Two methods of software analysis and specification have been selected as the most suitable for this purpose: The method of data flow diagrams has been shown to be particularly valuable for performing the functional/procedural software specification while the entities - relationship diagrams has been approved to be efficient for specifying software data/information domain. Feasibility of these two methods has been analyzed in particular for software uncertainty analysis and overall code accuracy estimation. (author). 14 refs

  19. Software for tomographic analysis: application in ceramic filters

    Figuerola, W.B.; Assis, J.T.; Oliveira, L.F.; Lopes, R.T.

    2001-01-01

    New methods for acquiring data have been developed with the technological advances. With this, it has been possible to obtain more precise data and, consequently produce results with greater reliability. Among the variety of acquisition methods available, those that have volume description, as CT (Computerized Tomography) and NMR (Nuclear Magnetic Resonance) stand out. The models of volumetric data (group of data that describe a solid object from a three dimensional space) are being greatly used in diversity of areas as a way of inspection, modeling and simulation of objects in a three - dimensional space. Applications of this model are already found in Mechanic Engineering, Geosciences, Medicine and other areas. In the area of engineering it is sometimes necessary to use industrial CT as the only non-invasive form of inspection the interior of pieces without destroying them. The 3D micro focus X-ray tomography is one technique of non destructive testing used in the most different areas of science and technology, given its capacity to generate clean images (practically free of the unhappiness effect) and high resolution reconstructions. The unsharpness effect minimization and space resolution improvement are consequences of the focal spot size reduction in the X-ray micro focus tube to dimensions smaller than 50 mm. The ceramic filters are used in a wide area in the metallurgic industry, particularly in the cast aluminum where they are used to clean the waste coming through the liquid aluminum. The ceramic filters used in this work are manufactured by FUSICO (German company) and they are constructed from foams. They are manufactured at three models: 10, 20 and 30 ppi (porous per inch). In this paper we present the development of software to analyze and characterize ceramic filters, which can be divided in four stages. This software was developed in C++ language, using objects oriented programming. It is also capable of being executed in multiple platforms (Windows

  20. Software project profitability analysis using temporal probabilistic reasoning; an empirical study with the CASSE framework

    Balikuddembe, JK

    2009-04-01

    Full Text Available Undertaking adequate risk management by understanding project requirements and ensuring that viable estimates are made on software projects require extensive application and sophisticated techniques of analysis and interpretation. Informative...

  1. Development of the Free-space Optical Communications Analysis Software (FOCAS)

    Jeganathan, M.; Mecherle, G.; Lesh, J.

    1998-01-01

    The Free-space Optical Communications Analysis Software (FOCAS) was developed at the Jet Propulsion Laboratory (JPL) to provide mission planners, systems engineers and communications engineers with an easy to use tool to analyze optical communications link.

  2. Development of tools for safety analysis of control software in advanced reactors

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  3. Development of tools for safety analysis of control software in advanced reactors

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described

  4. Handbook of Partial Least Squares Concepts, Methods and Applications

    Vinzi, Vincenzo Esposito; Henseler, Jörg

    2010-01-01

    This handbook provides a comprehensive overview of Partial Least Squares (PLS) methods with specific reference to their use in marketing and with a discussion of the directions of current research and perspectives. It covers the broad area of PLS methods, from regression to structural equation modeling applications, software and interpretation of results. The handbook serves both as an introduction for those without prior knowledge of PLS and as a comprehensive reference for researchers and practitioners interested in the most recent advances in PLS methodology.

  5. Handbook of power systems engineering with power electronics applications

    Hase, Yoshihide

    2012-01-01

    Formerly known as Handbook of Power System Engineering, this second edition provides rigorous revisions to the original treatment of systems analysis together with a substantial new four-chapter section on power electronics applications. Encompassing a whole range of equipment, phenomena, and analytical approaches, this handbook offers a complete overview of power systems and their power electronics applications, and presents a thorough examination of the fundamental principles, combining theories and technologies that are usually treated in separate specialised fields, in a single u

  6. JULIDE: a software tool for 3D reconstruction and statistical analysis of autoradiographic mouse brain sections.

    Delphine Ribes

    Full Text Available In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.

  7. Analysis for Parallel Execution without Performing Hardware/Software Co-simulation

    Muhammad Rashid

    2014-01-01

    Hardware/software co-simulation improves the performance of embedded applications by executing the applications on a virtual platform before the actual hardware is available in silicon. However, the virtual platform of the target architecture is often not available during early stages of the embedded design flow. Consequently, analysis for parallel execution without performing hardware/software co-simulation is required. This article presents an analysis methodology for parallel execution of ...

  8. An assessment of software for flow cytometry analysis in banana plants

    Renata Alves Lara Silva

    2014-02-01

    Full Text Available Flow cytometry is a technique that yields rapid results in analyses of cell properties such as volume, morphological complexity and quantitative DNA content, and it is considered more convenient than other techniques. However, the analysis usually generates histograms marked by variations that can be produced by many factors, including differences between the software packages that capture the data generated by the flow cytometer. The objective of the present work was to evaluate the performance of four software products commonly used in flow cytometry based on quantifications of DNA content and analyses of the coefficients of variation associated with the software outputs. Readings were obtained from 25 ‘NBA’ (AA banana leaf samples using the FACSCalibur (BD flow cytometer, and 25 histograms from each software product (CellQuest™, WinMDI™, FlowJo™ and FCS Express™ were analyzed to obtain the estimated DNA content and the coefficient of variation (CV of the estimates. The values of DNA content obtained from the software did not differ significantly. However, the CV analysis showed that the precision of the WinMDI™ software was low and that the CV values were underestimated, whereas the remaining software showed CV values that were in relatively close agreement with those found in the literature. The CellQuest™ software is recommended because it was developed by the same company that produces the flow cytometer used in the present study.

  9. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  10. Development of computer software for pavement life cycle cost analysis.

    1988-01-01

    The life cycle cost analysis program (LCCA) is designed to automate and standardize life cycle costing in Virginia. It allows the user to input information necessary for the analysis, and it then completes the calculations and produces a printed copy...

  11. Materials handbook for fusion energy systems

    Davis, J. W.; Marchbanks, M. F.

    A materials data book for use in the design and analysis of components and systems in near term experimental and commercial reactor concepts has been created by the Office of Fusion Energy. The handbook is known as the Materials Handbook for Fusion Energy Systems (MHFES) and is available to all organizations actively involved in fusion related research or system designs. Distribution of the MHFES and its data pages is handled by the Hanford Engineering Development Laboratory (HEDL), while its direction and content is handled by McDonnell Douglas Astronautics Company — St. Louis (MDAC-STL). The MHFES differs from other handbooks in that its format is geared more to the designer and structural analyst than to the materials scientist or materials engineer. The format that is used organizes the handbook by subsystems or components rather than material. Within each subsystem is information pertaining to material selection, specific material properties, and comments or recommendations on treatment of data. Since its inception a little more than a year ago, over 80 copies have been distributed to over 28 organizations consisting of national laboratories, universities, and private industries.

  12. The international handbook of space technology

    Badescu, Viorel

    2014-01-01

    This comprehensive handbook provides an overview of space technology and a holistic understanding of the system-of-systems that is a modern spacecraft. With a foreword by Elon Musk, CEO and CTO of SpaceX, and contributions from globally leading agency experts from NASA, ESA, JAXA, and CNES, as well as European and North American academics and industrialists, this handbook, as well as giving an interdisciplinary overview, offers, through individual self-contained chapters, more detailed understanding of specific fields, ranging through: ·         Launch systems, structures, power, thermal, communications, propulsion, and software, to ·         entry, descent and landing, ground segment, robotics, and data systems, to ·         technology management, legal and regulatory issues, and project management. This handbook is an equally invaluable asset to those on a career path towards the space industry as it is to those already within the industry.

  13. Reference handbook: Level detectors

    1990-01-01

    The purpose of this handbook is to provide Rocky Flats personnel with the information necessary to understand level measurement and detection. Upon completion of this handbook you should be able to do the following: List three reasons for measuring level. Describe the basic operating principles of the sight glass. Demonstrate proper techniques for reading a sight glass. Describe the basic operating principles of a float level detector. Describe the basic operating principles of a bubbler level indicating system. Explain the differences between a wet and dry reference leg indicating system, and describe how each functions. This handbook is designed for use by experienced Rocky Flats operators to reinforce and improve their current knowledge level, and by entry-level operators to ensure that they possess a minimum level of fundamental knowledge. Level Detectors is applicable to many job classifications and can be used as a reference for classroom work or for self-study. Although this reference handbook is by no means all-encompassing, you will gain enough information about this subject area to assist you in contributing to the safe operation of Rocky Flats Plant

  14. Rain Gauges Handbook

    Bartholomew, M. J. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-01-01

    To improve the quantitative description of precipitation processes in climate models, the Atmospheric Radiation Measurement (ARM) Climate Research Facility deployed rain gauges located near disdrometers (DISD and VDIS data streams). This handbook deals specifically with the rain gauges that make the observations for the RAIN data stream. Other precipitation observations are made by the surface meteorology instrument suite (i.e., MET data stream).

  15. Area Handbook for Uruguay.

    Weil, Thomas E.; And Others

    This volume is one of 62 in a series of handbooks designed to be useful to military and other personnel who need a convenient compilation of basic facts about the social, economic, political, and military institutions and practices of various countries. The emphasis is on objective description of the nation's present society and the kinds of…

  16. Area Handbook for Syria.

    Nyrop, Richard; And Others

    This volume on Syria is one of a series of handbooks prepared by the Foreign Area Studies (FAS) of the American University, designed to be useful to military and other personnel who need a convenient compilation of basic facts about the social, economic, political, and military institutions and practices of various countries. The emphasis is on…

  17. Transit manager's handbook.

    2011-09-01

    This handbook provides an overview of public transit in Iowa and how to do business with the Iowa Department of Transportation (Iowa DOT) Office of Public Transit (OPT). It is intended to be a tool to assist transit managers navigate through the many...

  18. Referencing handbook: OSCOLA

    Williams, Helen

    2016-01-01

    University of Lincoln approved guide to OSCOLA referencing. Providing guidelines on how to reference UK, EU and International primary sources of law and secondary sources. The handbook provides examples and annotated diagrams to help you reference sources in the OSCOLA style.

  19. Oil economists' handbook 1984

    Jenkins, G [ed.

    1983-01-01

    This handbook lists statistics on energy resources, production and consumption; petroleum refining, products, storage, quality and prices; shipping; pipelines; and energy companies. Conversion factors, a dictionary of terms, and a chronology of major events (1919-1938) are included. The data given runs up to 1982/83.

  20. Wildlife Habitat Evaluation Handbook.

    Neilson, Edward L., Jr.; Benson, Delwin E.

    The National 4-H Wildlife Invitational is a competitive event to teach youth about the fundamentals of wildlife management. Youth learn that management for wildlife means management of wildlife habitat and providing for the needs of wildlife. This handbook provides information about wildlife habitat management concepts in both urban and rural…

  1. Capability Handbook- offline metrology

    Islam, Aminul; Marhöfer, David Maximilian; Tosello, Guido

    This offline metrological capability handbook has been made in relation to HiMicro Task 3.3. The purpose of this document is to assess the metrological capability of the HiMicro partners and to gather the information of all available metrological instruments in the one single document. It provides...

  2. Radiation`96. Conference handbook

    NONE

    1996-12-31

    The conference program includes eight invited lectures which cover a range of contemporary topics in radiation science and technology. In addition, thirty-two oral papers were presented, along with forty-five posters. The conference handbook contains one-page precis or extended abstracts of all presentations, and is a substantial compendium of current radiation research in Australia.

  3. Radiation`96. Conference handbook

    NONE

    1997-12-31

    The conference program includes eight invited lectures which cover a range of contemporary topics in radiation science and technology. In addition, thirty-two oral papers were presented, along with forty-five posters. The conference handbook contains one-page precis or extended abstracts of all presentations, and is a substantial compendium of current radiation research in Australia.

  4. The Video Handbook.

    1972

    In order to provide basic technical and production information for closed-circuit television, the editors have assembled this series of papers. Deisgned as an introductory guide for those entering the field, the handbook covers the basic areas of non-broadcast television. Starting with facilities and equipment the guide outlines the planning and…

  5. European food law handbook

    Meulen, van der B.M.J.; Velde, van der M.; Szajkowska, A.; Verbruggen, R.

    2008-01-01

    This handbook analyses and explains the institutional, substantive and procedural elements of EU food law, taking the General Food Law as a focus point. Principles are discussed as well as specific rules addressing food as a product, the processes related to food and communication about food through

  6. Trustee Handbook. Fourth Edition.

    Johnson, Eric W.

    The principles of sound governance and administration of independent schools are discussed in this handbook for private school trustees. The nature and responsibilities of school boards are presented, along with a description of the functions of various types of board committees. A chapter on the duties of school board members provides orientation…

  7. The Population Activist's Handbook.

    Population Inst., Washington, DC.

    This handbook is a guide to effective action strategies on dealing with overpopulation. Divided into five sections, the book outlines programs, suggests references, and lists resources that are helpful for thinking and for planning action on population issues. Section one focuses on strategies to change the current population policy choices made…

  8. Energy manager's handbook

    Payne, G A

    1977-01-01

    The handbook provides sufficient guidance on the principles involved for readers to tailor a program to meet their own requirement. The following chapters are included: Energy Conservation; Management of Energy; Delivery, Storage, and Handling of Fuels; Boilers; Furnaces; Heat Distribution and Utilization; Industrial Space Heating; Electricity; Services; and Road Transport. (MCW)

  9. DOE handbook: Design considerations

    1999-04-01

    The Design Considerations Handbook includes information and suggestions for the design of systems typical to nuclear facilities, information specific to various types of special facilities, and information useful to various design disciplines. The handbook is presented in two parts. Part 1, which addresses design considerations, includes two sections. The first addresses the design of systems typically used in nuclear facilities to control radiation or radioactive materials. Specifically, this part addresses the design of confinement systems and radiation protection and effluent monitoring systems. The second section of Part 1 addresses the design of special facilities (i.e., specific types of nonreactor nuclear facilities). The specific design considerations provided in this section were developed from review of DOE 6430.1A and are supplemented with specific suggestions and considerations from designers with experience designing and operating such facilities. Part 2 of the Design Considerations Handbook describes good practices and design principles that should be considered in specific design disciplines, such as mechanical systems and electrical systems. These good practices are based on specific experiences in the design of nuclear facilities by design engineers with related experience. This part of the Design Considerations Handbook contains five sections, each of which applies to a particular engineering discipline

  10. Faculty Handbook. Regis College.

    Regis Coll., Weston, MA.

    Regis College policies and procedures are described in this 1976 faculty handbook. Chapter 1 covers college organization and governance, including roles of academic officers and committees. Specific faculty data are presented in Chapter 2, such as definition of academic ranks and titles, recruitment and appointment, promotion, tenure, review,…

  11. Area Handbook for Ghana.

    Kaplan, Irving; And Others

    The dominant social, political, and economic aspects of Ghanaian society are described in this handbook. Changes and developments in Ghana in the past 10 years, highlighted by the 1966 overthrough and widespread repudiation of Kwame Nkrumah and his policies and practices, have created a need for this revision of the 1962 edition. The purpose of…

  12. Handbook of nanomaterials properties

    Luo, Dan; Schricker, Scott R; Sigmund, Wolfgang; Zauscher, Stefan

    2014-01-01

    Nanomaterials attract tremendous attention in recent researches. Although extensive research has been done in this field it still lacks a comprehensive reference work that presents data on properties of different Nanomaterials. This Handbook of Nanomaterials Properties will be the first single reference work that brings together the various properties with wide breadth and scope.

  13. Animal damage management handbook.

    Hugh C. Black

    1994-01-01

    This handbook treats animal damage management (ADM) in the West in relation to forest, range, and recreation resources; predator management is not addressed. It provides a comprehensive reference of safe, effective, and practical methods for managing animal damage on National Forest System lands. Supporting information is included in references after each chapter and...

  14. Parent Involvement Handbook.

    Caplan, Arna

    This handbook on parent involvement, designed to be used with preschool programs, was developed by the Jefferson County Public Schools in Lakewood, Colorado. Included are: (1) a general statement about parent involvement in an early childhood program, (2) a description of the Jefferson County Early Childhood Program, (3) a description of the…

  15. Nuclear safeguards technology handbook

    1977-12-01

    The purpose of this handbook is to present to United States industrial organizations the Department of Energy's (DOE) Safeguards Technology Program. The roles and missions for safeguards in the U.S. government and application of the DOE technology program to industry safeguards planning are discussed. A guide to sources and products is included. (LK)

  16. Handbook on Peace Education

    Salomon, Gavriel, Ed.; Cairns, Ed, Ed.

    2009-01-01

    This handbook encompasses a range of disciplines that underlie the field of peace education and provides the rationales for the ways it is actually carried out. The discipline is a composite of contributions from a variety of disciplines ranging from social psychology to philosophy and from communication to political science. That is, peace…

  17. Nuclear safeguards technology handbook

    1977-12-01

    The purpose of this handbook is to present to United States industrial organizations the Department of Energy's (DOE) Safeguards Technology Program. The roles and missions for safeguards in the U.S. government and application of the DOE technology program to industry safeguards planning are discussed. A guide to sources and products is included

  18. Radiation'96. Conference handbook

    1996-01-01

    The conference program includes eight invited lectures which cover a range of contemporary topics in radiation science and technology. In addition, thirty-two oral papers were presented, along with forty-five posters. The conference handbook contains one-page precis or extended abstracts of all presentations, and is a substantial compendium of current radiation research in Australia

  19. Humane Education Projects Handbook.

    Junior League of Ogden, UT.

    This handbook was developed to promote interest in humane education and to encourage the adoption of humane education projects. Although specifically designed to assist Junior Leagues in developing such projects, the content should prove valuable to animal welfare organizations, zoos, aquariums, nature centers, and other project-oriented groups…

  20. DOE handbook: Design considerations

    NONE

    1999-04-01

    The Design Considerations Handbook includes information and suggestions for the design of systems typical to nuclear facilities, information specific to various types of special facilities, and information useful to various design disciplines. The handbook is presented in two parts. Part 1, which addresses design considerations, includes two sections. The first addresses the design of systems typically used in nuclear facilities to control radiation or radioactive materials. Specifically, this part addresses the design of confinement systems and radiation protection and effluent monitoring systems. The second section of Part 1 addresses the design of special facilities (i.e., specific types of nonreactor nuclear facilities). The specific design considerations provided in this section were developed from review of DOE 6430.1A and are supplemented with specific suggestions and considerations from designers with experience designing and operating such facilities. Part 2 of the Design Considerations Handbook describes good practices and design principles that should be considered in specific design disciplines, such as mechanical systems and electrical systems. These good practices are based on specific experiences in the design of nuclear facilities by design engineers with related experience. This part of the Design Considerations Handbook contains five sections, each of which applies to a particular engineering discipline.