WorldWideScience

Sample records for machine scientific software

  1. Position Paper: Applying Machine Learning to Software Analysis to Achieve Trusted, Repeatable Scientific Computing

    Energy Technology Data Exchange (ETDEWEB)

    Prowell, Stacy J [ORNL; Symons, Christopher T [ORNL

    2015-01-01

    Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.

  2. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  3. Fostering successful scientific software communities

    Science.gov (United States)

    Bangerth, W.; Heister, T.; Hwang, L.; Kellogg, L. H.

    2016-12-01

    Developing sustainable open source software packages for the sciences appears at first to be primarily a technical challenge: How can one create stable and robust algorithms, appropriate software designs, sufficient documentation, quality assurance strategies such as continuous integration and test suites, or backward compatibility approaches that yield high-quality software usable not only by the authors, but also the broader community of scientists? However, our experience from almost two decades of leading the development of the deal.II software library (http://www.dealii.org, a widely-used finite element package) and the ASPECT code (http://aspect.dealii.org, used to simulate convection in the Earth's mantle) has taught us that technical aspects are not the most difficult ones in scientific open source software. Rather, it is the social challenge of building and maintaining a community of users and developers interested in answering questions on user forums, contributing code, and jointly finding solutions to common technical and non-technical challenges. These problems are posed in an environment where project leaders typically have no resources to reward the majority of contributors, where very few people are specifically paid for the work they do on the project, and with frequent turnover of contributors as project members rotate into and out of jobs. In particular, much software work is done by graduate students who may become fluent enough in a software only a year or two before they leave academia. We will discuss strategies we have found do and do not work in maintaining and growing communities around the scientific software projects we lead. Specifically, we will discuss the management style necessary to keep contributors engaged, ways to give credit where credit is due, and structuring documentation to decrease reliance on forums and thereby allow user communities to grow without straining those who answer questions.

  4. Testing Scientific Software: A Systematic Literature Review

    Science.gov (United States)

    Kanewala, Upulee; Bieman, James M.

    2014-01-01

    Context Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. Objective This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. Method We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. Results We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Conclusions Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques. PMID:25125798

  5. Testing Scientific Software: A Systematic Literature Review.

    Science.gov (United States)

    Kanewala, Upulee; Bieman, James M

    2014-10-01

    Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques.

  6. Examining software complexity and quality for scientific software

    International Nuclear Information System (INIS)

    Kelly, D.; Shepard, T.

    2005-01-01

    Research has not found a simple relationship between software complexity and software quality, and particularly no relationship between commonly used software complexity metrics and the occurrence of software faults. A study with an example of scientific software from the nuclear power industry illustrates the importance of addressing cognitive complexity, the complexity related to understanding the intellectual content of the software. Simple practices such as aptly-named variables contributes more to high quality software than limiting code sizes. This paper examines the research into complexity and quality and reports on a longitudinal study using the example of nuclear software. (author)

  7. Center for Technology for Advanced Scientific Component Software (TASCS)

    Energy Technology Data Exchange (ETDEWEB)

    Damevski, Kostadin [Virginia State Univ., Petersburg, VA (United States)

    2009-03-30

    A resounding success of the Scientific Discover through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedened computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative hig-performance scientific computing.

  8. 2006 XSD Scientific Software Workshop report.

    Energy Technology Data Exchange (ETDEWEB)

    Evans, K., Jr.; De Carlo, F.; Jemian, P.; Lang, J.; Lienert, U.; Maclean, J.; Newville, M.; Tieman, B.; Toby, B.; van Veenendaal, B.; Univ. of Chicago

    2006-01-22

    In May of 2006, a committee was formed to assess the fundamental needs and opportunities in scientific software for x-ray data reduction, analysis, modeling, and simulation. This committee held a series of discussions throughout the summer, conducted a poll of the members of the x-ray community, and held a workshop. This report details the findings and recommendations of the committee. Each experiment performed at the APS requires three crucial ingredients: the powerful x-ray source, an optimized instrument to perform measurements, and computer software to acquire, visualize, and analyze the experimental observations. While the APS has invested significant resources in the accelerator, investment in other areas such as scientific software for data analysis and visualization has lagged behind. This has led to the adoption of a wide variety of software with variable levels of usability. In order to maximize the scientific output of the APS, it is essential to support the broad development of real-time analysis and data visualization software. As scientists attack problems of increasing sophistication and deal with larger and more complex data sets, software is playing an ever more important role. Furthermore, our need for excellent and flexible scientific software can only be expected to increase, as the upgrade of the APS facility and the implementation of advanced detectors create a host of new measurement capabilities. New software analysis tools must be developed to take full advantage of these capabilities. It is critical that the APS take the lead in software development and the implementation of theory to software to ensure the continued success of this facility. The topics described in this report are relevant to the APS today and critical for the APS upgrade plan. Implementing these recommendations will have a positive impact on the scientific productivity of the APS today and will be even more critical in the future.

  9. Human-machine interface software package

    International Nuclear Information System (INIS)

    Liu, D.K.; Zhang, C.Z.

    1992-01-01

    The Man-Machine Interface software Package (MMISP) is designed to configure the console software of PLS 60 Mev LINAC control system. The control system of PLS 60 Mev LINAC is a distributed control system which includes the main computer (Intel 310) four local station, and two sets of industrial level console computer. The MMISP provides the operator with the display page editor, various I/O configuration such as digital signals In/Out, analog signal In/Out, waveform TV graphic display, and interactive with operator through graphic picture display, voice explanation, and touch panel. This paper describes its function and application. (author)

  10. Reliable Software Development for Machine Protection Systems

    CERN Document Server

    Anderson, D; Dragu, M; Fuchsberger, K; Garnier, JC; Gorzawski, AA; Koza, M; Krol, K; Misiowiec, K; Stamos, K; Zerlauth, M

    2014-01-01

    The Controls software for the Large Hadron Collider (LHC) at CERN, with more than 150 millions lines of code, resides amongst the largest known code bases in the world1. Industry has been applying Agile software engineering techniques for more than two decades now, and the advantages of these techniques can no longer be ignored to manage the code base for large projects within the accelerator community. Furthermore, CERN is a particular environment due to the high personnel turnover and manpower limitations, where applying Agile processes can improve both, the codebase management as well as its quality. This paper presents the successful application of the Agile software development process Scrum for machine protection systems at CERN, the quality standards and infrastructure introduced together with the Agile process as well as the challenges encountered to adapt it to the CERN environment.

  11. Highly parallel machines and future of scientific computing

    International Nuclear Information System (INIS)

    Singh, G.S.

    1992-01-01

    Computing requirement of large scale scientific computing has always been ahead of what state of the art hardware could supply in the form of supercomputers of the day. And for any single processor system the limit to increase in the computing power was realized a few years back itself. Now with the advent of parallel computing systems the availability of machines with the required computing power seems a reality. In this paper the author tries to visualize the future large scale scientific computing in the penultimate decade of the present century. The author summarized trends in parallel computers and emphasize the need for a better programming environment and software tools for optimal performance. The author concludes this paper with critique on parallel architectures, software tools and algorithms. (author). 10 refs., 2 tabs

  12. Software engineering and automatic continuous verification of scientific software

    Science.gov (United States)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  13. Software Defects, Scientific Computation and the Scientific Method

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    Computation has rapidly grown in the last 50 years so that in many scientific areas it is the dominant partner in the practice of science. Unfortunately, unlike the experimental sciences, it does not adhere well to the principles of the scientific method as espoused by, for example, the philosopher Karl Popper. Such principles are built around the notions of deniability and reproducibility. Although much research effort has been spent on measuring the density of software defects, much less has been spent on the more difficult problem of measuring their effect on the output of a program. This talk explores these issues with numerous examples suggesting how this situation might be improved to match the demands of modern science. Finally it develops a theoretical model based on an amalgam of statistical mechanics and Hartley/Shannon information theory which suggests that software systems have strong implementation independent behaviour and supports the widely observed phenomenon that defects clust...

  14. Writing software or writing scientific articles?

    CERN Document Server

    Basaglia, Tullio; Dressendorfer, P V; Larkin, A; Pia, M G

    2008-01-01

    An analysis of publications related to high energy physics computing in refereed journals is presented. The distribution of papers associated to various fields of computing relevant to high energy physics is critically analyzed. The relative publication rate of software papers is evaluated in comparison to other closely related physics disciplines, such as nuclear physics, radiation protection and medical physics, and to hardware publications. The results hint to the fact that, in spite of the significant effort invested in high energy physics computing and its fundamental role in the experiments, this research area is underrepresented in scientific literature; nevertheless the analysis of citations highlights the significant impact of software publications in experimental research.

  15. Designing Scientific Software for Heterogeneous Computing

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig

    , algorithms and data structures must be designed to utilize the underlying parallel architecture. The architectural changes in hardware design within the last decade, from single to multi and many-core architectures, require software developers to identify and properly implement methods that both exploit...... makes parallel software design applicable, but also a challenge for scientific software developers at all levels. We have developed a generic C++ library for fast prototyping of large-scale PDEs solvers based on flexible-order finite difference approximations on structured regular grids. The library...... is designed with a high abstraction interface to improve developer productivity. The library is based on modern template-based design concepts as described in Glimberg, Engsig-Karup, Nielsen & Dammann (2013). The library utilizes heterogeneous CPU/GPU environments in order to maximize computational throughput...

  16. Software Engineering for Scientific Computer Simulations

    Science.gov (United States)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  17. Performance Engineering Technology for Scientific Component Software

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D.

    2007-05-08

    Large-scale, complex scientific applications are beginning to benefit from the use of component software design methodology and technology for software development. Integral to the success of component-based applications is the ability to achieve high-performing code solutions through the use of performance engineering tools for both intra-component and inter-component analysis and optimization. Our work on this project aimed to develop performance engineering technology for scientific component software in association with the DOE CCTTSS SciDAC project (active during the contract period) and the broader Common Component Architecture (CCA) community. Our specific implementation objectives were to extend the TAU performance system and Program Database Toolkit (PDT) to support performance instrumentation, measurement, and analysis of CCA components and frameworks, and to develop performance measurement and monitoring infrastructure that could be integrated in CCA applications. These objectives have been met in the completion of all project milestones and in the transfer of the technology into the continuing CCA activities as part of the DOE TASCS SciDAC2 effort. In addition to these achievements, over the past three years, we have been an active member of the CCA Forum, attending all meetings and serving in several working groups, such as the CCA Toolkit working group, the CQoS working group, and the Tutorial working group. We have contributed significantly to CCA tutorials since SC'04, hosted two CCA meetings, participated in the annual ACTS workshops, and were co-authors on the recent CCA journal paper [24]. There are four main areas where our project has delivered results: component performance instrumentation and measurement, component performance modeling and optimization, performance database and data mining, and online performance monitoring. This final report outlines the achievements in these areas for the entire project period. The submitted progress

  18. BENCHMARKING MACHINE LEARNING TECHNIQUES FOR SOFTWARE DEFECT DETECTION

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine Learning approaches are good in solving problems that have less information. In most cases, the software domain problems characterize as a process of learning that depend on the various circumstances and changes accordingly. A predictive model is constructed by using machine learning approaches and classified them into defective and non-defective modules. Machine learning techniques help developers to retrieve useful information after the classification and enable them to analyse data...

  19. Agile Scientists? : Investigating Agile Practices in Scientific Software Development

    OpenAIRE

    Sletholt, Magnus Thorstein

    2011-01-01

    The topic of this master thesis is development of scientific software. The research questions put forth are oriented towards specific agile practices and whether these are present in the development processes of scientific software projects. Moreover, the effects of applying such agile practices, particularly pertaining to the handling of requirements and testing, in scientific software projects are addressed in the thesis. In order to answer the proposed research questions a table consisting...

  20. The need for scientific software engineering in the pharmaceutical industry.

    Science.gov (United States)

    Luty, Brock; Rose, Peter W

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  1. The need for scientific software engineering in the pharmaceutical industry

    Science.gov (United States)

    Luty, Brock; Rose, Peter W.

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  2. A development methodology for scientific software

    International Nuclear Information System (INIS)

    Cort, G.; Barrus, D.M.; Goldstone, J.A.; Miller, L.; Nelson, R.O.; Poore, R.V.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed

  3. Design of instrumentation and software for precise laser machining

    Science.gov (United States)

    Wyszyński, D.; Grabowski, Marcin; Lipiec, Piotr

    2017-10-01

    The paper concerns the design of instrumentation and software for precise laser machining. Application of advanced laser beam manipulation instrumentation enables noticeable improvement of cut quality and material loss. This factors have significant impact on process efficiency and cutting edge quality by means of machined part size and shape accuracy, wall taper, material loss reduction (e.g. diamond) and time effectiveness. The goal can be reached by integration of laser drive, observation and optical measurement system, beam manipulation system and five axis mechanical instrumentation with use of advanced tailored software enabling full laser cutting process control and monitoring.

  4. Center for Center for Technology for Advanced Scientific Component Software (TASCS)

    Energy Technology Data Exchange (ETDEWEB)

    Kostadin, Damevski [Virginia State Univ., Petersburg, VA (United States)

    2015-01-25

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.

  5. Center for Technology for Advanced Scientific Component Software (TASCS) Consolidated Progress Report July 2006 - March 2009

    Energy Technology Data Exchange (ETDEWEB)

    Bernholdt, D E; McInnes, L C; Govindaraju, M; Bramley, R; Epperly, T; Kohl, J A; Nieplocha, J; Armstrong, R; Shasharina, S; Sussman, A L; Sottile, M; Damevski, K

    2009-04-14

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.

  6. Scientific Software - the role of best practices and recommendations

    Science.gov (United States)

    Fritzsch, Bernadette; Bernstein, Erik; Castell, Wolfgang zu; Diesmann, Markus; Haas, Holger; Hammitzsch, Martin; Konrad, Uwe; Lähnemann, David; McHardy, Alice; Pampel, Heinz; Scheliga, Kaja; Schreiber, Andreas; Steglich, Dirk

    2017-04-01

    In Geosciences - like in most other communities - scientific work strongly depends on software. For big data analysis, existing (closed or open source) program packages are often mixed with newly developed codes. Different versions of software components and varying configurations can influence the result of data analysis. This often makes reproducibility of results and reuse of codes very difficult. Policies for publication and documentation of used and newly developed software, along with best practices, can help tackle this problem. Within the Helmholtz Association a Task Group "Access to and Re-use of scientific software" was implemented by the Open Science Working Group in 2016. The aim of the Task Group is to foster the discussion about scientific software in the Open Science context and to formulate recommendations for the production and publication of scientific software, ensuring open access to it. As a first step, a workshop gathered interested scientists from institutions across Germany. The workshop brought together various existing initiatives from different scientific communities to analyse current problems, share established best practices and come up with possible solutions. The subjects in the working groups covered a broad range of themes, including technical infrastructures, standards and quality assurance, citation of software and reproducibility. Initial recommendations are presented and discussed in the talk. They are the foundation for further discussions in the Helmholtz Association and the Priority Initiative "Digital Information" of the Alliance of Science Organisations in Germany. The talk aims to inform about the activities and to link with other initiatives on the national or international level.

  7. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  8. Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration

    DEFF Research Database (Denmark)

    Franz, Michael; Gal, Andreas; Probst, Christian

    2006-01-01

    As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers....... Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which...... then execute on top of an existing software portability platform such as Java. The result is a considerably reduced implementation effort....

  9. Teaching Radiology Physics Interactively with Scientific Notebook Software.

    Science.gov (United States)

    Richardson, Michael L; Amini, Behrang

    2018-06-01

    The goal of this study is to demonstrate how the teaching of radiology physics can be enhanced with the use of interactive scientific notebook software. We used the scientific notebook software known as Project Jupyter, which is free, open-source, and available for the Macintosh, Windows, and Linux operating systems. We have created a scientific notebook that demonstrates multiple interactive teaching modules we have written for our residents using the Jupyter notebook system. Scientific notebook software allows educators to create teaching modules in a form that combines text, graphics, images, data, interactive calculations, and image analysis within a single document. These notebooks can be used to build interactive teaching modules, which can help explain complex topics in imaging physics to residents. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  10. Criteria and tools for scientific software quality measurements

    Energy Technology Data Exchange (ETDEWEB)

    Tseng, M Y [Previse Inc., Willowdale ON (Canada)

    1995-12-01

    Not all software used in the nuclear industry needs the rigorous formal verification, reliability testing and quality assessment that are being applied to safety critical software. Recently, however, there is increasing recognition that systematic and objective quality assessment of the scientific software used in design and safety analyses of nuclear facilities is necessary to support safety and licensing decisions. Because of the complexity and large size of these programs and the resource constraints faced by the AECB reviewer, it is desirable that appropriate automated tools are used wherever practical. To objectively assess the quality of software, a set of attributes of a software product by which its quality is described and evaluated must be established. These attributes must be relevant to the application domain of software under evaluation. To effectively assess the quality of software, metrics defining quantitative scale and method appropriate to determine the value of attributes need to be applied. To cost-effectively perform the evaluation, use of suitable automated tools is desirable. In this project, criteria for evaluating the quality of scientific software are presented; metrics for which those criteria can be evaluated are identified; a survey of automated tools to measure those metrics was conducted and the most appropriate tool (QA Fortran) was acquired; and the tool usage was demonstrated on three sample programs. (author) 5 refs.

  11. Criteria and tools for scientific software quality measurements

    International Nuclear Information System (INIS)

    Tseng, M.Y.

    1995-12-01

    Not all software used in the nuclear industry needs the rigorous formal verification, reliability testing and quality assessment that are being applied to safety critical software. Recently, however, there is increasing recognition that systematic and objective quality assessment of the scientific software used in design and safety analyses of nuclear facilities is necessary to support safety and licensing decisions. Because of the complexity and large size of these programs and the resource constraints faced by the AECB reviewer, it is desirable that appropriate automated tools are used wherever practical. To objectively assess the quality of software, a set of attributes of a software product by which its quality is described and evaluated must be established. These attributes must be relevant to the application domain of software under evaluation. To effectively assess the quality of software, metrics defining quantitative scale and method appropriate to determine the value of attributes need to be applied. To cost-effectively perform the evaluation, use of suitable automated tools is desirable. In this project, criteria for evaluating the quality of scientific software are presented; metrics for which those criteria can be evaluated are identified; a survey of automated tools to measure those metrics was conducted and the most appropriate tool (QA Fortran) was acquired; and the tool usage was demonstrated on three sample programs. (author) 5 refs

  12. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  13. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  14. USAGE OF CONSTRUCTION-ORIENTED SOFTWARE SCAD FOR ANALYSIS OF WORK OF MACHINE-BUILDING STRUCTURES

    Directory of Open Access Journals (Sweden)

    D. О. Bannikov

    2018-02-01

    Full Text Available Purpose. In the case of analysis of work of the machine-building structures, the direct usage of construction-oriented software developments is impossible, since ideology and methodology for solving various tasks in construction and machine-building are different. Therefore, in the conducting of practical calculations, there is a need for a certain adjustment of the approaches put in the program complexes and their adaptation to the engineering industry. The presentation of the author's experience of the construction-oriented software SCAD usage for Windows for analyzing the work of various machine-building structures, their components and assemblies is the immediate purpose of the publication. Methodology. During a long period of time the author was engaged in analyzing the work of building, mainly thin-walled, steel structures using the Finite Element Method based on the SCAD for Windows software package. At the same time, a considerable number of machine-building structures were considered, including railroad rolling stock units. Most of these tasks grew into a scientific and research problem that needed to be thoroughly researched and analyzed before giving design recommendations. Findings. The publication presents more than a dozen different tasks, typical for the machine-building industry, which the author had to deal with. Static and quasi-static problems, the problem of motion in time, the contact problem, the problem of the cracks deve-lopment, the physical and geometric non-linearity are among them. Accordingly, for each of these problems the main challenges, features and practical techniques developed during the work are presented, as well as the constructed finite element models are presented as an illustration. Originality. The experience of construction-oriented software product usage on the basis of the Finite Element Method for analyzing of the work of machine-building structures is generalized. A number of practical methods and

  15. Managing Scientific Software Complexity with Bocca and CCA

    Directory of Open Access Journals (Sweden)

    Benjamin A. Allan

    2008-01-01

    Full Text Available In high-performance scientific software development, the emphasis is often on short time to first solution. Even when the development of new components mostly reuses existing components or libraries and only small amounts of new code must be created, dealing with the component glue code and software build processes to obtain complete applications is still tedious and error-prone. Component-based software meant to reduce complexity at the application level increases complexity to the extent that the user must learn and remember the interfaces and conventions of the component model itself. To address these needs, we introduce Bocca, the first tool to enable application developers to perform rapid component prototyping while maintaining robust software-engineering practices suitable to HPC environments. Bocca provides project management and a comprehensive build environment for creating and managing applications composed of Common Component Architecture components. Of critical importance for high-performance computing (HPC applications, Bocca is designed to operate in a language-agnostic way, simultaneously handling components written in any of the languages commonly used in scientific applications: C, C++, Fortran, Python and Java. Bocca automates the tasks related to the component glue code, freeing the user to focus on the scientific aspects of the application. Bocca embraces the philosophy pioneered by Ruby on Rails for web applications: start with something that works, and evolve it to the user's purpose.

  16. Software architecture for time-constrained machine vision applications

    Science.gov (United States)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2013-01-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility, because they are normally oriented toward particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse, and inefficient execution on multicore processors. We present a novel software architecture for time-constrained machine vision applications that addresses these issues. The architecture is divided into three layers. The platform abstraction layer provides a high-level application programming interface for the rest of the architecture. The messaging layer provides a message-passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of message. The application layer provides a repository for reusable application modules designed for machine vision applications. These modules, which include acquisition, visualization, communication, user interface, and data processing, take advantage of the power of well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, the proposed architecture is applied to a real machine vision application: a jam detector for steel pickling lines.

  17. Continuous integration and quality control for scientific software

    Science.gov (United States)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  18. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  19. Voice recognition software can be used for scientific articles

    DEFF Research Database (Denmark)

    Pommergaard, Hans-Christian; Huang, Chenxi; Burcharth, Jacob

    2015-01-01

    INTRODUCTION: Dictation of scientific articles has been recognised as an efficient method for producing high-quality, first article drafts. However, standardised transcription service by a secretary may not be available for all researchers and voice recognition software (VRS) may therefore...... with a median score of five (range: 3-9), which was improved with the addition of 5,000 words. CONCLUSION: The out-of-the-box performance of VRS was acceptable and improved after additional words were added. Further studies are needed to investigate the effect of additional software accuracy training....

  20. Nurturing reliable and robust open-source scientific software

    Science.gov (United States)

    Uieda, L.; Wessel, P.

    2017-12-01

    Scientific results are increasingly the product of software. The reproducibility and validity of published results cannot be ensured without access to the source code of the software used to produce them. Therefore, the code itself is a fundamental part of the methodology and must be published along with the results. With such a reliance on software, it is troubling that most scientists do not receive formal training in software development. Tools such as version control, continuous integration, and automated testing are routinely used in industry to ensure the correctness and robustness of software. However, many scientist do not even know of their existence (although efforts like Software Carpentry are having an impact on this issue; software-carpentry.org). Publishing the source code is only the first step in creating an open-source project. For a project to grow it must provide documentation, participation guidelines, and a welcoming environment for new contributors. Expanding the project community is often more challenging than the technical aspects of software development. Maintainers must invest time to enforce the rules of the project and to onboard new members, which can be difficult to justify in the context of the "publish or perish" mentality. This problem will continue as long as software contributions are not recognized as valid scholarship by hiring and tenure committees. Furthermore, there are still unsolved problems in providing attribution for software contributions. Many journals and metrics of academic productivity do not recognize citations to sources other than traditional publications. Thus, some authors choose to publish an article about the software and use it as a citation marker. One issue with this approach is that updating the reference to include new contributors involves writing and publishing a new article. A better approach would be to cite a permanent archive of individual versions of the source code in services such as Zenodo

  1. Voice recognition software can be used for scientific articles

    DEFF Research Database (Denmark)

    Pommergaard, Hans-Christian; Huang, Chenxi; Burcharth, Jacob

    2015-01-01

    INTRODUCTION: Dictation of scientific articles has been recognised as an efficient method for producing high-quality, first article drafts. However, standardised transcription service by a secretary may not be available for all researchers and voice recognition software (VRS) may therefore...... be an alternative. The purpose of this study was to evaluate the out-of-the-box accuracy of VRS. METHODS: Eleven young researchers without dictation experience dictated the first draft of their own scientific article after thorough preparation according to a pre-defined schedule. The dictate transcribed by VRS...

  2. eSciMart: Web Platform for Scientific Software Marketplace

    Science.gov (United States)

    Kryukov, A. P.; Demichev, A. P.

    2016-10-01

    In this paper we suggest a design of a web marketplace where users of scientific application software and databases, presented in the form of web services, as well as their providers will have presence simultaneously. The model, which will be the basis for the web marketplace is close to the customer-to-customer (C2C) model, which has been successfully used, for example, on the auction sites such as eBay (ebay.com). Unlike the classical model of C2C the suggested marketplace focuses on application software in the form of web services, and standardization of API through which application software will be integrated into the web marketplace. A prototype of such a platform, entitled eSciMart, is currently being developed at SINP MSU.

  3. Sardana, the software for building SCADAS in scientific environments

    International Nuclear Information System (INIS)

    Coutinho, T.; Cuni, G.; Fernandez-Carreiras, D.; Klora, J.; Pascual-Izarra, C.; Reszela, Z.; Sune, R.; Homs, A.; Taurel, E.; Rey, V.

    2012-01-01

    Sardana is a software package for Supervision, Control and Data Acquisition in scientific installations. It delivers important cost and time reductions associated with the design, development and support of the control and data acquisition systems. It enhances TANGO with the capabilities for building graphical interfaces without writing code, a powerful python-based macro environment for building sequences and complex macros, and a comprehensive access to the hardware. Just as Tango, Sardana is Open Source and its development model is open to collaboration, which provides a free platform that scales well to small laboratories as well as to large scientific institutions. The first beta version has been commissioned for the control system of Accelerators and Beamlines at the Alba Synchrotron. Furthermore, there is a collaboration in place, comprising Desy, MaxIV and Solaris, and several other potential users are evaluating it. (authors)

  4. Voice recognition software can be used for scientific articles.

    Science.gov (United States)

    Pommergaard, Hans-Christian; Huang, Chenxi; Burcharth, Jacob; Rosenberg, Jacob

    2015-02-01

    Dictation of scientific articles has been recognised as an efficient method for producing high-quality, first article drafts. However, standardised transcription service by a secretary may not be available for all researchers and voice recognition software (VRS) may therefore be an alternative. The purpose of this study was to evaluate the out-of-the-box accuracy of VRS. Eleven young researchers without dictation experience dictated the first draft of their own scientific article after thorough preparation according to a pre-defined schedule. The dictate transcribed by VRS was compared with the same dictate transcribed by an experienced research secretary, and the effect of adding words to the vocabulary of the VRS was investigated. The number of errors per hundred words was used as outcome. Furthermore, three experienced researchers assessed the subjective readability using a Likert scale (0-10). Dragon Nuance Premium version 12.5 was used as VRS. The median number of errors per hundred words was 18 (range: 8.5-24.3), which improved when 15,000 words were added to the vocabulary. Subjective readability assessment showed that the texts were understandable with a median score of five (range: 3-9), which was improved with the addition of 5,000 words. The out-of-the-box performance of VRS was acceptable and improved after additional words were added. Further studies are needed to investigate the effect of additional software accuracy training.

  5. Craniux: a LabVIEW-based modular software framework for brain-machine interface research.

    Science.gov (United States)

    Degenhart, Alan D; Kelly, John W; Ashmore, Robin C; Collinger, Jennifer L; Tyler-Kabara, Elizabeth C; Weber, Douglas J; Wang, Wei

    2011-01-01

    This paper presents "Craniux," an open-access, open-source software framework for brain-machine interface (BMI) research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG) signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development.

  6. Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research

    Directory of Open Access Journals (Sweden)

    Alan D. Degenhart

    2011-01-01

    Full Text Available This paper presents “Craniux,” an open-access, open-source software framework for brain-machine interface (BMI research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development.

  7. Choosing between different AI approaches? The scientific benefits of the confrontation, and the new collaborative era between humans and machines

    Directory of Open Access Journals (Sweden)

    Jordi Vallverdú

    2008-07-01

    Full Text Available AI is a multidisciplinary activity that involves specialists from several fields, and we can say that the aim of science, and AI science, is solving problems. AI and computer sciences are been creating a new kind of making science, that we can call in silico science. Both models top-eown and bottomup are useful for e-scientific research. There is no a real controversy between them. Besides, the extended mind model of human cognition, involves human-machine interactions. Huge amount of data requires new ways to make and organize scientific practices: supercomputers, grids, distributed computing, specific software and middleware and, basically, more efficient and visual ways to interact with information. This is one of the key points to understand contemporary relationships between humans and machines: usability of scientific data.

  8. Practical guide to machine vision software an introduction with LabVIEW

    CERN Document Server

    Kwon, Kye-Si

    2014-01-01

    For both students and engineers in R&D, this book explains machine vision in a concise, hands-on way, using the Vision Development Module of the LabView software by National Instruments. Following a short introduction to the basics of machine vision and the technical procedures of image acquisition, the book goes on to guide readers in the use of the various software functions of LabView's machine vision module. It covers typical machine vision tasks, including particle analysis, edge detection, pattern and shape matching, dimension measurements as well as optical character recognition, enabli

  9. Learning from open source software projects to improve scientific review.

    Science.gov (United States)

    Ghosh, Satrajit S; Klein, Arno; Avants, Brian; Millman, K Jarrod

    2012-01-01

    Peer-reviewed publications are the primary mechanism for sharing scientific results. The current peer-review process is, however, fraught with many problems that undermine the pace, validity, and credibility of science. We highlight five salient problems: (1) reviewers are expected to have comprehensive expertise; (2) reviewers do not have sufficient access to methods and materials to evaluate a study; (3) reviewers are neither identified nor acknowledged; (4) there is no measure of the quality of a review; and (5) reviews take a lot of time, and once submitted cannot evolve. We propose that these problems can be resolved by making the following changes to the review process. Distributing reviews to many reviewers would allow each reviewer to focus on portions of the article that reflect the reviewer's specialty or area of interest and place less of a burden on any one reviewer. Providing reviewers materials and methods to perform comprehensive evaluation would facilitate transparency, greater scrutiny, and replication of results. Acknowledging reviewers makes it possible to quantitatively assess reviewer contributions, which could be used to establish the impact of the reviewer in the scientific community. Quantifying review quality could help establish the importance of individual reviews and reviewers as well as the submitted article. Finally, we recommend expediting post-publication reviews and allowing for the dialog to continue and flourish in a dynamic and interactive manner. We argue that these solutions can be implemented by adapting existing features from open-source software management and social networking technologies. We propose a model of an open, interactive review system that quantifies the significance of articles, the quality of reviews, and the reputation of reviewers.

  10. High-integrity software, computation and the scientific method

    International Nuclear Information System (INIS)

    Hatton, L.

    2012-01-01

    Computation rightly occupies a central role in modern science. Datasets are enormous and the processing implications of some algorithms are equally staggering. With the continuing difficulties in quantifying the results of complex computations, it is of increasing importance to understand its role in the essentially Popperian scientific method. In this paper, some of the problems with computation, for example the long-term unquantifiable presence of undiscovered defect, problems with programming languages and process issues will be explored with numerous examples. One of the aims of the paper is to understand the implications of trying to produce high-integrity software and the limitations which still exist. Unfortunately Computer Science itself suffers from an inability to be suitably critical of its practices and has operated in a largely measurement-free vacuum since its earliest days. Within computer science itself, this has not been so damaging in that it simply leads to unconstrained creativity and a rapid turnover of new technologies. In the applied sciences however which have to depend on computational results, such unquantifiability significantly undermines trust. It is time this particular demon was put to rest. (author)

  11. Final Report for 'Center for Technology for Advanced Scientific Component Software'

    International Nuclear Information System (INIS)

    Shasharina, Svetlana

    2010-01-01

    The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.

  12. Software design of the hybrid robot machine for ITER vacuum vessel assembly and maintenance

    International Nuclear Information System (INIS)

    Li, Ming; Wu, Huapeng; Handroos, Heikki; Yang, Guangyou

    2013-01-01

    A specific software design is elaborated in this paper for the hybrid robot machine used for the ITER vacuum vessel (VV) assembly and maintenance. In order to provide the multi-machining-function as well as the complicated, flexible and customizable GUI designing satisfying the non-standardized VV assembly process in one hand, and in another hand guarantee the stringent machining precision in the real-time motion control of robot machine, a client–server-control software architecture is proposed, which separates the user interaction, data communication and robot control implementation into different software layers. Correspondingly, three particular application protocols upon the TCP/IP are designed to transmit the data, command and status between the client and the server so as to deal with the abundant data streaming in the software. In order not to be affected by the graphic user interface (GUI) modification process in the future experiment in VV assembly working field, the real-time control system is realized as a stand-alone module in the architecture to guarantee the controlling performance of the robot machine. After completing the software development, a milling operation is tested on the robot machine, and the result demonstrates that both the specific GUI operability and the real-time motion control performance could be guaranteed adequately in the software design

  13. Software design of the hybrid robot machine for ITER vacuum vessel assembly and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ming, E-mail: Ming.Li@lut.fi [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland); Wu, Huapeng; Handroos, Heikki [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland); Yang, Guangyou [School of Mechanical Engineering, Hubei University of Technology, Wuhan (China)

    2013-10-15

    A specific software design is elaborated in this paper for the hybrid robot machine used for the ITER vacuum vessel (VV) assembly and maintenance. In order to provide the multi-machining-function as well as the complicated, flexible and customizable GUI designing satisfying the non-standardized VV assembly process in one hand, and in another hand guarantee the stringent machining precision in the real-time motion control of robot machine, a client–server-control software architecture is proposed, which separates the user interaction, data communication and robot control implementation into different software layers. Correspondingly, three particular application protocols upon the TCP/IP are designed to transmit the data, command and status between the client and the server so as to deal with the abundant data streaming in the software. In order not to be affected by the graphic user interface (GUI) modification process in the future experiment in VV assembly working field, the real-time control system is realized as a stand-alone module in the architecture to guarantee the controlling performance of the robot machine. After completing the software development, a milling operation is tested on the robot machine, and the result demonstrates that both the specific GUI operability and the real-time motion control performance could be guaranteed adequately in the software design.

  14. Software architecture standard for simulation virtual machine, version 2.0

    Science.gov (United States)

    Sturtevant, Robert; Wessale, William

    1994-01-01

    The Simulation Virtual Machine (SBM) is an Ada architecture which eases the effort involved in the real-time software maintenance and sustaining engineering. The Software Architecture Standard defines the infrastructure which all the simulation models are built from. SVM was developed for and used in the Space Station Verification and Training Facility.

  15. Application of software quality assurance to a specific scientific code development task

    International Nuclear Information System (INIS)

    Dronkers, J.J.

    1986-03-01

    This paper describes an application of software quality assurance to a specific scientific code development program. The software quality assurance program consists of three major components: administrative control, configuration management, and user documentation. The program attempts to be consistent with existing local traditions of scientific code development while at the same time providing a controlled process of development

  16. Using Machine Learning for Risky Module Estimation of Safety-Critical Software

    International Nuclear Information System (INIS)

    Kim, Young Mi; Jeong, Choong Heui

    2009-01-01

    With the rapid development of digital computer and information processing technologies, nuclear I and C (Instrument and Control) system which needs safety critical function has adopted digital technologies. Software used in safety-critical system must have high dependability. Highly dependable software needs strict software testing and V and V activities. These days, regulatory demands for nuclear power plants are more and more increasing. But, human resources and time for regulation are limited. So, early software risky module prediction is very useful for software testing and regulation activities. Early estimation can be built from a collection of internal metrics during early development phase. Internal metrics are measures of a product derived from assessment of the product itself, and external metrics are measures of a product derived from assessment of the behavior of the systems. Internal metrics can be collected more easily and early than external metrics. In addition, internal metrics can be useful for estimating fault-prone software modules using machine learning. In this paper, we introduce current research status and techniques related to estimating risky software module using machine learning techniques. Section 2 describes the overview of the estimation model using machine learning and section 3 describes processes of the estimation model. Section 4 describes several estimation models using machine leanings. Section 5 concludes the paper

  17. Protyping machine vision software on the World Wide Web

    Science.gov (United States)

    Karantalis, George; Batchelor, Bruce G.

    1998-10-01

    Interactive image processing is a proven technique for analyzing industrial vision applications and building prototype systems. Several of the previous implementations have used dedicated hardware to perform the image processing, with a top layer of software providing a convenient user interface. More recently, self-contained software packages have been devised and these run on a standard computer. The advent of the Java programming language has made it possible to write platform-independent software, operating over the Internet, or a company-wide Intranet. Thus, there arises the possibility of designing at least some shop-floor inspection/control systems, without the vision engineer ever entering the factories where they will be used. It successful, this project will have a major impact on the productivity of vision systems designers.

  18. Final Scientific/Technical Report for "Enabling Exascale Hardware and Software Design through Scalable System Virtualization"

    Energy Technology Data Exchange (ETDEWEB)

    Dinda, Peter August [Northwestern Univ., Evanston, IL (United States)

    2015-03-17

    This report describes the activities, findings, and products of the Northwestern University component of the "Enabling Exascale Hardware and Software Design through Scalable System Virtualization" project. The purpose of this project has been to extend the state of the art of systems software for high-end computing (HEC) platforms, and to use systems software to better enable the evaluation of potential future HEC platforms, for example exascale platforms. Such platforms, and their systems software, have the goal of providing scientific computation at new scales, thus enabling new research in the physical sciences and engineering. Over time, the innovations in systems software for such platforms also become applicable to more widely used computing clusters, data centers, and clouds. This was a five-institution project, centered on the Palacios virtual machine monitor (VMM) systems software, a project begun at Northwestern, and originally developed in a previous collaboration between Northwestern University and the University of New Mexico. In this project, Northwestern (including via our subcontract to the University of Pittsburgh) contributed to the continued development of Palacios, along with other team members. We took the leadership role in (1) continued extension of support for emerging Intel and AMD hardware, (2) integration and performance enhancement of overlay networking, (3) connectivity with architectural simulation, (4) binary translation, and (5) support for modern Non-Uniform Memory Access (NUMA) hosts and guests. We also took a supporting role in support for specialized hardware for I/O virtualization, profiling, configurability, and integration with configuration tools. The efforts we led (1-5) were largely successful and executed as expected, with code and papers resulting from them. The project demonstrated the feasibility of a virtualization layer for HEC computing, similar to such layers for cloud or datacenter computing. For effort (3

  19. Evaluation of Machine Learning Methods for LHC Optics Measurements and Corrections Software

    CERN Document Server

    AUTHOR|(CDS)2206853; Henning, Peter

    The field of artificial intelligence is driven by the goal to provide machines with human-like intelligence. However modern science is currently facing problems with high complexity that cannot be solved by humans in the same timescale as by machines. Therefore there is a demand on automation of complex tasks. To identify the category of tasks which can be performed by machines in the domain of optics measurements and correction on the Large Hadron Collider (LHC) is one of the central research subjects of this thesis. The application of machine learning methods and concepts of artificial intelligence can be found in various industry and scientific branches. In High Energy Physics these concepts are mostly used in offline analysis of experiments data and to perform regression tasks. In Accelerator Physics the machine learning approach has not found a wide application yet. Therefore potential tasks for machine learning solutions can be specified in this domain. The appropriate methods and their suitability for...

  20. The Use of Open Source Software for Open Architecture System on CNC Milling Machine

    Directory of Open Access Journals (Sweden)

    Dalmasius Ganjar Subagio

    2012-03-01

    Full Text Available Computer numerical control (CNC milling machine system cannot be separated from the software required to follow the provisions of the Open Architecture capabilities that have portability, extend ability, interoperability, and scalability. When a prescribed period of a CNC milling machine has passed and the manufacturer decided to discontinue it, then the user will have problems for maintaining the performance of the machine. This paper aims to show that the using of open source software (OSS is the way out to maintain engine performance. With the use of OSS, users no longer depend on the software built by the manufacturer because OSS is open and can be developed independently. In this paper, USBCNC V.3.42 is used as an alternative OSS. The test result shows that the work piece is in match with the desired pattern. The test result shows that the performance of machines using OSS has similar performance with the machine using software from the manufacturer. 

  1. Software quality and process improvement in scientific simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, J.; Webster, R. [Los Alamos National Lab., NM (United States)

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  2. Interface unit and software of X-ray television automatic machine

    International Nuclear Information System (INIS)

    Molodykh, V.A.; Yamanaev, M.S.

    1983-01-01

    Description of the interface unit and specialized software of X-ray television automatic machine is presented. An algorithm for automatic defect survey, measuring of defect geometric parameters with a successive estimate of control quality in accordance with technical norms is proposed. Experimental investigation results on the quality of welded joints of steel tubes obtained using the above system are summarized

  3. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Science.gov (United States)

    2011-02-02

    ... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-74,554] International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA, San Jose, CA; Notice of Affirmative Determination Regarding Application for Reconsideration By application dated November 29, 2010, a worker and a state workforce official...

  4. 76 FR 54800 - International Business Machines (IBM), Software Group Business Unit, Quality Assurance Group, San...

    Science.gov (United States)

    2011-09-02

    ... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-74,554] International Business Machines (IBM), Software Group Business Unit, Quality Assurance Group, San Jose, California; Notice of Negative Determination on Reconsideration On January 21, 2011, the Department of Labor (Department) issued an Affirmative Determination Regarding...

  5. Instrumentino: An Open-Source Software for Scientific Instruments

    OpenAIRE

    Koenka, Israel Joel; Sáiz, Jorge; Hauser, Peter C.

    2015-01-01

    Scientists often need to build dedicated computer-controlled experimental systems. For this purpose, it is becoming common to employ open-source microcontroller platforms, such as the Arduino. These boards and associated integrated software development environments provide affordable yet powerful solutions for the implementation of hardware control of transducers and acquisition of signals from detectors and sensors. It is, however, a challenge to write programs that allow interactive use of ...

  6. Organizing Scientific Information with Mendeley© software

    OpenAIRE

    Vlčková Kateřina; Lojdová Kateřina; Mareš Jan

    2013-01-01

    The main goal of this workshop was to provide academics and doctoral students the knowledge and skills necessary to efficiently organize and retrieve information using Mendeley© software. During the research process, the next step after finding information, is organize it. In this digital era, our skills to efficiently find information need to be different from those of the print era (Tuominen, 2007). The same applies to the subsequent process of organization of that information. Furthermore,...

  7. From scientific instrument to industrial machine : Coping with architectural stress in embedded systems

    NARCIS (Netherlands)

    Doornbos, R.; Loo, S. van

    2012-01-01

    Architectural stress is the inability of a system design to respond to new market demands. It is an important yet often concealed issue in high tech systems. In From scientific instrument to industrial machine, we look at the phenomenon of architectural stress in embedded systems in the context of a

  8. The Dostoevsky Machine in Georgetown: scientific translation in the Cold War.

    Science.gov (United States)

    Gordin, Michael D

    2016-04-01

    Machine Translation (MT) is now ubiquitous in discussions of translation. The roots of this phenomenon - first publicly unveiled in the so-called 'Georgetown-IBM Experiment' on 9 January 1954 - displayed not only the technological utopianism still associated with dreams of a universal computer translator, but was deeply enmeshed in the political pressures of the Cold War and a dominating conception of scientific writing as both the goal of machine translation as well as its method. Machine translation was created, in part, as a solution to a perceived crisis sparked by the massive expansion of Soviet science. Scientific prose was also perceived as linguistically simpler, and so served as the model for how to turn a language into a series of algorithms. This paper follows the rise of the Georgetown program - the largest single program in the world - from 1954 to the (as it turns out, temporary) collapse of MT in 1964.

  9. Subcutaneous ICD screening with the Boston Scientific ZOOM programmer versus a 12-lead ECG machine.

    Science.gov (United States)

    Chang, Shu C; Patton, Kristen K; Robinson, Melissa R; Poole, Jeanne E; Prutkin, Jordan M

    2018-02-24

    The subcutaneous implantable cardioverter-defibrillator (S-ICD) requires preimplant screening to ensure appropriate sensing and reduce risk of inappropriate shocks. Screening can be performed using either an ICD programmer or a 12-lead electrocardiogram (ECG) machine. It is unclear whether differences in signal filtering and digital sampling change the screening success rate. Subjects were recruited if they had a transvenous single-lead ICD without pacing requirements or were candidates for a new ICD. Screening was performed using both a Boston Scientific ZOOM programmer (Marlborough, MA, USA) and General Electric MAC 5000 ECG machine (Fairfield, CT, USA). A pass was defined as having at least one lead that fit within the screening template in both supine and sitting positions. A total of 69 subjects were included and 27 sets of ECG leads had differing screening results between the two machines (7%). Of these sets, 22 (81%) passed using the ECG machine but failed using the programmer and five (19%) passed using the ECG machine but failed using the programmer (P machine but failed using the programmer. No subject passed screening with the programmer but failed with the ECG machine. There can be occasional disagreement in S-ICD patient screening between an ICD programmer and ECG machine, all of whom passed with the ECG machine but failed using the programmer. On a per lead basis, the ECG machine passes more subjects. It is unknown what the inappropriate shock rate would be if an S-ICD was implanted. Clinical judgment should be used in borderline cases. © 2018 Wiley Periodicals, Inc.

  10. From scientific instrument to industrial machine coping with architectural stress in embedded systems

    CERN Document Server

    Doornbos, Richard

    2012-01-01

    Architectural stress is the inability of a system design to respond to new market demands. It is an important yet often concealed issue in high tech systems. In From scientific instrument to industrial machine, we look at the phenomenon of architectural stress in embedded systems in the context of a transmission electron microscope system built by FEI Company. Traditionally, transmission electron microscopes are manually operated scientific instruments, but they also have enormous potential for use in industrial applications. However, this new market has quite different characteristics. There are strong demands for cost-effective analysis, accurate and precise measurements, and ease-of-use. These demands can be translated into new system qualities, e.g. reliability, predictability and high throughput, as well as new functions, e.g. automation of electron microscopic analyses, automated focusing and positioning functions. From scientific instrument to industrial machine takes a pragmatic approach to the proble...

  11. RSYST: From nuclear reactor calculations towards a highly sophisticated scientific software integration environment

    International Nuclear Information System (INIS)

    Noack, M.; Seybold, J.; Ruehle, R.

    1996-01-01

    The software environment RSYST was originally used to solve problems of reactor physics. The consideration of advanced scientific simulation requirements and the strict application of modern software design principles led to a system which is perfectly suitable to solve problems in various complex scientific problem domains. Starting with a review of the early days of RSYST, we describe the straight evolution driven by the need of software environment which combines the advantages of a high-performance database system with the capability to integrate sophisticated scientific technical applications. The RSYST architecture is presented and the data modelling capabilities are described. To demonstrate the powerful possibilities and flexibility of the RSYST environment, we describe a wide range of RSYST applications, e.g., mechanical simulations of multibody systems, which are used in biomechanical research, civil engineering and robotics. In addition, a hypermedia system which is used for scientific technical training and documentation is presented. (orig.) [de

  12. Software and man-machine interface considerations for a nuclear plant computer replacement and upgrade project

    International Nuclear Information System (INIS)

    Diamond, G.; Robinson, E.

    1984-01-01

    Some of the key software functions and Man-Machine Interface considerations in a computer replacement and upgrade project for a nuclear power plant are described. The project involves the installation of two separate computer systems: an Emergency Response Facilities Computer System (ERFCS) and a Plant Process Computer System (PPCS). These systems employ state-of-the-art computer hardware and software. The ERFCS is a new system intended to provide enhanced functions to meet NRC post-TMI guidelines. The PPCS is intended to replace and upgrade an existing obsolete plant computer system. A general overview of the hardware and software aspects of the replacement and upgrade is presented. The work done to develop the upgraded Man-Machine Interface is described. For the ERFCS, a detailed discussion is presented of the work done to develop logic to evaluate the readiness and performance of safety systems and their supporting functions. The Man-Machine Interface considerations of reporting readiness and performance to the operator are discussed. Finally, the considerations involved in the implementation of this logic in real-time software are discussed.. For the PPCS, a detailed discussion is presented of some new features

  13. Software support for students engaging in scientific activity and scientific controversy

    Science.gov (United States)

    Cavalli-Sforza, Violetta; Weiner, Arlene W.; Lesgold, Alan M.

    Computer environments could support students in engaging in cognitive activities that are essential to scientific practice and to the understanding of the nature of scientific knowledge, but that are difficult to manage in science classrooms. The authors describe a design for a computer-based environment to assist students in conducting dialectical activities of constructing, comparing, and evaluating arguments for competing scientific theories. Their choice of activities and their design respond to educators' and theorists' criticisms of current science curricula. They give detailed specifications of portions of the environment.

  14. Exploring machine-learning-based control plane intrusion detection techniques in software defined optical networks

    Science.gov (United States)

    Zhang, Huibin; Wang, Yuqiao; Chen, Haoran; Zhao, Yongli; Zhang, Jie

    2017-12-01

    In software defined optical networks (SDON), the centralized control plane may encounter numerous intrusion threatens which compromise the security level of provisioned services. In this paper, the issue of control plane security is studied and two machine-learning-based control plane intrusion detection techniques are proposed for SDON with properly selected features such as bandwidth, route length, etc. We validate the feasibility and efficiency of the proposed techniques by simulations. Results show an accuracy of 83% for intrusion detection can be achieved with the proposed machine-learning-based control plane intrusion detection techniques.

  15. NetiNeti: discovery of scientific names from text using machine learning methods

    Directory of Open Access Journals (Sweden)

    Akella Lakshmi

    2012-08-01

    Full Text Available Abstract Background A scientific name for an organism can be associated with almost all biological data. Name identification is an important step in many text mining tasks aiming to extract useful information from biological, biomedical and biodiversity text sources. A scientific name acts as an important metadata element to link biological information. Results We present NetiNeti (Name Extraction from Textual Information-Name Extraction for Taxonomic Indexing, a machine learning based approach for recognition of scientific names including the discovery of new species names from text that will also handle misspellings, OCR errors and other variations in names. The system generates candidate names using rules for scientific names and applies probabilistic machine learning methods to classify names based on structural features of candidate names and features derived from their contexts. NetiNeti can also disambiguate scientific names from other names using the contextual information. We evaluated NetiNeti on legacy biodiversity texts and biomedical literature (MEDLINE. NetiNeti performs better (precision = 98.9% and recall = 70.5% compared to a popular dictionary based approach (precision = 97.5% and recall = 54.3% on a 600-page biodiversity book that was manually marked by an annotator. On a small set of PubMed Central’s full text articles annotated with scientific names, the precision and recall values are 98.5% and 96.2% respectively. NetiNeti found more than 190,000 unique binomial and trinomial names in more than 1,880,000 PubMed records when used on the full MEDLINE database. NetiNeti also successfully identifies almost all of the new species names mentioned within web pages. Conclusions We present NetiNeti, a machine learning based approach for identification and discovery of scientific names. The system implementing the approach can be accessed at http://namefinding.ubio.org.

  16. Learning from open source software projects to improve scientific review

    Directory of Open Access Journals (Sweden)

    Satrajit S Ghosh

    2012-04-01

    Full Text Available Peer-reviewed publications are the primary mechanism for sharing scientific results. The current peer-review process is, however, fraught with many problems that undermine the pace, validity, and credibility of science. We highlight five salient problems: (1 Reviewers are expected to have comprehensive expertise; (2 Reviewers do not have sufficient access to methods and materials to evaluate a study; (3 Reviewers are not acknowledged; (4 There is no measure of the quality of a review; and (5 Reviews take a lot of time, and once submitted cannot evolve. We propose that these problems can be resolved by making the following changes to the review process. Distributing reviews to many reviewers would allow each reviewer to focus on portions of the article that reflect the reviewer’s specialty or area of interest and place less of a burden on any one reviewer, enabling a more comprehensive and timely review. Providing reviewers materials and methods to perform comprehensive evaluation would facilitate transparency, replication of results and enable greater scrutiny by people from different fields using different nomenclature, leading to greater clarity and cross-fertilization of ideas. Acknowledging reviewers makes it possible to quantitatively assess reviewer contributions, which could be integrated with assessments for promotions and grants. Quantifying review quality could help establish the importance of reviewers and information generated during a review, and assess the importance of a submitted article. Finally, we recommend expediting post-publication reviews and allowing for the dialogue to continue and flourish in a dynamic and interactive manner. We argue that these solutions can be addressed by building upon computer programming code management systems. In this article, we provide examples of current code review systems that offer opportunities for addressing the above problems, and offer suggestions for enhancing code review systems for

  17. Certainty in Stockpile Computing: Recommending a Verification and Validation Program for Scientific Software

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J.R.

    1998-11-01

    As computing assumes a more central role in managing the nuclear stockpile, the consequences of an erroneous computer simulation could be severe. Computational failures are common in other endeavors and have caused project failures, significant economic loss, and loss of life. This report examines the causes of software failure and proposes steps to mitigate them. A formal verification and validation program for scientific software is recommended and described.

  18. An application of machine learning to the organization of institutional software repositories

    Science.gov (United States)

    Bailin, Sidney; Henderson, Scott; Truszkowski, Walt

    1993-01-01

    Software reuse has become a major goal in the development of space systems, as a recent NASA-wide workshop on the subject made clear. The Data Systems Technology Division of Goddard Space Flight Center has been working on tools and techniques for promoting reuse, in particular in the development of satellite ground support software. One of these tools is the Experiment in Libraries via Incremental Schemata and Cobweb (ElvisC). ElvisC applies machine learning to the problem of organizing a reusable software component library for efficient and reliable retrieval. In this paper we describe the background factors that have motivated this work, present the design of the system, and evaluate the results of its application.

  19. Rapid development of scalable scientific software using a process oriented approach

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Vinter, Brian

    2011-01-01

    Scientific applications are often not written with multiprocessing, cluster computing or grid computing in mind. This paper suggests using Python and PyCSP to structure scientific software through Communicating Sequential Processes. Three scientific applications are used to demonstrate the features...... of PyCSP and how networks of processes may easily be mapped into a visual representation for better understanding of the process workflow. We show that for many sequential solutions, the difficulty in implementing a parallel application is removed. The use of standard multi-threading mechanisms...

  20. Flexible software architecture for user-interface and machine control in laboratory automation.

    Science.gov (United States)

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  1. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    Science.gov (United States)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  2. Machine Learning Approach for Software Reliability Growth Modeling with Infinite Testing Effort Function

    Directory of Open Access Journals (Sweden)

    Subburaj Ramasamy

    2017-01-01

    Full Text Available Reliability is one of the quantifiable software quality attributes. Software Reliability Growth Models (SRGMs are used to assess the reliability achieved at different times of testing. Traditional time-based SRGMs may not be accurate enough in all situations where test effort varies with time. To overcome this lacuna, test effort was used instead of time in SRGMs. In the past, finite test effort functions were proposed, which may not be realistic as, at infinite testing time, test effort will be infinite. Hence in this paper, we propose an infinite test effort function in conjunction with a classical Nonhomogeneous Poisson Process (NHPP model. We use Artificial Neural Network (ANN for training the proposed model with software failure data. Here it is possible to get a large set of weights for the same model to describe the past failure data equally well. We use machine learning approach to select the appropriate set of weights for the model which will describe both the past and the future data well. We compare the performance of the proposed model with existing model using practical software failure data sets. The proposed log-power TEF based SRGM describes all types of failure data equally well and also improves the accuracy of parameter estimation more than existing TEF and can be used for software release time determination as well.

  3. Techniques and tools for measuring energy efficiency of scientific software applications

    CERN Document Server

    Abdurachmanov, David; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Nurminen, Jukka K.; Nyback, Filip; Pestana, Goncalo; Ou, Zhonghong; Khan, Kashif

    2014-01-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running o...

  4. A Collaboration Model for Community-Based Software Development with Social Machines

    Directory of Open Access Journals (Sweden)

    Dave Murray-Rust

    2016-02-01

    Full Text Available Crowdsourcing is generally used for tasks with minimal coordination, providing limited support for dynamic reconfiguration. Modern systems, exemplified by social ma chines, are subject to continual flux in both the client and development communities and their needs. To support crowdsourcing of open-ended development, systems must dynamically integrate human creativity with machine support. While workflows can be u sed to handle structured, predictable processes, they are less suitable for social machine development and its attendant uncertainty. We present models and techniques for coordination of human workers in crowdsourced software development environments. We combine the Social Compute Unit—a model of ad-hoc human worker teams—with versatile coordination protocols expressed in the Lightweight Social Calculus. This allows us to combine coordination and quality constraints with dynamic assessments of end-user desires, dynamically discovering and applying development protocols.

  5. Visualization of scientific data for high energy physics: PAW, a general-purpose portable software tool for data analysis and presentation

    International Nuclear Information System (INIS)

    Brun, R.; Couet, O.; Vandoni, C.E.; Zanarini, P.

    1990-01-01

    Visualization of scientific data although a fashionable word in the world of computer graphics, is not a new invention, but it is hundreds years old. With the advent of computer graphics the visualization of Scientific Data has now become a well understood and widely used technology, with hundreds of applications in the most different fields, ranging from media applications to real scientific ones. In the present paper, we shall discuss the design concepts of the Visualization of Scientific Data systems in particular in the specific field of High Energy Physics. During the last twenty years, CERN has played a leading role as the focus for development of packages and software libraries to solve problems related to High Energy Physics (HEP). The results of the integration of resources from many different Laboratories can be expressed in several million lines of code written at CERN during this period of time, used at CERN and distributed to collaborating laboratories. Nowadays, this role of software developer is considered very important by the entire HEP community. In this paper a large software package, where man-machine interaction and graphics play a key role (PAW-Physics Analysis Workstation), is described. PAW is essentially an interactive system which includes many different software tools, strongly oriented towards data analysis and data presentation. Some of these tools have been available in different forms and with different human interfaces for several years. 6 figs

  6. Techniques and tools for measuring energy efficiency of scientific software applications

    International Nuclear Information System (INIS)

    Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Pestana, Gonçalo; Khan, Kashif; Nurminen, Jukka K; Nyback, Filip; Ou, Zhonghong

    2015-01-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running on ARM and Intel architectures, and compare their power consumption and performance. We leverage several profiling tools (both in hardware and software) to extract different characteristics of the power use. We report the results of these measurements and the experience gained in developing a set of measurement techniques and profiling tools to accurately assess the power consumption for scientific workloads. (paper)

  7. Built To Last: Using Iterative Development Models for Sustainable Scientific Software Development

    Science.gov (United States)

    Jasiak, M. E.; Truslove, I.; Savoie, M.

    2013-12-01

    In scientific research, software development exists fundamentally for the results they create. The core research must take focus. It seems natural to researchers, driven by grant deadlines, that every dollar invested in software development should be used to push the boundaries of problem solving. This system of values is frequently misaligned with those of the software being created in a sustainable fashion; short-term optimizations create longer-term sustainability issues. The National Snow and Ice Data Center (NSIDC) has taken bold cultural steps in using agile and lean development and management methodologies to help its researchers meet critical deadlines, while building in the necessary support structure for the code to live far beyond its original milestones. Agile and lean software development and methodologies including Scrum, Kanban, Continuous Delivery and Test-Driven Development have seen widespread adoption within NSIDC. This focus on development methods is combined with an emphasis on explaining to researchers why these methods produce more desirable results for everyone, as well as promoting developers interacting with researchers. This presentation will describe NSIDC's current scientific software development model, how this addresses the short-term versus sustainability dichotomy, the lessons learned and successes realized by transitioning to this agile and lean-influenced model, and the current challenges faced by the organization.

  8. Turning men into machines? Scientific management, industrial psychology, and the "human factor".

    Science.gov (United States)

    Derksen, Maarten

    2014-01-01

    In the controversy that broke out in 1911 over Frederick W. Taylor's scientific management, many critics contended that it ignored "the human factor" and reduced workers to machines. Psychologists succeeded in positioning themselves as experts of the human factor, and their instruments and expertise as the necessary complement of Taylor's psychologically deficient system. However, the conventional view that the increasing influence of psychologists and other social scientists "humanized" management theory and practice needs to be amended. Taylor's scientific management was not less human than later approaches such as Human Relations, but it articulated the human factor differently, and aligned it to its own instruments and practices in such a way that it was at once external to them and essential to their functioning. Industrial psychologists, on the other hand, at first presented themselves as engineers of the human factor and made the human mind an integral part of management. © 2014 Wiley Periodicals, Inc.

  9. Hardware and software and machine-tool simulation with parallel structures mechanisms

    Directory of Open Access Journals (Sweden)

    Keba P.V.

    2016-12-01

    Full Text Available The usage spectrum of mechanisms with parallel structure is spreading all the time. The mechanisms of machine-tools and manipulators become more complicated and it is necessary to improve the program-controlled modules. Closed circuit mechanisms are mostly spread in robotic complexes, where manipulator performs complicated spatial movements by the given trajectory. The usage spectrum is very wide and the most popular are sorting, welding, assembling and others. However, the problem of designing the operating programs is still present even today. It is just because the developed post-processors are created for the equipment that we have for now. But new machine tool constructions appear every day and there is a necessity to control them. The problems associated with using of hardware and software of mechanisms with parallel structure in computer-aided simulation are considered. The program for inverse problem kinematics solving is designed. New method of designing the control programs is found. The kinematic analysis methods options and calculated data obtained by computer mathematics systems are shown with «Tools Glide» software taken as an example.

  10. Provenance tracking for scientific software toolchains through on-demand release and archiving

    Science.gov (United States)

    Ham, David

    2017-04-01

    There is an emerging consensus that published computational science results must be backed by a provenance chain tying results to the exact versions of input data and the code which generated them. There is also now an impressive range of web services devoted to revision control of software, and the archiving in citeable form of both software and input data. However, much scientific software itself builds on libraries and toolkits, and these themselves have dependencies. Further, it is common for cutting edge research to depend on the latest version of software in online repositories, rather than the official release version. This creates a situation in which an author who wishes to follow best practice in recording the provenance chain of their results must archive and cite unreleased versions of a series of dependencies. Here, we present an alternative which toolkit authors can easily implement to provide a semi-automatic mechanism for creating and archiving custom software releases of the precise version of a package used in a particular simulation. This approach leverages the excellent services provided by GitHub and Zenodo to generate a connected set of citeable DOIs for the archived software. We present the integration of this workflow into the Firedrake automated finite element framework as a practical example of this approach in use on a complex geoscientific tool chain in practical use.

  11. Virtual reality devices integration in scientific visualization software in the VtkVRPN framework

    International Nuclear Information System (INIS)

    Journe, G.; Guilbaud, C.

    2005-01-01

    A high-quality scientific visualization software relies on ergonomic navigation and exploration. Those are essential to be able to perform an efficient data analysis. To help solving this issue, management of virtual reality devices has been developed inside the CEA 'VtkVRPN' framework. This framework is based on VTK, a 3D graphical library, and VRPN, a virtual reality devices management library. This document describes the developments done during a post-graduate training course. (authors)

  12. Software protocol design: Communication and control in a multi-task robot machine for ITER vacuum vessel assembly and maintenance

    International Nuclear Information System (INIS)

    Li, Ming; Wu, Huapeng; Handroos, Heikki; Yang, Guangyou; Wang, Yongbo

    2015-01-01

    Highlights: • A high-level protocol is proposed for the data inter-transmission. • The protocol design is task-oriented for the robot control in the software system. • The protocol functions as a role of middleware in the software. • The protocol running stand-alone as an independent process in the software provides greater security. • Providing a reference design protocol for the multi-task robot machine in the industry. - Abstract: A specific communication and control protocol for software design of a multi-task robot machine is proposed. In order to fulfill the requirements on the complicated multi machining functions and the high performance motion control, the software design of robot is divided into two main parts accordingly, which consists of the user-oriented HMI part and robot control-oriented real-time control system. The two parts of software are deployed in the different hardware for the consideration of run-time performance, which forms a client–server-control architecture. Therefore a high-level task-oriented protocol is designed for the data inter-communication between the HMI part and the control system part, in which all the transmitting data related to a machining task is divided into three categories: trajectory-oriented data, task control-oriented data and status monitoring-oriented data. The protocol consists of three sub-protocols accordingly – a trajectory protocol, task control protocol and status protocol – which are deployed over the Ethernet and run as independent processes in both the client and server computers. The protocols are able to manage the vast amounts of data streaming due to the multi machining functions in a more efficient way. Since the protocol is functioning in the software as a role of middleware, and providing the data interface standards for the developing groups of two parts of software, it also permits greater focus of both software parts developers on their own requirements-oriented design. By

  13. Software protocol design: Communication and control in a multi-task robot machine for ITER vacuum vessel assembly and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ming, E-mail: ming.li@lut.fi [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland); Wu, Huapeng; Handroos, Heikki [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland); Yang, Guangyou [School of Mechanical Engineering, Hubei University of Technology, Wuhan (China); Wang, Yongbo [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland)

    2015-10-15

    Highlights: • A high-level protocol is proposed for the data inter-transmission. • The protocol design is task-oriented for the robot control in the software system. • The protocol functions as a role of middleware in the software. • The protocol running stand-alone as an independent process in the software provides greater security. • Providing a reference design protocol for the multi-task robot machine in the industry. - Abstract: A specific communication and control protocol for software design of a multi-task robot machine is proposed. In order to fulfill the requirements on the complicated multi machining functions and the high performance motion control, the software design of robot is divided into two main parts accordingly, which consists of the user-oriented HMI part and robot control-oriented real-time control system. The two parts of software are deployed in the different hardware for the consideration of run-time performance, which forms a client–server-control architecture. Therefore a high-level task-oriented protocol is designed for the data inter-communication between the HMI part and the control system part, in which all the transmitting data related to a machining task is divided into three categories: trajectory-oriented data, task control-oriented data and status monitoring-oriented data. The protocol consists of three sub-protocols accordingly – a trajectory protocol, task control protocol and status protocol – which are deployed over the Ethernet and run as independent processes in both the client and server computers. The protocols are able to manage the vast amounts of data streaming due to the multi machining functions in a more efficient way. Since the protocol is functioning in the software as a role of middleware, and providing the data interface standards for the developing groups of two parts of software, it also permits greater focus of both software parts developers on their own requirements-oriented design. By

  14. The Text in the Machine: American Copyright Law and the Many Natures of Software, 1974-1978.

    Science.gov (United States)

    Díaz, Gerardo Con

    This article is a case study in the history of software copyright in the United States from 1974 to 1978. It focuses on the work of a group called the National Commission on New Technological Uses of Copyrighted Works. CONTU, as this group was known, faced the problem of choosing which ontology of software-by which I mean a conception of the nature of software as an invention-should serve as the conceptual underpinning for the law of software copyright. In particular, the commissioners needed to decide whether computer programs are texts, machines, means to communicate with machines, or many of these things at once. CONTU's history shows how the discursive emergence of software as a new technology has been shaped by the convergence of commercial interests, the transmission of technical knowledge to lay audiences, and idiosyncratic views on the nature of information technology and human creativity.

  15. Developing a software for tracking the memory states of the machines in the LHCb Filter Farm

    CERN Document Server

    Jain, Harshit

    2017-01-01

    The LHCb Event Filter Farm consists of more than 1500 server nodes with a total amount of roughly 65 TB operating memory .The memory is crucial for the success of the LHCb experiment, since the proton-proton collisions are temporarily stored on these memory modules. Unfortunately, the aging nodes of the server farm occasionally suffer losses of their memory modules. The lower the available memory, the lower performance we can get out of it. Inducing the users or administrators to pay attention to this matter is inefficient. One needs to upgrade it to an acceptable way. The aim of this project was to develop a software to monitor a set of test machines. The software stores the data of the memory sticks in advance in a database which will be used for future reference. Then it checks the memory sticks at a future time instant to find any failures. In the case of any such losses the software looks up in the database to find out which memory sticks have lost and displays all information of those sticks in a log fi...

  16. Gendermetrics.NET: a novel software for analyzing the gender representation in scientific authoring.

    Science.gov (United States)

    Bendels, Michael H K; Brüggmann, Dörthe; Schöffel, Norman; Groneberg, David A

    2016-01-01

    Imbalances in female career promotion are believed to be strong in the field of academic science. A primary parameter to analyze gender inequalities is the gender authoring in scientific publications. Since the presently available data on gender distribution is largely limited to underpowered studies, we here develop a new approach to analyze authors' genders in large bibliometric databases. A SQL-Server based multiuser software suite was developed that serves as an integrative tool for analyzing bibliometric data with a special emphasis on gender and topographical analysis. The presented system allows seamless integration, inspection, modification, evaluation and visualization of bibliometric data. By providing an adaptive and almost fully automatic integration and analysis process, the inter-individual variability of analysis is kept at a low level. Depending on the scientific question, the system enables the user to perform a scientometric analysis including its visualization within a short period of time. In summary, a new software suite for analyzing gender representations in scientific articles was established. The system is suitable for the comparative analysis of scientific structures on the level of continents, countries, cities, city regions, institutions, research fields and journals.

  17. A document-driven method for certifying scientific computing software for use in nuclear safety analysis

    International Nuclear Information System (INIS)

    Smith, W. Spencer; Koothoor, Mimitha

    2016-01-01

    This paper presents a documentation and development method to facilitate the certification of scientific computing software used in the safety analysis of nuclear facilities. To study the problems faced during quality assurance and certification activities, a case study was performed on legacy software used for thermal analysis of a fuel pin in a nuclear reactor. Although no errors were uncovered in the code, 27 issues of incompleteness and inconsistency were found with the documentation. This work proposes that software documentation follow a rational process, which includes a software requirements specification following a template that is reusable, maintainable, and understandable. To develop the design and implementation, this paper suggests literate programming as an alternative to traditional structured programming. Literate programming allows for documenting of numerical algorithms and code together in what is termed the literate programmer's manual. This manual is developed with explicit traceability to the software requirements specification. The traceability between the theory, numerical algorithms, and implementation facilitates achieving completeness and consistency, as well as simplifies the process of verification and the associated certification

  18. A document-driven method for certifying scientific computing software for use in nuclear safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, W. Spencer; Koothoor, Mimitha [Computing and Software Department, McMaster University, Hamilton (Canada)

    2016-04-15

    This paper presents a documentation and development method to facilitate the certification of scientific computing software used in the safety analysis of nuclear facilities. To study the problems faced during quality assurance and certification activities, a case study was performed on legacy software used for thermal analysis of a fuel pin in a nuclear reactor. Although no errors were uncovered in the code, 27 issues of incompleteness and inconsistency were found with the documentation. This work proposes that software documentation follow a rational process, which includes a software requirements specification following a template that is reusable, maintainable, and understandable. To develop the design and implementation, this paper suggests literate programming as an alternative to traditional structured programming. Literate programming allows for documenting of numerical algorithms and code together in what is termed the literate programmer's manual. This manual is developed with explicit traceability to the software requirements specification. The traceability between the theory, numerical algorithms, and implementation facilitates achieving completeness and consistency, as well as simplifies the process of verification and the associated certification.

  19. Software and Human-Machine Interface Development for Environmental Controls Subsystem Support

    Science.gov (United States)

    Dobson, Matthew

    2018-01-01

    The Space Launch System (SLS) is the next premier launch vehicle for NASA. It is the next stage of manned space exploration from American soil, and will be the platform in which we push further beyond Earth orbit. In preparation of the SLS maiden voyage on Exploration Mission 1 (EM-1), the existing ground support architecture at Kennedy Space Center required significant overhaul and updating. A comprehensive upgrade of controls systems was necessary, including programmable logic controller software, as well as Launch Control Center (LCC) firing room and local launch pad displays for technician use. Environmental control acts as an integral component in these systems, being the foremost system for conditioning the pad and extremely sensitive launch vehicle until T-0. The Environmental Controls Subsystem (ECS) required testing and modification to meet the requirements of the designed system, as well as the human factors requirements of NASA software for Validation and Verification (V&V). This term saw significant strides in the progress and functionality of the human-machine interfaces used at the launch pad, and improved integration with the controller code.

  20. Civacuve analysis software for mis machine examination of pressurized water reactor vessels

    International Nuclear Information System (INIS)

    Dubois, Ph.; Gagnor, A.

    2001-01-01

    The product software CIVACUVE is used by INTERCONTROLE for the analysis of UT examinations, for detection, performed by the In-Service Inspection Machine (MIS) of the vessels of nuclear power plants. This software is based on an adaptation of an algorithm of SEGMENTATION (CEA CEREM), which is applied prior to any analysis. It is equipped with tools adapted to industrial use. It allows to: - perform image analysis thanks to advanced graphic tools (Zooms, True Bscan, 'contour' selection...), - backup of all data in a database (complete and transparent backup of all informations used and obtained during the different analysis operations), - connect PC to the Database (export of Reports and even of segmented points), - issue Examination Reports, Operating Condition Sheets, Sizing curves... - and last, perform a graphic and numerical comparison between different inspections of the same vessel. Used in Belgium and France on different kind of reactor vessels, CIVACUVE has allowed to show that the principle of SEGMENTATION can be adapted to detection exams. The use of CIVACUVE generates a important time gain as well as the betterment of quality in analysis. Wide data opening toward PC's allows a real flexibility with regard to client's requirements and preoccupations

  1. Assessing Scientific Practices Using Machine-Learning Methods: How Closely Do They Match Clinical Interview Performance?

    Science.gov (United States)

    Beggrow, Elizabeth P.; Ha, Minsu; Nehm, Ross H.; Pearl, Dennis; Boone, William J.

    2014-02-01

    The landscape of science education is being transformed by the new Framework for Science Education (National Research Council, A framework for K-12 science education: practices, crosscutting concepts, and core ideas. The National Academies Press, Washington, DC, 2012), which emphasizes the centrality of scientific practices—such as explanation, argumentation, and communication—in science teaching, learning, and assessment. A major challenge facing the field of science education is developing assessment tools that are capable of validly and efficiently evaluating these practices. Our study examined the efficacy of a free, open-source machine-learning tool for evaluating the quality of students' written explanations of the causes of evolutionary change relative to three other approaches: (1) human-scored written explanations, (2) a multiple-choice test, and (3) clinical oral interviews. A large sample of undergraduates (n = 104) exposed to varying amounts of evolution content completed all three assessments: a clinical oral interview, a written open-response assessment, and a multiple-choice test. Rasch analysis was used to compute linear person measures and linear item measures on a single logit scale. We found that the multiple-choice test displayed poor person and item fit (mean square outfit >1.3), while both oral interview measures and computer-generated written response measures exhibited acceptable fit (average mean square outfit for interview: person 0.97, item 0.97; computer: person 1.03, item 1.06). Multiple-choice test measures were more weakly associated with interview measures (r = 0.35) than the computer-scored explanation measures (r = 0.63). Overall, Rasch analysis indicated that computer-scored written explanation measures (1) have the strongest correspondence to oral interview measures; (2) are capable of capturing students' normative scientific and naive ideas as accurately as human-scored explanations, and (3) more validly detect understanding

  2. PRAPRAG: software para planejamento racional de máquinas agrícolas PRAPRAG: software for rational planning of agricultural machines

    Directory of Open Access Journals (Sweden)

    Erivelto Mercante

    2010-04-01

    Full Text Available O software PRAPRAG é uma ferramenta de escolha de máquinas e implementos agrícolas que apresentam o menor custo por área ou por quantidade produzida, bem como, faz o planejamento de aquisição das máquinas para a propriedade agrícola, do ponto de vista técnico e econômico. Foi utilizada a linguagem de programação Borland Delphi 3.0 e, a partir de prospectos das máquinas e implementos, criou-se um banco de dados onde o usuário pode cadastrar e modificar suas características de uso. O software mostrou-se uma ferramenta útil e uso amigável. O software proporciona maior rapidez, segurança e confiabilidade ao processo produtivo e econômico das propriedades, na seleção e aquisição de conjuntos mecanizados agrícolas, e na determinação de custos com a mão de obra utilizada.The software PRAPRAG is a tool used for choosing agricultural machines and implements that present the lowest cost per area or produced amount, as well as, to it makes the machines acquisition planning for the agricultural property, from both technical and economical points of view. It was used the programming language Borland Delphi 3.0. From the machine and implement handouts, it was created a database where the user can register and modify their characteristics of use. The software showed to be a useful and friendly tool. The software provides high speed, safety and reliability for the productive and economical process of the properties, at the selection and acquisition of agricultural systems, as well as for the determination of costs with the used labor.

  3. Enhancing Seismic Calibration Research Through Software Automation and Scientific Information Management

    Energy Technology Data Exchange (ETDEWEB)

    Ruppert, S D; Dodge, D A; Ganzberger, M D; Harris, D B; Hauk, T F

    2009-07-07

    The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Development (GNEMRD) Program at LLNL continues to make significant progress enhancing the process of deriving seismic calibrations and performing scientific integration, analysis, and information management with software automation tools. Our tool efforts address the problematic issues of very large datasets and varied formats encountered during seismic calibration research. New information management and analysis tools have resulted in demonstrated gains in efficiency of producing scientific data products and improved accuracy of derived seismic calibrations. In contrast to previous years, software development work this past year has emphasized development of automation at the data ingestion level. This change reflects a gradually-changing emphasis in our program from processing a few large data sets that result in a single integrated delivery, to processing many different data sets from a variety of sources. The increase in the number of sources had resulted in a large increase in the amount of metadata relative to the final volume of research products. Software developed this year addresses the problems of: (1) Efficient metadata ingestion and conflict resolution; (2) Automated ingestion of bulletin information; (3) Automated ingestion of waveform information from global data centers; and (4) Site Metadata and Response transformation required for certain products. This year, we also made a significant step forward in meeting a long-standing goal of developing and using a waveform correlation framework. Our objective for such a framework is to extract additional calibration data (e.g. mining blasts) and to study the extent to which correlated seismicity can be found in global and regional scale environments.

  4. nanoHUB.org: Experiences and Challenges in Software Sustainability for a Large Scientific Community

    Directory of Open Access Journals (Sweden)

    Lynn Zentner

    2014-07-01

    Full Text Available The science gateway nanoHUB.org, funded by the National Science Foundation (NSF, serves a large scientific community dedicated to research and education in nanotechnology with community-contributed simulation codes as well as a vast repository of other materials such as recorded presentations, teaching materials, and workshops and courses. Nearly 330,000 users annually access over 4400 items of content on nanoHUB, including 343 simulation tools. Arguably the largest nanotechnology facility in the world, nanoHUB has led the way not only in providing open access to scientific code in the nanotechnology community, but also in lowering barriers to the use of that code, by providing a platform where developers are able to easily and quickly deploy code written in a variety of languages with user-friendly graphical user interfaces and where users can run the latest versions of codes transparently on the grid or other powerful resources without ever having to download or update code. Being a leader in open access code deployment provides nanoHUB with opportunities and challenges as it meets the current and future needs of its community. This paper discusses the experiences of nanoHUB in addressing and adapting to the changing landscape of scientific software in ways that best serve its community and meet the needs of the largest portion of its user base.

  5. ENHANCING SEISMIC CALIBRATION RESEARCH THROUGH SOFTWARE AUTOMATION AND SCIENTIFIC INFORMATION MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Ruppert, S; Dodge, D A; Ganzberger, M D; Hauk, T F; Matzel, E M

    2008-07-03

    The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Development (GNEMRD) Program at LLNL continues to make significant progress enhancing the process of deriving seismic calibrations and performing scientific integration, analysis, and information management with software automation tools. Our tool efforts address the problematic issues of very large datasets and varied formats encountered during seismic calibration research. New information management and analysis tools have resulted in demonstrated gains in efficiency of producing scientific data products and improved accuracy of derived seismic calibrations. The foundation of a robust, efficient data development and processing environment is comprised of many components built upon engineered versatile libraries. We incorporate proven industry 'best practices' throughout our code and apply source code and bug tracking management as well as automatic generation and execution of unit tests for our experimental, development and production lines. Significant software engineering and development efforts have produced an object-oriented framework that provides database centric coordination between scientific tools, users, and data. Over a half billion parameters, signals, measurements, and metadata entries are all stored in a relational database accessed by an extensive object-oriented multi-technology software framework that includes stored procedures, real-time transactional database triggers and constraints, as well as coupled Java and C++ software libraries to handle the information interchange and validation requirements. Significant resources were applied to schema design to enable management of processing methods and station parameters, responses and metadata. This allowed for the development of merged ground-truth (GT) data sets compiled by the NNSA labs and AFTAC that include hundreds of thousands of events and tens of millions of arrivals. The

  6. 75 FR 60478 - In the Matter of Certain Machine Vision Software, Machine Vision Systems, and Products Containing...

    Science.gov (United States)

    2010-09-30

    ... (``ID'') of the presiding administrative law judge (``ALJ'') finding no violation of section 337 of the..., Virginia; Rasco GmbH (``Rasco'') of Germany; MVTec Software GmbH of Germany and MVTec LLC of Cambridge...

  7. Efficient Machine Learning Approach for Optimizing Scientific Computing Applications on Emerging HPC Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Arumugam, Kamesh [Old Dominion Univ., Norfolk, VA (United States)

    2017-05-01

    Efficient parallel implementations of scientific applications on multi-core CPUs with accelerators such as GPUs and Xeon Phis is challenging. This requires - exploiting the data parallel architecture of the accelerator along with the vector pipelines of modern x86 CPU architectures, load balancing, and efficient memory transfer between different devices. It is relatively easy to meet these requirements for highly structured scientific applications. In contrast, a number of scientific and engineering applications are unstructured. Getting performance on accelerators for these applications is extremely challenging because many of these applications employ irregular algorithms which exhibit data-dependent control-ow and irregular memory accesses. Furthermore, these applications are often iterative with dependency between steps, and thus making it hard to parallelize across steps. As a result, parallelism in these applications is often limited to a single step. Numerical simulation of charged particles beam dynamics is one such application where the distribution of work and memory access pattern at each time step is irregular. Applications with these properties tend to present significant branch and memory divergence, load imbalance between different processor cores, and poor compute and memory utilization. Prior research on parallelizing such irregular applications have been focused around optimizing the irregular, data-dependent memory accesses and control-ow during a single step of the application independent of the other steps, with the assumption that these patterns are completely unpredictable. We observed that the structure of computation leading to control-ow divergence and irregular memory accesses in one step is similar to that in the next step. It is possible to predict this structure in the current step by observing the computation structure of previous steps. In this dissertation, we present novel machine learning based optimization techniques to address

  8. China's experimental pragmatics of "Scientific development" in wind power: Algorithmic struggles over software in wind turbines

    DEFF Research Database (Denmark)

    Kirkegaard, Julia

    2016-01-01

    . This increased focus on quality, to ensure the sustainable and scientific development of China's wind energy market, requires improved indigenous Chinese innovation capabilities in wind turbine technology. To shed light on how the turn to quality impacts upon the industry and global competition, this study......This article presents a case study on the development of China's wind power market. As China's wind industry has experienced a quality crisis, the Chinese government has intervened to steer the industry towards a turn to quality, indicating a pragmatist and experimental mode of market development...... unfold over issues associated with intellectual property rights (IPRs), certification and standardisation of software algorithms. The article concludes that the use of this STS lens makes a fresh contribution to the often path-dependent, structuralist and hierarchical China literature, offering instead...

  9. XML as a standard I/O data format in scientific software development

    International Nuclear Information System (INIS)

    Song Tianming; Yang Jiamin; Yi Rongqing

    2010-01-01

    XML is an open standard data format with strict syntax rules, which is widely used in large-scale software development. It is adopted as I/O file format in the development of SpectroSim, a simulation and data-processing system for soft x-ray spectrometer used in ICF experiments. XML data that describe spectrometer configurations, schema codes that define syntax rules for XML and report generation technique for visualization of XML data are introduced. The characteristics of XML such as the capability to express structured information, self-descriptive feature, automation of visualization are explained with examples, and its feasibility as a standard scientific I/O data file format is discussed. (authors)

  10. EXPERIENCE OF USING «OPEN JOURNAL SYSTEMS» SOFTWARE PLATFORM FOR INFORMATION SUPPORT OF SCIENTIFIC AND EDUCATIONAL ACTIVITY

    Directory of Open Access Journals (Sweden)

    Oleg M. Spirin

    2017-10-01

    Full Text Available The article deals with the foreign and domestic experience of using the Open Journal Systems (OJS software platform for informational support of scientific and educational activities, in particular: a as a means of publicizing and disseminating the results of scientific research; b for creating and maintaining repositories of libraries of higher educational establishments; c for developing the scientific and educational space of an educational establishments; d as a cloud-based service for the preservation and access to scientific resources; e for information support in organization of student training; and f for deployment of student journals. As a result of the analysis of scientific periodicals of Ukraine in the field of psychological and pedagogical sciences, the scientific journals on the basis of Open Journal Systems are identified. The experience of support the electronic scientific journal «Information Technologies and Learning Tools» (http://journal.iitta.gov.ua is presented separately.

  11. Towards Analysis-Driven Scientific Software Architecture: The Case for Abstract Data Type Calculus

    Directory of Open Access Journals (Sweden)

    Damian W.I. Rouson

    2008-01-01

    Full Text Available This article approaches scientific software architecture from three analytical paths. Each path examines discrete time advancement of multiphysics phenomena governed by coupled differential equations. The new object-oriented Fortran 2003 constructs provide a formal syntax for an abstract data type (ADT calculus. The first analysis uses traditional object-oriented software design metrics to demonstrate the high cohesion and low coupling associated with the calculus. A second analysis from the viewpoint of computational complexity theory demonstrates that a more representative bug search strategy than that considered by Rouson et al. (ACM Trans. Math. Soft. 34(1 (2008 reduces the number of lines searched in a code with λ total lines from O(λ2 to O(λ log2 λ , which in turn becomes nearly independent of the overall code size in the context of ADT calculus. The third analysis derives from information theory an argument that ADT calculus simplifies developer communications in part by minimizing the growth in interface information content as developers add new physics to a multiphysics package.

  12. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  13. Fuel-handling machine tests at the Institute for Nuclear Research - Pitesti. Computer and software research and engineering

    International Nuclear Information System (INIS)

    Doca, Cezar; Predescu, Darie; Maiorescu, Oliviu; Dobrescu, Sorin

    2003-01-01

    This poster introduces the fuel-handling machine SCC-MID. This work is part of a very ambitious project that was accomplished with remarkable investment efforts. Material and human resources was spent to build a test stand for fuel handling machines (CANDU system), closely linked to NPP Cernavoda. A challenging goal was also to develop a computer system (hw/sw) designed and engineered to control the test and calibration process of this fuel-handling machine. The design takes care both of the functionality required to correctly control the F/H machine and of the additional functionality required to assist the testing process. How to test the system itself to validate the implemented solutions, how to safely and consistently maintain the data involved, how to manage such a system, how to gradually integrate the system in the whole stand saving time and work already done and solutions already validated were questions we had to find out right solutions. We choose modular solutions both for hardware and software, based on late technologies which in addition permit to achieve the versatility we needed, namely: VME based hw systems running OS9/68k (Unix like real-time multi-user multitasking OS), ISaGRAF (process control application oriented development and run-time software), Hawk (cross-compiler and IDE software for C/C++ software development intended to run on other Motorola based hw), Suretrack (project management software). The system topology implements open system network concepts that permit communication between different sw/hw platforms (OS9/Motorola and ix86/ms-windows based systems) We spent major resources to model the technological processes and test tools like: - real time simulation of the machine behavior while responding to the human commands or to the state changes of other machine parts as a result of other commands or as mechanical interlinks or technological interlocks and presentation of results revealing time related movements; - database

  14. Copyrighted Software | OSTI, US Dept of Energy Office of Scientific and

    Science.gov (United States)

    Copyrighted Software Software, for which the originating site HAS asserted copyright, is not publicly terms of the developer's contract with DOE. Requests for copyrighted software from those other than DOE contractors or governmental entities are referred by ESTSC to the copyright holder for licensing. Software on

  15. Advanced Cell Classifier: User-Friendly Machine-Learning-Based Software for Discovering Phenotypes in High-Content Imaging Data.

    Science.gov (United States)

    Piccinini, Filippo; Balassa, Tamas; Szkalisity, Abel; Molnar, Csaba; Paavolainen, Lassi; Kujala, Kaisa; Buzas, Krisztina; Sarazova, Marie; Pietiainen, Vilja; Kutay, Ulrike; Smith, Kevin; Horvath, Peter

    2017-06-28

    High-content, imaging-based screens now routinely generate data on a scale that precludes manual verification and interrogation. Software applying machine learning has become an essential tool to automate analysis, but these methods require annotated examples to learn from. Efficiently exploring large datasets to find relevant examples remains a challenging bottleneck. Here, we present Advanced Cell Classifier (ACC), a graphical software package for phenotypic analysis that addresses these difficulties. ACC applies machine-learning and image-analysis methods to high-content data generated by large-scale, cell-based experiments. It features methods to mine microscopic image data, discover new phenotypes, and improve recognition performance. We demonstrate that these features substantially expedite the training process, successfully uncover rare phenotypes, and improve the accuracy of the analysis. ACC is extensively documented, designed to be user-friendly for researchers without machine-learning expertise, and distributed as a free open-source tool at www.cellclassifier.org. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. A framework for detection of malicious software in Android handheld systems using machine learning techniques

    OpenAIRE

    Torregrosa García, Blas

    2015-01-01

    The present study aims at designing and developing new approaches to detect malicious applications in Android-based devices. More precisely, MaLDroide (Machine Learning-based Detector for Android malware), a framework for detection of Android malware based on machine learning techniques, is introduced here. It is devised to identify malicious applications. Este trabajo tiene como objetivo el diseño y el desarrollo de nuevas formas de detección de aplicaciones maliciosas en los dispositivos...

  17. Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research

    Science.gov (United States)

    2011-01-01

    open-source BMI software solu- tions are currently available, we feel that the Craniux software package fills a specific need in the realm of BMI...data, such as cortical source imaging using EEG or MEG recordings. It is with these characteristics in mind that we feel the Craniux software package...S. Adee, “Dean Kamen’s ‘luke arm’ prosthesis readies for clinical trials,” IEEE Spectrum, February 2008, http://spectrum .ieee.org/biomedical

  18. From curve fitting to machine learning an illustrative guide to scientific data analysis and computational intelligence

    CERN Document Server

    Zielesny, Achim

    2016-01-01

    This successful book provides in its second edition an interactive and illustrative guide from two-dimensional curve fitting to multidimensional clustering and machine learning with neural networks or support vector machines. Along the way topics like mathematical optimization or evolutionary algorithms are touched. All concepts and ideas are outlined in a clear cut manner with graphically depicted plausibility arguments and a little elementary mathematics. The major topics are extensively outlined with exploratory examples and applications. The primary goal is to be as illustrative as possible without hiding problems and pitfalls but to address them. The character of an illustrative cookbook is complemented with specific sections that address more fundamental questions like the relation between machine learning and human intelligence. All topics are completely demonstrated with the computing platform Mathematica and the Computational Intelligence Packages (CIP), a high-level function library developed with M...

  19. An Evaluation of Output Quality of Machine Translation (Padideh Software vs. Google Translate)

    Science.gov (United States)

    Azer, Haniyeh Sadeghi; Aghayi, Mohammad Bagher

    2015-01-01

    This study aims to evaluate the translation quality of two machine translation systems in translating six different text-types, from English to Persian. The evaluation was based on criteria proposed by Van Slype (1979). The proposed model for evaluation is a black-box type, comparative and adequacy-oriented evaluation. To conduct the evaluation, a…

  20. Widening the adoption of workflows to include human and human-machine scientific processes

    Science.gov (United States)

    Salayandia, L.; Pinheiro da Silva, P.; Gates, A. Q.

    2010-12-01

    Scientific workflows capture knowledge in the form of technical recipes to access and manipulate data that help scientists manage and reuse established expertise to conduct their work. Libraries of scientific workflows are being created in particular fields, e.g., Bioinformatics, where combined with cyber-infrastructure environments that provide on-demand access to data and tools, result in powerful workbenches for scientists of those communities. The focus in these particular fields, however, has been more on automating rather than documenting scientific processes. As a result, technical barriers have impeded a wider adoption of scientific workflows by scientific communities that do not rely as heavily on cyber-infrastructure and computing environments. Semantic Abstract Workflows (SAWs) are introduced to widen the applicability of workflows as a tool to document scientific recipes or processes. SAWs intend to capture a scientists’ perspective about the process of how she or he would collect, filter, curate, and manipulate data to create the artifacts that are relevant to her/his work. In contrast, scientific workflows describe the process from the point of view of how technical methods and tools are used to conduct the work. By focusing on a higher level of abstraction that is closer to a scientist’s understanding, SAWs effectively capture the controlled vocabularies that reflect a particular scientific community, as well as the types of datasets and methods used in a particular domain. From there on, SAWs provide the flexibility to adapt to different environments to carry out the recipes or processes. These environments range from manual fieldwork to highly technical cyber-infrastructure environments, i.e., such as those already supported by scientific workflows. Two cases, one from Environmental Science and another from Geophysics, are presented as illustrative examples.

  1. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  2. The Analisis Sentimen Sosial Media Twitter Dengan Algoritma Machine Learning Menggunakan Software R

    Directory of Open Access Journals (Sweden)

    Jaka Aulia Pratama

    2017-10-01

    Full Text Available Media sosial adalah wadah untuk mengungkapkan opini terhadap suatu topik tertentu. Ketersediaan informasi dan opini dari para pengguna media sosial merupakan kumpulan dokumen data berupa teks yang amat sangat besar dan berguna untuk kepentingan penelitian maupun membuat suatu keputusan bagi pihak – pihak tertentu. Text Mining bisa didefinisikan sebagai proses penggalian informasi di mana pengguna berinteraksi dengan kumpulan dokumen dari waktu ke waktu dengan menggunakan suatu alat analisis. Analisis sentimen atau Opinion Mining adalah salah satu studi di bidang komputasi yang berhubungan dengan kasus publik mengenai opini, penilaian, sikap, dan emosi. Penelitian ini akan menggunakan metode Machine Learning pada analisis sentimen pengguna layanan jejaring sosial Twitter terhadap Donald Trump dan Barack Obama dalam 20000 tweets. Nilai akurasi metode Machine Learning yang diperoleh cukup tinggi yaitu 87.52% untuk Data Training dan 87.4% untuk Data Testing.

  3. Automatically quantifying the scientific quality and sensationalism of news records mentioning pandemics: validating a maximum entropy machine-learning model.

    Science.gov (United States)

    Hoffman, Steven J; Justicz, Victoria

    2016-07-01

    To develop and validate a method for automatically quantifying the scientific quality and sensationalism of individual news records. After retrieving 163,433 news records mentioning the Severe Acute Respiratory Syndrome (SARS) and H1N1 pandemics, a maximum entropy model for inductive machine learning was used to identify relationships among 500 randomly sampled news records that correlated with systematic human assessments of their scientific quality and sensationalism. These relationships were then computationally applied to automatically classify 10,000 additional randomly sampled news records. The model was validated by randomly sampling 200 records and comparing human assessments of them to the computer assessments. The computer model correctly assessed the relevance of 86% of news records, the quality of 65% of records, and the sensationalism of 73% of records, as compared to human assessments. Overall, the scientific quality of SARS and H1N1 news media coverage had potentially important shortcomings, but coverage was not too sensationalizing. Coverage slightly improved between the two pandemics. Automated methods can evaluate news records faster, cheaper, and possibly better than humans. The specific procedure implemented in this study can at the very least identify subsets of news records that are far more likely to have particular scientific and discursive qualities. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Free/open source software: a study of some applications for scientific data analysis of nuclear experiments

    International Nuclear Information System (INIS)

    Menezes, Mario Olimpio de

    2005-01-01

    Free/Open Source Software (FOSS) has been used in science long before the formal social movement known as 'Free Software/Open Source Software' came in to existence. After the Personal Computer (PC) boom in the 80s, commercial closed source software became widely available to scientists for data analysis in this platform. In this paper, we study some high quality FOSS, available also for free, that can be used for complex data analysis tasks. We show the results and data analysis process, aiming to expose the high quality and highly productive ways of both results and processes, while highlighting the different approach used in some of the FOSS. We show that scientists have today in FOSS a viable, high quality alternative to commercial closed source software which, besides being ready to use, also offer the possibility of great customization or extension to fit very particular needs of many fields of scientific data analysis. Among the FOSS, we study in this paper GNU Octave and SCILAB - free alternatives to MATLAB; Gnuplot - free alternative to ORIGIN-like software. We also show that scientists have invaluable resources in modern FOSS programming languages such as Python, and Perl, that can be used both to do data analysis and manipulation, allowing very complex tasks to be done automatically after some few lines of easy programming. (author)

  5. Free/open source software: a study of some applications for scientific data analysis of nuclear experiments

    Energy Technology Data Exchange (ETDEWEB)

    Menezes, Mario Olimpio de [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: mario@ipen.br; mo.menezes@gmail.com

    2005-07-01

    Free/Open Source Software (FOSS) has been used in science long before the formal social movement known as 'Free Software/Open Source Software' came in to existence. After the Personal Computer (PC) boom in the 80s, commercial closed source software became widely available to scientists for data analysis in this platform. In this paper, we study some high quality FOSS, available also for free, that can be used for complex data analysis tasks. We show the results and data analysis process, aiming to expose the high quality and highly productive ways of both results and processes, while highlighting the different approach used in some of the FOSS. We show that scientists have today in FOSS a viable, high quality alternative to commercial closed source software which, besides being ready to use, also offer the possibility of great customization or extension to fit very particular needs of many fields of scientific data analysis. Among the FOSS, we study in this paper GNU Octave and SCILAB - free alternatives to MATLAB; Gnuplot - free alternative to ORIGIN-like software. We also show that scientists have invaluable resources in modern FOSS programming languages such as Python, and Perl, that can be used both to do data analysis and manipulation, allowing very complex tasks to be done automatically after some few lines of easy programming. (author)

  6. SHINE Virtual Machine Model for In-flight Updates of Critical Mission Software

    Science.gov (United States)

    Plesea, Lucian

    2008-01-01

    This software is a new target for the Spacecraft Health Inference Engine (SHINE) knowledge base that compiles a knowledge base to a language called Tiny C - an interpreted version of C that can be embedded on flight processors. This new target allows portions of a running SHINE knowledge base to be updated on a "live" system without needing to halt and restart the containing SHINE application. This enhancement will directly provide this capability without the risk of software validation problems and can also enable complete integration of BEAM and SHINE into a single application. This innovation enables SHINE deployment in domains where autonomy is used during flight-critical applications that require updates. This capability eliminates the need for halting the application and performing potentially serious total system uploads before resuming the application with the loss of system integrity. This software enables additional applications at JPL (microsensors, embedded mission hardware) and increases the marketability of these applications outside of JPL.

  7. Charlotte: Scientific Modeling and Simulation Under the Software as a Service Paradigm, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA spends considerable effort supporting the efforts of collaborating researchers. These researchers are interested in interacting with scientific models provided...

  8. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    Directory of Open Access Journals (Sweden)

    Joseph P. Kenny

    2008-01-01

    Full Text Available Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also address interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.

  9. Software para el procesamiento de los ensayos de la máquina sincrónica; Software for processing Synchronous Machine test´data

    Directory of Open Access Journals (Sweden)

    Orlando Lázaro Rodríguez González

    2011-02-01

    Full Text Available Este trabajo resuelve, haciendo uso de las facilidades que brinda el MATLAB, de forma cuasi analítica,este problema.El software sigue los mismos pasos que realiza manualmente por el método tradicional. Fueimplementado el método de la fuerza magnetomotriz. Para el ajuste de la curva en el tramo saturado seemplea una aproximación según la función arcotangente, en el sentido de los mínimos cuadrados. Elpunto intercepción entre curvas se calcula con métodos numéricos, hasta una precisión que supera la quepueden tener los datos del experimento. Está destinado para su uso en la industria y para la docencia enlas universidades Synchronous Machine tests, such as no load, short circuit and zero power factor, are achieved to traceZero Power-factor, Regulation and External Load characteristics, in addition to Voltage Regulation for anygiven state. To carry out this results we are familiarized with graphical methods, which are cumbersomeand imprecise. This is a new cuasi analytic approach to this problem by means of a MATLAB software.Algorithm is similar to that followed by someone who is carrying out solutions by traditional way. It wasimplemented Magneto Motriz Force method. Saturation was approached to an inverse tangent curve in aleast means square sense. To compute the interception point between curves it was used numericalmethods which achieved a precision higher than experiments offer.This software is intended to be usedeither in Indutry or in universities with academic purpose.

  10. The Acquisition and Control Design for Vacuum Unit of an Electron Beam Machine Using Remote Manual, Software and Hardware

    International Nuclear Information System (INIS)

    Sudiyanto; Prajitno

    2002-01-01

    The Acquisition and Control design for vacuum unit of an Electron Beam Machine using Remote manual, Software and Hardwire have been done. For Remote Manual system open/close of pneumatic valves can be done by using 220 Vac/12 Vdc relay equipped with long cable and switches on the control panel. An indicator lamp mentioning ready/not ready status of the vacuum unit would be the main indicator in making decisions to open/close the pneumatic valves. On the software method the acquisition and controlled would be done by using the Distributed Control System which have already been developed recently. The references voltage on he vacuum level of 10 -2 Torr and 10 -6 Torr would be proceeded by using ADC techniques of PCL-718 and recorded on the software system as a references data base in making an open/closed pneumatic decision. On the Hardwire method, on/off controlling of the pneumatic valves could be done by using voltage comparison by using logic circuitry where the vacuum references level of 10 -2 Torr and 10 -6 Torr have already been taken, monitored by penning gauge. The hardwire method is the fastest in response time than the others. (author)

  11. A neurite quality index and machine vision software for improved quantification of neurodegeneration.

    Science.gov (United States)

    Romero, Peggy; Miller, Ted; Garakani, Arman

    2009-12-01

    Current methods to assess neurodegradation in dorsal root ganglion cultures as a model for neurodegenerative diseases are imprecise and time-consuming. Here we describe two new methods to quantify neuroprotection in these cultures. The neurite quality index (NQI) builds upon earlier manual methods, incorporating additional morphological events to increase detection sensitivity for the detection of early degeneration events. Neurosight is a machine vision-based method that recapitulates many of the strengths of NQI while enabling high-throughput screening applications with decreased costs.

  12. Harnessing the power of big data: infusing the scientific method with machine learning to transform ecology

    Science.gov (United States)

    Most efforts to harness the power of big data for ecology and environmental sciences focus on data and metadata sharing, standardization, and accuracy. However, many scientists have not accepted the data deluge as an integral part of their research because the current scientific method is not scalab...

  13. Leveraging Open Source Software to Create Technical Animations of Scientific Data

    National Research Council Canada - National Science Library

    Vines, John M

    2006-01-01

    .... Army Research Laboratory spends tens of thousands of dollars annually for software licenses for packages such as Maya, Houdini, and 3D Studio Max, while in many instances, an open source package...

  14. Development of yarn breakage detection software system based on machine vision

    Science.gov (United States)

    Wang, Wenyuan; Zhou, Ping; Lin, Xiangyu

    2017-10-01

    For questions spinning mills and yarn breakage cannot be detected in a timely manner, and save the cost of textile enterprises. This paper presents a software system based on computer vision for real-time detection of yarn breakage. The system and Windows8.1 system Tablet PC, cloud server to complete the yarn breakage detection and management. Running on the Tablet PC software system is designed to collect yarn and location information for analysis and processing. And will be processed after the information through the Wi-Fi and http protocol sent to the cloud server to store in the Microsoft SQL2008 database. In order to follow up on the yarn break information query and management. Finally sent to the local display on time display, and remind the operator to deal with broken yarn. The experimental results show that the system of missed test rate not more than 5%o, and no error detection.

  15. An open CAM system for dentistry on the basis of China-made 5-axis simultaneous contouring CNC machine tool and industrial CAM software.

    Science.gov (United States)

    Lu, Li; Liu, Shusheng; Shi, Shenggen; Yang, Jianzhong

    2011-10-01

    China-made 5-axis simultaneous contouring CNC machine tool and domestically developed industrial computer-aided manufacture (CAM) technology were used for full crown fabrication and measurement of crown accuracy, with an attempt to establish an open CAM system for dental processing and to promote the introduction of domestic dental computer-aided design (CAD)/CAM system. Commercially available scanning equipment was used to make a basic digital tooth model after preparation of crown, and CAD software that comes with the scanning device was employed to design the crown by using domestic industrial CAM software to process the crown data in order to generate a solid model for machining purpose, and then China-made 5-axis simultaneous contouring CNC machine tool was used to complete machining of the whole crown and the internal accuracy of the crown internal was measured by using 3D-MicroCT. The results showed that China-made 5-axis simultaneous contouring CNC machine tool in combination with domestic industrial CAM technology can be used for crown making and the crown was well positioned in die. The internal accuracy was successfully measured by using 3D-MicroCT. It is concluded that an open CAM system for dentistry on the basis of China-made 5-axis simultaneous contouring CNC machine tool and domestic industrial CAM software has been established, and development of the system will promote the introduction of domestically-produced dental CAD/CAM system.

  16. Software maintenance in scientific and engineering environments: An introduction and guide

    Science.gov (United States)

    Wright, David

    1986-01-01

    The purpose of software maintenance techniques is addressed. The aims of perfective, adaptive and corrective software maintenance are defined and discussed, especially in the NASA research environment. Areas requiring maintenance, and tools available for this, and suggestions for their use are made. Stress is placed on the organizational aspect of maintenance at both the individual and group level. Particular emphasis is placed on the use of various forms of documentation as the basis around which to organize. Finally, suggestions are given on how to proceed in the partial or complete absence of such documentation.

  17. Benefits of Record Management For Scientific Writing (Study of Metadata Reception of Zotero Reference Management Software in UIN Malang

    Directory of Open Access Journals (Sweden)

    Moch Fikriansyah Wicaksono

    2018-01-01

    Full Text Available Record creation and management by individuals or organizations grows rapidly, particularly the change from print to electronics, and the smallest part of record (metadata. Therefore, there is a need to perform record management metadata, particularly for students who have the needs of recording references and citation. Reference management software (RMS is a software to help reference management, one of them named zotero. The purpose of this article is to describe the benefits of record management for the writing of scientific papers for students, especially on biology study program in UIN Malik Ibrahim Malang. The type of research used is descriptive with quantitative approach. To increase the depth of respondents' answers, we used additional data by conducting interviews. The selected population is 322 students, class of 2012 to 2014, using random sampling. The selection criteria were chosen because the introduction and use of reference management software, zotero have started since three years ago.  Respondents in this study as many as 80 people, which is obtained from the formula Yamane. The results showed that 70% agreed that using reference management software saved time and energy in managing digital file metadata, 71% agreed that if digital metadata can be quickly stored into RMS, 65% agreed on the ease of storing metadata into the reference management software, 70% agreed when it was easy to configure metadata to quote and bibliography, 56.6% agreed that the metadata stored in reference management software could be edited, 73.8% agreed that using metadata will make it easier to write quotes and bibliography.

  18. Effective software design and development for the new graph architecture HPC machines.

    Energy Technology Data Exchange (ETDEWEB)

    Dechev, Damian

    2012-03-01

    Software applications need to change and adapt as modern architectures evolve. Nowadays advancement in chip design translates to increased parallelism. Exploiting such parallelism is a major challenge in modern software engineering. Multicore processors are about to introduce a significant change in the way we design and use fundamental data structures. In this work we describe the design and programming principles of a software library of highly concurrent scalable and nonblocking data containers. In this project we have created algorithms and data structures for handling fundamental computations in massively multithreaded contexts, and we have incorporated these into a usable library with familiar look and feel. In this work we demonstrate the first design and implementation of a wait-free hash table. Our multiprocessor data structure design allows a large number of threads to concurrently insert, remove, and retrieve information. Non-blocking designs alleviate the problems traditionally associated with the use of mutual exclusion, such as bottlenecks and thread-safety. Lock-freedom provides the ability to share data without some of the drawbacks associated with locks, however, these designs remain susceptible to starvation. Furthermore, wait-freedom provides all of the benefits of lock-free synchronization with the added assurance that every thread makes progress in a finite number of steps. This implies deadlock-freedom, livelock-freedom, starvation-freedom, freedom from priority inversion, and thread-safety. The challenges of providing the desirable progress and correctness guarantees of wait-free objects makes their design and implementation difficult. There are few wait-free data structures described in the literature. Using only standard atomic operations provided by the hardware, our design is portable; therefore, it is applicable to a variety of data-intensive applications including the domains of embedded systems and supercomputers.Our experimental

  19. Corganiser: a web-based software tool for planning time-sensitive sampling of whole rounds during scientific drilling

    DEFF Research Database (Denmark)

    Marshall, Ian

    2014-01-01

    with a wide range of core and section configurations and can thus be used in future drilling projects. Corganiser is written in the Python programming language and is implemented both as a graphical web interface and command-line interface. It can be accessed online at http://130.226.247.137/.......Corganiser is a software tool developed to simplify the process of preparing whole-round sampling plans for time-sensitive microbiology and geochemistry sampling during scientific drilling. It was developed during the Integrated Ocean Drilling Program (IODP) Expedition 347, but is designed to work...

  20. Astroinformation resource of the Ukrainian virtual observatory: Joint observational data archive, scientific tasks, and software

    Science.gov (United States)

    Vavilova, I. B.; Pakulyak, L. K.; Shlyapnikov, A. A.; Protsyuk, Yu. I.; Savanevich, V. E.; Andronov, I. L.; Andruk, V. N.; Kondrashova, N. N.; Baklanov, A. V.; Golovin, A. V.; Fedorov, P. N.; Akhmetov, V. S.; Isak, I. I.; Mazhaev, A. E.; Golovnya, V. V.; Virun, N. V.; Zolotukhina, A. V.; Kazantseva, L. V.; Virnina, N. A.; Breus, V. V.; Kashuba, S. G.; Chinarova, L. L.; Kudashkina, L. S.; Epishev, V. P.

    2012-04-01

    The overview of the most important components of the national project - Ukrainian Virtual Observatory (UkrVO) - is presented.Among these components, there is the establishment of a Joint Digital Archive (JDA) of observational data obtained at Ukrainian observatories since 1890, including astronegative's JDA (more than 200 thousand plates). Because of this task requires a VO-oriented software, such issues as software verification of content integrity and JDA administration; compliance of image for mats to IVOA standards; photometric and astrometry calibration of images. Among other developments of local UkrVO software the means of automatic registration of moving celestial objects at the starry sky followed by visual inspection of the results as well as stellar fields image processing software are considered. Research projects that use local UkrVO data archives, namely, an analysis of long observational series of active galactic nuclei, the study of solar flares and solar active regions based on spectral observational archives, research and discovery of variable stars, the study of stellar fields in vicinity gamma-ray bursts are discussed. Particular attention is paid to the CoLiTec program, which allows to increase significantly the number of registered small solar system bodies, and to dis cover new ones, in particular, with the help of this program the comets C/2010 X1 (Elenin) and P/2011 N 01 were discovered in ISON-NM observatory. Development of the UkrVO JDA pro to type is noted which provides access to data bases of MAO NAS of Ukraine, Nikolaev Astronomical Observatory and L'viv Astronomical Observatory.

  1. Software organization for a prolog-based prototyping system for machine vision

    Science.gov (United States)

    Jones, Andrew C.; Hack, Ralf; Batchelor, Bruce G.

    1996-11-01

    We describe PIP (prolog image processing)--a prototype system for interactive image processing using Prolog, implemented on an Apple Macintosh computer. PIP is the latest in a series of products that the third author has been involved in the implementation of, under the collective title Prolog+. PIP differs from our previous systems in two particularly important respects. The first is that whereas we previously required dedicated image processing hardware, the present system implements image processing routines using software. The second difference is that our present system is hierarchical in structure, where the top level of the hierarchy emulates Prolog+, but there is a flexible infrastructure which supports more sophisticated image manipulation which we will be able to exploit in due course . We discuss the impact of the Apple Macintosh operating system upon the implementation of the image processing functions, and the interface between these functions and the Prolog system. We also explain how the existing set of Prolog+ commands has been implemented. PIP is now nearing maturity, and we will make a version of it generally available in the near future. However, although the represent version of PIP constitutes a complete image processing tool, there are a number of ways in which we are intending to enhance future versions, with a view to added flexibility and efficiency: we discuss these ideas briefly near the end of the present paper.

  2. Computer-aided software understanding systems to enhance confidence of scientific codes

    International Nuclear Information System (INIS)

    Sheng, G.; Oeren, T.I.

    1991-01-01

    A unique characteristic of nuclear waste disposal is the very long time span over which the combined engineered and natural containment system must remain effective: hundreds of thousands of years. Since there is no precedent in human history for such an endeavour, simulation with the use of computers is the only means we have of forecasting possible future outcomes quantitatively. The need for reliable models and software to make such forecasts so far into the future is obvious. One of the critical elements necessary to ensure reliability is the degree of reviewability of the computer program. Among others, there are two very important reasons for this. Firstly, if there is to be any chance at all of validating the conceptual models as implemented by the computer code, peer reviewers must be able to see and understand what the program is doing. It is all but impossible to achieve this understanding by just looking at the code due to possible unfamiliarity with the language and often due as well to the length and complexity of the code. Secondly, a thorough understanding of the code is also necessary to carry out code maintenance activities which include among others, error detection, error correction and code modification for purposes of enhancing its performance, functionality or to adapt it to a changed environment. The emerging concepts of computer-aided software understanding and reverse engineering can answer precisely these needs. This paper will discuss the role they can play in enhancing the confidence one has on computer codes and several examples will be provided. Finally a brief discussion of combining state-of-art forward engineering systems with reverse engineering systems will show how powerfully they can contribute to the overall quality assurance of a computer program. (13 refs., 7 figs.)

  3. Porting the AVS/Express scientific visualization software to Cray XT4.

    Science.gov (United States)

    Leaver, George W; Turner, Martin J; Perrin, James S; Mummery, Paul M; Withers, Philip J

    2011-08-28

    Remote scientific visualization, where rendering services are provided by larger scale systems than are available on the desktop, is becoming increasingly important as dataset sizes increase beyond the capabilities of desktop workstations. Uptake of such services relies on access to suitable visualization applications and the ability to view the resulting visualization in a convenient form. We consider five rules from the e-Science community to meet these goals with the porting of a commercial visualization package to a large-scale system. The application uses message-passing interface (MPI) to distribute data among data processing and rendering processes. The use of MPI in such an interactive application is not compatible with restrictions imposed by the Cray system being considered. We present details, and performance analysis, of a new MPI proxy method that allows the application to run within the Cray environment yet still support MPI communication required by the application. Example use cases from materials science are considered.

  4. The Machine within the Machine

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    Although Virtual Machines are widespread across CERN, you probably won't have heard of them unless you work for an experiment. Virtual machines - known as VMs - allow you to create a separate machine within your own, allowing you to run Linux on your Mac, or Windows on your Linux - whatever combination you need.   Using a CERN Virtual Machine, a Linux analysis software runs on a Macbook. When it comes to LHC data, one of the primary issues collaborations face is the diversity of computing environments among collaborators spread across the world. What if an institute cannot run the analysis software because they use different operating systems? "That's where the CernVM project comes in," says Gerardo Ganis, PH-SFT staff member and leader of the CernVM project. "We were able to respond to experimentalists' concerns by providing a virtual machine package that could be used to run experiment software. This way, no matter what hardware they have ...

  5. SOFTWARE INSPECTION TECHNIQUE OF SCIENTIFIC AND ACADEMIC WORKS – PANACEA FOR THE PLAGIARISM OR A SPOT OF DIFFICULT PROCEDURE?

    Directory of Open Access Journals (Sweden)

    С. П. Ткаченко

    2017-10-01

    Full Text Available The use of software identifying text coincidences instruments provides an analysis of scientific and academic works for the evidence of plagiarism. Besides, the anti-plagiarism is not just a fight against illegal text borrowings as consequences of this phenomenon. It should be a comprehensive mechanism for dealing with both the consequences and the causes of the appearance of such a phenomenon. Such mechanism should include the implementation of the system of values that correspond to the principles of academic integrity, along with making total enquiries of all works on plagiarism. The exhaustive list of text borrowings, which is resultant, is the basis for expertsassessors decision-making in point of plagiarism presence. What is more, there should be no permissible standards of the plagiarism. All borrowings must being drawn up in accordance with the requirements. Therefore, the best result in the fight against plagiarism will achieved only by comprehensive approach – when the academic culture and the principles of academic integrity will be instilled ever since school, along with the check of all works for plagiarism.

  6. PRAPRAG: software para planejamento racional de máquinas agrícolas PRAPRAG: software for rational planning of agricultural machines

    OpenAIRE

    Erivelto Mercante; Eduardo G. de Souza; Jerry A Johann; Antonio Gabriel Filho; Miguel A Uribe-Opazo

    2010-01-01

    O software PRAPRAG é uma ferramenta de escolha de máquinas e implementos agrícolas que apresentam o menor custo por área ou por quantidade produzida, bem como, faz o planejamento de aquisição das máquinas para a propriedade agrícola, do ponto de vista técnico e econômico. Foi utilizada a linguagem de programação Borland Delphi 3.0 e, a partir de prospectos das máquinas e implementos, criou-se um banco de dados onde o usuário pode cadastrar e modificar suas características de uso. O software m...

  7. System and Software Design for the Man Machine Interface System for Shin-Hanul Nuclear Power Plant Units 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Woong Seock; Kim, Chang Ho; Lee, Yoon Hee; Sohn, Se Do; Baek, Seung Min [KEPCO E and C, Daejeon (Korea, Republic of)

    2015-10-15

    The design of the safety MMIS(Man Machine Interface System) system has been performed using POSAFE-Q Programmable Logic Controller (PLC). The design of the non-safety MMIS has been performed using OPERASYSTEM Distributed Control System (DCS). This paper describes the design experiences from the design work of the MMIS using these new platforms. The SHN 1 and 2 MMIS has been developed using POSAFE-Q platform for safety and OPERASYSTEM for non-safety system. Through the utilization of the standardized platform, the safety system was developed using the above hardware and software blocks resulting in efficient safety system development. An integrated CASE tool has been setup for reliable software development. The integrated development environment has been setup formally resulting in consistent work. Even we have setup integrated development environment, the independent verification and validation including testing environment needs to be setup for more advanced environment which will be used for future plant.

  8. Objective detection of apoptosis in rat renal tissue sections using light microscopy and free image analysis software with subsequent machine learning: Detection of apoptosis in renal tissue.

    Science.gov (United States)

    Macedo, Nayana Damiani; Buzin, Aline Rodrigues; de Araujo, Isabela Bastos Binotti Abreu; Nogueira, Breno Valentim; de Andrade, Tadeu Uggere; Endringer, Denise Coutinho; Lenz, Dominik

    2017-02-01

    The current study proposes an automated machine learning approach for the quantification of cells in cell death pathways according to DNA fragmentation. A total of 17 images of kidney histological slide samples from male Wistar rats were used. The slides were photographed using an Axio Zeiss Vert.A1 microscope with a 40x objective lens coupled with an Axio Cam MRC Zeiss camera and Zen 2012 software. The images were analyzed using CellProfiler (version 2.1.1) and CellProfiler Analyst open-source software. Out of the 10,378 objects, 4970 (47,9%) were identified as TUNEL positive, and 5408 (52,1%) were identified as TUNEL negative. On average, the sensitivity and specificity values of the machine learning approach were 0.80 and 0.77, respectively. Image cytometry provides a quantitative analytical alternative to the more traditional qualitative methods more commonly used in studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Software sensors for biomass concentration in a SSC process using artificial neural networks and support vector machine.

    Science.gov (United States)

    Acuña, Gonzalo; Ramirez, Cristian; Curilem, Millaray

    2014-01-01

    The lack of sensors for some relevant state variables in fermentation processes can be coped by developing appropriate software sensors. In this work, NARX-ANN, NARMAX-ANN, NARX-SVM and NARMAX-SVM models are compared when acting as software sensors of biomass concentration for a solid substrate cultivation (SSC) process. Results show that NARMAX-SVM outperforms the other models with an SMAPE index under 9 for a 20 % amplitude noise. In addition, NARMAX models perform better than NARX models under the same noise conditions because of their better predictive capabilities as they include prediction errors as inputs. In the case of perturbation of initial conditions of the autoregressive variable, NARX models exhibited better convergence capabilities. This work also confirms that a difficult to measure variable, like biomass concentration, can be estimated on-line from easy to measure variables like CO₂ and O₂ using an adequate software sensor based on computational intelligence techniques.

  10. Community Capacity Building as a vital mechanism for enhancing the growth and efficacy of a sustainable scientific software ecosystem: experiences running a real-time bi-coastal "Open Science for Synthesis" Training Institute for young Earth and Environmental scientists

    Science.gov (United States)

    Schildhauer, M.; Jones, M. B.; Bolker, B.; Lenhardt, W. C.; Hampton, S. E.; Idaszak, R.; Rebich Hespanha, S.; Ahalt, S.; Christopherson, L.

    2014-12-01

    Continuing advances in computational capabilities, access to Big Data, and virtual collaboration technologies are creating exciting new opportunities for accomplishing Earth science research at finer resolutions, with much broader scope, using powerful modeling and analytical approaches that were unachievable just a few years ago. Yet, there is a perceptible lag in the abilities of the research community to capitalize on these new possibilities, due to lacking the relevant skill-sets, especially with regards to multi-disciplinary and integrative investigations that involve active collaboration. UC Santa Barbara's National Center for Ecological Analysis and Synthesis (NCEAS), and the University of North Carolina's Renaissance Computing Institute (RENCI), were recipients of NSF OCI S2I2 "Conceptualization awards", charged with helping define the needs of the research community relative to enabling science and education through "sustained software infrastructure". Over the course of our activities, a consistent request from Earth scientists was for "better training in software that enables more effective, reproducible research." This community-based feedback led to creation of an "Open Science for Synthesis" Institute— a innovative, three-week, bi-coastal training program for early career researchers. We provided a mix of lectures, hands-on exercises, and working group experience on topics including: data discovery and preservation; code creation, management, sharing, and versioning; scientific workflow documentation and reproducibility; statistical and machine modeling techniques; virtual collaboration mechanisms; and methods for communicating scientific results. All technologies and quantitative tools presented were suitable for advancing open, collaborative, and reproducible synthesis research. In this talk, we will report on the lessons learned from running this ambitious training program, that involved coordinating classrooms among two remote sites, and

  11. Machine translation

    Energy Technology Data Exchange (ETDEWEB)

    Nagao, M

    1982-04-01

    Each language has its own structure. In translating one language into another one, language attributes and grammatical interpretation must be defined in an unambiguous form. In order to parse a sentence, it is necessary to recognize its structure. A so-called context-free grammar can help in this respect for machine translation and machine-aided translation. Problems to be solved in studying machine translation are taken up in the paper, which discusses subjects for semantics and for syntactic analysis and translation software. 14 references.

  12. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: An Earth Modeling System Software Framework Strawman Design that Integrates Cactus and UCLA/UCB Distributed Data Broker

    Science.gov (United States)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task. both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation, while maintaining high performance across numerous supercomputer and workstation architectures. This document proposes a strawman framework design for the climate community based on the integration of Cactus, from the relativistic physics community, and UCLA/UCB Distributed Data Broker (DDB) from the climate community. This design is the result of an extensive survey of climate models and frameworks in the climate community as well as frameworks from many other scientific communities. The design addresses fundamental development and runtime needs using Cactus, a framework with interfaces for FORTRAN and C-based languages, and high-performance model communication needs using DDB. This document also specifically explores object-oriented design issues in the context of climate modeling as well as climate modeling issues in terms of object-oriented design.

  13. Computer simulation of plasma behavior in open-ended linear theta machines. Scientific report 81-5

    Energy Technology Data Exchange (ETDEWEB)

    Stover, E. K.

    1981-04-01

    Zero-dimensional and one-dimensional fluid plasma computer models have been developed to study the behavior of linear theta pinch plasmas. Computer simulation results generated from these codes are compared with data obtained from two theta pinch experiments so that significant machine plasma behavior can be identified. The experiments examined are a collisional experiment, T/sub i/ approx. 50 eV, n/sub e/ approx. 10/sup 17/ cm/sup -3/, where the plasma mean-free-path was significantly less than the plasma column length, and a hot ion species experiment, T/sub i/ approx. 3 keV, n/sub e/ approx. 10/sup 16/ cm/sup -3/, where the ion mean-free-path was on the order of the plasma column length.

  14. Computer simulation of plasma behavior in open-ended linear theta machines. Scientific report 81-5

    International Nuclear Information System (INIS)

    Stover, E.K.

    1981-04-01

    Zero-dimensional and one-dimensional fluid plasma computer models have been developed to study the behavior of linear theta pinch plasmas. Computer simulation results generated from these codes are compared with data obtained from two theta pinch experiments so that significant machine plasma behavior can be identified. The experiments examined are a collisional experiment, T/sub i/ approx. 50 eV, n/sub e/ approx. 10 17 cm -3 , where the plasma mean-free-path was significantly less than the plasma column length, and a hot ion species experiment, T/sub i/ approx. 3 keV, n/sub e/ approx. 10 16 cm -3 , where the ion mean-free-path was on the order of the plasma column length

  15. Civacuve analysis software for mis machine examination of pressurized water reactor vessels; Civacuve logiciel d'analyse des controles mis des cuves de reacteurs nucleaires

    Energy Technology Data Exchange (ETDEWEB)

    Dubois, Ph.; Gagnor, A. [Intercontrole, 94 - Rungis (France)

    2001-07-01

    The product software CIVACUVE is used by INTERCONTROLE for the analysis of UT examinations, for detection, performed by the In-Service Inspection Machine (MIS) of the vessels of nuclear power plants. This software is based on an adaptation of an algorithm of SEGMENTATION (CEA CEREM), which is applied prior to any analysis. It is equipped with tools adapted to industrial use. It allows to: - perform image analysis thanks to advanced graphic tools (Zooms, True Bscan, 'contour' selection...), - backup of all data in a database (complete and transparent backup of all informations used and obtained during the different analysis operations), - connect PC to the Database (export of Reports and even of segmented points), - issue Examination Reports, Operating Condition Sheets, Sizing curves... - and last, perform a graphic and numerical comparison between different inspections of the same vessel. Used in Belgium and France on different kind of reactor vessels, CIVACUVE has allowed to show that the principle of SEGMENTATION can be adapted to detection exams. The use of CIVACUVE generates a important time gain as well as the betterment of quality in analysis. Wide data opening toward PC's allows a real flexibility with regard to client's requirements and preoccupations.

  16. Performance and portability of the SciBy virtual machine

    DEFF Research Database (Denmark)

    Andersen, Rasmus; Vinter, Brian

    2010-01-01

    The Scientific Bytecode Virtual Machine is a virtual machine designed specifically for performance, security, and portability of scientific applications deployed in a Grid environment. The performance overhead normally incurred by virtual machines is mitigated using native optimized scientific li...

  17. Quantum machine learning.

    Science.gov (United States)

    Biamonte, Jacob; Wittek, Peter; Pancotti, Nicola; Rebentrost, Patrick; Wiebe, Nathan; Lloyd, Seth

    2017-09-13

    Fuelled by increasing computer power and algorithmic advances, machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that classical systems are thought not to produce efficiently, so it is reasonable to postulate that quantum computers may outperform classical computers on machine learning tasks. The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers. Recent work has produced quantum algorithms that could act as the building blocks of machine learning programs, but the hardware and software challenges are still considerable.

  18. ESTSC - Software Best Practices

    Science.gov (United States)

    DOE Scientific and Technical Software Best Practices December 2010 Table of Contents 1.0 Introduction 2.0 Responsibilities 2.1 OSTI/ESTSC 2.2 SIACs 2.3 Software Submitting Sites/Creators 2.4 Software Sensitivity Review 3.0 Software Announcement and Submission 3.1 STI Software Appropriate for Announcement 3.2

  19. Long-term preservation of analysis software environment

    International Nuclear Information System (INIS)

    Toppe Larsen, Dag; Blomer, Jakob; Buncic, Predrag; Charalampidis, Ioannis; Haratyunyan, Artem

    2012-01-01

    Long-term preservation of scientific data represents a challenge to experiments, especially regarding the analysis software. Preserving data is not enough; the full software and hardware environment is needed. Virtual machines (VMs) make it possible to preserve hardware “in software”. A complete infrastructure package has been developed for easy deployment and management of VMs, based on CERN virtual machine (CernVM). Further, a HTTP-based file system, CernVM file system (CVMFS), is used for the distribution of the software. It is possible to process data with any given software version, and a matching, regenerated VM version. A point-and-click web user interface is being developed for setting up the complete processing chain, including VM and software versions, number and type of processing nodes, and the particular type of analysis and data. This paradigm also allows for distributed cloud-computing on private and public clouds, for both legacy and contemporary experiments.

  20. Self-assembled software and method of overriding software execution

    Science.gov (United States)

    Bouchard, Ann M.; Osbourn, Gordon C.

    2013-01-08

    A computer-implemented software self-assembled system and method for providing an external override and monitoring capability to dynamically self-assembling software containing machines that self-assemble execution sequences and data structures. The method provides an external override machine that can be introduced into a system of self-assembling machines while the machines are executing such that the functionality of the executing software can be changed or paused without stopping the code execution and modifying the existing code. Additionally, a monitoring machine can be introduced without stopping code execution that can monitor specified code execution functions by designated machines and communicate the status to an output device.

  1. Mobile technology and telemedicine for shoulder range of motion: validation of a motion-based machine-learning software development kit.

    Science.gov (United States)

    Ramkumar, Prem N; Haeberle, Heather S; Navarro, Sergio M; Sultan, Assem A; Mont, Michael A; Ricchetti, Eric T; Schickendantz, Mark S; Iannotti, Joseph P

    2018-03-07

    Mobile technology offers the prospect of delivering high-value care with increased patient access and reduced costs. Advances in mobile health (mHealth) and telemedicine have been inhibited by the lack of interconnectivity between devices and software and inability to process consumer sensor data. The objective of this study was to preliminarily validate a motion-based machine learning software development kit (SDK) for the shoulder compared with a goniometer for 4 arcs of motion: (1) abduction, (2) forward flexion, (3) internal rotation, and (4) external rotation. A mobile application for the SDK was developed and "taught" 4 arcs of shoulder motion. Ten subjects without shoulder pain or prior shoulder surgery performed the arcs of motion for 5 repetitions. Each motion was measured by the SDK and compared with a physician-measured manual goniometer measurement. Angular differences between SDK and goniometer measurements were compared with univariate and power analyses. The comparison between the SDK and goniometer measurement detected a mean difference of less than 5° for all arcs of motion (P > .05), with a 94% chance of detecting a large effect size from a priori power analysis. Mean differences for the arcs of motion were: abduction, -3.7° ± 3.2°; forward flexion, -4.9° ± 2.5°; internal rotation, -2.4° ± 3.7°; and external rotation -2.6° ± 3.4°. The SDK has the potential to remotely substitute for a shoulder range of motion examination within 5° of goniometer measurements. An open-source motion-based SDK that can learn complex movements, including clinical shoulder range of motion, from consumer sensors offers promise for the future of mHealth, particularly in telemonitoring before and after orthopedic surgery. Copyright © 2018 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  2. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  3. Software Innovations Speed Scientific Computing

    Science.gov (United States)

    2012-01-01

    To help reduce the time needed to analyze data from missions like those studying the Sun, Goddard Space Flight Center awarded SBIR funding to Tech-X Corporation of Boulder, Colorado. That work led to commercial technologies that help scientists accelerate their data analysis tasks. Thanks to its NASA work, the company doubled its number of headquarters employees to 70 and generated about $190,000 in revenue from its NASA-derived products.

  4. Direct numerical control of machine tools in a nuclear research center by the CAMAC system

    International Nuclear Information System (INIS)

    Zwoll, K.; Mueller, K.D.; Becks, B.; Erven, W.; Sauer, M.

    1977-01-01

    The production of mechanical parts in research centers can be improved by connecting several numerically controlled machine tools to a central process computer via a data link. The CAMAC Serial Highway with its expandable structure yields an economic and flexible system for this purpose. The CAMAC System also facilitates the development of modular components controlling the machine tools itself. A CAMAC installation controlling three different machine tools connected to a central computer (PDP11) via the CAMAC Serial Highway is described. Besides this application, part of the CAMAC hardware and software can also be used for a great variety of scientific experiments

  5. Scientific Computation Application Partnerships in Materials and Chemical Sciences, Charge Transfer and Charge Transport in Photoactivated Systems, Developing Electron-Correlated Methods for Excited State Structure and Dynamics in the NWChem Software Suite

    Energy Technology Data Exchange (ETDEWEB)

    Cramer, Christopher J. [Univ. of Minnesota, Minneapolis, MN (United States)

    2017-11-12

    Charge transfer and charge transport in photoactivated systems are fundamental processes that underlie solar energy capture, solar energy conversion, and photoactivated catalysis, both organometallic and enzymatic. We developed methods, algorithms, and software tools needed for reliable treatment of the underlying physics for charge transfer and charge transport, an undertaking with broad applicability to the goals of the fundamental-interaction component of the Department of Energy Office of Basic Energy Sciences and the exascale initiative of the Office of Advanced Scientific Computing Research.

  6. Virtual reality devices integration in scientific visualization software in the VtkVRPN framework; Integration de peripheriques de realite virtuelle dans des applications de visualisation scientifique au sein de la plate-forme VtkVRPN

    Energy Technology Data Exchange (ETDEWEB)

    Journe, G.; Guilbaud, C

    2005-07-01

    A high-quality scientific visualization software relies on ergonomic navigation and exploration. Those are essential to be able to perform an efficient data analysis. To help solving this issue, management of virtual reality devices has been developed inside the CEA 'VtkVRPN' framework. This framework is based on VTK, a 3D graphical library, and VRPN, a virtual reality devices management library. This document describes the developments done during a post-graduate training course. (authors)

  7. Modal analysis of spindle of grinder machine based on ANSYS

    Directory of Open Access Journals (Sweden)

    HE Chaocong

    2015-10-01

    Full Text Available The object of research is to a certain type grinding wheel spindle for which a 3D model of the spindle is established by SolidWorks software and ANSYS software is imported for model analysis.Natural frequency,vibration type and critical speed of the spindle model are obtained and the resulting data are scientifically analyzed.The results show that the spindle structure is reasonable,the machining accuracy can be ensured and the position where the most severe deformation and the main shaft fatigue fracture may occur can be found out,which also provide the theoretical basis for further optimization design and precision control.

  8. Modal analysis of spindle of grinder machine based on ANSYS

    OpenAIRE

    HE Chaocong; LIU Peipei; YAN Chunfei; WANG Muhuan; LIN Jun

    2015-01-01

    The object of research is to a certain type grinding wheel spindle for which a 3D model of the spindle is established by SolidWorks software and ANSYS software is imported for model analysis.Natural frequency,vibration type and critical speed of the spindle model are obtained and the resulting data are scientifically analyzed.The results show that the spindle structure is reasonable,the machining accuracy can be ensured and the position where the most severe deformation and the main shaft fat...

  9. DiFX: A software correlator for very long baseline interferometry using multi-processor computing environments

    OpenAIRE

    Deller, A. T.; Tingay, S. J.; Bailes, M.; West, C.

    2007-01-01

    We describe the development of an FX style correlator for Very Long Baseline Interferometry (VLBI), implemented in software and intended to run in multi-processor computing environments, such as large clusters of commodity machines (Beowulf clusters) or computers specifically designed for high performance computing, such as multi-processor shared-memory machines. We outline the scientific and practical benefits for VLBI correlation, these chiefly being due to the inherent flexibility of softw...

  10. Singer CNC sewing and embroidery machine

    Directory of Open Access Journals (Sweden)

    Lokodi Zsolt

    2011-12-01

    Full Text Available This paper presents the adaptation of a classic foot pedal operated Singer sewing machine to a computerized numerical control (CNC sewing and embroidery machine. This machine is composed of a Singer sewing machine and a two-degrees-of-freedom XY stage designed specifically for this application. The whole system is controlled from a PC using adequate CNC control software.

  11. Machine learning for micro-tomography

    Science.gov (United States)

    Parkinson, Dilworth Y.; Pelt, Daniël. M.; Perciano, Talita; Ushizima, Daniela; Krishnan, Harinarayan; Barnard, Harold S.; MacDowell, Alastair A.; Sethian, James

    2017-09-01

    Machine learning has revolutionized a number of fields, but many micro-tomography users have never used it for their work. The micro-tomography beamline at the Advanced Light Source (ALS), in collaboration with the Center for Applied Mathematics for Energy Research Applications (CAMERA) at Lawrence Berkeley National Laboratory, has now deployed a series of tools to automate data processing for ALS users using machine learning. This includes new reconstruction algorithms, feature extraction tools, and image classification and recommen- dation systems for scientific image. Some of these tools are either in automated pipelines that operate on data as it is collected or as stand-alone software. Others are deployed on computing resources at Berkeley Lab-from workstations to supercomputers-and made accessible to users through either scripting or easy-to-use graphical interfaces. This paper presents a progress report on this work.

  12. Mastering machine learning with scikit-learn

    CERN Document Server

    Hackeling, Gavin

    2014-01-01

    If you are a software developer who wants to learn how machine learning models work and how to apply them effectively, this book is for you. Familiarity with machine learning fundamentals and Python will be helpful, but is not essential.

  13. N286.7-99, A Canadian standard specifying software quality management system requirements for analytical, scientific, and design computer programs and its implementation at AECL

    International Nuclear Information System (INIS)

    Abel, R.

    2000-01-01

    Analytical, scientific, and design computer programs (referred to in this paper as 'scientific computer programs') are developed for use in a large number of ways by the user-engineer to support and prove engineering calculations and assumptions. These computer programs are subject to frequent modifications inherent in their application and are often used for critical calculations and analysis relative to safety and functionality of equipment and systems. N286.7-99(4) was developed to establish appropriate quality management system requirements to deal with the development, modification, and application of scientific computer programs. N286.7-99 provides particular guidance regarding the treatment of legacy codes

  14. Haunted by the ghost in the machine. Commentary on "The spirituality of human consciousness: a Catholic evaluation of some current neuro-scientific interpretations".

    Science.gov (United States)

    Miller, James B

    2012-09-01

    Metaphysical and epistemological dualism informs much contemporary discussion of the relationships of science and religion, in particular in relation to the neurosciences and the religious understanding of the human person. This dualism is a foundational artifact of modern culture; however, contemporary scientific research and historical theological scholarship encourage a more holistic view wherein human personhood is most fittingly understood as an emergent phenomenon of, but not simply reducible to, evolutionary and developmental neurobiology.

  15. Machine Shop Grinding Machines.

    Science.gov (United States)

    Dunn, James

    This curriculum manual is one in a series of machine shop curriculum manuals intended for use in full-time secondary and postsecondary classes, as well as part-time adult classes. The curriculum can also be adapted to open-entry, open-exit programs. Its purpose is to equip students with basic knowledge and skills that will enable them to enter the…

  16. Car2x with software defined networks, network functions virtualization and supercomputers technical and scientific preparations for the Amsterdam Arena telecoms fieldlab

    NARCIS (Netherlands)

    Meijer R.J.; Cushing R.; De Laat C.; Jackson P.; Klous S.; Koning R.; Makkes M.X.; Meerwijk A.

    2015-01-01

    In the invited talk 'Car2x with SDN, NFV and supercomputers' we report about how our past work with SDN [1, 2] allows the design of a smart mobility fieldlab in the huge parking lot the Amsterdam Arena. We explain how we can engineer and test software that handle the complex conditions of the Car2X

  17. Car2x with software defined networks, network functions virtualization and supercomputers : technical and scientific preparations for the Amsterdam Arena telecoms fieldlab

    NARCIS (Netherlands)

    Meijer, R.; Cushing, R.; de Laat, C.; Jackson, P.; Klous, S.; Koning, R.; Makkes, M.; Meerwijk, A.; Smari, W.W.

    2015-01-01

    In the invited talk “Car2x with SDN, NFV and supercomputers” we report about how our past work with SDN [1, 2] allows the design of a smart mobility fieldlab in the huge parking lot the Amsterdam Arena. We explain how we can engineer and test software that handle the complex conditions of the Car2X

  18. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  19. Scientific instruments, scientific progress and the cyclotron

    International Nuclear Information System (INIS)

    Baird, David; Faust, Thomas

    1990-01-01

    Philosophers speak of science in terms of theory and experiment, yet when they speak of the progress of scientific knowledge they speak in terms of theory alone. In this article it is claimed that scientific knowledge consists of, among other things, scientific instruments and instrumental techniques and not simply of some kind of justified beliefs. It is argued that one aspect of scientific progress can be characterized relatively straightforwardly - the accumulation of new scientific instruments. The development of the cyclotron is taken to illustrate this point. Eight different activities which promoted the successful completion of the cyclotron are recognised. The importance is in the machine rather than the experiments which could be run on it and the focus is on how the cyclotron came into being, not how it was subsequently used. The completed instrument is seen as a useful unit of scientific progress in its own right. (UK)

  20. Next Generation Software Development

    National Research Council Canada - National Science Library

    Manna, Zohar

    2005-01-01

    Under this grant we have studied the development of a scientifically sound basis for software development that builds on widely used pragmatic methods but is firmly grounded in well-established formal...

  1. The Machine Scoring of Writing

    Science.gov (United States)

    McCurry, Doug

    2010-01-01

    This article provides an introduction to the kind of computer software that is used to score student writing in some high stakes testing programs, and that is being promoted as a teaching and learning tool to schools. It sketches the state of play with machines for the scoring of writing, and describes how these machines work and what they do.…

  2. Current trends in free software research

    OpenAIRE

    Navarro Bosch, Ramon; Vila Marta, Sebastià

    2009-01-01

    This report analyzes how scientific research is studying free software. We find which research is being done on free software by looking into scientific journals and conferences publications. The data thus obtained is analized and the most salient trends related to free software discovered. We also reviewed the main works published in each free software research area.

  3. Machine Vision Handbook

    CERN Document Server

    2012-01-01

    The automation of visual inspection is becoming more and more important in modern industry as a consistent, reliable means of judging the quality of raw materials and manufactured goods . The Machine Vision Handbook  equips the reader with the practical details required to engineer integrated mechanical-optical-electronic-software systems. Machine vision is first set in the context of basic information on light, natural vision, colour sensing and optics. The physical apparatus required for mechanized image capture – lenses, cameras, scanners and light sources – are discussed followed by detailed treatment of various image-processing methods including an introduction to the QT image processing system. QT is unique to this book, and provides an example of a practical machine vision system along with extensive libraries of useful commands, functions and images which can be implemented by the reader. The main text of the book is completed by studies of a wide variety of applications of machine vision in insp...

  4. Scientific Equipment Division - Overview

    International Nuclear Information System (INIS)

    Halik, J.

    2001-01-01

    Full text: The Scientific Equipment Division consists of the Design Group and the Mechanical Workshop. The activity of the Division includes the following: - designing of devices and equipment for experiments in physics, their mechanical construction and assembly. In particular, there are vacuum chambers and installations for HV and UHV; - maintenance and upgrading of the existing installations and equipment in our Institute; - participation of our engineers and technicians in design works, equipment assembly and maintenance for experiments in foreign laboratories. The Design Group is equipped with PC-computers and AutoCAD graphic software (release 2000 and Mechanical Desktop 4.0) and a AO plotter, what allows us to make drawings and 2- and 3-dimensional mechanical documentation to the world standards. The Mechanical Workshop can offer a wide range of machining and treatment methods with satisfactory tolerances and surface quality. It offers the following possibilities: - turning - cylindrical elements of a length up to 2000 mm and a diameter up to 400 mm, and also disc-type elements of a diameter up to 600 mm and a length not exceeding 300 mm; - milling - elements of length up to 1000 mm and gear wheels of diameter up to 300 mm; - grinding - flat surfaces of dimensions up to 300 mm x 1000 mm and cylindrical elements of a diameter up to 200 mm and a length up to 800 mm; - drilling - holes of a diameter up to 50 mm; - welding - electrical and gas welding, including TIG vacuum-tight welding; - soft and hard soldering; - mechanical works including precision engineering; - plastics treatment - machining and polishing using diamond milling, modelling, lamination of various shapes and materials, including plexiglas, scintillators and light-guides; - painting - paint spraying with possibility of using furnace-fred drier of internal dimensions of 800 mm x 800 mm x 800 mm. Our workshop posses CNC milling machine which can be used for machining of work-pieces up to 500 kg

  5. Mini lathe machine converted to CNC

    Directory of Open Access Journals (Sweden)

    Alexandru Morar

    2012-06-01

    Full Text Available This paper presents the adaptation of a mechanical mini-lathing machine to a computerized numerical control (CNC lathing machine. This machine is composed of a ASIST mini-lathe and a two-degrees-of-freedom XZ stage designed specifically for this application. The whole system is controlled from a PC using adequate CNC control software.

  6. Machine function based control code algebras

    NARCIS (Netherlands)

    Bergstra, J.A.

    Machine functions have been introduced by Earley and Sturgis in [6] in order to provide a mathematical foundation of the use of the T-diagrams proposed by Bratman in [5]. Machine functions describe the operation of a machine at a very abstract level. A theory of hardware and software based on

  7. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  8. Sustainable machining

    CERN Document Server

    2017-01-01

    This book provides an overview on current sustainable machining. Its chapters cover the concept in economic, social and environmental dimensions. It provides the reader with proper ways to handle several pollutants produced during the machining process. The book is useful on both undergraduate and postgraduate levels and it is of interest to all those working with manufacturing and machining technology.

  9. Mr Lars Leijonborg, Minister for Higher Education and Research of Sweden visiting the cavern ATLAS, the control room of ATLAS and the machine LHC at Point 1 with Collaboration Spokesperson P. Jenni and Dr. Jos Engelen, Chief Scientific Officer of CERN.

    CERN Multimedia

    Maximilien Brice

    2008-01-01

    Mr Lars Leijonborg, Minister for Higher Education and Research of Sweden visiting the cavern ATLAS, the control room of ATLAS and the machine LHC at Point 1 with Collaboration Spokesperson P. Jenni and Dr. Jos Engelen, Chief Scientific Officer of CERN.

  10. In praise of open software

    CERN Multimedia

    2000-01-01

    Much scientific software is proprietary and beyond the reach of poorer scientific communities. This issue will become critical as companies build bioinformatics tools for genomics. The principal of open-source software needs to be defended by academic research institutions (1/2 p).

  11. Man - Machine Communication

    CERN Document Server

    Petersen, Peter; Nielsen, Henning

    1984-01-01

    This report describes a Man-to-Machine Communication module which together with a STAC can take care of all operator inputs from the touch-screen, tracker balls and mechanical buttons. The MMC module can also contain a G64 card which could be a GPIB driver but many other G64 cards could be used. The soft-ware services the input devices and makes the results accessible from the CAMAC bus. NODAL functions for the Man Machine Communication is implemented in the STAC and in the ICC.

  12. Value Framing: A Prelude to Software Problem Framing

    NARCIS (Netherlands)

    Wieringa, Roelf J.; Gordijn, Jaap; van Eck, Pascal; Cox, K.; Hall, J.G.; Rapanotti, L.

    2004-01-01

    Software problem framing is a way to find specifications for software. Software problem frames can be used to structure the environment of a software system (the machine) and specify desired software properties in such a way that we can show that software with these properties will help achieve the

  13. Case Analysis on Application of SPSS Software in Forestry Production and Scientific Research%SPSS在林业生产和科学研究中的应用实例解析

    Institute of Scientific and Technical Information of China (English)

    琚松苗

    2012-01-01

    SPSS作为统计分析工具,具有数据管理、统计分析、趋势研究、制表绘图、文字处理等功能。本文从科技推广角度出发,结合典型实例介绍SPSS统计软件在林业生产和科学研究中的应用。%SPSS (Statistical Program for Social Sciences), a statistical analysis tool, is used for data management, statistical analysis, trend study, tabulation and drawing, word processing and so on. In this paper the application of the statistical software SPSS in forestry production and scientific research is introduced with typical cases from the perspective of forestry science and technology promotion.

  14. Electrical machines diagnosis

    CERN Document Server

    Trigeassou, Jean-Claude

    2013-01-01

    Monitoring and diagnosis of electrical machine faults is a scientific and economic issue which is motivated by objectives for reliability and serviceability in electrical drives.This book provides a survey of the techniques used to detect the faults occurring in electrical drives: electrical, thermal and mechanical faults of the electrical machine, faults of the static converter and faults of the energy storage unit.Diagnosis of faults occurring in electrical drives is an essential part of a global monitoring system used to improve reliability and serviceability. This diagnosis is perf

  15. Formal modeling of virtual machines

    Science.gov (United States)

    Cremers, A. B.; Hibbard, T. N.

    1978-01-01

    Systematic software design can be based on the development of a 'hierarchy of virtual machines', each representing a 'level of abstraction' of the design process. The reported investigation presents the concept of 'data space' as a formal model for virtual machines. The presented model of a data space combines the notions of data type and mathematical machine to express the close interaction between data and control structures which takes place in a virtual machine. One of the main objectives of the investigation is to show that control-independent data type implementation is only of limited usefulness as an isolated tool of program development, and that the representation of data is generally dictated by the control context of a virtual machine. As a second objective, a better understanding is to be developed of virtual machine state structures than was heretofore provided by the view of the state space as a Cartesian product.

  16. NASA Software Cost Estimation Model: An Analogy Based Estimation Model

    Science.gov (United States)

    Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James

    2015-01-01

    The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K-­ nearest neighbor prediction model performance on the same data set.

  17. Machine Learning for Security

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Applied statistics, aka ‘Machine Learning’, offers a wealth of techniques for answering security questions. It’s a much hyped topic in the big data world, with many companies now providing machine learning as a service. This talk will demystify these techniques, explain the math, and demonstrate their application to security problems. The presentation will include how-to’s on classifying malware, looking into encrypted tunnels, and finding botnets in DNS data. About the speaker Josiah is a security researcher with HP TippingPoint DVLabs Research Group. He has over 15 years of professional software development experience. Josiah used to do AI, with work focused on graph theory, search, and deductive inference on large knowledge bases. As rules only get you so far, he moved from AI to using machine learning techniques identifying failure modes in email traffic. There followed digressions into clustered data storage and later integrated control systems. Current ...

  18. Quantum Virtual Machine (QVM)

    Energy Technology Data Exchange (ETDEWEB)

    2016-11-18

    There is a lack of state-of-the-art HPC simulation tools for simulating general quantum computing. Furthermore, there are no real software tools that integrate current quantum computers into existing classical HPC workflows. This product, the Quantum Virtual Machine (QVM), solves this problem by providing an extensible framework for pluggable virtual, or physical, quantum processing units (QPUs). It enables the execution of low level quantum assembly codes and returns the results of such executions.

  19. Simple machines

    CERN Document Server

    Graybill, George

    2007-01-01

    Just how simple are simple machines? With our ready-to-use resource, they are simple to teach and easy to learn! Chocked full of information and activities, we begin with a look at force, motion and work, and examples of simple machines in daily life are given. With this background, we move on to different kinds of simple machines including: Levers, Inclined Planes, Wedges, Screws, Pulleys, and Wheels and Axles. An exploration of some compound machines follows, such as the can opener. Our resource is a real time-saver as all the reading passages, student activities are provided. Presented in s

  20. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  1. Face machines

    Energy Technology Data Exchange (ETDEWEB)

    Hindle, D.

    1999-06-01

    The article surveys latest equipment available from the world`s manufacturers of a range of machines for tunnelling. These are grouped under headings: excavators; impact hammers; road headers; and shields and tunnel boring machines. Products of thirty manufacturers are referred to. Addresses and fax numbers of companies are supplied. 5 tabs., 13 photos.

  2. Electric machine

    Science.gov (United States)

    El-Refaie, Ayman Mohamed Fawzi [Niskayuna, NY; Reddy, Patel Bhageerath [Madison, WI

    2012-07-17

    An interior permanent magnet electric machine is disclosed. The interior permanent magnet electric machine comprises a rotor comprising a plurality of radially placed magnets each having a proximal end and a distal end, wherein each magnet comprises a plurality of magnetic segments and at least one magnetic segment towards the distal end comprises a high resistivity magnetic material.

  3. Machine Learning.

    Science.gov (United States)

    Kirrane, Diane E.

    1990-01-01

    As scientists seek to develop machines that can "learn," that is, solve problems by imitating the human brain, a gold mine of information on the processes of human learning is being discovered, expert systems are being improved, and human-machine interactions are being enhanced. (SK)

  4. Nonplanar machines

    International Nuclear Information System (INIS)

    Ritson, D.

    1989-05-01

    This talk examines methods available to minimize, but never entirely eliminate, degradation of machine performance caused by terrain following. Breaking of planar machine symmetry for engineering convenience and/or monetary savings must be balanced against small performance degradation, and can only be decided on a case-by-case basis. 5 refs

  5. Machine speech and speaking about machines

    Energy Technology Data Exchange (ETDEWEB)

    Nye, A. [Univ. of Wisconsin, Whitewater, WI (United States)

    1996-12-31

    Current philosophy of language prides itself on scientific status. It boasts of being no longer contaminated with queer mental entities or idealist essences. It theorizes language as programmable variants of formal semantic systems, reimaginable either as the properly epiphenomenal machine functions of computer science or the properly material neural networks of physiology. Whether or not such models properly capture the physical workings of a living human brain is a question that scientists will have to answer. I, as a philosopher, come at the problem from another direction. Does contemporary philosophical semantics, in its dominant truth-theoretic and related versions, capture actual living human thought as it is experienced, or does it instead reflect, regardless of (perhaps dubious) scientific credentials, pathology of thought, a pathology with a disturbing social history.

  6. TAYLOR’S SCIENTIFIC MANAGEMENT

    OpenAIRE

    Dimitrios N. KOUMPAROULIS; Dionysios K. SOLOMOS

    2012-01-01

    Frederick Taylor is known as the father of modern management. Taylor’s scientific management revolutionized industry and helped shape the modern organization. Scientific management revolutionized industry because it explains how to increase production by working smarter, not harder. Taylor’s ideas were not limited to only serving the company’s bottom line but from the increase in productivity benefited the workforce as well. The principles of scientific management have become a machine of uni...

  7. The economics of information systems and software

    CERN Document Server

    Veryard, Richard

    2014-01-01

    The Economics of Information Systems and Software focuses on the economic aspects of information systems and software, including advertising, evaluation of information systems, and software maintenance. The book first elaborates on value and values, software business, and scientific information as an economic category. Discussions focus on information products and information services, special economic properties of information, culture and convergence, hardware and software products, materiality and consumption, technological progress, and software flexibility. The text then takes a look at a

  8. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  9. Machine Learning

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Machine learning, which builds on ideas in computer science, statistics, and optimization, focuses on developing algorithms to identify patterns and regularities in data, and using these learned patterns to make predictions on new observations. Boosted by its industrial and commercial applications, the field of machine learning is quickly evolving and expanding. Recent advances have seen great success in the realms of computer vision, natural language processing, and broadly in data science. Many of these techniques have already been applied in particle physics, for instance for particle identification, detector monitoring, and the optimization of computer resources. Modern machine learning approaches, such as deep learning, are only just beginning to be applied to the analysis of High Energy Physics data to approach more and more complex problems. These classes will review the framework behind machine learning and discuss recent developments in the field.

  10. Machine Translation

    Indian Academy of Sciences (India)

    Research Mt System Example: The 'Janus' Translating Phone Project. The Janus ... based on laptops, and simultaneous translation of two speakers in a dialogue. For more ..... The current focus in MT research is on using machine learning.

  11. Ten recommendations for software engineering in research.

    Science.gov (United States)

    Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph

    2014-01-01

    Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software.

  12. Automating horizontal boring and milling machine

    International Nuclear Information System (INIS)

    Naqvi, S.A.R.; Mahmood, T.; Choudhry, M.A.; Hanif, A.

    2012-01-01

    Aiming at the requirements of modification for many old import machine tools in industry, the schemes suited to the renovation are presented in this paper. A horizontal boring and milling machine (HBM) involved in machining of tank Al-Khalid has been modified using Mitsubishi FX-1N and FX-2N PLC. The developed software is for control of all the functions of the said machine. These functions include power on/off oil pump, spindle rotation and machine movement in all axes. All the decisions required by the machine for actuation of instructions are based on the data acquired from the control panel, timers and limit switches. Also the developed software minimize the down time, safety of operator and error free actuation of instructions. (author)

  13. Development of a software concept for computer-aided technical detail planning for machines in German hard coal mining; Entwicklung eines Softwarekonzeptes fuer rechnergestuetzte maschinentechnische Datailplanung im deutschen Steinkohlenbergbau

    Energy Technology Data Exchange (ETDEWEB)

    Borstell, D

    1994-12-31

    CAD systems have long been an aid in German hard coal mining for reducing costs in all technical planning tasks. The use of computers will offer as yet unused possibilities for further cost savings in the future. For this purpose, this book introduces a new software concept for the technical planning for machines in the mines of the Ruhr. By the continued and thorough use of the potential of modern computer techniques, by the application of the knowledge of planning science orientated towards practice and by transferring computer-aided planning applications from other branches of industry, a further contribution is to be made to reducing costs in technical planning. The heart of the future technical planning workplace will be a graphically orientated surface with the three-dimensional representation of the pit structure on the screen. On this surface, after choosing the work area in the pit structure, there is access to the available software tools. These support the planning engineer in information and design work (connection to databank, 2D/3D-CAD, libraries of operating means and standard parts) and give support to the method of procedure (through expert systems, sample specifications, checklists). They will also offer help in inspection and decision-making (by simulation and calculation routines, expert systems) and in supporting publicity activities (text processing, desktop publishing). The computer-aided planning system of the future will develop from the two-dimensional design environment usual today into a comprehensive integrated 3D engineering system. (orig.) [Deutsch] CAD-Systeme sind im deutschen Steinkohlenbergbau seit Jahren wichtige Hilfsmittel zur Kostenreduzierung bei allen technischen Planungsaufgaben. Auch in Zukunft wird der Einsatz von Rechnern noch ungenutzte Moeglichkeiten fuer weitere Kosteneinsparungen bieten. Zu diesem Zweck wird in der vorliegenden Arbeit ein neues Softwarekonzept fuer die maschinentechnische Planung in den Stabsstellen

  14. Usability in Scientific Databases

    Directory of Open Access Journals (Sweden)

    Ana-Maria Suduc

    2012-07-01

    Full Text Available Usability, most often defined as the ease of use and acceptability of a system, affects the users' performance and their job satisfaction when working with a machine. Therefore, usability is a very important aspect which must be considered in the process of a system development. The paper presents several numerical data related to the history of the scientific research of the usability of information systems, as it is viewed in the information provided by three important scientific databases, Science Direct, ACM Digital Library and IEEE Xplore Digital Library, at different queries related to this field.

  15. Preliminary Test of Upgraded Conventional Milling Machine into PC Based CNC Milling Machine

    International Nuclear Information System (INIS)

    Abdul Hafid

    2008-01-01

    CNC (Computerized Numerical Control) milling machine yields a challenge to make an innovation in the field of machining. With an action job is machining quality equivalent to CNC milling machine, the conventional milling machine ability was improved to be based on PC CNC milling machine. Mechanically and instrumentally change. As a control replacing was conducted by servo drive and proximity were used. Computer programme was constructed to give instruction into milling machine. The program structure of consists GUI model and ladder diagram. Program was put on programming systems called RTX software. The result of up-grade is computer programming and CNC instruction job. The result was beginning step and it will be continued in next time. With upgrading ability milling machine becomes user can be done safe and optimal from accident risk. By improving performance of milling machine, the user will be more working optimal and safely against accident risk. (author)

  16. Metric for Measuring Software Power | Akwukwuma | Journal of the ...

    African Journals Online (AJOL)

    The term “power” has been used to describe Software in Software community especially Software vendors. However, there has been no formal definition of Software power, nor has there been any scientific method of determining Software power. It is therefore the objective of this paper to examine the attributes of Software ...

  17. The Three Pillars of Machine Programming

    OpenAIRE

    Gottschlich, Justin; Solar-Lezama, Armando; Tatbul, Nesime; Carbin, Michael; Rinard, Martin; Barzilay, Regina; Amarasinghe, Saman; Tenenbaum, Joshua B; Mattson, Tim

    2018-01-01

    In this position paper, we describe our vision of the future of machine programming through a categorical examination of three pillars of research. Those pillars are: (i) intention, (ii) invention, and(iii) adaptation. Intention emphasizes advancements in the human-to-computer and computer-to-machine-learning interfaces. Invention emphasizes the creation or refinement of algorithms or core hardware and software building blocks through machine learning (ML). Adaptation emphasizes advances in t...

  18. Model-based machine learning.

    Science.gov (United States)

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.

  19. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  20. Improvement of Computer Software Quality through Software Automated Tools.

    Science.gov (United States)

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  1. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  2. Evidence synthesis software.

    Science.gov (United States)

    Park, Sophie Elizabeth; Thomas, James

    2018-06-07

    It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise

  3. Machine Protection

    International Nuclear Information System (INIS)

    Zerlauth, Markus; Schmidt, Rüdiger; Wenninger, Jörg

    2012-01-01

    The present architecture of the machine protection system is being recalled and the performance of the associated systems during the 2011 run will be briefly summarized. An analysis of the causes of beam dumps as well as an assessment of the dependability of the machine protection systems (MPS) itself is being presented. Emphasis will be given to events that risked exposing parts of the machine to damage. Further improvements and mitigations of potential holes in the protection systems will be evaluated along with their impact on the 2012 run. The role of rMPP during the various operational phases (commissioning, intensity ramp up, MDs...) will be discussed along with a proposal for the intensity ramp up for the start of beam operation in 2012

  4. Machine Learning

    Energy Technology Data Exchange (ETDEWEB)

    Chikkagoudar, Satish; Chatterjee, Samrat; Thomas, Dennis G.; Carroll, Thomas E.; Muller, George

    2017-04-21

    The absence of a robust and unified theory of cyber dynamics presents challenges and opportunities for using machine learning based data-driven approaches to further the understanding of the behavior of such complex systems. Analysts can also use machine learning approaches to gain operational insights. In order to be operationally beneficial, cybersecurity machine learning based models need to have the ability to: (1) represent a real-world system, (2) infer system properties, and (3) learn and adapt based on expert knowledge and observations. Probabilistic models and Probabilistic graphical models provide these necessary properties and are further explored in this chapter. Bayesian Networks and Hidden Markov Models are introduced as an example of a widely used data driven classification/modeling strategy.

  5. Machine Protection

    CERN Document Server

    Zerlauth, Markus; Wenninger, Jörg

    2012-01-01

    The present architecture of the machine protection system is being recalled and the performance of the associated systems during the 2011 run will be briefly summarized. An analysis of the causes of beam dumps as well as an assessment of the dependability of the machine protection systems (MPS) itself is being presented. Emphasis will be given to events that risked exposing parts of the machine to damage. Further improvements and mitigations of potential holes in the protection systems will be evaluated along with their impact on the 2012 run. The role of rMPP during the various operational phases (commissioning, intensity ramp up, MDs...) will be discussed along with a proposal for the intensity ramp up for the start of beam operation in 2012.

  6. Machine Protection

    Energy Technology Data Exchange (ETDEWEB)

    Zerlauth, Markus; Schmidt, Rüdiger; Wenninger, Jörg [European Organization for Nuclear Research, Geneva (Switzerland)

    2012-07-01

    The present architecture of the machine protection system is being recalled and the performance of the associated systems during the 2011 run will be briefly summarized. An analysis of the causes of beam dumps as well as an assessment of the dependability of the machine protection systems (MPS) itself is being presented. Emphasis will be given to events that risked exposing parts of the machine to damage. Further improvements and mitigations of potential holes in the protection systems will be evaluated along with their impact on the 2012 run. The role of rMPP during the various operational phases (commissioning, intensity ramp up, MDs...) will be discussed along with a proposal for the intensity ramp up for the start of beam operation in 2012.

  7. Utilising artificial intelligence in software defined wireless sensor network

    CSIR Research Space (South Africa)

    Matlou, OG

    2017-10-01

    Full Text Available Software Defined Wireless Sensor Network (SDWSN) is realised by infusing Software Defined Network (SDN) model in Wireless Sensor Network (WSN), Reason for that is to overcome the challenges of WSN. Artificial Intelligence (AI) and machine learning...

  8. Teletherapy machine

    International Nuclear Information System (INIS)

    Panyam, Vinatha S.; Rakshit, Sougata; Kulkarni, M.S.; Pradeepkumar, K.S.

    2017-01-01

    Radiation Standards Section (RSS), RSSD, BARC is the national metrology institute for ionizing radiation. RSS develops and maintains radiation standards for X-ray, beta, gamma and neutron radiations. In radiation dosimetry, traceability, accuracy and consistency of radiation measurements is very important especially in radiotherapy where the success of patient treatment is dependent on the accuracy of the dose delivered to the tumour. Cobalt teletherapy machines have been used in the treatment of cancer since the early 1950s and India had its first cobalt teletherapy machine installed at the Cancer Institute, Chennai in 1956

  9. Software for nuclear spectrometry

    International Nuclear Information System (INIS)

    1998-10-01

    The Advisory Group Meeting (AGM) on Software for Nuclear Spectrometry was dedicated to review the present status of software for nuclear spectrometry and to advise on future activities in this field. Because similar AGM and consultant's meetings had been held in the past; together with an attempt to get more streamlined, this AGM was devoted to the specific field of software for gamma ray spectrometry. Nevertheless, many of the issues discussed and the recommendations made are of general concern for any software on nuclear spectrometry. The report is organized by sections. The 'Summary' gives conclusions and recommendations adopted at the AGM. These conclusions and recommendations resulted from the discussions held during and after presentations of the scientific and technical papers. These papers are reported here in their integral form in the following Sections

  10. Division of Scientific Equipment - Overview

    International Nuclear Information System (INIS)

    Halik, J.

    2002-01-01

    Full text: The Scientific Equipment Division consists of the Design Group and the Mechanical Workshop. The activity of the Division includes the following: * designs of devices and equipment for experiments in physics; their mechanical construction and assembly. In particular, these are vacuum chambers and installations for HV and UHV;* maintenance and upgrading of the existing installations and equipment in our Institute; * participation of our engineers and technicians in design works, equipment assembly and maintenance for experiments in foreign laboratories. The Design Group is equipped with PC-computers and AutoCAD graphic software (release 2000 and Mechanical Desktop 4.0) and an A0 plotter, which allow us to make drawings and 2- and 3-dimensional mechanical documentation to the world standards. The Mechanical Workshop offers a wide range of machining and treatment methods with satisfactory tolerances and surface quality. They include: * turning - cylindrical elements of a length up to 2000 mm and a diameter up to 400 mm, and also disc type elements of a diameter up to 600 mm and a length not exceeding 300 mm, * milling - elements of length up to 1000 mm and gear wheels of diameter up to 300 mm, * grinding - flat surfaces of dimensions up to 300 mm x 1000 mm and cylindrical elements of a diameter up to 200 mm and a length up to 800 mm, * drilling - holes of a diameter up to 50 mm, * welding - electrical and gas welding, including TIG vacuum-tight welding, * soft and hard soldering, * mechanical works including precision engineering, * plastics treatment - machining and polishing using diamond milling, modelling, lamination of various shapes and materials, including plexiglas, scintillators and light-guides, * painting - paint spraying with possibility of using furnace-fired drier of internal dimensions of 800 mm x 800 mm x 800 mm. Our workshop is equipped with the CNC milling machine which can be used for machining of work pieces up to 500 kg. The machine

  11. Diamond turning machine controller implementation

    Energy Technology Data Exchange (ETDEWEB)

    Garrard, K.P.; Taylor, L.W.; Knight, B.F.; Fornaro, R.J.

    1988-12-01

    The standard controller for a Pnuemo ASG 2500 Diamond Turning Machine, an Allen Bradley 8200, has been replaced with a custom high-performance design. This controller consists of four major components. Axis position feedback information is provided by a Zygo Axiom 2/20 laser interferometer with 0.1 micro-inch resolution. Hardware interface logic couples the computers digital and analog I/O channels to the diamond turning machine`s analog motor controllers, the laser interferometer, and other machine status and control information. It also provides front panel switches for operator override of the computer controller and implement the emergency stop sequence. The remaining two components, the control computer hardware and software, are discussed in detail below.

  12. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  13. The machine in multimedia analytics

    NARCIS (Netherlands)

    Zahálka, J.

    2017-01-01

    This thesis investigates the role of the machine in multimedia analytics, a discipline that combines visual analytics with multimedia analysis algorithms in order to unlock the potential of multimedia collections as sources of knowledge in scientific and applied domains. Specifically, the central

  14. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  15. Machine testning

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo

    This document is used in connection with a laboratory exercise of 3 hours duration as a part of the course GEOMETRICAL METROLOGY AND MACHINE TESTING. The exercise includes a series of tests carried out by the student on a conventional and a numerically controled lathe, respectively. This document...

  16. Machine rates for selected forest harvesting machines

    Science.gov (United States)

    R.W. Brinker; J. Kinard; Robert Rummer; B. Lanford

    2002-01-01

    Very little new literature has been published on the subject of machine rates and machine cost analysis since 1989 when the Alabama Agricultural Experiment Station Circular 296, Machine Rates for Selected Forest Harvesting Machines, was originally published. Many machines discussed in the original publication have undergone substantial changes in various aspects, not...

  17. Open Science: Open source licenses in scientific research

    OpenAIRE

    Guadamuz, Andres

    2006-01-01

    The article examines the validity of OSS (open source software) licenses for scientific, as opposed to creative works. It draws on examples of OSS licenses to consider their suitability for the scientific community and scientific research.

  18. Fiscal 1997 project on the R and D of industrial scientific technology under consignment from NEDO. Report on the results of the R and D of new software structuring models (R and D of micromachine cooperative control use software); 1997 nendo sangyo kagaku gijutsu kenkyu kaihatsu jigyo Shin Energy Sangyo Gijutsu Sogo Kaihatsu Kiko itaku. Shin software kozoka model no kenkyu kaihatsu (bisho kikai kyocho seigyoyo software no kenkyu kaihatsu) seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    A R and D was conducted of software structuring models which ease the development and maintenance of software systems and meet diversification of needs. As for the study of the cooperative control use programming language, a R and D of agent oriented language Flage was carried out for expansion of language function, arrangement of network function, development of exercises, etc. As to the formulation of agent knowledge, proposed were processes to make a program from the specifications, and EVA, a mechanism in response to changes in the specifications of existing programs. In relation to the basic theory of cooperation system, a study was made mainly of object oriented attribute grammar OOAG as a model representing cooperative computation in software process as a rule group. Concerning the study of the situation recognition mechanism, researched were models of communication and reasoning among agents in cooperation. 187 refs., 107 figs., 23 tabs.

  19. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-06-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  20. Software platform virtualization in chemistry research and university teaching.

    Science.gov (United States)

    Kind, Tobias; Leamy, Tim; Leary, Julie A; Fiehn, Oliver

    2009-11-16

    Modern chemistry laboratories operate with a wide range of software applications under different operating systems, such as Windows, LINUX or Mac OS X. Instead of installing software on different computers it is possible to install those applications on a single computer using Virtual Machine software. Software platform virtualization allows a single guest operating system to execute multiple other operating systems on the same computer. We apply and discuss the use of virtual machines in chemistry research and teaching laboratories. Virtual machines are commonly used for cheminformatics software development and testing. Benchmarking multiple chemistry software packages we have confirmed that the computational speed penalty for using virtual machines is low and around 5% to 10%. Software virtualization in a teaching environment allows faster deployment and easy use of commercial and open source software in hands-on computer teaching labs. Software virtualization in chemistry, mass spectrometry and cheminformatics is needed for software testing and development of software for different operating systems. In order to obtain maximum performance the virtualization software should be multi-core enabled and allow the use of multiprocessor configurations in the virtual machine environment. Server consolidation, by running multiple tasks and operating systems on a single physical machine, can lead to lower maintenance and hardware costs especially in small research labs. The use of virtual machines can prevent software virus infections and security breaches when used as a sandbox system for internet access and software testing. Complex software setups can be created with virtual machines and are easily deployed later to multiple computers for hands-on teaching classes. We discuss the popularity of bioinformatics compared to cheminformatics as well as the missing cheminformatics education at universities worldwide.

  1. Human-machine interface upgrade

    International Nuclear Information System (INIS)

    Kropik, M.; Matejka, K.; Sklenka, L.; Chab, V.

    2002-01-01

    The article describes a new human-machine interface that was installed at the VR-1 training reactor. The human-machine interface upgrade was completed in the summer 2001. The interface was designed with respect to functional, ergonomic and aesthetic requirements. The interface is based on a personal computer equipped with two displays. One display enables alphanumeric communication between the reactor operator and the nuclear reactor I and C. The second display is a graphical one. It presents the status of the reactor, principal parameters (as power, period), control rods positions, course of the reactor power. Furthermore, it is possible to set parameters, to show the active core configuration, to perform reactivity calculations, etc. The software for the new human-machine interface was produced with the InTouch developing tool of the Wonder-Ware Company. It is possible to switch the language of the interface between Czech and English because of many foreign students and visitors to the reactor. Microcomputer based communication units with proper software were developed to connect the new human-machine interface with the present reactor I and C. The new human-machine interface at the VR-1 training reactor improves the comfort and safety of the reactor utilisation, facilitates experiments and training, and provides better support for foreign visitors. (orig.)

  2. Automation of a universal machine

    International Nuclear Information System (INIS)

    Rodriguez S, J.

    1997-01-01

    The development of the hardware and software of a control system for a servo-hydraulic machine is presented. The universal machine is an Instron, model 1331, used to make mechanical tests. The software includes the acquisition of data from the measurements, processing and graphic presentation of the results in the assay of the 'tension' type. The control is based on a PPI (Programmable Peripheral Interface) 8255, in which the different states of the machine are set. The control functions of the machine are: a) Start of an assay, b) Pause in the assay, c) End of the assay, d) Choice of the control mode of the machine, that they could be in load, stroke or strain modes. For the data acquisition, a commercial card, National Products, model DAS-16, plugged in a slot of a Pc was used. Three transducers provide the analog signals, a cell of load, a LVDT and a extensometer. All the data are digitalized and handled in order to get the results in the appropriate working units. A stress-strain graph is obtained in the screen of the Pc for a tension test for a specific material. The points of maximum stress, rupture stress and the yield stress of the material under test are shown. (Author)

  3. Software development for teleroentgenogram analysis

    Science.gov (United States)

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  4. Introduction: Minds, Bodies, Machines

    Directory of Open Access Journals (Sweden)

    Deirdre Coleman

    2008-10-01

    Full Text Available This issue of 19 brings together a selection of essays from an interdisciplinary conference on 'Minds, Bodies, Machines' convened last year by Birkbeck's Centre for Nineteenth-Century Studies, University of London, in partnership with the English programme, University of Melbourne and software developers Constraint Technologies International (CTI. The conference explored the relationship between minds, bodies and machines in the long nineteenth century, with a view to understanding the history of our technology-driven, post-human visions. It is in the nineteenth century that the relationship between the human and the machine under post-industrial capitalism becomes a pervasive theme. From Blake on the mills of the mind by which we are enslaved, to Carlyle's and Arnold's denunciation of the machinery of modern life, from Dickens's sooty fictional locomotive Mr Pancks, who 'snorted and sniffed and puffed and blew, like a little labouring steam-engine', and 'shot out […]cinders of principles, as if it were done by mechanical revolvency', to the alienated historical body of the late-nineteenth-century factory worker under Taylorization, whose movements and gestures were timed, regulated and rationalised to maximize efficiency; we find a cultural preoccupation with the mechanisation of the nineteenth-century human body that uncannily resonates with modern dreams and anxieties around technologies of the human.

  5. Electric machines

    CERN Document Server

    Gross, Charles A

    2006-01-01

    BASIC ELECTROMAGNETIC CONCEPTSBasic Magnetic ConceptsMagnetically Linear Systems: Magnetic CircuitsVoltage, Current, and Magnetic Field InteractionsMagnetic Properties of MaterialsNonlinear Magnetic Circuit AnalysisPermanent MagnetsSuperconducting MagnetsThe Fundamental Translational EM MachineThe Fundamental Rotational EM MachineMultiwinding EM SystemsLeakage FluxThe Concept of Ratings in EM SystemsSummaryProblemsTRANSFORMERSThe Ideal n-Winding TransformerTransformer Ratings and Per-Unit ScalingThe Nonideal Three-Winding TransformerThe Nonideal Two-Winding TransformerTransformer Efficiency and Voltage RegulationPractical ConsiderationsThe AutotransformerOperation of Transformers in Three-Phase EnvironmentsSequence Circuit Models for Three-Phase Transformer AnalysisHarmonics in TransformersSummaryProblemsBASIC MECHANICAL CONSIDERATIONSSome General PerspectivesEfficiencyLoad Torque-Speed CharacteristicsMass Polar Moment of InertiaGearingOperating ModesTranslational SystemsA Comprehensive Example: The ElevatorP...

  6. Charging machine

    International Nuclear Information System (INIS)

    Medlin, J.B.

    1976-01-01

    A charging machine for loading fuel slugs into the process tubes of a nuclear reactor includes a tubular housing connected to the process tube, a charging trough connected to the other end of the tubular housing, a device for loading the charging trough with a group of fuel slugs, means for equalizing the coolant pressure in the charging trough with the pressure in the process tubes, means for pushing the group of fuel slugs into the process tube and a latch and a seal engaging the last object in the group of fuel slugs to prevent the fuel slugs from being ejected from the process tube when the pusher is removed and to prevent pressure liquid from entering the charging machine. 3 claims, 11 drawing figures

  7. The ANS mathematics and computation software standards

    Energy Technology Data Exchange (ETDEWEB)

    Smetana, A. O. [Savannah River National Laboratory, Washington Savannah River Company, Aiken, SC 29808 (United States)

    2006-07-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  8. The ANS mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A. O.

    2006-01-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  9. Genesis machines

    CERN Document Server

    Amos, Martyn

    2014-01-01

    Silicon chips are out. Today's scientists are using real, wet, squishy, living biology to build the next generation of computers. Cells, gels and DNA strands are the 'wetware' of the twenty-first century. Much smaller and more intelligent, these organic computers open up revolutionary possibilities. Tracing the history of computing and revealing a brave new world to come, Genesis Machines describes how this new technology will change the way we think not just about computers - but about life itself.

  10. Controls and Machine Protection Systems

    CERN Document Server

    Carrone, E.

    2016-01-01

    Machine protection, as part of accelerator control systems, can be managed with a 'functional safety' approach, which takes into account product life cycle, processes, quality, industrial standards and cybersafety. This paper will discuss strategies to manage such complexity and the related risks, with particular attention to fail-safe design and safety integrity levels, software and hardware standards, testing, and verification philosophy. It will also discuss an implementation of a machine protection system at the SLAC National Accelerator Laboratory's Linac Coherent Light Source (LCLS).

  11. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...

  12. Scientific progress report 1980

    International Nuclear Information System (INIS)

    1981-01-01

    The R + D-projects in this field and the infrastructural tasks mentioned are handled in seven working- and two project groups: Computer systems, Numerical and applied mathematics, Software development, Process calculation systems- hardware, Nuclear electronics, measuring- and automatic control technique, Research of component parts and irradiation tests, Central data processing, Processing of process data in the science of medicine, Co-operation in the BERNET-project in the 'Wissenschaftliches Rechenzentrum Berlin (WRB)' (scientific computer center in Berlin). (orig./WB)

  13. The Evolution of Software Publication in Astronomy

    Science.gov (United States)

    Cantiello, Matteo

    2018-01-01

    Software is a fundamental component of the scientific research process. As astronomical discoveries increasingly rely on complex numerical calculations and the analysis of big data sets, publishing and documenting software is a fundamental step in ensuring transparency and reproducibility of results. I will briefly discuss the recent history of software publication and highlight the challenges and opportunities ahead.

  14. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  15. Toward Intelligent Software Defect Detection

    Science.gov (United States)

    Benson, Markland J.

    2011-01-01

    Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.

  16. Development of micro pattern cutting simulation software

    International Nuclear Information System (INIS)

    Lee, Jong Min; Song, Seok Gyun; Choi, Jeong Ju; Novandy, Bondhan; Kim, Su Jin; Lee, Dong Yoon; Nam, Sung Ho; Je, Tae Jin

    2008-01-01

    The micro pattern machining on the surface of wide mold is not easy to be simulated by conventional software. In this paper, a software is developed for micro pattern cutting simulation. The 3d geometry of v-groove, rectangular groove, pyramid and pillar patterns are visualized by c++ and OpenGL library. The micro cutting force is also simulated for each pattern

  17. Dynamic Modal Analysis of Vertical Machining Centre Components

    OpenAIRE

    Anayet U. Patwari; Waleed F. Faris; A. K. M. Nurul Amin; S. K. Loh

    2009-01-01

    The paper presents a systematic procedure and details of the use of experimental and analytical modal analysis technique for structural dynamic evaluation processes of a vertical machining centre. The main results deal with assessment of the mode shape of the different components of the vertical machining centre. The simplified experimental modal analysis of different components of milling machine was carried out. This model of the different machine tool's structure is made by design software...

  18. Manipulator for plasma-assisted machining of components made of materials with low machinability

    International Nuclear Information System (INIS)

    Lyaoshchukov, M.M.; Agadzhanyan, R.A.

    1984-01-01

    The All-Union Scientific-Research and Technological Institute of Pump Engineering developed, and the ''Uralgidromash'' Production Association has adopted, a manipulator with remote control for the plasma-assisted machining (PAM) of components made of materials with low machinability. The manipulator is distinguished by its universal design and can be used for machining both external and internal surfaces of the bodies of revolution and also end faces and various curvilinear surfaces

  19. Representational Machines

    DEFF Research Database (Denmark)

    Photography not only represents space. Space is produced photographically. Since its inception in the 19th century, photography has brought to light a vast array of represented subjects. Always situated in some spatial order, photographic representations have been operatively underpinned by social...... to the enterprises of the medium. This is the subject of Representational Machines: How photography enlists the workings of institutional technologies in search of establishing new iconic and social spaces. Together, the contributions to this edited volume span historical epochs, social environments, technological...... possibilities, and genre distinctions. Presenting several distinct ways of producing space photographically, this book opens a new and important field of inquiry for photography research....

  20. Shear machines

    International Nuclear Information System (INIS)

    Astill, M.; Sunderland, A.; Waine, M.G.

    1980-01-01

    A shear machine for irradiated nuclear fuel elements has a replaceable shear assembly comprising a fuel element support block, a shear blade support and a clamp assembly which hold the fuel element to be sheared in contact with the support block. A first clamp member contacts the fuel element remote from the shear blade and a second clamp member contacts the fuel element adjacent the shear blade and is advanced towards the support block during shearing to compensate for any compression of the fuel element caused by the shear blade (U.K.)

  1. VIRTUAL MACHINES IN EDUCATION – CNC MILLING MACHINE WITH SINUMERIK 840D CONTROL SYSTEM

    Directory of Open Access Journals (Sweden)

    Ireneusz Zagórski

    2014-11-01

    Full Text Available Machining process nowadays could not be conducted without its inseparable element: cutting edge and frequently numerically controlled milling machines. Milling and lathe machining centres comprise standard equipment in many companies of the machinery industry, e.g. automotive or aircraft. It is for that reason that tertiary education should account for this rising demand. This entails the introduction into the curricula the forms which enable visualisation of machining, milling process and virtual production as well as virtual machining centres simulation. Siemens Virtual Machine (Virtual Workshop sets an example of such software, whose high functionality offers a range of learning experience, such as: learning the design of machine tools, their configuration, basic operation functions as well as basics of CNC.

  2. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  3. Electricity of machine tool

    International Nuclear Information System (INIS)

    Gijeon media editorial department

    1977-10-01

    This book is divided into three parts. The first part deals with electricity machine, which can taints from generator to motor, motor a power source of machine tool, electricity machine for machine tool such as switch in main circuit, automatic machine, a knife switch and pushing button, snap switch, protection device, timer, solenoid, and rectifier. The second part handles wiring diagram. This concludes basic electricity circuit of machine tool, electricity wiring diagram in your machine like milling machine, planer and grinding machine. The third part introduces fault diagnosis of machine, which gives the practical solution according to fault diagnosis and the diagnostic method with voltage and resistance measurement by tester.

  4. Environmentally Friendly Machining

    CERN Document Server

    Dixit, U S; Davim, J Paulo

    2012-01-01

    Environment-Friendly Machining provides an in-depth overview of environmentally-friendly machining processes, covering numerous different types of machining in order to identify which practice is the most environmentally sustainable. The book discusses three systems at length: machining with minimal cutting fluid, air-cooled machining and dry machining. Also covered is a way to conserve energy during machining processes, along with useful data and detailed descriptions for developing and utilizing the most efficient modern machining tools. Researchers and engineers looking for sustainable machining solutions will find Environment-Friendly Machining to be a useful volume.

  5. System support software for TSTA

    International Nuclear Information System (INIS)

    Claborn, G.W.; Mann, L.W.; Nielson, C.W.

    1987-01-01

    The software at the Tritium Systems Test Assembly (TSTA) is logically broken into two parts, the system support software and the subsystem software. The purpose of the system support software is to isolate the subsystem software from the physical hardware. In this sense the system support software forms the kernel of the software at TSTA. The kernel software performs several functions. It gathers data from CAMAC modules and makes that data available for subsystem processes. It services requests to send commands to CAMAC modules. It provides a system of logging functions and provides for a system-wide global program state that allows highly structured interaction between subsystem processes. The kernel's most visible function is to provide the Man-Machine Interface (MMI). The MMI allows the operators a window into the physical hardware and subsystem process state. Finally the kernel provides a data archiving and compression function that allows archival data to be accessed and plotted. Such kernel software as developed and implemented at TSTA is described

  6. Making better scientific figures

    Science.gov (United States)

    Hawkins, Ed; McNeall, Doug

    2016-04-01

    In the words of the UK government chief scientific adviser "Science is not finished until it's communicated" (Walport 2013). The tools to produce good visual communication have never been so easily accessible to scientists as at the present. Correspondingly, it has never been easier to produce and disseminate poor graphics. In this presentation, we highlight some good practice and offer some practical advice in preparing scientific figures for presentation to peers or to the public. We identify common mistakes in visualisation, including some made by the authors, and offer some good reasons not to trust defaults in graphics software. In particular, we discuss the use of colour scales and share our experiences in running a social media campaign (http://tiny.cc/endrainbow) to replace the "rainbow" (also "jet", or "spectral") colour scale as the default in (climate) scientific visualisation.

  7. NASA's Scientific Visualization Studio

    Science.gov (United States)

    Mitchell, Horace G.

    2003-01-01

    Since 1988, the Scientific Visualization Studio(SVS) at NASA Goddard Space Flight Center has produced scientific visualizations of NASA s scientific research and remote sensing data for public outreach. These visualizations take the form of images, animations, and end-to-end systems and have been used in many venues: from the network news to science programs such as NOVA, from museum exhibits at the Smithsonian to White House briefings. This presentation will give an overview of the major activities and accomplishments of the SVS, and some of the most interesting projects and systems developed at the SVS will be described. Particular emphasis will be given to the practices and procedures by which the SVS creates visualizations, from the hardware and software used to the structures and collaborations by which products are designed, developed, and delivered to customers. The web-based archival and delivery system for SVS visualizations at svs.gsfc.nasa.gov will also be described.

  8. Mining software specifications methodologies and applications

    CERN Document Server

    Lo, David

    2011-01-01

    An emerging topic in software engineering and data mining, specification mining tackles software maintenance and reliability issues that cost economies billions of dollars each year. The first unified reference on the subject, Mining Software Specifications: Methodologies and Applications describes recent approaches for mining specifications of software systems. Experts in the field illustrate how to apply state-of-the-art data mining and machine learning techniques to address software engineering concerns. In the first set of chapters, the book introduces a number of studies on mining finite

  9. Machine Protection

    CERN Document Server

    Schmidt, R

    2014-01-01

    The protection of accelerator equipment is as old as accelerator technology and was for many years related to high-power equipment. Examples are the protection of powering equipment from overheating (magnets, power converters, high-current cables), of superconducting magnets from damage after a quench and of klystrons. The protection of equipment from beam accidents is more recent. It is related to the increasing beam power of high-power proton accelerators such as ISIS, SNS, ESS and the PSI cyclotron, to the emission of synchrotron light by electron–positron accelerators and FELs, and to the increase of energy stored in the beam (in particular for hadron colliders such as LHC). Designing a machine protection system requires an excellent understanding of accelerator physics and operation to anticipate possible failures that could lead to damage. Machine protection includes beam and equipment monitoring, a system to safely stop beam operation (e.g. dumping the beam or stopping the beam at low energy) and an ...

  10. The Machine / Job Features Mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Alef, M. [KIT, Karlsruhe; Cass, T. [CERN; Keijser, J. J. [NIKHEF, Amsterdam; McNab, A. [Manchester U.; Roiser, S. [CERN; Schwickerath, U. [CERN; Sfiligoi, I. [Fermilab

    2017-11-22

    Within the HEPiX virtualization group and the Worldwide LHC Computing Grid’s Machine/Job Features Task Force, a mechanism has been developed which provides access to detailed information about the current host and the current job to the job itself. This allows user payloads to access meta information, independent of the current batch system or virtual machine model. The information can be accessed either locally via the filesystem on a worker node, or remotely via HTTP(S) from a webserver. This paper describes the final version of the specification from 2016 which was published as an HEP Software Foundation technical note, and the design of the implementations of this version for batch and virtual machine platforms. We discuss early experiences with these implementations and how they can be exploited by experiment frameworks.

  11. Configurable software for satellite graphics

    Energy Technology Data Exchange (ETDEWEB)

    Hartzman, P D

    1977-12-01

    An important goal in interactive computer graphics is to provide users with both quick system responses for basic graphics functions and enough computing power for complex calculations. One solution is to have a distributed graphics system in which a minicomputer and a powerful large computer share the work. The most versatile type of distributed system is an intelligent satellite system in which the minicomputer is programmable by the application user and can do most of the work while the large remote machine is used for difficult computations. At New York University, the hardware was configured from available equipment. The level of system intelligence resulted almost completely from software development. Unlike previous work with intelligent satellites, the resulting system had system control centered in the satellite. It also had the ability to reconfigure software during realtime operation. The design of the system was done at a very high level using set theoretic language. The specification clearly illustrated processor boundaries and interfaces. The high-level specification also produced a compact, machine-independent virtual graphics data structure for picture representation. The software was written in a systems implementation language; thus, only one set of programs was needed for both machines. A user can program both machines in a single language. Tests of the system with an application program indicate that is has very high potential. A major result of this work is the demonstration that a gigantic investment in new hardware is not necessary for computing facilities interested in graphics.

  12. Multivariate Statistical Analysis Software Technologies for Astrophysical Research Involving Large Data Bases

    Science.gov (United States)

    Djorgovski, S. G.

    1994-01-01

    We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complex database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects of the SKICAT system, and of some of the scientific results achieved to date. We also developed a user-friendly package for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications and has

  13. Knowledge-Based Software Management

    International Nuclear Information System (INIS)

    Sally Schaffner; Matthew Bickley; Brian Bevins; Leon Clancy; Karen White

    2003-01-01

    Management of software in a dynamic environment such as is found at Jefferson Lab can be a daunting task. Software development tasks are distributed over a wide range of people with varying skill levels. The machine configuration is constantly changing requiring upgrades to software at both the hardware control level and the operator control level. In order to obtain high quality support from vendor service agreements, which is vital to maintaining 24/7 operations, hardware and software must be kept at industry's current levels. This means that periodic upgrades independent of machine configuration changes must take place. It is often difficult to identify and organize the information needed to guide the process of development, upgrades and enhancements. Dependencies between support software and applications need to be consistently identified to prevent introducing errors during upgrades and to allow adequate testing to be planned and performed. Developers also need access to information regarding compilers, make files and organized distribution directories. This paper describes a system under development at Jefferson Lab which will provide software developers and managers this type of information in a timely user-friendly fashion. The current status and future plans for the system will be detailed

  14. Live Replication of Paravirtual Machines

    OpenAIRE

    Stodden, Daniel

    2009-01-01

    Virtual machines offer a fair degree of system state encapsulation, which promotes practical advances in fault tolerance, system debugging, profiling and security applications. This work investigates deterministic replay and semi-active replication for system paravirtualization, a software discipline trading guest kernel binar compatibility for reduced dependency on costly trap-and-emulate techniques. A primary contribution is evidence that trace capturing under a piecewise deterministic exec...

  15. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  16. Software Reviews.

    Science.gov (United States)

    Davis, Shelly J., Ed.; Knaupp, Jon, Ed.

    1984-01-01

    Reviewed is computer software on: (1) classification of living things, a tutorial program for grades 5-10; and (2) polynomial practice using tiles, a drill-and-practice program for algebra students. (MNS)

  17. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Three pieces of computer software are described and reviewed: HyperCard, to build and use varied applications; Iggy's Gnees, for problem solving with shapes in grades kindergarten-two; and Algebra Shop, for practicing skills and problem solving. (MNS)

  18. Data-driven machine control : a feasibility study on YieldStar

    NARCIS (Netherlands)

    Mehrafrouz, M.

    2014-01-01

    Traditionally machine control software focusses on the control flow; this is also the situation within ASML and YieldStar. With the increased complexity of the machine control software more and more data is needed to accurately control a tool like YieldStar. In other software application areas, like

  19. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  20. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  1. Study of on-machine error identification and compensation methods for micro machine tools

    International Nuclear Information System (INIS)

    Wang, Shih-Ming; Yu, Han-Jen; Lee, Chun-Yi; Chiu, Hung-Sheng

    2016-01-01

    Micro machining plays an important role in the manufacturing of miniature products which are made of various materials with complex 3D shapes and tight machining tolerance. To further improve the accuracy of a micro machining process without increasing the manufacturing cost of a micro machine tool, an effective machining error measurement method and a software-based compensation method are essential. To avoid introducing additional errors caused by the re-installment of the workpiece, the measurement and compensation method should be on-machine conducted. In addition, because the contour of a miniature workpiece machined with a micro machining process is very tiny, the measurement method should be non-contact. By integrating the image re-constructive method, camera pixel correction, coordinate transformation, the error identification algorithm, and trajectory auto-correction method, a vision-based error measurement and compensation method that can on-machine inspect the micro machining errors and automatically generate an error-corrected numerical control (NC) program for error compensation was developed in this study. With the use of the Canny edge detection algorithm and camera pixel calibration, the edges of the contour of a machined workpiece were identified and used to re-construct the actual contour of the work piece. The actual contour was then mapped to the theoretical contour to identify the actual cutting points and compute the machining errors. With the use of a moving matching window and calculation of the similarity between the actual and theoretical contour, the errors between the actual cutting points and theoretical cutting points were calculated and used to correct the NC program. With the use of the error-corrected NC program, the accuracy of a micro machining process can be effectively improved. To prove the feasibility and effectiveness of the proposed methods, micro-milling experiments on a micro machine tool were conducted, and the results

  2. Machine Protection

    International Nuclear Information System (INIS)

    Schmidt, R

    2014-01-01

    The protection of accelerator equipment is as old as accelerator technology and was for many years related to high-power equipment. Examples are the protection of powering equipment from overheating (magnets, power converters, high-current cables), of superconducting magnets from damage after a quench and of klystrons. The protection of equipment from beam accidents is more recent. It is related to the increasing beam power of high-power proton accelerators such as ISIS, SNS, ESS and the PSI cyclotron, to the emission of synchrotron light by electron–positron accelerators and FELs, and to the increase of energy stored in the beam (in particular for hadron colliders such as LHC). Designing a machine protection system requires an excellent understanding of accelerator physics and operation to anticipate possible failures that could lead to damage. Machine protection includes beam and equipment monitoring, a system to safely stop beam operation (e.g. dumping the beam or stopping the beam at low energy) and an interlock system providing the glue between these systems. The most recent accelerator, the LHC, will operate with about 3 × 10 14 protons per beam, corresponding to an energy stored in each beam of 360 MJ. This energy can cause massive damage to accelerator equipment in case of uncontrolled beam loss, and a single accident damaging vital parts of the accelerator could interrupt operation for years. This article provides an overview of the requirements for protection of accelerator equipment and introduces the various protection systems. Examples are mainly from LHC, SNS and ESS

  3. Computational Simulations and the Scientific Method

    Science.gov (United States)

    Kleb, Bil; Wood, Bill

    2005-01-01

    As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.

  4. Scientific Misconduct.

    Science.gov (United States)

    Goodstein, David

    2002-01-01

    Explores scientific fraud, asserting that while few scientists actually falsify results, the field has become so competitive that many are misbehaving in other ways; an example would be unreasonable criticism by anonymous peer reviewers. (EV)

  5. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  6. Machine terms dictionary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1979-04-15

    This book gives descriptions of machine terms which includes machine design, drawing, the method of machine, machine tools, machine materials, automobile, measuring and controlling, electricity, basic of electron, information technology, quality assurance, Auto CAD and FA terms and important formula of mechanical engineering.

  7. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  8. Giro form reading machine

    Science.gov (United States)

    Minh Ha, Thien; Niggeler, Dieter; Bunke, Horst; Clarinval, Jose

    1995-08-01

    Although giro forms are used by many people in daily life for money remittance in Switzerland, the processing of these forms at banks and post offices is only partly automated. We describe an ongoing project for building an automatic system that is able to recognize various items printed or written on a giro form. The system comprises three main components, namely, an automatic form feeder, a camera system, and a computer. These components are connected in such a way that the system is able to process a bunch of forms without any human interactions. We present two real applications of our system in the field of payment services, which require the reading of both machine printed and handwritten information that may appear on a giro form. One particular feature of giro forms is their flexible layout, i.e., information items are located differently from one form to another, thus requiring an additional analysis step to localize them before recognition. A commercial optical character recognition software package is used for recognition of machine-printed information, whereas handwritten information is read by our own algorithms, the details of which are presented. The system is implemented by using a client/server architecture providing a high degree of flexibility to change. Preliminary results are reported supporting our claim that the system is usable in practice.

  9. Numerical modeling and optimization of machining duplex stainless steels

    Directory of Open Access Journals (Sweden)

    Rastee D. Koyee

    2015-01-01

    Full Text Available The shortcomings of the machining analytical and empirical models in combination with the industry demands have to be fulfilled. A three-dimensional finite element modeling (FEM introduces an attractive alternative to bridge the gap between pure empirical and fundamental scientific quantities, and fulfill the industry needs. However, the challenging aspects which hinder the successful adoption of FEM in the machining sector of manufacturing industry have to be solved first. One of the greatest challenges is the identification of the correct set of machining simulation input parameters. This study presents a new methodology to inversely calculate the input parameters when simulating the machining of standard duplex EN 1.4462 and super duplex EN 1.4410 stainless steels. JMatPro software is first used to model elastic–viscoplastic and physical work material behavior. In order to effectively obtain an optimum set of inversely identified friction coefficients, thermal contact conductance, Cockcroft–Latham critical damage value, percentage reduction in flow stress, and Taylor–Quinney coefficient, Taguchi-VIKOR coupled with Firefly Algorithm Neural Network System is applied. The optimization procedure effectively minimizes the overall differences between the experimentally measured performances such as cutting forces, tool nose temperature and chip thickness, and the numerically obtained ones at any specified cutting condition. The optimum set of input parameter is verified and used for the next step of 3D-FEM application. In the next stage of the study, design of experiments, numerical simulations, and fuzzy rule modeling approaches are employed to optimize types of chip breaker, insert shapes, process conditions, cutting parameters, and tool orientation angles based on many important performances. Through this study, not only a new methodology in defining the optimal set of controllable parameters for turning simulations is introduced, but also

  10. The uranium machine

    International Nuclear Information System (INIS)

    Walker, M.

    1990-01-01

    The German atom bomb is a chimera. Scientists such as Carl Friedrich von Weizsaecker and Werner Heisenberg have been claiming for a long time that they refused to carry out research in the Third Reich because they did not want to put such a terrible weapon into Hitler's hand. The author produces evidence proving that the German physicists were never in a position to carry out a research project on the scale of the 'Manhattan Project', quite apart from the fact that they were lacking important technical prerequisites for splitting isotopes. With a detective's touch the author succeeds in reconstructing the competition for the bomb in minute detail. This book is the most detailed and precise analysis of the reality of that uranium machine which for four decades has haunted scientific and journalistic literature. (orig./HP) [de

  11. Software Authentication

    International Nuclear Information System (INIS)

    Wolford, J.K.; Geelhood, B.D.; Hamilton, V.A.; Ingraham, J.; MacArthur, D.W.; Mitchell, D.J.; Mullens, J.A.; Vanier, P. E.; White, G.K.; Whiteson, R.

    2001-01-01

    The effort to define guidance for authentication of software for arms control and nuclear material transparency measurements draws on a variety of disciplines and has involved synthesizing established criteria and practices with newer methods. Challenges include the need to protect classified information that the software manipulates as well as deal with the rapid pace of innovation in the technology of nuclear material monitoring. The resulting guidance will shape the design of future systems and inform the process of authentication of instruments now being developed. This paper explores the technical issues underlying the guidance and presents its major tenets

  12. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  13. Integrating multiple scientific computing needs via a Private Cloud infrastructure

    International Nuclear Information System (INIS)

    Bagnasco, S; Berzano, D; Brunetti, R; Lusso, S; Vallero, S

    2014-01-01

    In a typical scientific computing centre, diverse applications coexist and share a single physical infrastructure. An underlying Private Cloud facility eases the management and maintenance of heterogeneous use cases such as multipurpose or application-specific batch farms, Grid sites catering to different communities, parallel interactive data analysis facilities and others. It allows to dynamically and efficiently allocate resources to any application and to tailor the virtual machines according to the applications' requirements. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques; for example, rolling updates can be performed easily and minimizing the downtime. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 site and a dynamically expandable PROOF-based Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The Private Cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem (used in two different configurations for worker- and service-class hypervisors) and the OpenWRT Linux distribution (used for network virtualization). A future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and by using mainstream contextualization tools like CloudInit.

  14. Research on software behavior trust based on hierarchy evaluation

    Science.gov (United States)

    Long, Ke; Xu, Haishui

    2017-08-01

    In view of the correlation software behavior, we evaluate software behavior credibility from two levels of control flow and data flow. In control flow level, method of the software behavior of trace based on support vector machine (SVM) is proposed. In data flow level, behavioral evidence evaluation based on fuzzy decision analysis method is put forward.

  15. The 1988 Directory of Educational Software Publishing Companies.

    Science.gov (United States)

    Electronic Learning, 1988

    1988-01-01

    Based on questionnaires sent to educational software companies in January 1988, this directory lists 78 companies. Information given includes company address, curriculum subject areas for which the company publishes software, types of machines and operating systems on which the software operates, and grade level for which it is targeted. (LRW)

  16. Photonometers for coating and sputtering machines

    Directory of Open Access Journals (Sweden)

    Václavík J.

    2013-05-01

    Full Text Available The concept of photonometers (alternative name of optical monitor of a vacuum deposition process for coating and sputtering machines is based on photonometers produced by companies like SATIS or HV Dresden. Photometers were developed in the TOPTEC centre and its predecessor VOD (Optical Development Workshop of Institut of Plasma Physics AS CR for more than 10 years. The article describes current status of the technology and ideas which will be incorporated in next development steps. Hardware and software used on coating machines B63D, VNA600 and sputtering machine UPM810 is presented.

  17. Photonometers for coating and sputtering machines

    Science.gov (United States)

    Oupický, P.; Jareš, D.; Václavík, J.; Vápenka, D.

    2013-04-01

    The concept of photonometers (alternative name of optical monitor of a vacuum deposition process) for coating and sputtering machines is based on photonometers produced by companies like SATIS or HV Dresden. Photometers were developed in the TOPTEC centre and its predecessor VOD (Optical Development Workshop of Institut of Plasma Physics AS CR) for more than 10 years. The article describes current status of the technology and ideas which will be incorporated in next development steps. Hardware and software used on coating machines B63D, VNA600 and sputtering machine UPM810 is presented.

  18. Operating System For Numerically Controlled Milling Machine

    Science.gov (United States)

    Ray, R. B.

    1992-01-01

    OPMILL program is operating system for Kearney and Trecker milling machine providing fast easy way to program manufacture of machine parts with IBM-compatible personal computer. Gives machinist "equation plotter" feature, which plots equations that define movements and converts equations to milling-machine-controlling program moving cutter along defined path. System includes tool-manager software handling up to 25 tools and automatically adjusts to account for each tool. Developed on IBM PS/2 computer running DOS 3.3 with 1 MB of random-access memory.

  19. Addiction Machines

    Directory of Open Access Journals (Sweden)

    James Godley

    2011-10-01

    Full Text Available Entry into the crypt William Burroughs shared with his mother opened and shut around a failed re-enactment of William Tell’s shot through the prop placed upon a loved one’s head. The accidental killing of his wife Joan completed the installation of the addictation machine that spun melancholia as manic dissemination. An early encryptment to which was added the audio portion of abuse deposited an undeliverable message in WB. Wil- liam could never tell, although his corpus bears the in- scription of this impossibility as another form of pos- sibility. James Godley is currently a doctoral candidate in Eng- lish at SUNY Buffalo, where he studies psychoanalysis, Continental philosophy, and nineteenth-century litera- ture and poetry (British and American. His work on the concept of mourning and “the dead” in Freudian and Lacanian approaches to psychoanalytic thought and in Gothic literature has also spawned an essay on zombie porn. Since entering the Academy of Fine Arts Karlsruhe in 2007, Valentin Hennig has studied in the classes of Sil- via Bächli, Claudio Moser, and Corinne Wasmuht. In 2010 he spent a semester at the Dresden Academy of Fine Arts. His work has been shown in group exhibi- tions in Freiburg and Karlsruhe.

  20. Machine musicianship

    Science.gov (United States)

    Rowe, Robert

    2002-05-01

    The training of musicians begins by teaching basic musical concepts, a collection of knowledge commonly known as musicianship. Computer programs designed to implement musical skills (e.g., to make sense of what they hear, perform music expressively, or compose convincing pieces) can similarly benefit from access to a fundamental level of musicianship. Recent research in music cognition, artificial intelligence, and music theory has produced a repertoire of techniques that can make the behavior of computer programs more musical. Many of these were presented in a recently published book/CD-ROM entitled Machine Musicianship. For use in interactive music systems, we are interested in those which are fast enough to run in real time and that need only make reference to the material as it appears in sequence. This talk will review several applications that are able to identify the tonal center of musical material during performance. Beyond this specific task, the design of real-time algorithmic listening through the concurrent operation of several connected analyzers is examined. The presentation includes discussion of a library of C++ objects that can be combined to perform interactive listening and a demonstration of their capability.

  1. Using Learning Analytics to Understand Scientific Modeling in the Classroom

    Directory of Open Access Journals (Sweden)

    David Quigley

    2017-11-01

    Full Text Available Scientific models represent ideas, processes, and phenomena by describing important components, characteristics, and interactions. Models are constructed across various scientific disciplines, such as the food web in biology, the water cycle in Earth science, or the structure of the solar system in astronomy. Models are central for scientists to understand phenomena, construct explanations, and communicate theories. Constructing and using models to explain scientific phenomena is also an essential practice in contemporary science classrooms. Our research explores new techniques for understanding scientific modeling and engagement with modeling practices. We work with students in secondary biology classrooms as they use a web-based software tool—EcoSurvey—to characterize organisms and their interrelationships found in their local ecosystem. We use learning analytics and machine learning techniques to answer the following questions: (1 How can we automatically measure the extent to which students’ scientific models support complete explanations of phenomena? (2 How does the design of student modeling tools influence the complexity and completeness of students’ models? (3 How do clickstreams reflect and differentiate student engagement with modeling practices? We analyzed EcoSurvey usage data collected from two different deployments with over 1,000 secondary students across a large urban school district. We observe large variations in the completeness and complexity of student models, and large variations in their iterative refinement processes. These differences reveal that certain key model features are highly predictive of other aspects of the model. We also observe large differences in student modeling practices across different classrooms and teachers. We can predict a student’s teacher based on the observed modeling practices with a high degree of accuracy without significant tuning of the predictive model. These results highlight

  2. Virtual Machine Language

    Science.gov (United States)

    Grasso, Christopher; Page, Dennis; O'Reilly, Taifun; Fteichert, Ralph; Lock, Patricia; Lin, Imin; Naviaux, Keith; Sisino, John

    2005-01-01

    Virtual Machine Language (VML) is a mission-independent, reusable software system for programming for spacecraft operations. Features of VML include a rich set of data types, named functions, parameters, IF and WHILE control structures, polymorphism, and on-the-fly creation of spacecraft commands from calculated values. Spacecraft functions can be abstracted into named blocks that reside in files aboard the spacecraft. These named blocks accept parameters and execute in a repeatable fashion. The sizes of uplink products are minimized by the ability to call blocks that implement most of the command steps. This block approach also enables some autonomous operations aboard the spacecraft, such as aerobraking, telemetry conditional monitoring, and anomaly response, without developing autonomous flight software. Operators on the ground write blocks and command sequences in a concise, high-level, human-readable programming language (also called VML ). A compiler translates the human-readable blocks and command sequences into binary files (the operations products). The flight portion of VML interprets the uplinked binary files. The ground subsystem of VML also includes an interactive sequence- execution tool hosted on workstations, which runs sequences at several thousand times real-time speed, affords debugging, and generates reports. This tool enables iterative development of blocks and sequences within times of the order of seconds.

  3. Terra Harvest software architecture

    Science.gov (United States)

    Humeniuk, Dave; Klawon, Kevin

    2012-06-01

    Under the Terra Harvest Program, the DIA has the objective of developing a universal Controller for the Unattended Ground Sensor (UGS) community. The mission is to define, implement, and thoroughly document an open architecture that universally supports UGS missions, integrating disparate systems, peripherals, etc. The Controller's inherent interoperability with numerous systems enables the integration of both legacy and future UGS System (UGSS) components, while the design's open architecture supports rapid third-party development to ensure operational readiness. The successful accomplishment of these objectives by the program's Phase 3b contractors is demonstrated via integration of the companies' respective plug-'n'-play contributions that include controllers, various peripherals, such as sensors, cameras, etc., and their associated software drivers. In order to independently validate the Terra Harvest architecture, L-3 Nova Engineering, along with its partner, the University of Dayton Research Institute, is developing the Terra Harvest Open Source Environment (THOSE), a Java Virtual Machine (JVM) running on an embedded Linux Operating System. The Use Cases on which the software is developed support the full range of UGS operational scenarios such as remote sensor triggering, image capture, and data exfiltration. The Team is additionally developing an ARM microprocessor-based evaluation platform that is both energy-efficient and operationally flexible. The paper describes the overall THOSE architecture, as well as the design decisions for some of the key software components. Development process for THOSE is discussed as well.

  4. Modular Software Performance Monitoring

    CERN Document Server

    Kruse, D F

    2011-01-01

    CPU clock frequency is not likely to be increased significantly in the coming years, and data analysis speed can be improved by using more processors or buying new machines, only if one is willing to change the paradigm to a parallel one. Therefore, performance monitoring procedures and tools are needed to help programmers to optimize existing software running on current and future hardware. Low level information from hardware performance counters is vital to spot specific performance problems slowing program execution. HEP software is often huge and complex, and existing tools are unable to give results with the required granularity. We will report on the approach we have chose to solve this problem that involves decomposing the application into parts and monitoring each of them separately. Both counting and sampling methods are used to allow an analysis with the required custom granularity: from global level, up to the function level. A set of tools (based on perfmon2 – a software interface to hardware co...

  5. Reviews, Software.

    Science.gov (United States)

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  6. Software Reviews.

    Science.gov (United States)

    Kinnaman, Daniel E.; And Others

    1988-01-01

    Reviews four educational software packages for Apple, IBM, and Tandy computers. Includes "How the West was One + Three x Four,""Mavis Beacon Teaches Typing,""Math and Me," and "Write On." Reviews list hardware requirements, emphasis, levels, publisher, purchase agreements, and price. Discusses the strengths…

  7. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game…

  8. Software Reviews.

    Science.gov (United States)

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  9. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  10. Multivariate statistical analysis software technologies for astrophysical research involving large data bases

    Science.gov (United States)

    Djorgovski, S. George

    1994-01-01

    We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complete database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful, and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications, and has produced real, published results.

  11. WATERLOPP V2/64: A highly parallel machine for numerical computation

    Science.gov (United States)

    Ostlund, Neil S.

    1985-07-01

    Current technological trends suggest that the high performance scientific machines of the future are very likely to consist of a large number (greater than 1024) of processors connected and communicating with each other in some as yet undetermined manner. Such an assembly of processors should behave as a single machine in obtaining numerical solutions to scientific problems. However, the appropriate way of organizing both the hardware and software of such an assembly of processors is an unsolved and active area of research. It is particularly important to minimize the organizational overhead of interprocessor comunication, global synchronization, and contention for shared resources if the performance of a large number ( n) of processors is to be anything like the desirable n times the performance of a single processor. In many situations, adding a processor actually decreases the performance of the overall system since the extra organizational overhead is larger than the extra processing power added. The systolic loop architecture is a new multiple processor architecture which attemps at a solution to the problem of how to organize a large number of asynchronous processors into an effective computational system while minimizing the organizational overhead. This paper gives a brief overview of the basic systolic loop architecture, systolic loop algorithms for numerical computation, and a 64-processor implementation of the architecture, WATERLOOP V2/64, that is being used as a testbed for exploring the hardware, software, and algorithmic aspects of the architecture.

  12. Scientific communication

    Directory of Open Access Journals (Sweden)

    Aleksander Kobylarek

    2017-09-01

    Full Text Available The article tackles the problem of models of communication in science. The formal division of communication processes into oral and written does not resolve the problem of attitude. The author defines successful communication as a win-win game, based on the respect and equality of the partners, regardless of their position in the world of science. The core characteristics of the process of scientific communication are indicated , such as openness, fairness, support, and creation. The task of creating the right atmosphere for science communication belongs to moderators, who should not allow privilege and differentiation of position to affect scientific communication processes.

  13. Scientific millenarianism

    International Nuclear Information System (INIS)

    Weinberg, A.M.

    1997-01-01

    Today, for the first time, scientific concerns are seriously being addressed that span future times--hundreds, even thousands, or more years in the future. One is witnessing what the author calls scientific millenarianism. Are such concerns for the distant future exercises in futility, or are they real issues that, to the everlasting gratitude of future generations, this generation has identified, warned about and even suggested how to cope with in the distant future? Can the four potential catastrophes--bolide impact, CO 2 warming, radioactive wastes and thermonuclear war--be avoided by technical fixes, institutional responses, religion, or by doing nothing? These are the questions addressed in this paper

  14. Scientific meetings

    International Nuclear Information System (INIS)

    1973-01-01

    One of the main aims of the IAEA is to foster the exchange of scientific and technical information and one of the main ways of doing this is to convene international scientific meetings. They range from large international conferences bringing together several hundred scientists, smaller symposia attended by an average of 150 to 250 participants and seminars designed to instruct rather than inform, to smaller panels and study groups of 10 to 30 experts brought together to advise on a particular programme or to develop a set of regulations. The topics of these meetings cover every part of the Agency's activities and form a backbone of many of its programmes. (author)

  15. A Tutorial on Software Obfuscation

    OpenAIRE

    Banescu, Sebastian and Pretschner, Alexander

    2017-01-01

    Protecting a digital asset once it leaves the cyber trust boundary of its creator is a challenging security problem. The creator is an entity which can range from a single person to an entire organization. The trust boundary of an entity is represented by all the (virtual or physical) machines controlled by that entity. Digital assets range from media content to code, and include items such as: music, movies, computer games and premium software features. The business model of t...

  16. Software and commands on VAX CERN. User's guide

    International Nuclear Information System (INIS)

    Balashov, V.K.; Trofimov, V.V.

    1990-01-01

    This guide describes a structure of applications software, which is available at VAX-type computers in JINR. The software includes program libraries, scientific programs and commands developed at CERN. 20 refs

  17. Control System Design for Automatic Cavity Tuning Machines

    Energy Technology Data Exchange (ETDEWEB)

    Carcagno, R.; Khabiboulline, T.; Kotelnikov, S.; Makulski, A.; Nehring, R.; Nogiec, J.; Ross, M.; Schappert, W.; /Fermilab; Goessel, A.; Iversen, J.; Klinke, D.; /DESY

    2009-05-01

    A series of four automatic tuning machines for 9-cell TESLA-type cavities are being developed and fabricated in a collaborative effort among DESY, FNAL, and KEK. These machines are intended to support high-throughput cavity fabrication for construction of large SRF-based accelerator projects. Two of these machines will be delivered to cavity vendors for the tuning of XFEL cavities. The control system for these machines must support a high level of automation adequate for industrial use by non-experts operators. This paper describes the control system hardware and software design for these machines.

  18. Control System Design for Automatic Cavity Tuning Machines

    International Nuclear Information System (INIS)

    Carcagno, R.; Khabiboulline, T.; Kotelnikov, S.; Makulski, A.; Nehring, R.; Nogiec, J.; Ross, M.; Schappert, W.; Goessel, A.; Iversen, J.; Klinke, D.

    2009-01-01

    A series of four automatic tuning machines for 9-cell TESLA-type cavities are being developed and fabricated in a collaborative effort among DESY, FNAL, and KEK. These machines are intended to support high-throughput cavity fabrication for construction of large SRF-based accelerator projects. Two of these machines will be delivered to cavity vendors for the tuning of XFEL cavities. The control system for these machines must support a high level of automation adequate for industrial use by non-experts operators. This paper describes the control system hardware and software design for these machines.

  19. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  20. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  1. Taylor’s Scientific Management

    Directory of Open Access Journals (Sweden)

    Dimitrios Koumparoulis

    2012-08-01

    Full Text Available Frederick Taylor is known as the father of modern management. Taylor’s scientific management revolutionized industry and helped shape modern organization. Scientific management revolutionized industry because it explains how to increase production by working smarter, not harder. Taylor’s ideas were not limited to only serving the company’s bottom line but the increase in productivity benefited the workforce as well. The principles of scientific management became a machine of universal efficiency since there was a widespread use of scientific management worldwide and beyond the scope of the workplace. Taylor’s theories on using science and statistical fact have become a guideline that many have followed to great success.

  2. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    COMMUNICATIONS, AND COMPUTER SCIENCES I ,(AIRMICS) FO~SOFTWARE TOOLS (.o FOR SOF1 ’ARE MAINTENANCE (ASQBG-1-89-001) October, 1988 DTIC ELECTE -ifB...SUNWW~. B..c Program An~Iysw HA.c C-Tractr C Cobol Stncturing Facility VS Cobol 11 F-Scan Foctma Futbol Cobol Fortran Sltiuc Code Anaiyaer Fortran IS

  3. Empirical model for estimating the surface roughness of machined ...

    African Journals Online (AJOL)

    Empirical model for estimating the surface roughness of machined ... as well as surface finish is one of the most critical quality measure in mechanical products. ... various cutting speed have been developed using regression analysis software.

  4. electrical-thermal coupling of induction machine for improved

    African Journals Online (AJOL)

    user

    parameter method was used in the thermal model of the machine. The system of ... Thermal modeling is important for design purpose, fault detection ... dependent problems are challenging both in software development ... numerical solution.

  5. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  6. The next scientific revolution.

    Science.gov (United States)

    Hey, Tony

    2010-11-01

    For decades, computer scientists have tried to teach computers to think like human experts. Until recently, most of those efforts have failed to come close to generating the creative insights and solutions that seem to come naturally to the best researchers, doctors, and engineers. But now, Tony Hey, a VP of Microsoft Research, says we're witnessing the dawn of a new generation of powerful computer tools that can "mash up" vast quantities of data from many sources, analyze them, and help produce revolutionary scientific discoveries. Hey and his colleagues call this new method of scientific exploration "machine learning." At Microsoft, a team has already used it to innovate a method of predicting with impressive accuracy whether a patient with congestive heart failure who is released from the hospital will be readmitted within 30 days. It was developed by directing a computer program to pore through hundreds of thousands of data points on 300,000 patients and "learn" the profiles of patients most likely to be rehospitalized. The economic impact of this prediction tool could be huge: If a hospital understands the likelihood that a patient will "bounce back," it can design programs to keep him stable and save thousands of dollars in health care costs. Similar efforts to uncover important correlations that could lead to scientific breakthroughs are under way in oceanography, conservation, and AIDS research. And in business, deep data exploration has the potential to unearth critical insights about customers, supply chains, advertising effectiveness, and more.

  7. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  8. Preparing a scientific manuscript in Linux: Today's possibilities and limitations.

    Science.gov (United States)

    Tchantchaleishvili, Vakhtang; Schmitto, Jan D

    2011-10-22

    Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux.

  9. Hyperspectral Soil Mapper (HYSOMA) software interface: Review and future plans

    Science.gov (United States)

    Chabrillat, Sabine; Guillaso, Stephane; Eisele, Andreas; Rogass, Christian

    2014-05-01

    code was developed as standalone IDL software to allow easy implementation in the hyperspectral and non-hyperspectral communities. Indeed, within the hyperspectral community, IDL language is very widely used, and for non-expert users that do not have an ENVI license, such software can be executed as a binary version using the free IDL virtual machine under various operating systems. Based on the growing interest of users in the software interface, the experimental software was adapted for public release version in 2012, and since then ~80 users of hyperspectral soil products downloaded the soil algorithms at www.gfz-potsdam.de/hysoma. The software interface was distributed for free as IDL plug-ins under the IDL-virtual machine. Up-to-now distribution of HYSOMA was based on a close source license model, for non-commercial and educational purposes. Currently, the HYSOMA is being under further development in the context of the EnMAP satellite mission, for extension and implementation in the EnMAP Box as EnSoMAP (EnMAP SOil MAPper). The EnMAP Box is a freely available, platform-independent software distributed under an open source license. In the presentation we will focus on an update of the HYSOMA software interface status and upcoming implementation in the EnMAP Box. Scientific software validation, associated publication record and users responses as well as software management and transition to open source will be discussed.

  10. Machine technology: a survey

    International Nuclear Information System (INIS)

    Barbier, M.M.

    1981-01-01

    An attempt was made to find existing machines that have been upgraded and that could be used for large-scale decontamination operations outdoors. Such machines are in the building industry, the mining industry, and the road construction industry. The road construction industry has yielded the machines in this presentation. A review is given of operations that can be done with the machines available

  11. Machine Shop Lathes.

    Science.gov (United States)

    Dunn, James

    This guide, the second in a series of five machine shop curriculum manuals, was designed for use in machine shop courses in Oklahoma. The purpose of the manual is to equip students with basic knowledge and skills that will enable them to enter the machine trade at the machine-operator level. The curriculum is designed so that it can be used in…

  12. Superconducting rotating machines

    International Nuclear Information System (INIS)

    Smith, J.L. Jr.; Kirtley, J.L. Jr.; Thullen, P.

    1975-01-01

    The opportunities and limitations of the applications of superconductors in rotating electric machines are given. The relevant properties of superconductors and the fundamental requirements for rotating electric machines are discussed. The current state-of-the-art of superconducting machines is reviewed. Key problems, future developments and the long range potential of superconducting machines are assessed

  13. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  14. Man-machine interactions 3

    CERN Document Server

    Czachórski, Tadeusz; Kozielski, Stanisław

    2014-01-01

    Man-Machine Interaction is an interdisciplinary field of research that covers many aspects of science focused on a human and machine in conjunction.  Basic goal of the study is to improve and invent new ways of communication between users and computers, and many different subjects are involved to reach the long-term research objective of an intuitive, natural and multimodal way of interaction with machines.  The rapid evolution of the methods by which humans interact with computers is observed nowadays and new approaches allow using computing technologies to support people on the daily basis, making computers more usable and receptive to the user's needs.   This monograph is the third edition in the series and presents important ideas, current trends and innovations in  the man-machine interactions area.  The aim of this book is to introduce not only hardware and software interfacing concepts, but also to give insights into the related theoretical background. Reader is provided with a compilation of high...

  15. Tensor Network Quantum Virtual Machine (TNQVM)

    Energy Technology Data Exchange (ETDEWEB)

    2016-11-18

    There is a lack of state-of-the-art quantum computing simulation software that scales on heterogeneous systems like Titan. Tensor Network Quantum Virtual Machine (TNQVM) provides a quantum simulator that leverages a distributed network of GPUs to simulate quantum circuits in a manner that leverages recent results from tensor network theory.

  16. Teaching Empirical Software Engineering Using Expert Teams

    DEFF Research Database (Denmark)

    Kuhrmann, Marco

    2017-01-01

    Empirical software engineering aims at making software engineering claims measurable, i.e., to analyze and understand phenomena in software engineering and to evaluate software engineering approaches and solutions. Due to the involvement of humans and the multitude of fields for which software...... is crucial, software engineering is considered hard to teach. Yet, empirical software engineering increases this difficulty by adding the scientific method as extra dimension. In this paper, we present a Master-level course on empirical software engineering in which different empirical instruments...... an extra specific expertise that they offer as service to other teams, thus, fostering cross-team collaboration. The paper outlines the general course setup, topics addressed, and it provides initial lessons learned....

  17. SASAgent: an agent based architecture for search, retrieval and composition of scientific models.

    Science.gov (United States)

    Felipe Mendes, Luiz; Silva, Laryssa; Matos, Ely; Braga, Regina; Campos, Fernanda

    2011-07-01

    Scientific computing is a multidisciplinary field that goes beyond the use of computer as machine where researchers write simple texts, presentations or store analysis and results of their experiments. Because of the huge hardware/software resources invested in experiments and simulations, this new approach to scientific computing currently adopted by research groups is well represented by e-Science. This work aims to propose a new architecture based on intelligent agents to search, recover and compose simulation models, generated in the context of research projects related to biological domain. The SASAgent architecture is described as a multi-tier, comprising three main modules, where CelO ontology satisfies requirements put by e-science projects mainly represented by the semantic knowledge base. Preliminary results suggest that the proposed architecture is promising to achieve requirements found in e-Science projects, considering mainly the biological domain. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Writing references and using citation management software.

    Science.gov (United States)

    Sungur, Mukadder Orhan; Seyhan, Tülay Özkan

    2013-09-01

    The correct citation of references is obligatory to gain scientific credibility, to honor the original ideas of previous authors and to avoid plagiarism. Currently, researchers can easily find, cite and store references using citation management software. In this review, two popular citation management software programs (EndNote and Mendeley) are summarized.

  19. Software Prototyping

    Science.gov (United States)

    Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R.

    2016-01-01

    Summary Background Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. Objective To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Methods Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Results Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system. PMID:27081404

  20. Producing software by integration: challenges and research directions (keynote)

    OpenAIRE

    Inverardi , Paola; Autili , Marco; Di Ruscio , Davide; Pelliccione , Patrizio; Tivoli , Massimo

    2013-01-01

    International audience; Software is increasingly produced according to a certain goal and by integrating existing software produced by third-parties, typically black-box, and often provided without a machine readable documentation. This implies that development processes of the next future have to explicitly deal with an inherent incompleteness of information about existing software, notably on its behaviour. Therefore, on one side a software producer will less and less know the precise behav...

  1. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  2. Vending machine assessment methodology. A systematic review.

    Science.gov (United States)

    Matthews, Melissa A; Horacek, Tanya M

    2015-07-01

    The nutritional quality of food and beverage products sold in vending machines has been implicated as a contributing factor to the development of an obesogenic food environment. How comprehensive, reliable, and valid are the current assessment tools for vending machines to support or refute these claims? A systematic review was conducted to summarize, compare, and evaluate the current methodologies and available tools for vending machine assessment. A total of 24 relevant research studies published between 1981 and 2013 met inclusion criteria for this review. The methodological variables reviewed in this study include assessment tool type, study location, machine accessibility, product availability, healthfulness criteria, portion size, price, product promotion, and quality of scientific practice. There were wide variations in the depth of the assessment methodologies and product healthfulness criteria utilized among the reviewed studies. Of the reviewed studies, 39% evaluated machine accessibility, 91% evaluated product availability, 96% established healthfulness criteria, 70% evaluated portion size, 48% evaluated price, 52% evaluated product promotion, and 22% evaluated the quality of scientific practice. Of all reviewed articles, 87% reached conclusions that provided insight into the healthfulness of vended products and/or vending environment. Product healthfulness criteria and complexity for snack and beverage products was also found to be variable between the reviewed studies. These findings make it difficult to compare results between studies. A universal, valid, and reliable vending machine assessment tool that is comprehensive yet user-friendly is recommended. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  4. BADMINTON TRAINING MACHINE WITH IMPACT MECHANISM

    Directory of Open Access Journals (Sweden)

    B. F. YOUSIF

    2011-02-01

    Full Text Available In the current work, a newly machine was designed and fabricated for badminton training purpose. In the designing process, CATIA software was used to design and simulate the machine components. The design was based on direct impact method to launch the shuttle using spring as the source of the impact. Hook’s law was used theoretically to determine the initial and the maximum lengths of the springs. The main feature of the machine is that can move in two axes (up and down, left and right. For the control system, infra-red sensor and touch switch were adapted in microcontroller. The final product was locally fabricated and proved that the machine can operate properly.

  5. Design of Control System for Kiwifruit Automatic Grading Machine

    Directory of Open Access Journals (Sweden)

    Xingjian Zuo

    2013-05-01

    Full Text Available The kiwifruit automatic grading machine is an important machine for postharvest processing of kiwifruit, and the control system ensures that the machine realizes intelligence. The control system for the kiwifruit automatic grading machine designed in this paper comprises a host computer and a slave microcontroller. The host computer provides a visual grading interface for the machine with a LabVIEW software, the slave microcontroller adopts an STC89C52 microcontroller as its core, and C language is used to write programs for controlling a position sensor module, push-pull type electromagnets, motor driving modules and a power supply for controlling the operation of the machine as well as the rise or descend of grading baffle plates. The ideal control effect is obtained through test, and the intelligent operation of the machine is realized.

  6. Machine tool structures

    CERN Document Server

    Koenigsberger, F

    1970-01-01

    Machine Tool Structures, Volume 1 deals with fundamental theories and calculation methods for machine tool structures. Experimental investigations into stiffness are discussed, along with the application of the results to the design of machine tool structures. Topics covered range from static and dynamic stiffness to chatter in metal cutting, stability in machine tools, and deformations of machine tool structures. This volume is divided into three sections and opens with a discussion on stiffness specifications and the effect of stiffness on the behavior of the machine under forced vibration c

  7. Use of machine learning techniques for modeling of snow depth

    Directory of Open Access Journals (Sweden)

    G. V. Ayzel

    2017-01-01

    Full Text Available Snow exerts significant regulating effect on the land hydrological cycle since it controls intensity of heat and water exchange between the soil-vegetative cover and the atmosphere. Estimating of a spring flood runoff or a rain-flood on mountainous rivers requires understanding of the snow cover dynamics on a watershed. In our work, solving a problem of the snow cover depth modeling is based on both available databases of hydro-meteorological observations and easily accessible scientific software that allows complete reproduction of investigation results and further development of this theme by scientific community. In this research we used the daily observational data on the snow cover and surface meteorological parameters, obtained at three stations situated in different geographical regions: Col de Porte (France, Sodankyla (Finland, and Snoquamie Pass (USA.Statistical modeling of the snow cover depth is based on a complex of freely distributed the present-day machine learning models: Decision Trees, Adaptive Boosting, Gradient Boosting. It is demonstrated that use of combination of modern machine learning methods with available meteorological data provides the good accuracy of the snow cover modeling. The best results of snow cover depth modeling for every investigated site were obtained by the ensemble method of gradient boosting above decision trees – this model reproduces well both, the periods of snow cover accumulation and its melting. The purposeful character of learning process for models of the gradient boosting type, their ensemble character, and use of combined redundancy of a test sample in learning procedure makes this type of models a good and sustainable research tool. The results obtained can be used for estimating the snow cover characteristics for river basins where hydro-meteorological information is absent or insufficient.

  8. Scientific Services on the Cloud

    Science.gov (United States)

    Chapman, David; Joshi, Karuna P.; Yesha, Yelena; Halem, Milt; Yesha, Yaacov; Nguyen, Phuong

    Scientific Computing was one of the first every applications for parallel and distributed computation. To this date, scientific applications remain some of the most compute intensive, and have inspired creation of petaflop compute infrastructure such as the Oak Ridge Jaguar and Los Alamos RoadRunner. Large dedicated hardware infrastructure has become both a blessing and a curse to the scientific community. Scientists are interested in cloud computing for much the same reason as businesses and other professionals. The hardware is provided, maintained, and administrated by a third party. Software abstraction and virtualization provide reliability, and fault tolerance. Graduated fees allow for multi-scale prototyping and execution. Cloud computing resources are only a few clicks away, and by far the easiest high performance distributed platform to gain access to. There may still be dedicated infrastructure for ultra-scale science, but the cloud can easily play a major part of the scientific computing initiative.

  9. Accelerator Operators and Software Development

    International Nuclear Information System (INIS)

    April Miller; Michele Joyce

    2001-01-01

    At Thomas Jefferson National Accelerator Facility, accelerator operators perform tasks in their areas of specialization in addition to their machine operations duties. One crucial area in which operators contribute is software development. Operators with programming skills are uniquely qualified to develop certain controls applications because of their expertise in the day-to-day operation of the accelerator. Jefferson Lab is one of the few laboratories that utilizes the skills and knowledge of operators to create software that enhances machine operations. Through the programs written; by operators, Jefferson Lab has improved machine efficiency and beam availability. Because many of these applications involve automation of procedures and need graphical user interfaces, the scripting language Tcl and the Tk toolkit have been adopted. In addition to automation, some operator-developed applications are used for information distribution. For this purpose, several standard web development tools such as perl, VBScript, and ASP are used. Examples of applications written by operators include injector steering, spin angle changes, system status reports, magnet cycling routines, and quantum efficiency measurements. This paper summarizes how the unique knowledge of accelerator operators has contributed to the success of the Jefferson Lab control system. *This work was supported by the U.S. DOE contract No. DE-AC05-84-ER40150

  10. Virtual machine performance benchmarking.

    Science.gov (United States)

    Langer, Steve G; French, Todd

    2011-10-01

    The attractions of virtual computing are many: reduced costs, reduced resources and simplified maintenance. Any one of these would be compelling for a medical imaging professional attempting to support a complex practice on limited resources in an era of ever tightened reimbursement. In particular, the ability to run multiple operating systems optimized for different tasks (computational image processing on Linux versus office tasks on Microsoft operating systems) on a single physical machine is compelling. However, there are also potential drawbacks. High performance requirements need to be carefully considered if they are to be executed in an environment where the running software has to execute through multiple layers of device drivers before reaching the real disk or network interface. Our lab has attempted to gain insight into the impact of virtualization on performance by benchmarking the following metrics on both physical and virtual platforms: local memory and disk bandwidth, network bandwidth, and integer and floating point performance. The virtual performance metrics are compared to baseline performance on "bare metal." The results are complex, and indeed somewhat surprising.

  11. Predicting Software Suitability Using a Bayesian Belief Network

    Science.gov (United States)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  12. Is writing style predictive of scientific fraud?

    DEFF Research Database (Denmark)

    Braud, Chloé Elodie; Søgaard, Anders

    2017-01-01

    The problem of detecting scientific fraud using machine learning was recently introduced, with initial, positive results from a model taking into account various general indicators. The results seem to suggest that writing style is predictive of scientific fraud. We revisit these initial experime......The problem of detecting scientific fraud using machine learning was recently introduced, with initial, positive results from a model taking into account various general indicators. The results seem to suggest that writing style is predictive of scientific fraud. We revisit these initial...... experiments, and show that the leave-one-out testing procedure they used likely leads to a slight over-estimate of the predictability, but also that simple models can outperform their proposed model by some margin. We go on to explore more abstract linguistic features, such as linguistic complexity...

  13. Compendium of Scientific Linacs

    Energy Technology Data Exchange (ETDEWEB)

    Clendenin, James E

    2003-05-16

    The International Committee supported the proposal of the Chairman of the XVIII International Linac Conference to issue a new Compendium of linear accelerators. The last one was published in 1976. The Local Organizing Committee of Linac96 decided to set up a sub-committee for this purpose. Contrary to the catalogues of the High Energy Accelerators which compile accelerators with energies above 1 GeV, we have not defined a specific limit in energy. Microtrons and cyclotrons are not in this compendium. Also data from thousands of medical and industrial linacs has not been collected. Therefore, only scientific linacs are listed in the present compendium. Each linac found in this research and involved in a physics context was considered. It could be used, for example, either as an injector for high energy accelerators, or in nuclear physics, materials physics, free electron lasers or synchrotron light machines. Linear accelerators are developed in three continents only: America, Asia, and Europe. This geographical distribution is kept as a basis. The compendium contains the parameters and status of scientific linacs. Most of these linacs are operational. However, many facilities under construction or design studies are also included. A special mention has been made at the end for the studies of future linear colliders.

  14. Verified scientific findings

    International Nuclear Information System (INIS)

    Bullinger, M.G.

    1982-01-01

    In this essay, the author attempts to enlighten the reader as to the meaning of the term ''verified scientific findings'' in section 13, sub-section 1, sentence 2 of the new Chemicals Control Law. The examples given here are the generally accepted regulations in regards to technology (that is sections 7a and 18b of the WHG (law on water economy), section 3, sub-section 1 of the machine- and engine protection laws) and to the status of technology (section 3, sub-section 6 of the BImSchG (Fed. law on prevention of air-borne pollution)), and to the status of science (section 5, sub-section 2 of the AMG (drug legislation). The ''status of science and technology'' as defined in sections 4 ff of the Atomic Energy Law (AtomG) and in sections 3, 4, 12, 2) of the First Radiation Protection Ordinance (1.StrlSch. VO), is also being discussed. The author defines the in his opinion ''dynamic term'' as the generally recognized result of scientific research, and the respective possibilities of practical utilization of technology. (orig.) [de

  15. Memoised Garbage Collection for Software Model Checking

    NARCIS (Netherlands)

    Nguyen, V.Y.; Ruys, T.C.; Kowalewski, S.; Philippou, A.

    Virtual machine based software model checkers like JPF and MoonWalker spend up to half of their veri��?cation time on garbage collection. This is no surprise as after nearly each transition the heap has to be cleaned from garbage. To improve this, this paper presents the Memoised Garbage Collection

  16. Software for managing multicrate FASTBUS Systems

    International Nuclear Information System (INIS)

    Deiss, S.R.; Gustavson, D.B.

    1982-10-01

    The FASTBUS System Manager software that was designed and implemented on an LSI-11 system using PASCAL is described. Particular attention is given to the file structures, file access mechanisms, and basic routing algorithms. Portability to other machines and languages is described

  17. MITS machine operations

    International Nuclear Information System (INIS)

    Flinchem, J.

    1980-01-01

    This document contains procedures which apply to operations performed on individual P-1c machines in the Machine Interface Test System (MITS) at AiResearch Manufacturing Company's Torrance, California Facility

  18. Brain versus Machine Control.

    Directory of Open Access Journals (Sweden)

    Jose M Carmena

    2004-12-01

    Full Text Available Dr. Octopus, the villain of the movie "Spiderman 2", is a fusion of man and machine. Neuroscientist Jose Carmena examines the facts behind this fictional account of a brain- machine interface

  19. Applied machining technology

    CERN Document Server

    Tschätsch, Heinz

    2010-01-01

    Machining and cutting technologies are still crucial for many manufacturing processes. This reference presents all important machining processes in a comprehensive and coherent way. It includes many examples of concrete calculations, problems and solutions.

  20. Machining with abrasives

    CERN Document Server

    Jackson, Mark J

    2011-01-01

    Abrasive machining is key to obtaining the desired geometry and surface quality in manufacturing. This book discusses the fundamentals and advances in the abrasive machining processes. It provides a complete overview of developing areas in the field.

  1. Machine medical ethics

    CERN Document Server

    Pontier, Matthijs

    2015-01-01

    The essays in this book, written by researchers from both humanities and sciences, describe various theoretical and experimental approaches to adding medical ethics to a machine in medical settings. Medical machines are in close proximity with human beings, and getting closer: with patients who are in vulnerable states of health, who have disabilities of various kinds, with the very young or very old, and with medical professionals. In such contexts, machines are undertaking important medical tasks that require emotional sensitivity, knowledge of medical codes, human dignity, and privacy. As machine technology advances, ethical concerns become more urgent: should medical machines be programmed to follow a code of medical ethics? What theory or theories should constrain medical machine conduct? What design features are required? Should machines share responsibility with humans for the ethical consequences of medical actions? How ought clinical relationships involving machines to be modeled? Is a capacity for e...

  2. RELAP-7 Software Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Choi, Yong-Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Zou, Ling [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support

    2014-09-25

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  3. Machine protection systems

    CERN Document Server

    Macpherson, A L

    2010-01-01

    A summary of the Machine Protection System of the LHC is given, with particular attention given to the outstanding issues to be addressed, rather than the successes of the machine protection system from the 2009 run. In particular, the issues of Safe Machine Parameter system, collimation and beam cleaning, the beam dump system and abort gap cleaning, injection and dump protection, and the overall machine protection program for the upcoming run are summarised.

  4. Manifesto for the Software Development Professionalization

    Directory of Open Access Journals (Sweden)

    Red Latinoamericana en Ingeniería de Software (RedLatinaIS

    2013-12-01

    Full Text Available One of the central problems of current economic development and industrial competitiveness, social and scientific, is the complexity of large and intensive software systems, and processes for their development and implementation. This complexity is defined by the amount and heterogeneity of the interaction of the hardware with the software components, their inter-relationships, of incorporation of the technical and organizational environments, and the interfaces to humans. The domain of these systems requires actions and scientific thoughts, hierarchical and systematic; also, the success of the products, services and organizations, is increasingly determined by the availability of suitable software products. Therefore, highly qualified professionals, able to understand and master the systems, involved in the entire life cycle of software engineering, and adopt different roles during the development. This is the reason that guide the thinking of this Manifesto , which aims is to achieve the Professionalization of Software Development.

  5. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  6. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  7. Dictionary of machine terms

    International Nuclear Information System (INIS)

    1990-06-01

    This book has introduction of dictionary of machine terms, and a compilation committee and introductory remarks. It gives descriptions of the machine terms in alphabetical order from a to Z and also includes abbreviation of machine terms and symbol table, way to read mathematical symbols and abbreviation and terms of drawings.

  8. Mankind, machines and people

    Energy Technology Data Exchange (ETDEWEB)

    Hugli, A

    1984-01-01

    The following questions are addressed: is there a difference between machines and men, between human communication and communication with machines. Will we ever reach the point when the dream of artificial intelligence becomes a reality. Will thinking machines be able to replace the human spirit in all its aspects. Social consequences and philosophical aspects are addressed. 8 references.

  9. A Universal Reactive Machine

    DEFF Research Database (Denmark)

    Andersen, Henrik Reif; Mørk, Simon; Sørensen, Morten U.

    1997-01-01

    Turing showed the existence of a model universal for the set of Turing machines in the sense that given an encoding of any Turing machine asinput the universal Turing machine simulates it. We introduce the concept of universality for reactive systems and construct a CCS processuniversal...

  10. HTS machine laboratory prototype

    DEFF Research Database (Denmark)

    machine. The machine comprises six stationary HTS field windings wound from both YBCO and BiSCOO tape operated at liquid nitrogen temperature and enclosed in a cryostat, and a three phase armature winding spinning at up to 300 rpm. This design has full functionality of HTS synchronous machines. The design...

  11. Your Sewing Machine.

    Science.gov (United States)

    Peacock, Marion E.

    The programed instruction manual is designed to aid the student in learning the parts, uses, and operation of the sewing machine. Drawings of sewing machine parts are presented, and space is provided for the student's written responses. Following an introductory section identifying sewing machine parts, the manual deals with each part and its…

  12. Software testing in roughness calculation

    International Nuclear Information System (INIS)

    Chen, Y L; Hsieh, P F; Fu, W E

    2005-01-01

    A test method to determine the function quality provided by the software for roughness measurement is presented in this study. The function quality of the software requirements should be part of and assessed through the entire life cycle of the software package. The specific function, or output accuracy, is crucial for the analysis of the experimental data. For scientific applications, however, commercial software is usually embedded with specific instrument, which is used for measurement or analysis during the manufacture process. In general, the error ratio caused by the software would be more apparent especially when dealing with relatively small quantities, like the measurements in the nanometer-scale range. The model of 'using a data generator' proposed by NPL of UK was applied in this study. An example of the roughness software is tested and analyzed by the above mentioned process. After selecting the 'reference results', the 'reference data' was generated by a programmable 'data generator'. The filter function of 0.8 mm long cutoff value, defined in ISO 11562 was tested with 66 sinusoid data at different wavelengths. Test results from commercial software and CMS written program were compared to the theoretical data calculated from ISO standards. As for the filter function in this software, the result showed a significant disagreement between the reference and test results. The short cutoff feature for filtering at the high frequencies does not function properly, while the long cutoff feature has the maximum difference in the filtering ratio, which is more than 70% between the wavelength of 300 μm and 500 μm. Conclusively, the commercial software needs to be tested more extensively for specific application by appropriate design of reference dataset to ensure its function quality

  13. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  14. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  15. Remote access to mathematical software

    International Nuclear Information System (INIS)

    Dolan, E.; Hovland, P.; More, J.; Norris, B.; Smith, B.

    2001-01-01

    The network-oriented application services paradigm is becoming increasingly common for scientific computing. The popularity of this approach can be attributed to the numerous advantages to both user and developer provided by network-enabled mathematical software. The burden of installing and maintaining complex systems is lifted from the user, while enabling developers to provide frequent updates without disrupting service. Access to software with similar functionality can be unified under the same interface. Remote servers can utilize potentially more powerful computing resources than may be available locally. We discuss some of the application services developed by the Mathematics and Computer Science Division at Argonne National Laboratory, including the Network Enabled Optimization System (NEOS) Server and the Automatic Differentiation of C (ADIC) Server, as well as preliminary work on Web access to the Portable Extensible Toolkit for Scientific Computing (PETSc). We also provide a brief survey of related work

  16. Asynchronized synchronous machines

    CERN Document Server

    Botvinnik, M M

    1964-01-01

    Asynchronized Synchronous Machines focuses on the theoretical research on asynchronized synchronous (AS) machines, which are "hybrids” of synchronous and induction machines that can operate with slip. Topics covered in this book include the initial equations; vector diagram of an AS machine; regulation in cases of deviation from the law of full compensation; parameters of the excitation system; and schematic diagram of an excitation regulator. The possible applications of AS machines and its calculations in certain cases are also discussed. This publication is beneficial for students and indiv

  17. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  18. An Overall Perspective of Machine Translation with its Shortcomings

    Directory of Open Access Journals (Sweden)

    Alireza Akbari

    2014-01-01

    Full Text Available The petition for language translation has strikingly augmented recently due to cross-cultural communication and exchange of information. In order to communicate well, text should be translated correctly and completely in each field such as legal documents, technical texts, scientific texts, publicity leaflets, and instructional materials. In this connection, Machine translation is of great importance in translation. The term “Machine Translation” was first proposed by George Artsrouni and Smirnov Troyanski (1933 to design a storage design on paper tape. This paper sought to investigate an overall perspective of Machine Translation models and its metrics in detail. Finally, it scrutinized the ins and outs shortcomings of Machine Translation.

  19. IEEE Computer Society/Software Engineering Institute Software Process Achievement (SPA) Award 2009

    Science.gov (United States)

    2011-03-01

    capabilities to our GDM. We also introduced software as a service ( SaaS ) as part our technology solutions and have further enhanced our ability to...model PROSPER Infosys production support methodology Q&P quality and productivity R&D research and development SaaS software as a service ... Software Development Life Cycle (SDLC) 23 Table 10: Scientific Estimation Coverage by Service Line 27 CMU/SEI-2011-TR-008 | vi CMU/SEI-2011

  20. Towards Measuring the Abstractness of State Machines based on Mutation Testing

    Directory of Open Access Journals (Sweden)

    Thomas Baar

    2017-01-01

    Full Text Available Abstract. The notation of state machines is widely adopted as a formalism to describe the behaviour of systems. Usually, multiple state machine models can be developed for the very same software system. Some of these models might turn out to be equivalent, but, in many cases, different state machines describing the same system also differ in their level of abstraction. In this paper, we present an approach to actually measure the abstractness level of state machines w.r.t. a given implemented software system. A state machine is considered to be less abstract when it is conceptionally closer to the implemented system. In our approach, this distance between state machine and implementation is measured by applying coverage criteria known from software mutation testing. Abstractness of state machines can be considered as a new metric. As for other metrics as well, a known value for the abstractness of a given state machine allows to assess its quality in terms of a simple number. In model-based software development projects, the abstract metric can help to prevent model degradation since it can actually measure the semantic distance from the behavioural specification of a system in form of a state machine to the current implementation of the system. In contrast to other metrics for state machines, the abstractness cannot be statically computed based on the state machine’s structure, but requires to execute both state machine and corresponding system implementation. The article is published in the author’s wording. 

  1. MAUS: MICE Analysis User Software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for online data quality checks and offline batch simulation and reconstruction. The code is structured in a Map-Reduce framework to allow parallelization whether on a personal machine or in the control room. Various software engineering practices from industry are also used to ensure correct and maintainable physics code, which include unit, functional and integration tests, continuous integration and load testing, code reviews, and distributed version control systems. Lastly, there are various small design decisions like using JSON as the data structure, using SWIG to allow developers to write components in either Python or C++, or using the SCons python-based build system that may be of interest to other experiments.

  2. High performance in software development

    CERN Multimedia

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  3. Automated Bug Assignment: Ensemble-based Machine Learning in Large Scale Industrial Contexts

    OpenAIRE

    Jonsson, Leif; Borg, Markus; Broman, David; Sandahl, Kristian; Eldh, Sigrid; Runeson, Per

    2016-01-01

    Bug report assignment is an important part of software maintenance. In particular, incorrect assignments of bug reports to development teams can be very expensive in large software development projects. Several studies propose automating bug assignment techniques using machine learning in open source software contexts, but no study exists for large-scale proprietary projects in industry. The goal of this study is to evaluate automated bug assignment techniques that are based on machine learni...

  4. Attacking Machine Learning models as part of a cyber kill chain

    OpenAIRE

    Nguyen, Tam N.

    2017-01-01

    Machine learning is gaining popularity in the network security domain as many more network-enabled devices get connected, as malicious activities become stealthier, and as new technologies like Software Defined Networking emerge. Compromising machine learning model is a desirable goal. In fact, spammers have been quite successful getting through machine learning enabled spam filters for years. While previous works have been done on adversarial machine learning, none has been considered within...

  5. Gram staining with an automatic machine.

    Science.gov (United States)

    Felek, S; Arslan, A

    1999-01-01

    This study was undertaken to develop a new Gram-staining machine controlled by a micro-controller and to investigate the quality of slides that were stained in the machine. The machine was designed and produced by the authors. It uses standard 220 V AC. Staining, washing, and drying periods are controlled by a timer built in the micro-controller. A software was made that contains a certain algorithm and time intervals for the staining mode. One-hundred and forty smears were prepared from Escherichia coli, Staphylococcus aureus, Neisseria sp., blood culture, trypticase soy broth, direct pus and sputum smears for comparison studies. Half of the slides in each group were stained with the machine, the other half by hand and then examined by four different microbiologists. Machine-stained slides had a higher clarity and less debris than the hand-stained slides (p stained slides, some Gram-positive organisms showed poor Gram-positive staining features (p Gram staining with the automatic machine increases the staining quality and helps to decrease the work load in a busy diagnostic laboratory.

  6. Requirements Engineering in Building Climate Science Software

    Science.gov (United States)

    Batcheller, Archer L.

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling Framework assists modeling applications, the Earth System Grid distributes data via a web portal, and the NCAR (National Center for Atmospheric Research) Command Language is used to convert, analyze and visualize data. Document analysis, observation, and interviews were used to investigate the requirements-related work. The first research question is about how and why stakeholders engage in a project, and what they do for the project. Two key findings arise. First, user counts are a vital measure of project success, which makes adoption important and makes counting tricky and political. Second, despite the importance of quantities of users, a few particular "power users" develop a relationship with the software developers and play a special role in providing feedback to the software team and integrating the system into user practice. The second research question focuses on how project objectives are articulated and how they are put into practice. The team seeks to both build a software system according to product requirements but also to conduct their work according to process requirements such as user support. Support provides essential communication between users and developers that assists with refining and identifying requirements for the software. It also helps users to learn and apply the software to their real needs. User support is a vital activity for scientific software teams aspiring to create infrastructure. The third research question is about how change in scientific practice and knowledge leads to changes in the software, and vice versa. The "thickness" of a layer of software infrastructure impacts whether the

  7. A Method for Design of Modular Reconfigurable Machine Tools

    Directory of Open Access Journals (Sweden)

    Zhengyi Xu

    2017-02-01

    Full Text Available Presented in this paper is a method for the design of modular reconfigurable machine tools (MRMTs. An MRMT is capable of using a minimal number of modules through reconfiguration to perform the required machining tasks for a family of parts. The proposed method consists of three steps: module identification, module determination, and layout synthesis. In the first step, the module components are collected from a family of general-purpose machines to establish a module library. In the second step, for a given family of parts to be machined, a set of needed modules are selected from the module library to construct a desired reconfigurable machine tool. In the third step, a final machine layout is decided though evaluation by considering a number of performance indices. Based on this method, a software package has been developed that can design an MRMT for a given part family.

  8. Tank monitor and control system (TMACS) software configuration management plan

    International Nuclear Information System (INIS)

    GLASSCOCK, J.A.

    1999-01-01

    This Software Configuration Management Plan (SCMP) describes the methodology for control of computer software developed and supported by the Systems Development and Integration (SD and I) organization of Lockheed Martin Services, Inc. (LMSI) for the Tank Monitor and Control System (TMACS). This plan controls changes to the software and configuration files used by TMACS. The controlled software includes the Gensym software package, Gensym knowledge base files developed for TMACS, C-language programs used by TMACS, the operating system on the production machine, language compilers, and all Windows NT commands and functions which affect the operating environment. The configuration files controlled include the files downloaded to the Acromag and Westronic field instruments

  9. The JET level-1 software

    International Nuclear Information System (INIS)

    McCullen, P.A.; Farthing, J.W.

    1998-01-01

    The complex nature of the JET machine requires a large amount of control parameter preparation, selection and validation before a pulse may be started. Level-1 is defined as the centralized, cross-subsystem control of JET. Before it was introduced over 10 years ago, the Session Leader (SL) who is responsible for specifying the parameter settings for a JET pulse, had virtually no software available to help him except for a simple editor used for the creation of control waveforms. Most of the required parameter settings were calculated by hand and then passed on either verbally or via hand-written forms. These parameters were then set by a large number of people - Local Unit Responsible Officers (LUROs) and CODAS Duty Officers (CDOs) using a wide selection of dedicated software. At this time the Engineer in Charge (EiC) would largely depend on the LUROs to inform him that conditions were ready. He never set control parameters personally and had little or no software available to him to see what many of the settings were. The first implementation of Level-1 software went some way towards improving the task of pulse schedule preparation in that the SL could specify his requirements via a computer interface and store them in a database for later use. At that time the maximum number of parameters that could be handled was 500. (author)

  10. Understanding dental CAD/CAM for restorations - dental milling machines from a mechanical engineering viewpoint. Part A: chairside milling machines.

    Science.gov (United States)

    Lebon, Nicolas; Tapie, Laurent; Duret, Francois; Attal, Jean-Pierre

    2016-01-01

    The dental milling machine is an important device in the dental CAD/CAM chain. Nowadays, dental numerical controlled (NC) milling machines are available for dental surgeries (chairside solution). This article provides a mechanical engineering approach to NC milling machines to help dentists understand the involvement of technology in digital dentistry practice. First, some technical concepts and definitions associated with NC milling machines are described from a mechanical engineering viewpoint. The technical and economic criteria of four chairside dental NC milling machines that are available on the market are then described. The technical criteria are focused on the capacities of the embedded technologies of these milling machines to mill both prosthetic materials and types of shape restorations. The economic criteria are focused on investment costs and interoperability with third-party software. The clinical relevance of the technology is assessed in terms of the accuracy and integrity of the restoration.

  11. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  12. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  13. OPENING REMARKS: Scientific Discovery through Advanced Computing

    Science.gov (United States)

    Strayer, Michael

    2006-01-01

    as the national and regional electricity grid, carbon sequestration, virtual engineering, and the nuclear fuel cycle. The successes of the first five years of SciDAC have demonstrated the power of using advanced computing to enable scientific discovery. One measure of this success could be found in the President’s State of the Union address in which President Bush identified ‘supercomputing’ as a major focus area of the American Competitiveness Initiative. Funds were provided in the FY 2007 President’s Budget request to increase the size of the NERSC-5 procurement to between 100-150 teraflops, to upgrade the LCF Cray XT3 at Oak Ridge to 250 teraflops and acquire a 100 teraflop IBM BlueGene/P to establish the Leadership computing facility at Argonne. We believe that we are on a path to establish a petascale computing resource for open science by 2009. We must develop software tools, packages, and libraries as well as the scientific application software that will scale to hundreds of thousands of processors. Computer scientists from universities and the DOE’s national laboratories will be asked to collaborate on the development of the critical system software components such as compilers, light-weight operating systems and file systems. Standing up these large machines will not be business as usual for ASCR. We intend to develop a series of interconnected projects that identify cost, schedule, risks, and scope for the upgrades at the LCF at Oak Ridge, the establishment of the LCF at Argonne, and the development of the software to support these high-end computers. The critical first step in defining the scope of the project is to identify a set of early application codes for each leadership class computing facility. These codes will have access to the resources during the commissioning phase of the facility projects and will be part of the acceptance tests for the machines. Applications will be selected, in part, by breakthrough science, scalability, and

  14. Parallel supercomputing: Advanced methods, algorithms, and software for large-scale linear and nonlinear problems

    Energy Technology Data Exchange (ETDEWEB)

    Carey, G.F.; Young, D.M.

    1993-12-31

    The program outlined here is directed to research on methods, algorithms, and software for distributed parallel supercomputers. Of particular interest are finite element methods and finite difference methods together with sparse iterative solution schemes for scientific and engineering computations of very large-scale systems. Both linear and nonlinear problems will be investigated. In the nonlinear case, applications with bifurcation to multiple solutions will be considered using continuation strategies. The parallelizable numerical methods of particular interest are a family of partitioning schemes embracing domain decomposition, element-by-element strategies, and multi-level techniques. The methods will be further developed incorporating parallel iterative solution algorithms with associated preconditioners in parallel computer software. The schemes will be implemented on distributed memory parallel architectures such as the CRAY MPP, Intel Paragon, the NCUBE3, and the Connection Machine. We will also consider other new architectures such as the Kendall-Square (KSQ) and proposed machines such as the TERA. The applications will focus on large-scale three-dimensional nonlinear flow and reservoir problems with strong convective transport contributions. These are legitimate grand challenge class computational fluid dynamics (CFD) problems of significant practical interest to DOE. The methods developed and algorithms will, however, be of wider interest.

  15. A computer architecture for intelligent machines

    Science.gov (United States)

    Lefebvre, D. R.; Saridis, G. N.

    1992-01-01

    The theory of intelligent machines proposes a hierarchical organization for the functions of an autonomous robot based on the principle of increasing precision with decreasing intelligence. An analytic formulation of this theory using information-theoretic measures of uncertainty for each level of the intelligent machine has been developed. The authors present a computer architecture that implements the lower two levels of the intelligent machine. The architecture supports an event-driven programming paradigm that is independent of the underlying computer architecture and operating system. Execution-level controllers for motion and vision systems are briefly addressed, as well as the Petri net transducer software used to implement coordination-level functions. A case study illustrates how this computer architecture integrates real-time and higher-level control of manipulator and vision systems.

  16. The machine/job features mechanism

    Science.gov (United States)

    Alef, M.; Cass, T.; Keijser, J. J.; McNab, A.; Roiser, S.; Schwickerath, U.; Sfiligoi, I.

    2017-10-01

    Within the HEPiX virtualization group and the Worldwide LHC Computing Grid’s Machine/Job Features Task Force, a mechanism has been developed which provides access to detailed information about the current host and the current job to the job itself. This allows user payloads to access meta information, independent of the current batch system or virtual machine model. The information can be accessed either locally via the filesystem on a worker node, or remotely via HTTP(S) from a webserver. This paper describes the final version of the specification from 2016 which was published as an HEP Software Foundation technical note, and the design of the implementations of this version for batch and virtual machine platforms. We discuss early experiences with these implementations and how they can be exploited by experiment frameworks.

  17. Risky module prediction for nuclear I and C software

    International Nuclear Information System (INIS)

    Kim, Young Mi; Kim, Hyeon Soo

    2012-01-01

    As software based digital I and C (Instrumentation and Control) systems are used more prevalently in nuclear plants, enhancement of software dependability has become an important issue in the area of nuclear I and C systems. Critical attributes of software dependability are safety and reliability. These attributes are tightly related to software failures caused by faults. Software testing and V and V (Verification and Validation) activities are hence important for enhancing software dependability. If the risky modules of safety-critical software can be predicted, it will be possible to focus on testing and V and V activities more efficiently and effectively. It should also make it possible to better allocate resources for regulation activities. We propose a prediction technique to estimate risky software modules by adopting machine learning models based on software complexity metrics. An empirical study with various machine learning algorithms was executed for comparing the prediction performance. Experimental results show SVMs (Support Vector Machines) perform as well or better than the other methods.

  18. Pattern recognition & machine learning

    CERN Document Server

    Anzai, Y

    1992-01-01

    This is the first text to provide a unified and self-contained introduction to visual pattern recognition and machine learning. It is useful as a general introduction to artifical intelligence and knowledge engineering, and no previous knowledge of pattern recognition or machine learning is necessary. Basic for various pattern recognition and machine learning methods. Translated from Japanese, the book also features chapter exercises, keywords, and summaries.

  19. Support vector machines applications

    CERN Document Server

    Guo, Guodong

    2014-01-01

    Support vector machines (SVM) have both a solid mathematical background and good performance in practical applications. This book focuses on the recent advances and applications of the SVM in different areas, such as image processing, medical practice, computer vision, pattern recognition, machine learning, applied statistics, business intelligence, and artificial intelligence. The aim of this book is to create a comprehensive source on support vector machine applications, especially some recent advances.

  20. The Newest Machine Material

    International Nuclear Information System (INIS)

    Seo, Yeong Seop; Choe, Byeong Do; Bang, Meong Sung

    2005-08-01

    This book gives descriptions of machine material with classification of machine material and selection of machine material, structure and connection of material, coagulation of metal and crystal structure, equilibrium diagram, properties of metal material, elasticity and plasticity, biopsy of metal, material test and nondestructive test. It also explains steel material such as heat treatment of steel, cast iron and cast steel, nonferrous metal materials, non metallic materials, and new materials.

  1. Introduction to machine learning

    OpenAIRE

    Baştanlar, Yalın; Özuysal, Mustafa

    2014-01-01

    The machine learning field, which can be briefly defined as enabling computers make successful predictions using past experiences, has exhibited an impressive development recently with the help of the rapid increase in the storage capacity and processing power of computers. Together with many other disciplines, machine learning methods have been widely employed in bioinformatics. The difficulties and cost of biological analyses have led to the development of sophisticated machine learning app...

  2. Machinability of advanced materials

    CERN Document Server

    Davim, J Paulo

    2014-01-01

    Machinability of Advanced Materials addresses the level of difficulty involved in machining a material, or multiple materials, with the appropriate tooling and cutting parameters.  A variety of factors determine a material's machinability, including tool life rate, cutting forces and power consumption, surface integrity, limiting rate of metal removal, and chip shape. These topics, among others, and multiple examples comprise this research resource for engineering students, academics, and practitioners.

  3. Machining of titanium alloys

    CERN Document Server

    2014-01-01

    This book presents a collection of examples illustrating the resent research advances in the machining of titanium alloys. These materials have excellent strength and fracture toughness as well as low density and good corrosion resistance; however, machinability is still poor due to their low thermal conductivity and high chemical reactivity with cutting tool materials. This book presents solutions to enhance machinability in titanium-based alloys and serves as a useful reference to professionals and researchers in aerospace, automotive and biomedical fields.

  4. On the Use of Machine Learning for Identifying Botnet Network Traffic

    DEFF Research Database (Denmark)

    Stevanovic, Matija; Pedersen, Jens Myrup

    2016-01-01

    contemporary approaches use machine learning techniques for identifying malicious traffic. This paper presents a survey of contemporary botnet detection methods that rely on machine learning for identifying botnet network traffic. The paper provides a comprehensive overview on the existing scientific work thus...... contributing to the better understanding of capabilities, limitations and opportunities of using machine learning for identifying botnet traffic. Furthermore, the paper outlines possibilities for the future development of machine learning-based botnet detection systems....

  5. Tribology in machine design

    CERN Document Server

    Stolarski, Tadeusz

    1999-01-01

    ""Tribology in Machine Design is strongly recommended for machine designers, and engineers and scientists interested in tribology. It should be in the engineering library of companies producing mechanical equipment.""Applied Mechanics ReviewTribology in Machine Design explains the role of tribology in the design of machine elements. It shows how algorithms developed from the basic principles of tribology can be used in a range of practical applications within mechanical devices and systems.The computer offers today's designer the possibility of greater stringen

  6. Induction machine handbook

    CERN Document Server

    Boldea, Ion

    2002-01-01

    Often called the workhorse of industry, the advent of power electronics and advances in digital control are transforming the induction motor into the racehorse of industrial motion control. Now, the classic texts on induction machines are nearly three decades old, while more recent books on electric motors lack the necessary depth and detail on induction machines.The Induction Machine Handbook fills industry's long-standing need for a comprehensive treatise embracing the many intricate facets of induction machine analysis and design. Moving gradually from simple to complex and from standard to

  7. Chaotic Boltzmann machines

    Science.gov (United States)

    Suzuki, Hideyuki; Imura, Jun-ichi; Horio, Yoshihiko; Aihara, Kazuyuki

    2013-01-01

    The chaotic Boltzmann machine proposed in this paper is a chaotic pseudo-billiard system that works as a Boltzmann machine. Chaotic Boltzmann machines are shown numerically to have computing abilities comparable to conventional (stochastic) Boltzmann machines. Since no randomness is required, efficient hardware implementation is expected. Moreover, the ferromagnetic phase transition of the Ising model is shown to be characterised by the largest Lyapunov exponent of the proposed system. In general, a method to relate probabilistic models to nonlinear dynamics by derandomising Gibbs sampling is presented. PMID:23558425

  8. Electrical machines & drives

    CERN Document Server

    Hammond, P

    1985-01-01

    Containing approximately 200 problems (100 worked), the text covers a wide range of topics concerning electrical machines, placing particular emphasis upon electrical-machine drive applications. The theory is concisely reviewed and focuses on features common to all machine types. The problems are arranged in order of increasing levels of complexity and discussions of the solutions are included where appropriate to illustrate the engineering implications. This second edition includes an important new chapter on mathematical and computer simulation of machine systems and revised discussions o

  9. Nanocomposites for Machining Tools

    Directory of Open Access Journals (Sweden)

    Daria Sidorenko

    2017-10-01

    Full Text Available Machining tools are used in many areas of production. To a considerable extent, the performance characteristics of the tools determine the quality and cost of obtained products. The main materials used for producing machining tools are steel, cemented carbides, ceramics and superhard materials. A promising way to improve the performance characteristics of these materials is to design new nanocomposites based on them. The application of micromechanical modeling during the elaboration of composite materials for machining tools can reduce the financial and time costs for development of new tools, with enhanced performance. This article reviews the main groups of nanocomposites for machining tools and their performance.

  10. Machine listening intelligence

    Science.gov (United States)

    Cella, C. E.

    2017-05-01

    This manifesto paper will introduce machine listening intelligence, an integrated research framework for acoustic and musical signals modelling, based on signal processing, deep learning and computational musicology.

  11. Machine learning with R

    CERN Document Server

    Lantz, Brett

    2013-01-01

    Written as a tutorial to explore and understand the power of R for machine learning. This practical guide that covers all of the need to know topics in a very systematic way. For each machine learning approach, each step in the process is detailed, from preparing the data for analysis to evaluating the results. These steps will build the knowledge you need to apply them to your own data science tasks.Intended for those who want to learn how to use R's machine learning capabilities and gain insight from your data. Perhaps you already know a bit about machine learning, but have never used R; or

  12. Rotating electrical machines

    CERN Document Server

    Le Doeuff, René

    2013-01-01

    In this book a general matrix-based approach to modeling electrical machines is promulgated. The model uses instantaneous quantities for key variables and enables the user to easily take into account associations between rotating machines and static converters (such as in variable speed drives).   General equations of electromechanical energy conversion are established early in the treatment of the topic and then applied to synchronous, induction and DC machines. The primary characteristics of these machines are established for steady state behavior as well as for variable speed scenarios. I

  13. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  14. Are there intelligent Turing machines?

    OpenAIRE

    Bátfai, Norbert

    2015-01-01

    This paper introduces a new computing model based on the cooperation among Turing machines called orchestrated machines. Like universal Turing machines, orchestrated machines are also designed to simulate Turing machines but they can also modify the original operation of the included Turing machines to create a new layer of some kind of collective behavior. Using this new model we can define some interested notions related to cooperation ability of Turing machines such as the intelligence quo...

  15. MPS beam control software architecture

    International Nuclear Information System (INIS)

    Krauter, K.; Crane, M.

    1993-01-01

    The new Machine Protection System (MPS) now being tested at SLAC has a beam control subsystem resident in processors located close to the beam monitoring devices within the machine. There are two types of beam control micros: Algorithm Processors (AP's) which collect and evaluate data from monitoring devices, and a Supervisor (SUPE) which collects and evaluates data from all the AP's. The SUPE also receives the global machine beamcode indicating beam presence, and passes it on to the AP's. The SUPE receives the beamcode pattern from the Master Pattern Generator (MPG) via a shared-memory communication link. MIL-1553 serial communication is used between the SUPE and the AP's, and between the AP's and the monitoring devices. Multitasking software is used to allow high priority handling of data evaluation and low priority handling of host/user interfacing and event reporting. Pipelining of data between acquisition and evaluation and reporting is used to accommodate the processing capacity, while still supporting full processing at the 36OHz broadcast rate of the beamcode pattern

  16. MPS beam control software architecture

    International Nuclear Information System (INIS)

    Krauter, K.; Crane, M.

    1993-04-01

    The new Machine Protection System (MPS) now being tested at SLAC has a beam control subsystem resident in processors located close to the beam monitoring devices within the machine. There are two types of beam control micros: Algorithm Processors (AP's) which collect and evaluate data from monitoring devices, and a Supervisor (SUPE) which collects and evaluates data from all the AP's. The SUPE also receives the global machine beamcode indicating beam presence, and passes it on to the AP's. The SUPE receives the beamcode pattern from the Master Pattern Generator (MPG) via a shared-memory communication link. MIL-1553 serial communication is used between the SUPE and the AP's, and between the AP's and the monitoring devices. Multitasking software is used to allow high priority handling of data evaluation and low priority handling of host/user interfacing and event reporting. Pipelining of data between acquisition and evaluation and reporting is used to accomodate the processing capacity, while still supporting full processing at the 360Hz broadcast rate of the beamcode pattern

  17. Load Balancing Scientific Applications

    Energy Technology Data Exchange (ETDEWEB)

    Pearce, Olga Tkachyshyn [Texas A & M Univ., College Station, TX (United States)

    2014-12-01

    The largest supercomputers have millions of independent processors, and concurrency levels are rapidly increasing. For ideal efficiency, developers of the simulations that run on these machines must ensure that computational work is evenly balanced among processors. Assigning work evenly is challenging because many large modern parallel codes simulate behavior of physical systems that evolve over time, and their workloads change over time. Furthermore, the cost of imbalanced load increases with scale because most large-scale scientific simulations today use a Single Program Multiple Data (SPMD) parallel programming model, and an increasing number of processors will wait for the slowest one at the synchronization points. To address load imbalance, many large-scale parallel applications use dynamic load balance algorithms to redistribute work evenly. The research objective of this dissertation is to develop methods to decide when and how to load balance the application, and to balance it effectively and affordably. We measure and evaluate the computational load of the application, and develop strategies to decide when and how to correct the imbalance. Depending on the simulation, a fast, local load balance algorithm may be suitable, or a more sophisticated and expensive algorithm may be required. We developed a model for comparison of load balance algorithms for a specific state of the simulation that enables the selection of a balancing algorithm that will minimize overall runtime.

  18. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  19. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  20. An open-source solution for advanced imaging flow cytometry data analysis using machine learning.

    Science.gov (United States)

    Hennig, Holger; Rees, Paul; Blasi, Thomas; Kamentsky, Lee; Hung, Jane; Dao, David; Carpenter, Anne E; Filby, Andrew

    2017-01-01

    Imaging flow cytometry (IFC) enables the high throughput collection of morphological and spatial information from hundreds of thousands of single cells. This high content, information rich image data can in theory resolve important biological differences among complex, often heterogeneous biological samples. However, data analysis is often performed in a highly manual and subjective manner using very limited image analysis techniques in combination with conventional flow cytometry gating strategies. This approach is not scalable to the hundreds of available image-based features per cell and thus makes use of only a fraction of the spatial and morphometric information. As a result, the quality, reproducibility and rigour of results are limited by the skill, experience and ingenuity of the data analyst. Here, we describe a pipeline using open-source software that leverages the rich information in digital imagery using machine learning algorithms. Compensated and corrected raw image files (.rif) data files from an imaging flow cytometer (the proprietary .cif file format) are imported into the open-source software CellProfiler, where an image processing pipeline identifies cells and subcellular compartments allowing hundreds of morphological features to be measured. This high-dimensional data can then be analysed using cutting-edge machine learning and clustering approaches using "user-friendly" platforms such as CellProfiler Analyst. Researchers can train an automated cell classifier to recognize different cell types, cell cycle phases, drug treatment/control conditions, etc., using supervised machine learning. This workflow should enable the scientific community to leverage the full analytical power of IFC-derived data sets. It will help to reveal otherwise unappreciated populations of cells based on features that may be hidden to the human eye that include subtle measured differences in label free detection channels such as bright-field and dark-field imagery

  1. Machine learning for identifying botnet network traffic

    DEFF Research Database (Denmark)

    Stevanovic, Matija; Pedersen, Jens Myrup

    2013-01-01

    . Due to promise of non-invasive and resilient detection, botnet detection based on network traffic analysis has drawn a special attention of the research community. Furthermore, many authors have turned their attention to the use of machine learning algorithms as the mean of inferring botnet......-related knowledge from the monitored traffic. This paper presents a review of contemporary botnet detection methods that use machine learning as a tool of identifying botnet-related traffic. The main goal of the paper is to provide a comprehensive overview on the field by summarizing current scientific efforts....... The contribution of the paper is three-fold. First, the paper provides a detailed insight on the existing detection methods by investigating which bot-related heuristic were assumed by the detection systems and how different machine learning techniques were adapted in order to capture botnet-related knowledge...

  2. Chuck for machining armature casings and angles

    International Nuclear Information System (INIS)

    Tashlitskii, A.I.; Matskevich, A.I.

    1984-01-01

    When machining T-joints and angles, the test specimen must be fixed before being placed in the desired position. This is quite a complex operation and is achieved in a few stages. At the Scientific Production Combine ''Kislorodmash,'' a new chuck was designed which in one pressing of the jaws seats and fixes the specimen. In the clamped condition, the chuck helps rotate and fix the specimen in one of the four positions. Rotating and fixing are manual. The chuck developed ensured a distinct interdependence of the axes of the branches being machined as the specimen remains fixed throughout the period of machining, and provides reliable fixing of the specimen, and there are no clearances when the specimen is fixed with a special wedge. When using the chuck, the ancillary movements of the operator are reduced to a minimum thus increasing the labor productivity

  3. Test Driven Development of Scientific Models

    Science.gov (United States)

    Clune, Thomas L.

    2012-01-01

    Test-Driven Development (TDD) is a software development process that promises many advantages for developer productivity and has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices. Of course, scientific/technical software differs from other software categories in a number of important respects, but I nonetheless believe that TDD is quite applicable to the development of such software and has the potential to significantly improve programmer productivity and code quality within the scientific community. After a detailed introduction to TDD, I will present the experience within the Software Systems Support Office (SSSO) in applying the technique to various scientific applications. This discussion will emphasize the various direct and indirect benefits as well as some of the difficulties and limitations of the methodology. I will conclude with a brief description of pFUnit, a unit testing framework I co-developed to support test-driven development of parallel Fortran applications.

  4. Numerical simulator of the CANDU fueling machine driving desk

    International Nuclear Information System (INIS)

    Doca, Cezar

    2008-01-01

    As a national and European premiere, in the 2003 - 2005 period, at the Institute for Nuclear Research Pitesti two CANDU fueling machine heads, no.4 and no.5, for the Nuclear Power Plant Cernavoda - Unit 2 were successfully tested. To perform the tests of these machines, a special CANDU fueling machine testing rig was built and was (and is) available for this goal. The design of the CANDU fueling machine test rig from the Institute for Nuclear Research Pitesti is a replica of the similar equipment operating in CANDU 6 type nuclear power plants. High technical level of the CANDU fueling machine tests required the using of an efficient data acquisition and processing Computer Control System. The challenging goal was to build a computer system (hardware and software) designed and engineered to control the test and calibration process of these fuel handling machines. The design takes care both of the functionality required to correctly control the CANDU fueling machine and of the additional functionality required to assist the testing process. Both the fueling machine testing rig and staff had successfully assessed by the AECL representatives during two missions. At same the time, at the Institute for Nuclear Research Pitesti was/is developed a numerical simulator for the CANDU fueling machine operators training. The paper presents the numerical simulator - a special PC program (software) which simulates the graphics and the functions and the operations at the main desk of the computer control system. The simulator permits 'to drive' a CANDU fueling machine in two manners: manual or automatic. The numerical simulator is dedicated to the training of operators who operate the CANDU fueling machine in a nuclear power plant with CANDU reactor. (author)

  5. 12: Assuring the quality of critical software

    International Nuclear Information System (INIS)

    Jacky, J.; Kalet, I.

    1987-01-01

    The authors recommend quality assurance procedures for radiation therapy software. Software quality assurance deals with preventing, detecting and repairing programming errors. Error detection difficulties are most severe in computer-based control systems, for example therapy machine control systems, because it may be impossible for users to confirm correct operation while treatments are in progress, or to intervene if things go wrong. Software quality assurance techniques observed in other industries in which public safety is at risk are reviewed. In some of these industries software must be approved or certified before it can be used. Approval is subject to technical reviews and audits by experts other than the program authors. The main obstacles to adoption of these techniques in the radiation therapy field are costs, lack of familiarity and doubts regarding efficacy. 18 refs

  6. Technical innovation and policy of scientific technique

    International Nuclear Information System (INIS)

    Song, Wi Jin

    2006-04-01

    This book deals with system of innovation and policy of scientific technology : main view point and Topic, technical politics and technical learning, spread of internet and change of structure in information and communications industry, characteristic of technical innovation of software as open source, transfer into national innovation system, change of activity of public scientific technology, theory on technical innovation, evolution of technical innovation policy and participation of civil.

  7. Microsoft Azure machine learning

    CERN Document Server

    Mund, Sumit

    2015-01-01

    The book is intended for those who want to learn how to use Azure Machine Learning. Perhaps you already know a bit about Machine Learning, but have never used ML Studio in Azure; or perhaps you are an absolute newbie. In either case, this book will get you up-and-running quickly.

  8. The Hooey Machine.

    Science.gov (United States)

    Scarnati, James T.; Tice, Craig J.

    1992-01-01

    Describes how students can make and use Hooey Machines to learn how mechanical energy can be transferred from one object to another within a system. The Hooey Machine is made using a pencil, eight thumbtacks, one pushpin, tape, scissors, graph paper, and a plastic lid. (PR)

  9. Nanocomposites for Machining Tools

    DEFF Research Database (Denmark)

    Sidorenko, Daria; Loginov, Pavel; Mishnaevsky, Leon

    2017-01-01

    Machining tools are used in many areas of production. To a considerable extent, the performance characteristics of the tools determine the quality and cost of obtained products. The main materials used for producing machining tools are steel, cemented carbides, ceramics and superhard materials...

  10. A nucleonic weighing machine

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    The design and operation of a nucleonic weighing machine fabricated for continuous weighing of material over conveyor belt are described. The machine uses a 40 mCi cesium-137 line source and a 10 litre capacity ionization chamber. It is easy to maintain as there are no moving parts. It can also be easily removed and reinstalled. (M.G.B.)

  11. An asymptotical machine

    Science.gov (United States)

    Cristallini, Achille

    2016-07-01

    A new and intriguing machine may be obtained replacing the moving pulley of a gun tackle with a fixed point in the rope. Its most important feature is the asymptotic efficiency. Here we obtain a satisfactory description of this machine by means of vector calculus and elementary trigonometry. The mathematical model has been compared with experimental data and briefly discussed.

  12. Machine learning with R

    CERN Document Server

    Lantz, Brett

    2015-01-01

    Perhaps you already know a bit about machine learning but have never used R, or perhaps you know a little R but are new to machine learning. In either case, this book will get you up and running quickly. It would be helpful to have a bit of familiarity with basic programming concepts, but no prior experience is required.

  13. The deleuzian abstract machines

    DEFF Research Database (Denmark)

    Werner Petersen, Erik

    2005-01-01

    To most people the concept of abstract machines is connected to the name of Alan Turing and the development of the modern computer. The Turing machine is universal, axiomatic and symbolic (E.g. operating on symbols). Inspired by Foucault, Deleuze and Guattari extended the concept of abstract...

  14. Human Machine Learning Symbiosis

    Science.gov (United States)

    Walsh, Kenneth R.; Hoque, Md Tamjidul; Williams, Kim H.

    2017-01-01

    Human Machine Learning Symbiosis is a cooperative system where both the human learner and the machine learner learn from each other to create an effective and efficient learning environment adapted to the needs of the human learner. Such a system can be used in online learning modules so that the modules adapt to each learner's learning state both…

  15. Learning Machines Implemented on Non-Deterministic Hardware

    OpenAIRE

    Gupta, Suyog; Sindhwani, Vikas; Gopalakrishnan, Kailash

    2014-01-01

    This paper highlights new opportunities for designing large-scale machine learning systems as a consequence of blurring traditional boundaries that have allowed algorithm designers and application-level practitioners to stay -- for the most part -- oblivious to the details of the underlying hardware-level implementations. The hardware/software co-design methodology advocated here hinges on the deployment of compute-intensive machine learning kernels onto compute platforms that trade-off deter...

  16. Precision machining commercialization

    International Nuclear Information System (INIS)

    1978-01-01

    To accelerate precision machining development so as to realize more of the potential savings within the next few years of known Department of Defense (DOD) part procurement, the Air Force Materials Laboratory (AFML) is sponsoring the Precision Machining Commercialization Project (PMC). PMC is part of the Tri-Service Precision Machine Tool Program of the DOD Manufacturing Technology Five-Year Plan. The technical resources supporting PMC are provided under sponsorship of the Department of Energy (DOE). The goal of PMC is to minimize precision machining development time and cost risk for interested vendors. PMC will do this by making available the high precision machining technology as developed in two DOE contractor facilities, the Lawrence Livermore Laboratory of the University of California and the Union Carbide Corporation, Nuclear Division, Y-12 Plant, at Oak Ridge, Tennessee

  17. Introduction to machine learning.

    Science.gov (United States)

    Baştanlar, Yalin; Ozuysal, Mustafa

    2014-01-01

    The machine learning field, which can be briefly defined as enabling computers make successful predictions using past experiences, has exhibited an impressive development recently with the help of the rapid increase in the storage capacity and processing power of computers. Together with many other disciplines, machine learning methods have been widely employed in bioinformatics. The difficulties and cost of biological analyses have led to the development of sophisticated machine learning approaches for this application area. In this chapter, we first review the fundamental concepts of machine learning such as feature assessment, unsupervised versus supervised learning and types of classification. Then, we point out the main issues of designing machine learning experiments and their performance evaluation. Finally, we introduce some supervised learning methods.

  18. LHC Report: machine development

    CERN Multimedia

    Rogelio Tomás García for the LHC team

    2015-01-01

    Machine development weeks are carefully planned in the LHC operation schedule to optimise and further study the performance of the machine. The first machine development session of Run 2 ended on Saturday, 25 July. Despite various hiccoughs, it allowed the operators to make great strides towards improving the long-term performance of the LHC.   The main goals of this first machine development (MD) week were to determine the minimum beam-spot size at the interaction points given existing optics and collimation constraints; to test new beam instrumentation; to evaluate the effectiveness of performing part of the beam-squeezing process during the energy ramp; and to explore the limits on the number of protons per bunch arising from the electromagnetic interactions with the accelerator environment and the other beam. Unfortunately, a series of events reduced the machine availability for studies to about 50%. The most critical issue was the recurrent trip of a sextupolar corrector circuit –...

  19. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    White, G

    2006-01-01

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects

  20. Fully automatic CNC machining production system

    Directory of Open Access Journals (Sweden)

    Lee Jeng-Dao

    2017-01-01

    Full Text Available Customized manufacturing is increasing years by years. The consumption habits change has been cause the shorter of product life cycle. Therefore, many countries view industry 4.0 as a target to achieve more efficient and more flexible automated production. To develop an automatic loading and unloading CNC machining system via vision inspection is the first step in industrial upgrading. CNC controller is adopted as the main controller to command to the robot, conveyor, and other equipment in this study. Moreover, machine vision systems are used to detect position of material on the conveyor and the edge of the machining material. In addition, Open CNC and SCADA software will be utilized to make real-time monitor, remote system of control, alarm email notification, and parameters collection. Furthermore, RFID has been added to employee classification and management. The machine handshaking has been successfully proposed to achieve automatic vision detect, edge tracing measurement, machining and system parameters collection for data analysis to accomplish industrial automation system integration with real-time monitor.

  1. Create, share and learn. Experiences with free software, free culture and collaboration in formal and non formal education in Colombia

    OpenAIRE

    Medina Cardona, Luis Fernando

    2012-01-01

    New media technologies, specially software have been of great impact in modern society.The combination of computer/software/networks as creative machines is present in everyday life. This article is focused in this interactions specially from the perspective of free open source software (FOSS). In doing so, the influence of its values is traced from the original hacker culture ethics to the free software four freedoms, showing a software explained culture as in the software studies discipline...

  2. Robotic Software for the Thacher Observatory

    Science.gov (United States)

    Lawrence, George; Luebbers, Julien; Eastman, Jason D.; Johnson, John A.; Swift, Jonathan

    2018-06-01

    The Thacher Observatory—a research and educational facility located in Ojai, CA—uses a 0.7 meter telescope to conduct photometric research on a variety of targets including eclipsing binaries, exoplanet transits, and supernovae. Currently, observations are automated using commercial software. In order to expand the flexibility for specialized scientific observations and to increase the educational value of the facility on campus, we are adapting and implementing the custom observatory control software and queue scheduling developed for the Miniature Exoplanet Radial Velocity Array (MINERVA) to the Thacher Observatory. We present the design and implementation of this new software as well as its demonstrated functionality on the Thacher Observatory.

  3. Smile (System/Machine-Independent Local Environment)

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, J.G.

    1988-04-01

    This document defines the characteristics of Smile, a System/machine-independent local environment. This environment consists primarily of a number of primitives (types, macros, procedure calls, and variables) that a program may use; these primitives provide facilities, such as memory allocation, timing, tasking and synchronization beyond those typically provided by a programming language. The intent is that a program will be portable from system to system and from machine to machine if it relies only on the portable aspects of its programming language and on the Smile primitives. For this to be so, Smile itself must be implemented on each system and machine, most likely using non-portable constructions; that is, while the environment provided by Smile is intended to be portable, the implementation of Smile is not necessarily so. In order to make the implementation of Smile as easy as possible and thereby expedite the porting of programs to a new system or a new machine, Smile has been defined to provide a minimal portable environment; that is, simple primitives are defined, out of which more complex facilities may be constructed using portable procedures. The implementation of Smile can be as any of the following: the underlying software environment for the operating system of an otherwise {open_quotes}bare{close_quotes} machine, a {open_quotes}guest{close_quotes} system environment built upon a preexisting operating system, an environment within a {open_quotes}user{close_quotes} process run by an operating system, or a single environment for an entire machine, encompassing both system and {open_quotes}user{close_quotes} processes. In the first three of these cases the tasks provided by Smile are {open_quotes}lightweight processes{close_quotes} multiplexed within preexisting processes or the system, while in the last case they also include the system processes themselves.

  4. Advances in software science and technology

    CERN Document Server

    Kamimura, Tsutomu

    1994-01-01

    This serial is a translation of the original works within the Japan Society of Software Science and Technology. A key source of information for computer scientists in the U.S., the serial explores the major areas of research in software and technology in Japan. These volumes are intended to promote worldwide exchange of ideas among professionals.This volume includes original research contributions in such areas as Augmented Language Logic (ALL), distributed C language, Smalltalk 80, and TAMPOPO-an evolutionary learning machine based on the principles of Realtime Minimum Skyline Detection.

  5. Advances in software science and technology

    CERN Document Server

    Kakuda, Hiroyasu; Ohno, Yoshio

    1992-01-01

    Advances in Software Science and Technology, Volume 3 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into two parts encompassing 11 chapters, this volume begins with an overview of the development of a system of writing tools called SUIKOU that analyzes a machine-readable Japanese document textually. This text then presents the conditioned attribute grammars (CAGs) and a system for evaluating them that can be applied to natural-language processing. Other chapters c

  6. Software Reuse Within the Earth Science Community

    Science.gov (United States)

    Marshall, James J.; Olding, Steve; Wolfe, Robert E.; Delnore, Victor E.

    2006-01-01

    Scientific missions in the Earth sciences frequently require cost-effective, highly reliable, and easy-to-use software, which can be a challenge for software developers to provide. The NASA Earth Science Enterprise (ESE) spends a significant amount of resources developing software components and other software development artifacts that may also be of value if reused in other projects requiring similar functionality. In general, software reuse is often defined as utilizing existing software artifacts. Software reuse can improve productivity and quality while decreasing the cost of software development, as documented by case studies in the literature. Since large software systems are often the results of the integration of many smaller and sometimes reusable components, ensuring reusability of such software components becomes a necessity. Indeed, designing software components with reusability as a requirement can increase the software reuse potential within a community such as the NASA ESE community. The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group is chartered to oversee the development of a process that will maximize the reuse potential of existing software components while recommending strategies for maximizing the reusability potential of yet-to-be-designed components. As part of this work, two surveys of the Earth science community were conducted. The first was performed in 2004 and distributed among government employees and contractors. A follow-up survey was performed in 2005 and distributed among a wider community, to include members of industry and academia. The surveys were designed to collect information on subjects such as the current software reuse practices of Earth science software developers, why they choose to reuse software, and what perceived barriers prevent them from reusing software. In this paper, we compare the results of these surveys, summarize the observed trends, and discuss the findings. The results are very

  7. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  8. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  9. Ensuring Software IP Cleanliness

    OpenAIRE

    Mahshad Koohgoli; Richard Mayer

    2007-01-01

    At many points in the life of a software enterprise, determination of intellectual property (IP) cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  10. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  11. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  12. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  13. Software - Naval Oceanography Portal

    Science.gov (United States)

    are here: Home › USNO › Earth Orientation › Software USNO Logo USNO Navigation Earth Orientation Products GPS-based Products VLBI-based Products EO Information Center Publications about Products Software Search databases Auxiliary Software Supporting Software Form Folder Earth Orientation Matrix Calculator

  14. Software Engineering Education Directory

    Science.gov (United States)

    1990-04-01

    and Engineering (CMSC 735) Codes: GPEV2 * Textiooks: IEEE Tutoria on Models and Metrics for Software Management and Engameeing by Basi, Victor R...Software Engineering (Comp 227) Codes: GPRY5 Textbooks: IEEE Tutoria on Software Design Techniques by Freeman, Peter and Wasserman, Anthony 1. Software

  15. Energy Science and Technology Software Center

    Energy Technology Data Exchange (ETDEWEB)

    Kidd, E.M.

    1995-03-01

    The Energy Science and Technology Software Center (ESTSC), is the U.S. Department of Energy`s (DOE) centralized software management facility. It is operated under contract for the DOE Office of Scientific and Technical Information (OSTI) and is located in Oak Ridge, Tennessee. The ESTSC is authorized by DOE and the U.S. Nuclear Regulatory Commission (NRC) to license and distribute DOE-and NRC-sponsored software developed by national laboratories and other facilities and by contractors of DOE and NRC. ESTSC also has selected software from the Nuclear Energy Agency (NEA) of the Organisation for Economic Cooperation and Development (OECD) through a software exchange agreement that DOE has with the agency.

  16. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  17. Optimizing infrastructure for software testing using virtualization

    International Nuclear Information System (INIS)

    Khalid, O.; Shaikh, A.; Copy, B.

    2012-01-01

    Virtualization technology and cloud computing have brought a paradigm shift in the way we utilize, deploy and manage computer resources. They allow fast deployment of multiple operating system as containers on physical machines which can be either discarded after use or check-pointed for later re-deployment. At European Organization for Nuclear Research (CERN), we have been using virtualization technology to quickly setup virtual machines for our developers with pre-configured software to enable them to quickly test/deploy a new version of a software patch for a given application. This paper reports both on the techniques that have been used to setup a private cloud on a commodity hardware and also presents the optimization techniques we used to remove deployment specific performance bottlenecks. (authors)

  18. Timing system control software in the SLC

    International Nuclear Information System (INIS)

    Thompson, K.; Phinney, N.

    1985-04-01

    A new timing system that allows precision (approx.1 to 2 ns) control of the trigger times of klystrons, beam position monitors, and other devices on a pulse-to-pulse basis at up to 360 Hz is in operation in the first third of the SLAC linear accelerator. The control software is divided between a central host VAX and local Intel 8086-based microprocessor clusters. Facilities exist to set up and adjust the timing of devices or groups of devices independently for beam pulses having different destinations and purposes, which are run in an interlaced fashion during normal machine operation. Upgrading of the system is currently underway, using a new version of the Programmable Delay Unit CAMAC module to allow pipelining of timing information for three machine pulses. An overview of the current state of the system is presented in this paper, with an emphasis on software control

  19. Operator aid system for Dhruva fueling machine

    International Nuclear Information System (INIS)

    Misra, S.M.; Ramaswamy, L.R.; Gohel, N.; Bharadwaj, G.; Ranade, M.R.; Khadilkar, M.G.

    1997-01-01

    Systems with significant software contents are replacing the old hardware logic systems. These systems not only are versatile but are easy to make changes in the program. Extensive use of such systems in critical real-time operation environment warrants not only excessive training on simulators, documentation but also fault tolerant system to bring the operation to a safe state in case of error. With new graphic user software interface and advancement in personal computer hardware design, the dynamic status of the physical environment can be shown on the visual display at near real time. These visual aids along with the software covering all the interlocks aids an operator in his professional work. This paper highlights the operator aid system for Dhruva fueling machine. (author). 6 refs., 1 fig

  20. De AERA. Gedroomde machines en de praktijk van het rekenwerk aan het Mathematisch Centrum te Amsterdam

    Directory of Open Access Journals (Sweden)

    Gerard Alberts

    2008-06-01

    Full Text Available AERA. Dream machines and computing practices at the Mathematical Center  Dream machines may be just as effective as the ones materialised. Their symbolic thrust can be quite powerful. The Amsterdam ‘Mathematisch Centrum’ (Mathematical Center, founded February 11, 1946, created a Computing Department in an effort to realise its goal of serving society. When Aad van Wijngaarden was appointed as head of the Computing Department, however, he claimed space for scientific research and computer construction, next to computing as a service. Still, the computing service following the five stage style of Hartree’s numerical analysis remained a dominant chara c ter istic of the work of the Computing Department.The high level of ambition held by Aad van Wijngaarden lead to ever renewed projections of big automatic computers, symbolised by the never-built AERA. Even a machine that was actually constructed, the ARRA which followed A.D. Booth’s design of the ARC, never made it into real operation. It did serve Van Wijngaarden to bluff his way into the computer age by midsummer 1952. Not until January 1954 did the computing department have a working stored program computer, which for reasons of policy went under the same name: ARRA. After just one other machine, the ARMAC, had been produced, a separate company, Electrologica, was set up for the manufacture of computers, which produced the rather successful X1 computer.The combination of ambition and absence of a working machine lead to a high level of work on programming, way beyond the usual ideas of libraries of subroutines. Edsger W. Dijkstra in particular led the way to an emphasis on the duties of the programmer within the pattern of numerical analysis. Programs generating programs, known elsewhere as autocoding systems, were at the ‘Mathematisch Centrum’ called ‘superprograms’. Practical examples were usually called a ‘complex’, in Dutch, where in English one might say

  1. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  2. OntoSoft: A Software Registry for Geosciences

    Science.gov (United States)

    Garijo, D.; Gil, Y.

    2017-12-01

    The goal of the EarthCube OntoSoft project is to enable the creation of an ecosystem for software stewardship in geosciences that will empower scientists to manage their software as valuable scientific assets. By sharing software metadata in OntoSoft, scientists enable broader access to that software by other scientists, software professionals, students, and decision makers. Our work to date includes: 1) an ontology for describing scientific software metadata, 2) a distributed scientific software repository that contains more than 750 entries that can be searched and compared across metadata fields, 3) an intelligent user interface that guides scientists to publish software and allows them to crowdsource its corresponding metadata. We have also developed a training program where scientists learn to describe and cite software in their papers in addition to data and provenance, and we are using OntoSoft to show them the benefits of publishing their software metadata. This training program is part of a Geoscience Papers of the Future Initiative, where scientists are reflecting on their current practices, benefits and effort for sharing software and data. This journal paper can be submitted to a Special Section of the AGU Earth and Space Science Journal.

  3. Machining of glass fiber reinforced polyamide

    Directory of Open Access Journals (Sweden)

    2007-12-01

    Full Text Available The machinability of a 30 wt% glass fiber reinforced polyamide (PA was investigated by means of drilling tests. A disk was cut from an extruded rod and drilled on the flat surface: thrust was acquired during drilling at different drilling speed, feed rate and drill diameter. Differential scanning calorimetry (DSC and indentation were used to characterize PA so as to evaluate the intrinsic lack of homogeneity of the extruded material. In conclusion, it was observed that the chip formation mechanism affects the thrust dependence on the machining parameters. A traditional modeling approach is able to predict thrust only in presence of a continuous chip. In some conditions, thrust increases as drilling speed increases and feed rate decreases; this evidence suggests not to consider the general scientific approach which deals the machining of plastics in analogy with metals. Moreover, the thrust can be significantly affected by the workpiece fabrication effect, as well as by the machining parameters; therefore, the fabrication effect is not negligible in the definition of an optimum for the machining process.

  4. Medical device software: defining key terms.

    Science.gov (United States)

    Pashkov, Vitalii; Gutorova, Nataliya; Harkusha, Andrii

    one of the areas of significant growth in medical devices has been the role of software - as an integral component of a medical device, as a standalone device and more recently as applications on mobile devices. The risk related to a malfunction of the standalone software used within healthcare is in itself not a criterion for its qualification or not as a medical device. It is therefore, necessary to clarify some criteria for the qualification of stand-alone software as medical devices Materials and methods: Ukrainian, European Union, United States of America legislation, Guidelines developed by European Commission and Food and Drug Administration's, recommendations represented by international voluntary group and scientific works. This article is based on dialectical, comparative, analytic, synthetic and comprehensive research methods. the legal regulation of software which is used for medical purpose in Ukraine limited to one definition. In European Union and United States of America were developed and applying special guidelines that help developers, manufactures and end users to difference software on types standing on medical purpose criteria. Software becomes more and more incorporated into medical devices. Developers and manufacturers may not have initially appreciated potential risks to patients and users such situation could have dangerous results for patients or users. It is necessary to develop and adopt the legislation that will intend to define the criteria for the qualification of medical device software and the application of the classification criteria to such software, provide some illustrative examples and step by step recommendations to qualify software as medical device.

  5. Control system software, simulation, and robotic applications

    Science.gov (United States)

    Frisch, Harold P.

    1991-01-01

    All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.

  6. Machine Learning and Radiology

    Science.gov (United States)

    Wang, Shijun; Summers, Ronald M.

    2012-01-01

    In this paper, we give a short introduction to machine learning and survey its applications in radiology. We focused on six categories of applications in radiology: medical image segmentation, registration, computer aided detection and diagnosis, brain function or activity analysis and neurological disease diagnosis from fMR images, content-based image retrieval systems for CT or MRI images, and text analysis of radiology reports using natural language processing (NLP) and natural language understanding (NLU). This survey shows that machine learning plays a key role in many radiology applications. Machine learning identifies complex patterns automatically and helps radiologists make intelligent decisions on radiology data such as conventional radiographs, CT, MRI, and PET images and radiology reports. In many applications, the performance of machine learning-based automatic detection and diagnosis systems has shown to be comparable to that of a well-trained and experienced radiologist. Technology development in machine learning and radiology will benefit from each other in the long run. Key contributions and common characteristics of machine learning techniques in radiology are discussed. We also discuss the problem of translating machine learning applications to the radiology clinical setting, including advantages and potential barriers. PMID:22465077

  7. Machine learning and radiology.

    Science.gov (United States)

    Wang, Shijun; Summers, Ronald M

    2012-07-01

    In this paper, we give a short introduction to machine learning and survey its applications in radiology. We focused on six categories of applications in radiology: medical image segmentation, registration, computer aided detection and diagnosis, brain function or activity analysis and neurological disease diagnosis from fMR images, content-based image retrieval systems for CT or MRI images, and text analysis of radiology reports using natural language processing (NLP) and natural language understanding (NLU). This survey shows that machine learning plays a key role in many radiology applications. Machine learning identifies complex patterns automatically and helps radiologists make intelligent decisions on radiology data such as conventional radiographs, CT, MRI, and PET images and radiology reports. In many applications, the performance of machine learning-based automatic detection and diagnosis systems has shown to be comparable to that of a well-trained and experienced radiologist. Technology development in machine learning and radiology will benefit from each other in the long run. Key contributions and common characteristics of machine learning techniques in radiology are discussed. We also discuss the problem of translating machine learning applications to the radiology clinical setting, including advantages and potential barriers. Copyright © 2012. Published by Elsevier B.V.

  8. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  9. Agile software assessment

    OpenAIRE

    Nierstrasz Oscar; Lungu Mircea

    2012-01-01

    Informed decision making is a critical activity in software development but it is poorly supported by common development environments which focus mainly on low level programming tasks. We posit the need for agile software assessment which aims to support decision making by enabling rapid and effective construction of software models and custom analyses. Agile software assessment entails gathering and exploiting the broader context of software information related to the system at hand as well ...

  10. Software Engineering Tools for Scientific Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We design and demonstrate the feasibility of extending the open source Eclipse integrated development environment (IDE) to support the full range of capabilities now...

  11. Software scaffolds to promote regulation during scientific inquiry learning

    NARCIS (Netherlands)

    Manlove, S.A.; Lazonder, Adrianus W.; de Jong, Anthonius J.M.

    2007-01-01

    This research addresses issues in the design of online scaffolds for regulation within inquiry learning environments. The learning environment in this study included a physics simulation, data analysis tools, and a model editor for students to create runnable models. A regulative support tool called

  12. The State of Software for Evolutionary Biology.

    Science.gov (United States)

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  13. Linear accelerator quality assurance using EPIQA software

    International Nuclear Information System (INIS)

    Bozhikov, S.; Sokerov, H.; Tonev, A.; Ivanova, K.

    2012-01-01

    Unlike treatment with static fields, using a dynamic multileaf collimator (dMLC), there are significant dosimetric issues which must be assessed before dynamic therapy can be implemented. The advanced techniques require some additional commissioning and quality assurance tests. The results of standard quality assurance (QA) machine tests and commissioning tests for volume modulated arc therapy (VMAT) using electronic portal image device (EPID) and 'EPIQA' software are presented. (authors)

  14. Virtual Machine Logbook - Enabling virtualization for ATLAS

    International Nuclear Information System (INIS)

    Yao Yushu; Calafiura, Paolo; Leggett, Charles; Poffet, Julien; Cavalli, Andrea; Frederic, Bapst

    2010-01-01

    ATLAS software has been developed mostly on CERN linux cluster lxplus or on similar facilities at the experiment Tier 1 centers. The fast rise of virtualization technology has the potential to change this model, turning every laptop or desktop into an ATLAS analysis platform. In the context of the CernVM project we are developing a suite of tools and CernVM plug-in extensions to promote the use of virtualization for ATLAS analysis and software development. The Virtual Machine Logbook (VML), in particular, is an application to organize work of physicists on multiple projects, logging their progress, and speeding up ''context switches'' from one project to another. An important feature of VML is the ability to share with a single 'click' the status of a given project with other colleagues. VML builds upon the save and restore capabilities of mainstream virtualization software like VMware, and provides a technology-independent client interface to them. A lot of emphasis in the design and implementation has gone into optimizing the save and restore process to makepractical to store many VML entries on a typical laptop disk or to share a VML entry over the network. At the same time, taking advantage of CernVM's plugin capabilities, we are extending the CernVM platform to help increase the usability of ATLAS software. For example, we added the ability to start the ATLAS event display on any computer running CernVM simply by clicking a button in a web browser. We want to integrate seamlessly VML with CernVM unique file system design to distribute efficiently ATLAS software on every physicist computer. The CernVM File System (CVMFS) download files on-demand via HTTP, and cache it locally for future use. This reduces by one order of magnitude the download sizes, making practical for a developer to work with multiple software releases on a virtual machine.

  15. An introduction to machine learning with Scikit-Learn

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    This tutorial gives an introduction to the scientific ecosystem for data analysis and machine learning in Python. After a short introduction of machine learning concepts, we will demonstrate on High Energy Physics data how a basic supervised learning analysis can be carried out using the Scikit-Learn library. Topics covered include data loading facilities and data representation, supervised learning algorithms, pipelines, model selection and evaluation, and model introspection.

  16. DNA-based machines.

    Science.gov (United States)

    Wang, Fuan; Willner, Bilha; Willner, Itamar

    2014-01-01

    The base sequence in nucleic acids encodes substantial structural and functional information into the biopolymer. This encoded information provides the basis for the tailoring and assembly of DNA machines. A DNA machine is defined as a molecular device that exhibits the following fundamental features. (1) It performs a fuel-driven mechanical process that mimics macroscopic machines. (2) The mechanical process requires an energy input, "fuel." (3) The mechanical operation is accompanied by an energy consumption process that leads to "waste products." (4) The cyclic operation of the DNA devices, involves the use of "fuel" and "anti-fuel" ingredients. A variety of DNA-based machines are described, including the construction of "tweezers," "walkers," "robots," "cranes," "transporters," "springs," "gears," and interlocked cyclic DNA structures acting as reconfigurable catenanes, rotaxanes, and rotors. Different "fuels", such as nucleic acid strands, pH (H⁺/OH⁻), metal ions, and light, are used to trigger the mechanical functions of the DNA devices. The operation of the devices in solution and on surfaces is described, and a variety of optical, electrical, and photoelectrochemical methods to follow the operations of the DNA machines are presented. We further address the possible applications of DNA machines and the future perspectives of molecular DNA devices. These include the application of DNA machines as functional structures for the construction of logic gates and computing, for the programmed organization of metallic nanoparticle structures and the control of plasmonic properties, and for controlling chemical transformations by DNA machines. We further discuss the future applications of DNA machines for intracellular sensing, controlling intracellular metabolic pathways, and the use of the functional nanostructures for drug delivery and medical applications.

  17. VVER NPPs fuel handling machine control system

    International Nuclear Information System (INIS)

    Mini, G.; Rossi, G.; Barabino, M.; Casalini, M.

    2002-01-01

    In order to increase the safety level of the fuel handling machine on WWER NPPs, Ansaldo Nucleare was asked to design and supply a new Control System. Two Fuel Handling Machine (FHM) Control System units have been already supplied for Temelin NPP and others supply are in process for the Atommash company, which has in charge the supply of FHMs for NPPs located in Russia, Ukraine and China.The computer-based system takes into account all the operational safety interlocks so that it is able to avoid incorrect and dangerous manoeuvres in the case of operator error. Control system design criteria, hardware and software architecture, and quality assurance control, are in accordance with the most recent international requirements and standards, and in particular for electromagnetic disturbance immunity demands and seismic compatibility. The hardware architecture of the control system is based on ABB INFI 90 system. The microprocessor-based ABB INFI 90 system incorporates and improves upon many of the time proven control capabilities of Bailey Network 90, validated over 14,000 installations world-wide.The control system complies all the former designed sensors and devices of the machine and markedly the angular position measurement sensors named 'selsyn' of Russian design. Nevertheless it is fully compatible with all the most recent sensors and devices currently available on the market (for ex. Multiturn absolute encoders).All control logic were developed using standard INFI 90 Engineering Work Station, interconnecting blocks extracted from an extensive SAMA library by using a graphical approach (CAD) and allowing and easier intelligibility, more flexibility and updated and coherent documentation. The data acquisition system and the Man Machine Interface are implemented by ABB in co-operation with Ansaldo. The flexible and powerful software structure of 1090 Work-stations (APMS - Advanced Plant Monitoring System, or Tenore NT) has been successfully used to interface the

  18. Fundamentals of machine design

    CERN Document Server

    Karaszewski, Waldemar

    2011-01-01

    A forum of researchers, educators and engineers involved in various aspects of Machine Design provided the inspiration for this collection of peer-reviewed papers. The resultant dissemination of the latest research results, and the exchange of views concerning the future research directions to be taken in this field will make the work of immense value to all those having an interest in the topics covered. The book reflects the cooperative efforts made in seeking out the best strategies for effecting improvements in the quality and the reliability of machines and machine parts and for extending

  19. Machine Learning for Hackers

    CERN Document Server

    Conway, Drew

    2012-01-01

    If you're an experienced programmer interested in crunching data, this book will get you started with machine learning-a toolkit of algorithms that enables computers to train themselves to automate useful tasks. Authors Drew Conway and John Myles White help you understand machine learning and statistics tools through a series of hands-on case studies, instead of a traditional math-heavy presentation. Each chapter focuses on a specific problem in machine learning, such as classification, prediction, optimization, and recommendation. Using the R programming language, you'll learn how to analyz

  20. Creativity in Machine Learning

    OpenAIRE

    Thoma, Martin

    2016-01-01

    Recent machine learning techniques can be modified to produce creative results. Those results did not exist before; it is not a trivial combination of the data which was fed into the machine learning system. The obtained results come in multiple forms: As images, as text and as audio. This paper gives a high level overview of how they are created and gives some examples. It is meant to be a summary of the current work and give people who are new to machine learning some starting points.

  1. Automated Software Acceleration in Programmable Logic for an Efficient NFFT Algorithm Implementation: A Case Study.

    Science.gov (United States)

    Rodríguez, Manuel; Magdaleno, Eduardo; Pérez, Fernando; García, Cristhian

    2017-03-28

    Non-equispaced Fast Fourier transform (NFFT) is a very important algorithm in several technological and scientific areas such as synthetic aperture radar, computational photography, medical imaging, telecommunications, seismic analysis and so on. However, its computation complexity is high. In this paper, we describe an efficient NFFT implementation with a hardware coprocessor using an All-Programmable System-on-Chip (APSoC). This is a hybrid device that employs an Advanced RISC Machine (ARM) as Processing System with Programmable Logic for high-performance digital signal processing through parallelism and pipeline techniques. The algorithm has been coded in C language with pragma directives to optimize the architecture of the system. We have used the very novel Software Develop System-on-Chip (SDSoC) evelopment tool that simplifies the interface and partitioning between hardware and software. This provides shorter development cycles and iterative improvements by exploring several architectures of the global system. The computational results shows that hardware acceleration significantly outperformed the software based implementation.

  2. Encapsulating Software Platform Logic by Aspect-Oriented Programming : A Case Study in Using Aspects for Language Portability

    NARCIS (Netherlands)

    Kats, L.C.; Visser, E.

    2010-01-01

    Software platforms such as the Java Virtual Machine or the CLR .NET virtual machine have their own ecosystem of a core programming language or instruction set, libraries, and developer community. Programming languages can target multiple software platforms to increase interoperability or to boost

  3. Semantic representation of scientific literature: bringing claims, contributions and named entities onto the Linked Open Data cloud

    Directory of Open Access Journals (Sweden)

    Bahar Sateli

    2015-12-01

    Full Text Available Motivation. Finding relevant scientific literature is one of the essential tasks researchers are facing on a daily basis. Digital libraries and web information retrieval techniques provide rapid access to a vast amount of scientific literature. However, no further automated support is available that would enable fine-grained access to the knowledge ‘stored’ in these documents. The emerging domain of Semantic Publishing aims at making scientific knowledge accessible to both humans and machines, by adding semantic annotations to content, such as a publication’s contributions, methods, or application domains. However, despite the promises of better knowledge access, the manual annotation of existing research literature is prohibitively expensive for wide-spread adoption. We argue that a novel combination of three distinct methods can significantly advance this vision in a fully-automated way: (i Natural Language Processing (NLP for Rhetorical Entity (RE detection; (ii Named Entity (NE recognition based on the Linked Open Data (LOD cloud; and (iii automatic knowledge base construction for both NEs and REs using semantic web ontologies that interconnect entities in documents with the machine-readable LOD cloud.Results. We present a complete workflow to transform scientific literature into a semantic knowledge base, based on the W3C standards RDF and RDFS. A text mining pipeline, implemented based on the GATE framework, automatically extracts rhetorical entities of type Claims and Contributions from full-text scientific literature. These REs are further enriched with named entities, represented as URIs to the linked open data cloud, by integrating the DBpedia Spotlight tool into our workflow. Text mining results are stored in a knowledge base through a flexible export process that provides for a dynamic mapping of semantic annotations to LOD vocabularies through rules stored in the knowledge base. We created a gold standard corpus from computer

  4. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  5. Scientific Workflow Management in Proteomics

    Science.gov (United States)

    de Bruin, Jeroen S.; Deelder, André M.; Palmblad, Magnus

    2012-01-01

    Data processing in proteomics can be a challenging endeavor, requiring extensive knowledge of many different software packages, all with different algorithms, data format requirements, and user interfaces. In this article we describe the integration of a number of existing programs and tools in Taverna Workbench, a scientific workflow manager currently being developed in the bioinformatics community. We demonstrate how a workflow manager provides a single, visually clear and intuitive interface to complex data analysis tasks in proteomics, from raw mass spectrometry data to protein identifications and beyond. PMID:22411703

  6. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  7. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  8. Scientific Computing and Apple's Intel Transition

    CERN Document Server

    CERN. Geneva

    2006-01-01

    Intel's published processor roadmap and how it may affect the future of personal and scientific computing About the speaker: Eric Albert is Senior Software Engineer in Apple's Core Technologies group. During Mac OS X's transition to Intel processors he has worked on almost every part of the operating system, from the OS kernel and compiler tools to appli...

  9. Application of a 16-bit microprocessor to the digital control of machine tools

    International Nuclear Information System (INIS)

    Issaly, Alain

    1979-01-01

    After an overview of machine tools (various types, definition standardization, associated technologies for motors and position sensors), this research thesis describes the principles of computer-based digital control: classification of machine tool command systems, machining programming, programming languages, dialog function, interpolation function, servo-control function, tool compensation function. The author reports the application of a 16-bit microprocessor to the computer-based digital control of a machine tool: feasibility, selection of microprocessor, hardware presentation, software development and description, machining mode, translation-loading mode

  10. FOSS geospatial libraries in scientific workflow environments: experiences and directions

    CSIR Research Space (South Africa)

    McFerren, G

    2011-07-01

    Full Text Available of experiments. In context of three sets of research (wildfire research, flood modelling and the linking of disease outbreaks to multi-scale environmental conditions), we describe our efforts to provide geospatial capability for scientific workflow software...

  11. Scientific Assistant Virtual Laboratory (SAVL)

    Science.gov (United States)

    Alaghband, Gita; Fardi, Hamid; Gnabasik, David

    2007-03-01

    The Scientific Assistant Virtual Laboratory (SAVL) is a scientific discovery environment, an interactive simulated virtual laboratory, for learning physics and mathematics. The purpose of this computer-assisted intervention is to improve middle and high school student interest, insight and scores in physics and mathematics. SAVL develops scientific and mathematical imagination in a visual, symbolic, and experimental simulation environment. It directly addresses the issues of scientific and technological competency by providing critical thinking training through integrated modules. This on-going research provides a virtual laboratory environment in which the student directs the building of the experiment rather than observing a packaged simulation. SAVL: * Engages the persistent interest of young minds in physics and math by visually linking simulation objects and events with mathematical relations. * Teaches integrated concepts by the hands-on exploration and focused visualization of classic physics experiments within software. * Systematically and uniformly assesses and scores students by their ability to answer their own questions within the context of a Master Question Network. We will demonstrate how the Master Question Network uses polymorphic interfaces and C# lambda expressions to manage simulation objects.

  12. 55th International Conference of Machine Design Departments 2014

    CERN Document Server

    Berka, Ondrej; Petr, Karel; Lopot, František; Dub, Martin

    2016-01-01

    This book is based on the 55th International Conference of Machine Design Departments 2014 (ICMD 2014) which was hosted by the Czech Technical University in September 2014. It features scientific articles which solve progressive themes from the field of machine design. The book addresses a broad range of themes including tribology, hydraulics, materials science, product innovation and experimental methods. It presents the latest interdisciplinary high-tech work. People with an interest in the latest research results in the field of machine design and manufacturing engineering will value this book with contributions of leading academic scientists and experts from all around the world.

  13. Coordinate measuring machines

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo

    This document is used in connection with three exercises of 2 hours duration as a part of the course GEOMETRICAL METROLOGY AND MACHINE TESTING. The exercises concern three aspects of coordinate measuring: 1) Measuring and verification of tolerances on coordinate measuring machines, 2) Traceabilit...... and uncertainty during coordinate measurements, 3) Digitalisation and Reverse Engineering. This document contains a short description of each step in the exercise and schemes with room for taking notes of the results.......This document is used in connection with three exercises of 2 hours duration as a part of the course GEOMETRICAL METROLOGY AND MACHINE TESTING. The exercises concern three aspects of coordinate measuring: 1) Measuring and verification of tolerances on coordinate measuring machines, 2) Traceability...

  14. Enter the machine

    Science.gov (United States)

    Palittapongarnpim, Pantita; Sanders, Barry C.

    2018-05-01

    Quantum tomography infers quantum states from measurement data, but it becomes infeasible for large systems. Machine learning enables tomography of highly entangled many-body states and suggests a new powerful approach to this problem.

  15. Performance evaluation of scientific programs on advanced architecture computers

    International Nuclear Information System (INIS)

    Walker, D.W.; Messina, P.; Baille, C.F.

    1988-01-01

    Recently a number of advanced architecture machines have become commercially available. These new machines promise better cost-performance then traditional computers, and some of them have the potential of competing with current supercomputers, such as the Cray X/MP, in terms of maximum performance. This paper describes an on-going project to evaluate a broad range of advanced architecture computers using a number of complete scientific application programs. The computers to be evaluated include distributed- memory machines such as the NCUBE, INTEL and Caltech/JPL hypercubes, and the MEIKO computing surface, shared-memory, bus architecture machines such as the Sequent Balance and the Alliant, very long instruction word machines such as the Multiflow Trace 7/200 computer, traditional supercomputers such as the Cray X.MP and Cray-2, and SIMD machines such as the Connection Machine. Currently 11 application codes from a number of scientific disciplines have been selected, although it is not intended to run all codes on all machines. Results are presented for two of the codes (QCD and missile tracking), and future work is proposed

  16. Introduction to AC machine design

    CERN Document Server

    Lipo, Thomas A

    2018-01-01

    AC electrical machine design is a key skill set for developing competitive electric motors and generators for applications in industry, aerospace, and defense. This book presents a thorough treatment of AC machine design, starting from basic electromagnetic principles and continuing through the various design aspects of an induction machine. Introduction to AC Machine Design includes one chapter each on the design of permanent magnet machines, synchronous machines, and thermal design. It also offers a basic treatment of the use of finite elements to compute the magnetic field within a machine without interfering with the initial comprehension of the core subject matter. Based on the author's notes, as well as after years of classroom instruction, Introduction to AC Machine Design: * Brings to light more advanced principles of machine design--not just the basic principles of AC and DC machine behavior * Introduces electrical machine design to neophytes while also being a resource for experienced designers * ...

  17. Framework for man-machine interface design evaluation system considering cognitive factor

    International Nuclear Information System (INIS)

    Itoh, Toru; Sasaki, Kazunori; Yoshikawa, Hidekazu; Takahashi, Makoto; Furuta, Tomihiko.

    1994-01-01

    It is necessary to improve human reliability in order to gain a higher reliability of the total plant system taking an account of development of plant automation and improvement of machine reliability. Therefore, the role of the man-machine system will come to be important. Accordingly, the evaluation of the man-machine system design information is desired in order to solve the mismatch problem between plant information presented by the man-machine system and information required by the operator comprehensively. This paper discusses required functions and software framework for the man-machine interface design evaluation system. The man-machine interface design evaluation system has features to extract the potential matters which are inherent on the design information of man-machine system by simulating the operator behavior, the plant system and the man-machine system, considering the operator's cognitive performance and time dependency. (author)

  18. Top scientific research center deploys Zambeel Aztera (TM) network storage system in high performance environment

    CERN Multimedia

    2002-01-01

    " The National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory has implemented a Zambeel Aztera storage system and software to accelerate the productivity of scientists running high performance scientific simulations and computations" (1 page).

  19. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    Science.gov (United States)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  20. SU-F-J-72: A Clinical Usable Integrated Contouring Quality Evaluation Software for Radiotherapy

    International Nuclear Information System (INIS)

    Jiang, S; Dolly, S; Cai, B; Mutic, S; Li, H

    2016-01-01

    Purpose: To introduce the Auto Contour Evaluation (ACE) software, which is the clinical usable, user friendly, efficient and all-in-one toolbox for automatically identify common contouring errors in radiotherapy treatment planning using supervised machine learning techniques. Methods: ACE is developed with C# using Microsoft .Net framework and Windows Presentation Foundation (WPF) for elegant GUI design and smooth GUI transition animations through the integration of graphics engines and high dots per inch (DPI) settings on modern high resolution monitors. The industrial standard software design pattern, Model-View-ViewModel (MVVM) pattern, is chosen to be the major architecture of ACE for neat coding structure, deep modularization, easy maintainability and seamless communication with other clinical software. ACE consists of 1) a patient data importing module integrated with clinical patient database server, 2) a 2D DICOM image and RT structure simultaneously displaying module, 3) a 3D RT structure visualization module using Visualization Toolkit or VTK library and 4) a contour evaluation module using supervised pattern recognition algorithms to detect contouring errors and display detection results. ACE relies on supervised learning algorithms to handle all image processing and data processing jobs. Implementations of related algorithms are powered by Accord.Net scientific computing library for better efficiency and effectiveness. Results: ACE can take patient’s CT images and RT structures from commercial treatment planning software via direct user input or from patients’ database. All functionalities including 2D and 3D image visualization and RT contours error detection have been demonstrated with real clinical patient cases. Conclusion: ACE implements supervised learning algorithms and combines image processing and graphical visualization modules for RT contours verification. ACE has great potential for automated radiotherapy contouring quality verification