WorldWideScience

Sample records for machine scientific software

  1. Position Paper: Applying Machine Learning to Software Analysis to Achieve Trusted, Repeatable Scientific Computing

    Energy Technology Data Exchange (ETDEWEB)

    Prowell, Stacy J [ORNL; Symons, Christopher T [ORNL

    2015-01-01

    Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.

  2. Scientific Software Component Technology

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, S.; Dykman, N.; Kumfert, G.; Smolinski, B.

    2000-02-16

    We are developing new software component technology for high-performance parallel scientific computing to address issues of complexity, re-use, and interoperability for laboratory software. Component technology enables cross-project code re-use, reduces software development costs, and provides additional simulation capabilities for massively parallel laboratory application codes. The success of our approach will be measured by its impact on DOE mathematical and scientific software efforts. Thus, we are collaborating closely with library developers and application scientists in the Common Component Architecture forum, the Equation Solver Interface forum, and other DOE mathematical software groups to gather requirements, write and adopt a variety of design specifications, and develop demonstration projects to validate our approach. Numerical simulation is essential to the science mission at the laboratory. However, it is becoming increasingly difficult to manage the complexity of modern simulation software. Computational scientists develop complex, three-dimensional, massively parallel, full-physics simulations that require the integration of diverse software packages written by outside development teams. Currently, the integration of a new software package, such as a new linear solver library, can require several months of effort. Current industry component technologies such as CORBA, JavaBeans, and COM have all been used successfully in the business domain to reduce software development costs and increase software quality. However, these existing industry component infrastructures will not scale to support massively parallel applications in science and engineering. In particular, they do not address issues related to high-performance parallel computing on ASCI-class machines, such as fast in-process connections between components, language interoperability for scientific languages such as Fortran, parallel data redistribution between components, and massively

  3. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  4. Scientific Bases of Human-Machine Communication by Voice

    Science.gov (United States)

    Schafer, Ronald W.

    1995-10-01

    The scientific bases for human-machine communication by voice are in the fields of psychology, linguistics, acoustics, signal processing, computer science, and integrated circuit technology. The purpose of this paper is to highlight the basic scientific and technological issues in human-machine communication by voice and to point out areas of future research opportunity. The discussion is organized around the following major issues in implementing human-machine voice communication systems: (i) hardware/software implementation of the system, (ii) speech synthesis for voice output, (iii) speech recognition and understanding for voice input, and (iv) usability factors related to how humans interact with machines.

  5. Testing Scientific Software: A Systematic Literature Review.

    Science.gov (United States)

    Kanewala, Upulee; Bieman, James M

    2014-10-01

    Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques.

  6. POLYSHIFT Communications Software for the Connection Machine System CM-200

    Directory of Open Access Journals (Sweden)

    William George

    1994-01-01

    Full Text Available We describe the use and implementation of a polyshift function PSHIFT for circular shifts and end-offs shifts. Polyshift is useful in many scientific codes using regular grids, such as finite difference codes in several dimensions, and multigrid codes, molecular dynamics computations, and in lattice gauge physics computations, such as quantum chromodynamics (QCD calculations. Our implementation of the PSHIFT function on the Connection Machine systems CM-2 and CM-200 offers a speedup of up to a factor of 3–4 compared with CSHIFT when the local data motion within a node is small. The PSHIFT routine is included in the Connection Machine Scientific Software Library (CMSSL.

  7. Publishing Platform for Scientific Software - Lessons Learned

    Science.gov (United States)

    Hammitzsch, Martin; Fritzsch, Bernadette; Reusser, Dominik; Brembs, Björn; Deinzer, Gernot; Loewe, Peter; Fenner, Martin; van Edig, Xenia; Bertelmann, Roland; Pampel, Heinz; Klump, Jens; Wächter, Joachim

    2015-04-01

    Scientific software has become an indispensable commodity for the production, processing and analysis of empirical data but also for modelling and simulation of complex processes. Software has a significant influence on the quality of research results. For strengthening the recognition of the academic performance of scientific software development, for increasing its visibility and for promoting the reproducibility of research results, concepts for the publication of scientific software have to be developed, tested, evaluated, and then transferred into operations. For this, the publication and citability of scientific software have to fulfil scientific criteria by means of defined processes and the use of persistent identifiers, similar to data publications. The SciForge project is addressing these challenges. Based on interviews a blueprint for a scientific software publishing platform and a systematic implementation plan has been designed. In addition, the potential of journals, software repositories and persistent identifiers have been evaluated to improve the publication and dissemination of reusable software solutions. It is important that procedures for publishing software as well as methods and tools for software engineering are reflected in the architecture of the platform, in order to improve the quality of the software and the results of research. In addition, it is necessary to work continuously on improving specific conditions that promote the adoption and sustainable utilization of scientific software publications. Among others, this would include policies for the development and publication of scientific software in the institutions but also policies for establishing the necessary competencies and skills of scientists and IT personnel. To implement the concepts developed in SciForge a combined bottom-up / top-down approach is considered that will be implemented in parallel in different scientific domains, e.g. in earth sciences, climate research and

  8. Center for Technology for Advanced Scientific Component Software (TASCS)

    Energy Technology Data Exchange (ETDEWEB)

    Damevski, Kostadin [Virginia State Univ., Petersburg, VA (United States)

    2009-03-30

    A resounding success of the Scientific Discover through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedened computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative hig-performance scientific computing.

  9. 2006 XSD Scientific Software Workshop report.

    Energy Technology Data Exchange (ETDEWEB)

    Evans, K., Jr.; De Carlo, F.; Jemian, P.; Lang, J.; Lienert, U.; Maclean, J.; Newville, M.; Tieman, B.; Toby, B.; van Veenendaal, B.; Univ. of Chicago

    2006-01-22

    In May of 2006, a committee was formed to assess the fundamental needs and opportunities in scientific software for x-ray data reduction, analysis, modeling, and simulation. This committee held a series of discussions throughout the summer, conducted a poll of the members of the x-ray community, and held a workshop. This report details the findings and recommendations of the committee. Each experiment performed at the APS requires three crucial ingredients: the powerful x-ray source, an optimized instrument to perform measurements, and computer software to acquire, visualize, and analyze the experimental observations. While the APS has invested significant resources in the accelerator, investment in other areas such as scientific software for data analysis and visualization has lagged behind. This has led to the adoption of a wide variety of software with variable levels of usability. In order to maximize the scientific output of the APS, it is essential to support the broad development of real-time analysis and data visualization software. As scientists attack problems of increasing sophistication and deal with larger and more complex data sets, software is playing an ever more important role. Furthermore, our need for excellent and flexible scientific software can only be expected to increase, as the upgrade of the APS facility and the implementation of advanced detectors create a host of new measurement capabilities. New software analysis tools must be developed to take full advantage of these capabilities. It is critical that the APS take the lead in software development and the implementation of theory to software to ensure the continued success of this facility. The topics described in this report are relevant to the APS today and critical for the APS upgrade plan. Implementing these recommendations will have a positive impact on the scientific productivity of the APS today and will be even more critical in the future.

  10. Case Study on Algebraic Software Methodologies for Scientific Computing

    Directory of Open Access Journals (Sweden)

    Magne Haveraaen

    2000-01-01

    Full Text Available The use of domain specific languages and appropriate software architectures are currently seen as the way to enhance reusability and improve software productivity. Here we outline a use of algebraic software methodologies and advanced program constructors to improve the abstraction level of software for scientific computing. This leads us to the language of coordinate free numerics as an alternative to the traditional coordinate dependent array notation. This provides the backdrop for the three accompanying papers: Coordinate Free Programming of Computational Fluid Dynamics Problems, centered around an example of using coordinate free numerics, Machine and Collection Abstractions for User-Implemented Data-Parallel Programming, exploiting the higher abstraction level when parallelising code, and An Algebraic Programming Style for Numerical Software and its Optimization, looking at high-level transformations enabled by the domain specific programming style.

  11. Reproducibility and reusability of scientific software

    Science.gov (United States)

    Shamir, Lior

    2017-01-01

    Information science and technology has been becoming an integral part of astronomy research, and due to the consistent growth in the size and impact of astronomical databases, that trend is bound to continue. While software is a vital part information systems and data analysis processes, in many cases the importance of the software and the standards of reporting on the use of source code has not yet elevated in the scientific communication process to the same level as other parts of the research. The purpose of the discussion is to examine the role of software in the scientific communication process in the light of transparency, reproducibility, and reusability of the research, as well as discussing software in astronomy in comparison to other disciplines.

  12. Community Recommendations for Sustainable Scientific Software

    Directory of Open Access Journals (Sweden)

    Robert R. Downs

    2015-11-01

    Full Text Available Science software has contributed to research practices, but the sustainability of scientific software presents challenges for the future use of research resources. Identifying improvements for science software sustainability practices can contribute to the re-use of science software. A focus group study was conducted to identify ways to improve science software sustainability practices of the Earth science community. A facilitated, roundtable discussion activity at the 2014 Federation of Earth Science Information Partners (ESIP Summer Meeting elicited recommendations on community activities to improve practices for the sustainability of scientific software. These suggestions fell into three broad themes – (1 improving collaboration and community engagement through publications and presentations (2 developing workshops, training, and documenting best practices and (3 creating incentives and motivation with awards, citation and a reviewed software repository. In addition to the recommendations coming out of the roundtable activity, this paper highlights how community-led groups such as ESIP are key to move a sustainable software effort in its various forms from concept to reality.

  13. Reliable Software Development for Machine Protection Systems

    CERN Document Server

    Anderson, D; Dragu, M; Fuchsberger, K; Garnier, JC; Gorzawski, AA; Koza, M; Krol, K; Misiowiec, K; Stamos, K; Zerlauth, M

    2014-01-01

    The Controls software for the Large Hadron Collider (LHC) at CERN, with more than 150 millions lines of code, resides amongst the largest known code bases in the world1. Industry has been applying Agile software engineering techniques for more than two decades now, and the advantages of these techniques can no longer be ignored to manage the code base for large projects within the accelerator community. Furthermore, CERN is a particular environment due to the high personnel turnover and manpower limitations, where applying Agile processes can improve both, the codebase management as well as its quality. This paper presents the successful application of the Agile software development process Scrum for machine protection systems at CERN, the quality standards and infrastructure introduced together with the Agile process as well as the challenges encountered to adapt it to the CERN environment.

  14. 2006 XSD Scientific Software User Survey.

    Energy Technology Data Exchange (ETDEWEB)

    Jemian, P. R.

    2007-01-22

    In preparation for the 2006 XSD Scientific Software workshop, our committee sent a survey on June 16 to 100 users in the APS user community. This report contains the survey and the responses we received. The responses are presented in the order received.

  15. Software engineering and automatic continuous verification of scientific software

    Science.gov (United States)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  16. Designing Scientific Software for Heterogeneous Computing

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig

    , algorithms and data structures must be designed to utilize the underlying parallel architecture. The architectural changes in hardware design within the last decade, from single to multi and many-core architectures, require software developers to identify and properly implement methods that both exploit...... concurrency and maintain numerical efficiency. Graphical Processing Units (GPUs) have proven to be very e_ective units for computing the solution of scientific problems described by partial differential equations (PDEs). GPUs have today become standard devices in portable, desktop, and supercomputers, which...... makes parallel software design applicable, but also a challenge for scientific software developers at all levels. We have developed a generic C++ library for fast prototyping of large-scale PDEs solvers based on flexible-order finite difference approximations on structured regular grids. The library...

  17. Modeling software with finite state machines a practical approach

    CERN Document Server

    Wagner, Ferdinand; Wagner, Thomas; Wolstenholme, Peter

    2006-01-01

    Modeling Software with Finite State Machines: A Practical Approach explains how to apply finite state machines to software development. It provides a critical analysis of using finite state machines as a foundation for executable specifications to reduce software development effort and improve quality. This book discusses the design of a state machine and of a system of state machines. It also presents a detailed analysis of development issues relating to behavior modeling with design examples and design rules for using finite state machines. This volume describes a coherent and well-tested fr

  18. Software Defects, Scientific Computation and the Scientific Method

    CERN Document Server

    CERN. Geneva

    2011-01-01

    Computation has rapidly grown in the last 50 years so that in many scientific areas it is the dominant partner in the practice of science. Unfortunately, unlike the experimental sciences, it does not adhere well to the principles of the scientific method as espoused by, for example, the philosopher Karl Popper. Such principles are built around the notions of deniability and reproducibility. Although much research effort has been spent on measuring the density of software defects, much less has been spent on the more difficult problem of measuring their effect on the output of a program. This talk explores these issues with numerous examples suggesting how this situation might be improved to match the demands of modern science. Finally it develops a theoretical model based on an amalgam of statistical mechanics and Hartley/Shannon information theory which suggests that software systems have strong implementation independent behaviour and supports the widely observed phenomenon that defects clust...

  19. Adaptable Assertion Checking for Scientific Software Components

    Energy Technology Data Exchange (ETDEWEB)

    Dahlgren, T L; Devanbu, P T

    2004-03-12

    We present a proposal for lowering the overhead of interface contract checking for science and engineering applications. Run-time enforcement of assertions is a well-known technique for improving the quality of software; however, the performance penalty is often too high for their retention during deployment, especially for long-running applications that depend upon iterative operations. With an efficient adaptive approach the benefits of run-time checking can continue to accrue with minimal overhead. Examples from scientific software interfaces being developed in the high performance computing research community will be used to measure the efficiency and effectiveness of this approach.

  20. Scientific Data Management Integrated Software Infrastructure Center

    Energy Technology Data Exchange (ETDEWEB)

    Choudhary, A.; Liao, W.K.

    2008-10-29

    This work provides software that enables scientific applications to more efficiently access available storage resources at different levels of interfaces. We developed scalable techniques and optimizations for PVFS parallel file systems, MPI I/O, and parallel netCDF I/O library. These implementations were evaluated using production application I/O kernels as well as popular I/O benchmarks and demonstrated promising results. The software developed under this work has been made available to the public via MCS, ANL web sites.

  1. Writing software or writing scientific articles?

    CERN Document Server

    Basaglia, Tullio; Dressendorfer, P V; Larkin, A; Pia, M G

    2008-01-01

    An analysis of publications related to high energy physics computing in refereed journals is presented. The distribution of papers associated to various fields of computing relevant to high energy physics is critically analyzed. The relative publication rate of software papers is evaluated in comparison to other closely related physics disciplines, such as nuclear physics, radiation protection and medical physics, and to hardware publications. The results hint to the fact that, in spite of the significant effort invested in high energy physics computing and its fundamental role in the experiments, this research area is underrepresented in scientific literature; nevertheless the analysis of citations highlights the significant impact of software publications in experimental research.

  2. BENCHMARKING MACHINE LEARNING TECHNIQUES FOR SOFTWARE DEFECT DETECTION

    Directory of Open Access Journals (Sweden)

    Saiqa Aleem

    2015-06-01

    Full Text Available Machine Learning approaches are good in solving problems that have less information. In most cases, the software domain problems characterize as a process of learning that depend on the various circumstances and changes accordingly. A predictive model is constructed by using machine learning approaches and classified them into defective and non-defective modules. Machine learning techniques help developers to retrieve useful information after the classification and enable them to analyse data from different perspectives. Machine learning techniques are proven to be useful in terms of software bug prediction. This study used public available data sets of software modules and provides comparative performance analysis of different machine learning techniques for software bug prediction. Results showed most of the machine learning methods performed well on software bug datasets.

  3. Performance Engineering Technology for Scientific Component Software

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D.

    2007-05-08

    Large-scale, complex scientific applications are beginning to benefit from the use of component software design methodology and technology for software development. Integral to the success of component-based applications is the ability to achieve high-performing code solutions through the use of performance engineering tools for both intra-component and inter-component analysis and optimization. Our work on this project aimed to develop performance engineering technology for scientific component software in association with the DOE CCTTSS SciDAC project (active during the contract period) and the broader Common Component Architecture (CCA) community. Our specific implementation objectives were to extend the TAU performance system and Program Database Toolkit (PDT) to support performance instrumentation, measurement, and analysis of CCA components and frameworks, and to develop performance measurement and monitoring infrastructure that could be integrated in CCA applications. These objectives have been met in the completion of all project milestones and in the transfer of the technology into the continuing CCA activities as part of the DOE TASCS SciDAC2 effort. In addition to these achievements, over the past three years, we have been an active member of the CCA Forum, attending all meetings and serving in several working groups, such as the CCA Toolkit working group, the CQoS working group, and the Tutorial working group. We have contributed significantly to CCA tutorials since SC'04, hosted two CCA meetings, participated in the annual ACTS workshops, and were co-authors on the recent CCA journal paper [24]. There are four main areas where our project has delivered results: component performance instrumentation and measurement, component performance modeling and optimization, performance database and data mining, and online performance monitoring. This final report outlines the achievements in these areas for the entire project period. The submitted progress

  4. Performance Engineering Technology for Scientific Component Software

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D.

    2007-05-08

    Large-scale, complex scientific applications are beginning to benefit from the use of component software design methodology and technology for software development. Integral to the success of component-based applications is the ability to achieve high-performing code solutions through the use of performance engineering tools for both intra-component and inter-component analysis and optimization. Our work on this project aimed to develop performance engineering technology for scientific component software in association with the DOE CCTTSS SciDAC project (active during the contract period) and the broader Common Component Architecture (CCA) community. Our specific implementation objectives were to extend the TAU performance system and Program Database Toolkit (PDT) to support performance instrumentation, measurement, and analysis of CCA components and frameworks, and to develop performance measurement and monitoring infrastructure that could be integrated in CCA applications. These objectives have been met in the completion of all project milestones and in the transfer of the technology into the continuing CCA activities as part of the DOE TASCS SciDAC2 effort. In addition to these achievements, over the past three years, we have been an active member of the CCA Forum, attending all meetings and serving in several working groups, such as the CCA Toolkit working group, the CQoS working group, and the Tutorial working group. We have contributed significantly to CCA tutorials since SC'04, hosted two CCA meetings, participated in the annual ACTS workshops, and were co-authors on the recent CCA journal paper [24]. There are four main areas where our project has delivered results: component performance instrumentation and measurement, component performance modeling and optimization, performance database and data mining, and online performance monitoring. This final report outlines the achievements in these areas for the entire project period. The submitted progress

  5. The need for scientific software engineering in the pharmaceutical industry.

    Science.gov (United States)

    Luty, Brock; Rose, Peter W

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  6. The need for scientific software engineering in the pharmaceutical industry

    Science.gov (United States)

    Luty, Brock; Rose, Peter W.

    2016-12-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  7. The need for scientific software engineering in the pharmaceutical industry

    Science.gov (United States)

    Luty, Brock; Rose, Peter W.

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  8. Center for Technology for Advanced Scientific Component Software (TASCS) Consolidated Progress Report July 2006 - March 2009

    Energy Technology Data Exchange (ETDEWEB)

    Bernholdt, D E; McInnes, L C; Govindaraju, M; Bramley, R; Epperly, T; Kohl, J A; Nieplocha, J; Armstrong, R; Shasharina, S; Sussman, A L; Sottile, M; Damevski, K

    2009-04-14

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.

  9. Man-Machine Interface Design for Modeling and Simulation Software

    Directory of Open Access Journals (Sweden)

    Arnstein J. Borstad

    1986-07-01

    Full Text Available Computer aided design (CAD systems, or more generally interactive software, are today being developed for various application areas like VLSI-design, mechanical structure design, avionics design, cartographic design, architectual design, office automation, publishing, etc. Such tools are becoming more and more important in order to be productive and to be able to design quality products. One important part of CAD-software development is the man-machine interface (MMI design.

  10. Integration of CMM software standards for nanopositioning and nanomeasuring machines

    Science.gov (United States)

    Sparrer, E.; Machleidt, T.; Hausotte, T.; Manske, E.; Franke, K.-H.

    2011-06-01

    The paper focuses on the utilization of nanopositioning and nanomeasuring machines as a three dimensional coordinate measuring machine by means of the international harmonized communication protocol Inspection plus plus for Dimensional Measurement Equipment (abbreviated I++DME). I++DME was designed 1999 to enable the interoperability of different measuring hardware, like coordinate measuring machines, form tester, camshaft or crankshaft measuring machines, with a priori unknown third party controlling and analyzing software. Our recent work was focused on the implementation of a modular, standard conform command interpreter server for the Inspection plus plus protocol. This communication protocol enables the application of I++DME compliant graphical controlling software, which is easy to operate and less error prone than the currently used textural programming via MathWorks MATLab. The function and architecture of the I++DME command interpreter is discussed and the principle of operation is demonstrated by means of an example controlling a nanopositioning and nanomeasuring machine with Hexagon Metrology's controlling and analyzing software QUINDOS 7 via the I++DME command interpreter server.

  11. Acquiring Software Design Schemas: A Machine Learning Perspective

    Science.gov (United States)

    Harandi, Mehdi T.; Lee, Hing-Yan

    1991-01-01

    In this paper, we describe an approach based on machine learning that acquires software design schemas from design cases of existing applications. An overview of the technique, design representation, and acquisition system are presented. the paper also addresses issues associated with generalizing common features such as biases. The generalization process is illustrated using an example.

  12. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  13. Scientific Data Analysis and Software Support: Geodynamics

    Science.gov (United States)

    Klosko, Steven; Sanchez, B. (Technical Monitor)

    2000-01-01

    The support on this contract centers on development of data analysis strategies, geodynamic models, and software codes to study four-dimensional geodynamic and oceanographic processes, as well as studies and mission support for near-Earth and interplanetary satellite missions. SRE had a subcontract to maintain the optical laboratory for the LTP, where instruments such as MOLA and GLAS are developed. NVI performed work on a Raytheon laser altimetry task through a subcontract, providing data analysis and final data production for distribution to users. HBG had a subcontract for specialized digital topography analysis and map generation. Over the course of this contract, Raytheon ITSS staff have supported over 60 individual tasks. Some tasks have remained in place during this entire interval whereas others have been completed and were of shorter duration. Over the course of events, task numbers were changed to reflect changes in the character of the work or new funding sources. The description presented below will detail the technical accomplishments that have been achieved according to their science and technology areas. What will be shown is a brief overview of the progress that has been made in each of these investigative and software development areas. Raytheon ITSS staff members have received many awards for their work on this contract, including GSFC Group Achievement Awards for TOPEX Precision Orbit Determination and the Joint Gravity Model One Team. NASA JPL gave the TOPEX/POSEIDON team a medal commemorating the completion of the primary mission and a Certificate of Appreciation. Raytheon ITSS has also received a Certificate of Appreciation from GSFC for its extensive support of the Shuttle Laser Altimeter Experiment.

  14. Building Scientific Workflows for the Geosciences with Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.; Weerawarana, S. M.

    2012-12-01

    We describe the design and development of the Apache Airavata scientific workflow software and its application to problems in geosciences. Airavata is based on Service Oriented Architecture principles and is developed as general purpose software for managing large-scale science applications on supercomputing resources such as the NSF's XSEDE. Based on the NSF-funded EarthCube Workflow Working Group activities, we discuss the application of this software relative to specific requirements (such as data stream data processing, event triggering, dealing with large data sets, and advanced distributed execution patterns involved in data mining). We also consider the role of governance in EarthCube software development and present the development of Airavata software through the Apache Software Foundation's community development model. We discuss the potential impacts on software accountability and sustainability using this model.

  15. PC Cluster Machine Equipped with High-Speed Communication Software

    CERN Document Server

    Tanaka, M

    2004-01-01

    A high performance Beowulf (PC cluster) machine installed with Linux operating system and MPI (Message Passing Interface) for interprocessor communications has been constructed using Gigabit Ethernet and the communication software GAMMA (Genoa Active Message Machine), instead of the standard TCP/IP protocol. Fast C/Fortran compilers have been exploited with the GAMMA communication libraries. This method has eliminated large communication overhead of TCP/IP and resulted in significant increase in the computational performance of real application programs including the first-principle molecular dynamics simulation code. (Keywords: non TCP/IP, active messages, small latency, fast C/Fortran compilers, materials science, first-principle molecular dynamics)

  16. Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration

    DEFF Research Database (Denmark)

    Franz, Michael; Gal, Andreas; Probst, Christian

    2006-01-01

    As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers....... Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which...... then execute on top of an existing software portability platform such as Java. The result is a considerably reduced implementation effort....

  17. 76 FR 54800 - International Business Machines (IBM), Software Group Business Unit, Quality Assurance Group, San...

    Science.gov (United States)

    2011-09-02

    ... Employment and Training Administration International Business Machines (IBM), Software Group Business Unit... Application for Reconsideration for the workers and former workers of International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA, San Jose, California (subject firm)....

  18. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Science.gov (United States)

    2011-02-02

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF LABOR Employment and Training Administration International Business Machines (IBM), Software Group Business Unit... at International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools...

  19. Component-based software for high-performance scientific computing

    Science.gov (United States)

    Alexeev, Yuri; Allan, Benjamin A.; Armstrong, Robert C.; Bernholdt, David E.; Dahlgren, Tamara L.; Gannon, Dennis; Janssen, Curtis L.; Kenny, Joseph P.; Krishnan, Manojkumar; Kohl, James A.; Kumfert, Gary; Curfman McInnes, Lois; Nieplocha, Jarek; Parker, Steven G.; Rasmussen, Craig; Windus, Theresa L.

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  20. Software Aging Prediction Based on Extreme Learning Machine

    Directory of Open Access Journals (Sweden)

    Xiaozhi Du

    2013-11-01

    Full Text Available In the research on software aging and rejuvenation, one of the most important questions is when to trigger the rejuvenation action. And it is useful to predict the system resource utilization state efficiently for determining the rejuvenation time. In this paper, we propose software aging prediction model based on extreme learning machine (ELM for a real VOD system. First, the data on the parameters of system resources and application server are collected. Then, the data is preprocessed by normalization and principal component analysis (PCA. Then, ELMs are constructed to model the extracted data series of systematic parameters. Finally, we get the predicted data of system resource by computing the sum of the outputs of these ELMs. Experiments show that the proposed software aging prediction method based on wavelet transform and ELM is superior to the artificial neural network (ANN and support vector machine (SVM in the aspects of prediction precision and efficiency. Based on the models employed here, software rejuvenation policies can be triggered by actual measurements.  

  1. Machine Learning and Software Quality Prediction: As an Expert System

    Directory of Open Access Journals (Sweden)

    Ekbal A. Rashid

    2014-04-01

    Full Text Available To improve the software quality the number of errors from the software must be removed. The research paper presents a study towards machine learning and software quality prediction as an expert system. The purpose of this paper is to apply the machine learning approaches, such as case-based reasoning, to predict software quality. The main objective of this research is to minimize software costs. Predict the error in software module correctly and use the results in future estimation. The novel idea behind this system is that Knowledge base (KBS building is an important task in CBR and the knowledge base can be built based on world new problems along with world new solutions. Second, reducing the maintenance cost by removing the duplicate record set from the KBS. Third, error prediction with the help of similarity functions. In this research four similarity functions have been used and these are Euclidean, Manhattan, Canberra, and Exponential. We feel that case-based models are particularly useful when it is difficult to define actual rules about a problem domain. For this purpose we have developed a case-based reasoning model and have validated it upon student data. It was observed that, Euclidean and Exponential both are good for error calculation in comparison to Manhattan and Canberra after performing five experiments. In order to obtain a result we have used indigenous tool. For finding the mean and standard deviation, SPSS version 16 and for generating graphs MATLAB 7.10.0 version have been used as an analyzing tool.

  2. Cooperative Work and Sustainable Scientific Software Practices in R

    Science.gov (United States)

    Weber, N.

    2013-12-01

    Most scientific software projects are dependent on the work of many diverse people, institutions and organizations. Incentivizing these actors to cooperatively develop software that is both reliable, and sustainable is complicated by the fact that the reward structures of these various actors greatly differ: research scientists want results from a software or model run in order to publish papers, produce new data, or test a hypothesis; software engineers and research centers want compilable, well documented code that is refactorable, reusable and reproducible in future research scenarios. While much research has been done on incentives and motivations for participating in open source software projects or cyberinfrastrcture development, little work has been done on what motivates or incentivizes developers to maintain scientific software projects beyond their original application. This poster will present early results of research into the incentives and motivation for cooperative scientific software development. In particular, this work focuses on motivations for the maintenance and repair of libraries on the software platform R. Our work here uses a sample of R packages that were created by research centers, or are specific to earth, environmental and climate science applications. We first mined 'check' logs from the Comprehensive R Archive Network (CRAN) to determine the amount of time a package has existed, the number of versions it has gone through over this time, the number of releases, and finally the contact information for each official package 'maintainer'. We then sent a survey to each official maintainer, asking them questions about what role they played in developing the original package, and what their motivations were for sustaining the project over time. We will present early results from this mining and our survey of R maintainers.

  3. Voice recognition software can be used for scientific articles

    DEFF Research Database (Denmark)

    Pommergaard, Hans-Christian; Huang, Chenxi; Burcharth, Jacob;

    2015-01-01

    INTRODUCTION: Dictation of scientific articles has been recognised as an efficient method for producing high-quality, first article drafts. However, standardised transcription service by a secretary may not be available for all researchers and voice recognition software (VRS) may therefore...

  4. Smart dynamic software components enabling decision support in Machine-to-machine networks

    Directory of Open Access Journals (Sweden)

    Alexander Dannies

    2013-01-01

    Full Text Available The future Internet of Things will be extended by machine-to-machine communication technologies in order to include sensor information. The overwhelming amount of data will require autonomous decision making processes which are directly executed at the location where data is generated or measured. An intelligent sensor system needs to be able to adapt to new parameters in its surrounding unknown at the time of deployment. In our paper we show that Java enables software updates on mobile devices and also that it is possible to run algorithms required for decision making processes on wireless sensor platforms based on Java.

  5. Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration

    DEFF Research Database (Denmark)

    Franz, Michael; Gal, Andreas; Probst, Christian

    2006-01-01

    As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers....... Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which...

  6. Spreading scientific philosophies with instruments: the case of Atwood's machine

    CERN Document Server

    Esposito, S

    2012-01-01

    We study how the paradigm of Newton's science, based on the organization of scientific knowledge as a series of mathematical laws, was definitively accepted in science courses - in the last decades of the XVIII century, in England as well as in the Continent - by means of the "universal" dynamical machine invented by George Atwood in late 1770s just for this purpose. The spreading of such machine, occurred well before the appearance of Atwood's treatise where he described the novel machine and the experiments to be performed with it, is a quite interesting historical case, which we consider in some detail. In particular, we focus on the "improvement" introduced by the Italian Giuseppe Saverio Poli and the subsequent "simplifications" of the machine, underlying the ongoing change of perspective after the definitive success of Newtonianism. The case studied here allows to recognize the relevant role played by a properly devised instrument in the acceptance of a new paradigm by non-erudite scholars, in addition ...

  7. Machine Code and Metaphysics: A Perspective on Software Engineering

    OpenAIRE

    2015-01-01

    A major, but too-little-considered problem for Software Engineering (SE) is a lack of consensus concerning Computer Science (CS) and how this relates to developing unpredictable computing technology. We consider some implications for SE of computer systems differing scientific basis, exemplified with the International Standard Organisations Open Systems Interconnection (ISO-OSI) layered architectural model. An architectural view allows comparison of computing technology components facilitatin...

  8. A Machine Learning based Efficient Software Reusability Prediction Model for Java Based Object Oriented Software

    Directory of Open Access Journals (Sweden)

    Surbhi Maggo

    2014-01-01

    Full Text Available Software reuse refers to the development of new software systems with the likelihood of completely or partially using existing components or resources with or without modification. Reusability is the measure of the ease with which previously acquired concepts and objects can be used in new contexts. It is a promising strategy for improvements in software quality, productivity and maintainability as it provides for cost effective, reliable (with the consideration that prior testing and use has eliminated bugs and accelerated (reduced time to market development of the software products. In this paper we present an efficient automation model for the identification and evaluation of reusable software components to measure the reusability levels (high, medium or low of procedure oriented Java based (object oriented software systems. The presented model uses a metric framework for the functional analysis of the Object oriented software components that target essential attributes of reusability analysis also taking into consideration Maintainability Index to account for partial reuse. Further machine learning algorithm LMNN is explored to establish relationships between the functional attributes. The model works at functional level rather than at structural level. The system is implemented as a tool in Java and the performance of the automation tool developed is recorded using criteria like precision, recall, accuracy and error rate. The results gathered indicate that the model can be effectively used as an efficient, accurate, fast and economic model for the identification of procedure based reusable components from the existing inventory of software resources.

  9. Managing Scientific Software Complexity with Bocca and CCA

    Directory of Open Access Journals (Sweden)

    Benjamin A. Allan

    2008-01-01

    Full Text Available In high-performance scientific software development, the emphasis is often on short time to first solution. Even when the development of new components mostly reuses existing components or libraries and only small amounts of new code must be created, dealing with the component glue code and software build processes to obtain complete applications is still tedious and error-prone. Component-based software meant to reduce complexity at the application level increases complexity to the extent that the user must learn and remember the interfaces and conventions of the component model itself. To address these needs, we introduce Bocca, the first tool to enable application developers to perform rapid component prototyping while maintaining robust software-engineering practices suitable to HPC environments. Bocca provides project management and a comprehensive build environment for creating and managing applications composed of Common Component Architecture components. Of critical importance for high-performance computing (HPC applications, Bocca is designed to operate in a language-agnostic way, simultaneously handling components written in any of the languages commonly used in scientific applications: C, C++, Fortran, Python and Java. Bocca automates the tasks related to the component glue code, freeing the user to focus on the scientific aspects of the application. Bocca embraces the philosophy pioneered by Ruby on Rails for web applications: start with something that works, and evolve it to the user's purpose.

  10. Towards a Domain Specific Software Architecture for Scientific Data Distribution

    Science.gov (United States)

    Wilson, A.; Lindholm, D. M.

    2011-12-01

    A reference architecture is a "design that satisfies a clearly distinguished subset of the functional capabilities identified in the reference requirements within the boundaries of certain design and implementation constraints, also identified in reference requirements." [Tracz, 1995] Recognizing the value of a reference architecture, NASA's ESDSWG's Standards Process Group (SPG) is introducing a multi-disciplinary science data systems (SDS) reference architecture in order to provide an implementation neutral, template solution for an architecture to support scientific data systems in general [Burnett, et al, 2011]. This reference architecture describes common features and patterns in scientific data systems, and can thus provide guidelines in building and improving such systems. But, guidelines alone may not be sufficient to actually build a system. A domain specific software architecture (DSSA) is "an assemblage of software components, specialized for a particular type of task (domain), generalized for effective use across that domain, composed in a standardized structure (topology) effective for building successful applications." [Tracz, 1995]. It can be thought of as relatively specific reference architecture. The "DSSA Process" is a software life cycle developed at Carnegie Melon's Software Engineering Institute that is based on the development and use of domain-specific software architectures, components, and tools. The process has four distinct activities: 1) develop a domain specific base/model, 2) populate and maintain the library, 3) build applications, 4) operate and maintain applications [Armitage, 1993]. The DSSA process may provide the missing link between guidelines and actual system construction. In this presentation we focus specifically on the realm of scientific data access and distribution. Assuming the role of domain experts in building data access systems, we report the results of creating a DSSA for scientific data distribution. We describe

  11. Continuous integration and quality control for scientific software

    Science.gov (United States)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  12. Execution time support for scientific programs on distributed memory machines

    Science.gov (United States)

    Berryman, Harry; Saltz, Joel; Scroggs, Jeffrey

    1990-01-01

    Optimizations are considered that are required for efficient execution of code segments that consists of loops over distributed data structures. The PARTI (Parallel Automated Runtime Toolkit at ICASE) execution time primitives are designed to carry out these optimizations and can be used to implement a wide range of scientific algorithms on distributed memory machines. These primitives allow the user to control array mappings in a way that gives an appearance of shared memory. Computations can be based on a global index set. Primitives are used to carry out gather and scatter operations on distributed arrays. Communications patterns are derived at runtime, and the appropriate send and receive messages are automatically generated.

  13. Voice recognition software can be used for scientific articles

    DEFF Research Database (Denmark)

    Pommergaard, Hans-Christian; Huang, Chenxi; Burcharth, Jacob

    2015-01-01

    INTRODUCTION: Dictation of scientific articles has been recognised as an efficient method for producing high-quality, first article drafts. However, standardised transcription service by a secretary may not be available for all researchers and voice recognition software (VRS) may therefore...... be an alternative. The purpose of this study was to evaluate the out-of-the-box accuracy of VRS. METHODS: Eleven young researchers without dictation experience dictated the first draft of their own scientific article after thorough preparation according to a pre-defined schedule. The dictate transcribed by VRS...

  14. Software architecture for time-constrained machine vision applications

    Science.gov (United States)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2013-01-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility, because they are normally oriented toward particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse, and inefficient execution on multicore processors. We present a novel software architecture for time-constrained machine vision applications that addresses these issues. The architecture is divided into three layers. The platform abstraction layer provides a high-level application programming interface for the rest of the architecture. The messaging layer provides a message-passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of message. The application layer provides a repository for reusable application modules designed for machine vision applications. These modules, which include acquisition, visualization, communication, user interface, and data processing, take advantage of the power of well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, the proposed architecture is applied to a real machine vision application: a jam detector for steel pickling lines.

  15. A software state machine for computing astronomical coordinates

    Science.gov (United States)

    Percival, J. W.

    We consider the common problem of computing apparent and topocentric places of stars for the purpose of pointing a telescope. Detailed algorithmic descriptions exist (see, for example, Kaplan et al. AJ 1989, 97, 1197). In addition, several software packages such as NOVAS (Kaplan, 1990) and Starlink's SLALIB by Patrick Wallace considerably ease the burden in building specific application programs. A few problems remain, however. Portability can be a problem, in that some real-time platforms have grudging or non-existent support for Fortran, which is the language of implementation for NOVAS and SLALIB. Also, efficiency can be a problem if the subroutines try to do too much, not allowing the programmer to fragment the calculation as needed. SLALIB offers many convenient entry points, which avoids this problem, but the programmer is still left to weave the subroutines together to achieve a desired result. We have designed a portable software state machine, written in C, for use in the WIYN Telescope Control System. The state machine is in the form of a graph, with the nodes representing coordinate states (heliocentric FK4, topocentric apparent, or galactic, for example) and the edges representing the calculations required to move between states. The programmer provides a starting state and coordinate state vector, and a desired ending state. Using the current state and desired end state, the machine marches through the graph, performing the transitions in the proper order. This approach has several advantages. First, not only are the calculations well-defined, as they are in existing subroutine libraries, but their order of execution is embedded in the machine, rather than merely specified in documentation, removing a source of programming error. Second, each transition can be implemented exactly once, in exactly one place, while the state machine dynamically changes the order of events according to the state transition table. Third, transitions can be implemented in

  16. Software Redundancy for Machine Interlock System of KOMAC

    Energy Technology Data Exchange (ETDEWEB)

    Song, Younggi; Seol, Kyungtae; Kwon, Hyeokjung; Cho, Yongsub [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    The Korea multi-purpose accelerator complex (KOMAC) consists of low-energy components including a 50-keV ion source, a low-energy beam transport (LEBT), a 3-MeV radio-frequency quadrupole (RFQ), and a 20-MeV drift tube linac (DTL), as well as high-energy components, including seven DTL tanks for the 100-MeV proton beam. The KOMAC includes 10 beam lines, 5 for 20-MeV beams and 5 for 100-MeV beams. The radiation of the beam loss and faults of the linac components can cause substantial damage to the devices. Therefore, the KOMAC active protection system needs to minimize the beam loss radiation and ensure the safe operation of the machine. The purpose of an interlock system is to turn off beam and components when an interlock occurs. The software-based interlock system was design to double-check a MPS operation and support sequential operation by interlock signals. The interlock system is based on hardware and software interlock system with redundancy to protect the sensitive devices from the radiation on the beam loss and faults on the equipment. The local MPS for a main interlock have been fabricated, and its response time was within 3 μs. This response time has been satisfied to meet the machine protection, which must prevent a beam within a few milliseconds during beam operation of 60 Hz. The interlock systems can inhibit a beam whenever one of the control systems detects an error from the local devices. A beam can be accelerated under machine and personnel protection condition.

  17. Science Gateways, Scientific Workflows and Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing

  18. eSciMart: Web Platform for Scientific Software Marketplace

    Science.gov (United States)

    Kryukov, A. P.; Demichev, A. P.

    2016-10-01

    In this paper we suggest a design of a web marketplace where users of scientific application software and databases, presented in the form of web services, as well as their providers will have presence simultaneously. The model, which will be the basis for the web marketplace is close to the customer-to-customer (C2C) model, which has been successfully used, for example, on the auction sites such as eBay (ebay.com). Unlike the classical model of C2C the suggested marketplace focuses on application software in the form of web services, and standardization of API through which application software will be integrated into the web marketplace. A prototype of such a platform, entitled eSciMart, is currently being developed at SINP MSU.

  19. Prolog-based prototyping software for machine vision

    Science.gov (United States)

    Batchelor, Bruce G.; Hack, Ralf; Jones, Andrew C.

    1996-10-01

    Prolog image processing (PIP) is a multi-media prototyping tool, intended to assist designers of intelligent industrial machine vision systems. This is the latest in a series of prolog-based systems that have been implemented at Cardiff, specifically for this purpose. The software package provides fully integrated facilities for both interactive and programmed image processing, 'smart' documentation, guidance about which lighting/viewing set-up to use, speech/natural language input and speech output. It can also be used to control a range of electro-mechanical devices, such as lamps, cameras, lenses, pneumatic positioning mechanisms, robots, etc., via a low-cost hardware interfacing module. The software runs on a standard computer, with no predecessors in that the image processing is carried out entirely in software. This article concentrates on the design and implementation of the PIP system, and presents programs for two demonstration applications: (a) recognizing a non-picture playing card; (b) recognizing a well laid table place setting.

  20. Software Development for Digital Control of WDW Series Testing Machine and Measurement of KIC

    Institute of Scientific and Technical Information of China (English)

    黄兴; 马杭; 程昌钧

    2005-01-01

    Software has been developed for digital control of WDW series testing machine and the measurement of fracture toughness by modularized design. Development of the software makes use of multi-thread and serial communication techniques, which can accurately control the testing machine and measure the fracture toughness in real-time. Three-point bending specimens were used in the measurement. The software operates stably and reliably, expanding the function of WDW series testing machine.

  1. A Framework for Testing Scientific Software: A Case Study of Testing Amsterdam Discrete Dipole Approximation Software

    Science.gov (United States)

    Shao, Hongbing

    Software testing with scientific software systems often suffers from test oracle problem, i.e., lack of test oracles. Amsterdam discrete dipole approximation code (ADDA) is a scientific software system that can be used to simulate light scattering of scatterers of various types. Testing of ADDA suffers from "test oracle problem". In this thesis work, I established a testing framework to test scientific software systems and evaluated this framework using ADDA as a case study. To test ADDA, I first used CMMIE code as the pseudo oracle to test ADDA in simulating light scattering of a homogeneous sphere scatterer. Comparable results were obtained between ADDA and CMMIE code. This validated ADDA for use with homogeneous sphere scatterers. Then I used experimental result obtained for light scattering of a homogeneous sphere to validate use of ADDA with sphere scatterers. ADDA produced light scattering simulation comparable to the experimentally measured result. This further validated the use of ADDA for simulating light scattering of sphere scatterers. Then I used metamorphic testing to generate test cases covering scatterers of various geometries, orientations, homogeneity or non-homogeneity. ADDA was tested under each of these test cases and all tests passed. The use of statistical analysis together with metamorphic testing is discussed as a future direction. In short, using ADDA as a case study, I established a testing framework, including use of pseudo oracles, experimental results and the metamorphic testing techniques to test scientific software systems that suffer from test oracle problems. Each of these techniques is necessary and contributes to the testing of the software under test.

  2. Voice recognition software can be used for scientific articles

    DEFF Research Database (Denmark)

    Pommergaard, Hans-Christian; Huang, Chenxi; Burcharth, Jacob

    2015-01-01

    be an alternative. The purpose of this study was to evaluate the out-of-the-box accuracy of VRS. METHODS: Eleven young researchers without dictation experience dictated the first draft of their own scientific article after thorough preparation according to a pre-defined schedule. The dictate transcribed by VRS...... with a median score of five (range: 3-9), which was improved with the addition of 5,000 words. CONCLUSION: The out-of-the-box performance of VRS was acceptable and improved after additional words were added. Further studies are needed to investigate the effect of additional software accuracy training....

  3. Voice recognition software can be used for scientific articles

    DEFF Research Database (Denmark)

    Pommergaard, Hans-Christian; Huang, Chenxi; Burcharth, Jacob;

    2015-01-01

    INTRODUCTION: Dictation of scientific articles has been recognised as an efficient method for producing high-quality, first article drafts. However, standardised transcription service by a secretary may not be available for all researchers and voice recognition software (VRS) may therefore...... be an alternative. The purpose of this study was to evaluate the out-of-the-box accuracy of VRS. METHODS: Eleven young researchers without dictation experience dictated the first draft of their own scientific article after thorough preparation according to a pre-defined schedule. The dictate transcribed by VRS...... was compared with the same dictate transcribed by an experienced research secretary, and the effect of adding words to the vocabulary of the VRS was investigated. The number of errors per hundred words was used as outcome. Furthermore, three experienced researchers assessed the subjective readability using...

  4. Voice recognition software can be used for scientific articles.

    Science.gov (United States)

    Pommergaard, Hans-Christian; Huang, Chenxi; Burcharth, Jacob; Rosenberg, Jacob

    2015-02-01

    Dictation of scientific articles has been recognised as an efficient method for producing high-quality, first article drafts. However, standardised transcription service by a secretary may not be available for all researchers and voice recognition software (VRS) may therefore be an alternative. The purpose of this study was to evaluate the out-of-the-box accuracy of VRS. Eleven young researchers without dictation experience dictated the first draft of their own scientific article after thorough preparation according to a pre-defined schedule. The dictate transcribed by VRS was compared with the same dictate transcribed by an experienced research secretary, and the effect of adding words to the vocabulary of the VRS was investigated. The number of errors per hundred words was used as outcome. Furthermore, three experienced researchers assessed the subjective readability using a Likert scale (0-10). Dragon Nuance Premium version 12.5 was used as VRS. The median number of errors per hundred words was 18 (range: 8.5-24.3), which improved when 15,000 words were added to the vocabulary. Subjective readability assessment showed that the texts were understandable with a median score of five (range: 3-9), which was improved with the addition of 5,000 words. The out-of-the-box performance of VRS was acceptable and improved after additional words were added. Further studies are needed to investigate the effect of additional software accuracy training.

  5. Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research

    Directory of Open Access Journals (Sweden)

    Alan D. Degenhart

    2011-01-01

    Full Text Available This paper presents “Craniux,” an open-access, open-source software framework for brain-machine interface (BMI research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development.

  6. Craniux: a LabVIEW-based modular software framework for brain-machine interface research.

    Science.gov (United States)

    Degenhart, Alan D; Kelly, John W; Ashmore, Robin C; Collinger, Jennifer L; Tyler-Kabara, Elizabeth C; Weber, Douglas J; Wang, Wei

    2011-01-01

    This paper presents "Craniux," an open-access, open-source software framework for brain-machine interface (BMI) research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG) signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development.

  7. SOFTWARE-CONTROLLED SYSTEM OF ULTRA-PRECISION MACHINING AXISYMMETRIC ASPHERIC MIRROR

    Institute of Scientific and Technical Information of China (English)

    GUO Yinbiao; WEI Lizhen

    2006-01-01

    In order to improve machining accuracy and efficiency, a software-controlled system of ultra-precision machining for axisymmetric aspheric mirror, using techniques of error compensation,remote transmission and modularization, is designed based on industrial PC, Windows 2000 work platform and Visual Basic 6.0. By experiments, this system realizes functions of ultra-precision machining, machining error compensation, remote data transmission and automatic data transformation among first machining, compensation machining and accuracy measurement. The actual application shows that error compensation improves machining accuracy, remote transmission improves machining efficiency while modularization avoids repeated work and improves design efficiency. Therefore, the system has met ultra-precision machining need for aspheric mirror.

  8. Practical guide to machine vision software an introduction with LabVIEW

    CERN Document Server

    Kwon, Kye-Si

    2014-01-01

    For both students and engineers in R&D, this book explains machine vision in a concise, hands-on way, using the Vision Development Module of the LabView software by National Instruments. Following a short introduction to the basics of machine vision and the technical procedures of image acquisition, the book goes on to guide readers in the use of the various software functions of LabView's machine vision module. It covers typical machine vision tasks, including particle analysis, edge detection, pattern and shape matching, dimension measurements as well as optical character recognition, enabli

  9. Final Report for "Center for Technology for Advanced Scientific Component Software"

    Energy Technology Data Exchange (ETDEWEB)

    Svetlana Shasharina

    2010-12-01

    The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.

  10. Software design of the hybrid robot machine for ITER vacuum vessel assembly and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ming, E-mail: Ming.Li@lut.fi [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland); Wu, Huapeng; Handroos, Heikki [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland); Yang, Guangyou [School of Mechanical Engineering, Hubei University of Technology, Wuhan (China)

    2013-10-15

    A specific software design is elaborated in this paper for the hybrid robot machine used for the ITER vacuum vessel (VV) assembly and maintenance. In order to provide the multi-machining-function as well as the complicated, flexible and customizable GUI designing satisfying the non-standardized VV assembly process in one hand, and in another hand guarantee the stringent machining precision in the real-time motion control of robot machine, a client–server-control software architecture is proposed, which separates the user interaction, data communication and robot control implementation into different software layers. Correspondingly, three particular application protocols upon the TCP/IP are designed to transmit the data, command and status between the client and the server so as to deal with the abundant data streaming in the software. In order not to be affected by the graphic user interface (GUI) modification process in the future experiment in VV assembly working field, the real-time control system is realized as a stand-alone module in the architecture to guarantee the controlling performance of the robot machine. After completing the software development, a milling operation is tested on the robot machine, and the result demonstrates that both the specific GUI operability and the real-time motion control performance could be guaranteed adequately in the software design.

  11. Whole earth modeling: developing and disseminating scientific software for computational geophysics.

    Science.gov (United States)

    Kellogg, L. H.

    2016-12-01

    Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.

  12. Final Scientific/Technical Report for "Enabling Exascale Hardware and Software Design through Scalable System Virtualization"

    Energy Technology Data Exchange (ETDEWEB)

    Dinda, Peter August [Northwestern Univ., Evanston, IL (United States)

    2015-03-17

    This report describes the activities, findings, and products of the Northwestern University component of the "Enabling Exascale Hardware and Software Design through Scalable System Virtualization" project. The purpose of this project has been to extend the state of the art of systems software for high-end computing (HEC) platforms, and to use systems software to better enable the evaluation of potential future HEC platforms, for example exascale platforms. Such platforms, and their systems software, have the goal of providing scientific computation at new scales, thus enabling new research in the physical sciences and engineering. Over time, the innovations in systems software for such platforms also become applicable to more widely used computing clusters, data centers, and clouds. This was a five-institution project, centered on the Palacios virtual machine monitor (VMM) systems software, a project begun at Northwestern, and originally developed in a previous collaboration between Northwestern University and the University of New Mexico. In this project, Northwestern (including via our subcontract to the University of Pittsburgh) contributed to the continued development of Palacios, along with other team members. We took the leadership role in (1) continued extension of support for emerging Intel and AMD hardware, (2) integration and performance enhancement of overlay networking, (3) connectivity with architectural simulation, (4) binary translation, and (5) support for modern Non-Uniform Memory Access (NUMA) hosts and guests. We also took a supporting role in support for specialized hardware for I/O virtualization, profiling, configurability, and integration with configuration tools. The efforts we led (1-5) were largely successful and executed as expected, with code and papers resulting from them. The project demonstrated the feasibility of a virtualization layer for HEC computing, similar to such layers for cloud or datacenter computing. For effort (3

  13. Time-triggered State-machine Reliable Software Architecture for Micro Turbine Engine Control

    Institute of Scientific and Technical Information of China (English)

    ZHANG Qi; XU Guoqiang; DING Shuiting

    2012-01-01

    Time-triggered (TT) embedded software pattern is well accepted in aerospace industry for its high reliability.Finite-state-machine (FSM) design method is widely used for its high efficiency and predictable behavior.In this paper,the time-triggered and state-machine combination software architecture is implemented for a 25 kg thrust micro turbine engine (MTE) used for unmanned aerial vehicle (UAV) system; also model-based-design development workflow for airworthiness software directive DO-178B is utilized.Experimental results show that time-triggered state-machine software architecture and development method could shorten the system development time,reduce the system test cost and make the turbine engine easily comply with the airworthiness rules.

  14. Using Machine Learning for Risky Module Estimation of Safety-Critical Software

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Mi; Jeong, Choong Heui [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2009-05-15

    With the rapid development of digital computer and information processing technologies, nuclear I and C (Instrument and Control) system which needs safety critical function has adopted digital technologies. Software used in safety-critical system must have high dependability. Highly dependable software needs strict software testing and V and V activities. These days, regulatory demands for nuclear power plants are more and more increasing. But, human resources and time for regulation are limited. So, early software risky module prediction is very useful for software testing and regulation activities. Early estimation can be built from a collection of internal metrics during early development phase. Internal metrics are measures of a product derived from assessment of the product itself, and external metrics are measures of a product derived from assessment of the behavior of the systems. Internal metrics can be collected more easily and early than external metrics. In addition, internal metrics can be useful for estimating fault-prone software modules using machine learning. In this paper, we introduce current research status and techniques related to estimating risky software module using machine learning techniques. Section 2 describes the overview of the estimation model using machine learning and section 3 describes processes of the estimation model. Section 4 describes several estimation models using machine leanings. Section 5 concludes the paper.

  15. DUNE as an Example of Sustainable Open Source Scientific Software Development

    OpenAIRE

    Blatt, Makus

    2013-01-01

    In this paper we describe how DUNE, an open source scientific software framework, is developed. Having a sustainable software framework for the solution of partial differential equations is the main driver of DUNE's development. We take a look how DUNE strives to stay sustainable software.

  16. An Evaluation of Output Quality of Machine Translation (Padideh Software vs. Google Translate

    Directory of Open Access Journals (Sweden)

    Haniyeh Sadeghi Azer

    2015-08-01

    Full Text Available This study aims to evaluate the translation quality of two machine translation systems in translating six different text-types, from English to Persian. The evaluation was based on criteria proposed by Van Slype (1979. The proposed model for evaluation is a black-box type, comparative and adequacy-oriented evaluation. To conduct the evaluation, a questionnaire was assigned to end-users to evaluate the outputs to examine and determine, if the machine-generated translations are intelligible and acceptable from their point of view and which one of the machine-generated translations produced by Padideh software and Google Translate is more acceptable and useful from the end-users point of view. The findings indicate that, the machine-generated translations are intelligible and acceptable in translating certain text-types, for end-users and Google Translate is more acceptable from end-users point of view. Keywords: Machine Translation, Machine Translation Evaluation, Translation Quality

  17. Rapid development of scalable scientific software using a process oriented approach

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Vinter, Brian

    2011-01-01

    Scientific applications are often not written with multiprocessing, cluster computing or grid computing in mind. This paper suggests using Python and PyCSP to structure scientific software through Communicating Sequential Processes. Three scientific applications are used to demonstrate the featur...

  18. Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)

    Energy Technology Data Exchange (ETDEWEB)

    Sussman, Alan [University of Maryland

    2014-10-21

    This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.

  19. The Use of Open Source Software for Open Architecture System on CNC Milling Machine

    Directory of Open Access Journals (Sweden)

    Dalmasius Ganjar Subagio

    2012-03-01

    Full Text Available Computer numerical control (CNC milling machine system cannot be separated from the software required to follow the provisions of the Open Architecture capabilities that have portability, extend ability, interoperability, and scalability. When a prescribed period of a CNC milling machine has passed and the manufacturer decided to discontinue it, then the user will have problems for maintaining the performance of the machine. This paper aims to show that the using of open source software (OSS is the way out to maintain engine performance. With the use of OSS, users no longer depend on the software built by the manufacturer because OSS is open and can be developed independently. In this paper, USBCNC V.3.42 is used as an alternative OSS. The test result shows that the work piece is in match with the desired pattern. The test result shows that the performance of machines using OSS has similar performance with the machine using software from the manufacturer. 

  20. A Machine Learning based Efficient Software Reusability Prediction Model for Java Based Object Oriented Software

    OpenAIRE

    Surbhi Maggo; Chetna Gupta

    2014-01-01

    Software reuse refers to the development of new software systems with the likelihood of completely or partially using existing components or resources with or without modification. Reusability is the measure of the ease with which previously acquired concepts and objects can be used in new contexts. It is a promising strategy for improvements in software quality, productivity and maintainability as it provides for cost effective, reliable (with the consideration that prior testing and use has...

  1. Multimedia extensions to prototyping software for machine vision

    Science.gov (United States)

    Batchelor, Bruce G.; Griffiths, Eric C.; Hack, Ralf; Jones, Andrew C.

    1996-10-01

    PIP (prolog image processing) is a prototyping tool, intended to assists designers of intelligent industrial machine vision systems. This article concentrates on the multi-media extensions to PIP, including: 1) on-line HELP, which allows the user to satisfy PIP goals from within the HELP facility, 2) lighting advisor, which gives advice to a vision engineer about which lighting/viewing arrangement is appropriate to use in a given situation, 3) device control, for operating a robot work cell, 4) speech input and (simple) natural language understanding, 5) speech synthesis, 6) remote operation of PIP via a local area network, and 7) remote operation of PIP via a local area network. At the time of writing, on-line access to PIP, via the Internet, is being developed.

  2. Software quality and process improvement in scientific simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, J.; Webster, R. [Los Alamos National Lab., NM (United States)

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  3. From scientific instrument to industrial machine : Coping with architectural stress in embedded systems

    NARCIS (Netherlands)

    Doornbos, R.; Loo, S. van

    2012-01-01

    Architectural stress is the inability of a system design to respond to new market demands. It is an important yet often concealed issue in high tech systems. In From scientific instrument to industrial machine, we look at the phenomenon of architectural stress in embedded systems in the context of a

  4. The Dostoevsky Machine in Georgetown: scientific translation in the Cold War.

    Science.gov (United States)

    Gordin, Michael D

    2016-04-01

    Machine Translation (MT) is now ubiquitous in discussions of translation. The roots of this phenomenon - first publicly unveiled in the so-called 'Georgetown-IBM Experiment' on 9 January 1954 - displayed not only the technological utopianism still associated with dreams of a universal computer translator, but was deeply enmeshed in the political pressures of the Cold War and a dominating conception of scientific writing as both the goal of machine translation as well as its method. Machine translation was created, in part, as a solution to a perceived crisis sparked by the massive expansion of Soviet science. Scientific prose was also perceived as linguistically simpler, and so served as the model for how to turn a language into a series of algorithms. This paper follows the rise of the Georgetown program - the largest single program in the world - from 1954 to the (as it turns out, temporary) collapse of MT in 1964.

  5. From scientific instrument to industrial machine coping with architectural stress in embedded systems

    CERN Document Server

    Doornbos, Richard

    2012-01-01

    Architectural stress is the inability of a system design to respond to new market demands. It is an important yet often concealed issue in high tech systems. In From scientific instrument to industrial machine, we look at the phenomenon of architectural stress in embedded systems in the context of a transmission electron microscope system built by FEI Company. Traditionally, transmission electron microscopes are manually operated scientific instruments, but they also have enormous potential for use in industrial applications. However, this new market has quite different characteristics. There are strong demands for cost-effective analysis, accurate and precise measurements, and ease-of-use. These demands can be translated into new system qualities, e.g. reliability, predictability and high throughput, as well as new functions, e.g. automation of electron microscopic analyses, automated focusing and positioning functions. From scientific instrument to industrial machine takes a pragmatic approach to the proble...

  6. NetiNeti: discovery of scientific names from text using machine learning methods

    Directory of Open Access Journals (Sweden)

    Akella Lakshmi

    2012-08-01

    Full Text Available Abstract Background A scientific name for an organism can be associated with almost all biological data. Name identification is an important step in many text mining tasks aiming to extract useful information from biological, biomedical and biodiversity text sources. A scientific name acts as an important metadata element to link biological information. Results We present NetiNeti (Name Extraction from Textual Information-Name Extraction for Taxonomic Indexing, a machine learning based approach for recognition of scientific names including the discovery of new species names from text that will also handle misspellings, OCR errors and other variations in names. The system generates candidate names using rules for scientific names and applies probabilistic machine learning methods to classify names based on structural features of candidate names and features derived from their contexts. NetiNeti can also disambiguate scientific names from other names using the contextual information. We evaluated NetiNeti on legacy biodiversity texts and biomedical literature (MEDLINE. NetiNeti performs better (precision = 98.9% and recall = 70.5% compared to a popular dictionary based approach (precision = 97.5% and recall = 54.3% on a 600-page biodiversity book that was manually marked by an annotator. On a small set of PubMed Central’s full text articles annotated with scientific names, the precision and recall values are 98.5% and 96.2% respectively. NetiNeti found more than 190,000 unique binomial and trinomial names in more than 1,880,000 PubMed records when used on the full MEDLINE database. NetiNeti also successfully identifies almost all of the new species names mentioned within web pages. Conclusions We present NetiNeti, a machine learning based approach for identification and discovery of scientific names. The system implementing the approach can be accessed at http://namefinding.ubio.org.

  7. Learning from open source software projects to improve scientific review

    Directory of Open Access Journals (Sweden)

    Satrajit S Ghosh

    2012-04-01

    Full Text Available Peer-reviewed publications are the primary mechanism for sharing scientific results. The current peer-review process is, however, fraught with many problems that undermine the pace, validity, and credibility of science. We highlight five salient problems: (1 Reviewers are expected to have comprehensive expertise; (2 Reviewers do not have sufficient access to methods and materials to evaluate a study; (3 Reviewers are not acknowledged; (4 There is no measure of the quality of a review; and (5 Reviews take a lot of time, and once submitted cannot evolve. We propose that these problems can be resolved by making the following changes to the review process. Distributing reviews to many reviewers would allow each reviewer to focus on portions of the article that reflect the reviewer’s specialty or area of interest and place less of a burden on any one reviewer, enabling a more comprehensive and timely review. Providing reviewers materials and methods to perform comprehensive evaluation would facilitate transparency, replication of results and enable greater scrutiny by people from different fields using different nomenclature, leading to greater clarity and cross-fertilization of ideas. Acknowledging reviewers makes it possible to quantitatively assess reviewer contributions, which could be integrated with assessments for promotions and grants. Quantifying review quality could help establish the importance of reviewers and information generated during a review, and assess the importance of a submitted article. Finally, we recommend expediting post-publication reviews and allowing for the dialogue to continue and flourish in a dynamic and interactive manner. We argue that these solutions can be addressed by building upon computer programming code management systems. In this article, we provide examples of current code review systems that offer opportunities for addressing the above problems, and offer suggestions for enhancing code review systems for

  8. Lifelong personal health data and application software via virtual machines in the cloud.

    Science.gov (United States)

    Van Gorp, Pieter; Comuzzi, Marco

    2014-01-01

    Personal Health Records (PHRs) should remain the lifelong property of patients, who should be able to show them conveniently and securely to selected caregivers and institutions. In this paper, we present MyPHRMachines, a cloud-based PHR system taking a radically new architectural solution to health record portability. In MyPHRMachines, health-related data and the application software to view and/or analyze it are separately deployed in the PHR system. After uploading their medical data to MyPHRMachines, patients can access them again from remote virtual machines that contain the right software to visualize and analyze them without any need for conversion. Patients can share their remote virtual machine session with selected caregivers, who will need only a Web browser to access the pre-loaded fragments of their lifelong PHR. We discuss a prototype of MyPHRMachines applied to two use cases, i.e., radiology image sharing and personalized medicine.

  9. Virtual Machine-level Software Transactional Memory: Principles, Techniques, and Implementation

    Science.gov (United States)

    2015-08-13

    VM-managed environment. ByteSTM is built by modifying Jikes RVM [3], a Java research virtual machine implemented in Java , using the optimizing...project have been publicly released as open-source software and research papers published at international conferences. In the following we summarize them... Research (AFOSR)/ RTC Arlington, Virginia 22203 Air Force Research Laboratory Air Force Materiel Command REPORT DOCUMENTATION PAGE Form Approved OMB No

  10. Certainty in Stockpile Computing: Recommending a Verification and Validation Program for Scientific Software

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J.R.

    1998-11-01

    As computing assumes a more central role in managing the nuclear stockpile, the consequences of an erroneous computer simulation could be severe. Computational failures are common in other endeavors and have caused project failures, significant economic loss, and loss of life. This report examines the causes of software failure and proposes steps to mitigate them. A formal verification and validation program for scientific software is recommended and described.

  11. MINRES-QLP Pack and Reliable Reproducible Research via Supportable Scientific Software

    Directory of Open Access Journals (Sweden)

    Sou-Cheng Terrya Choi

    2014-07-01

    Full Text Available The MINRES-QLP Pack is a suite of standard and extended Krylov subspace methods for solving large linear systems and linear least-squares problems in which the coefficient matrices are potentially singular or ill-conditioned and possibly have special symmetries. Our purpose is to develop robust open-source MATLAB implementations of these algorithms that are faithful to the theory, following the philosophy of reproducible research (RR, and practicing development principles of what we call supportable scientific software (SSS that promote reliable reproducible research (RRR. In this paper, we review key features in the ongoing theoretical and software development of our algorithms in the MINRES-QLP Pack. We highlight the most effective software engineering tools known to us that are potentially useful to other scientific research areas. We support open calls to create more incentives for practitioners of robust and reliable scientific software such as citations and grants for quality software. We encourage introducing principles of RRR via SSS to computational science students in advanced courses of scientific computing and to computational scientists through seminars, workshops, or conferences. To these ends, we started an experimental seminar course, “Reliable Mathematical Software” (IIT MATH-573 in our institution, and organized multiple sessions on “Reliable Computational Science” in the SIAM Annual Meeting 2014. We share our research practice and pedagogic experiences in this article.

  12. Easyverifier 1.0: a software tool for revising scientific articles’ bibliographical citations

    Directory of Open Access Journals (Sweden)

    Freddy Alberto Correa Riveros

    2010-05-01

    Full Text Available The first academic revolution which occurred in developed countries during the late 19th century made research a university func- tion in addition to the traditional task of teaching. A second academic revolution has tried to transform the university into a tea- ching, research and socio-economic development enterprise. The scientific article has become an excellent practical means for the movement of new knowledge between the university and the socioeconomic environment. This work had two purposes. One was to present some general considerations regarding research and the scientific article. The second was to provide information about a computational tool which supports revising scientific articles’ citations; this step is usually done manually and requires some experience. The software allows two text files to be read, one containing the scientific article’s content and another one the bibliography. A report is then generated allowing the authors mentioned in the text but not indexed in the bibliography to be identified and to determine which authors have been mentioned in the bibliography but who have not been mentioned in the text of the article. The software allows researchers and journal coordinators to detect reference errors among citations in the text and the bibliographical references. The steps to develop the software were: analysis, design, implementation and use. For the analysis it was important the revision of the literature about elaboration of citations in scientific documents.

  13. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    Science.gov (United States)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  14. Techniques and tools for measuring energy efficiency of scientific software applications

    CERN Document Server

    Abdurachmanov, David; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Nurminen, Jukka K.; Nyback, Filip; Pestana, Goncalo; Ou, Zhonghong; Khan, Kashif

    2014-01-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running o...

  15. The Python interpreter as a framework for integrating scientific computing software-components

    Directory of Open Access Journals (Sweden)

    2008-06-01

    Full Text Available The focus of the Molecular Simulation Laboratory is to model molecular
    interactions. In particular, we are working on automated docking and molecular visualization. Building and simulating complex molecular systems requires the tight interoperation of a variety of software tools originating from various scientific disciplines and usually developed independently of each other. Over the last ten years we have evolved a strategy for addressing the formidable software engineering problem of
    integrating such heterogeneous software tools. The basic idea is that the Python interpreter serves as the integration framework and provides a powerful and flexible glue for rapidly prototyping applications from reusable software components (i.e. Python packages. We no longer think in terms of programs, but rather in terms of packages which can be loaded dynamically into the interpreter when needed, and instantly extend our framework (i.e. the Python interpreter with new functionality. We have written more than 30 packages (>2500 classes providing support for applications ranging from scientific visualization and visual programming to molecular simulations and virtual reality. Moreover, some of our components have been reused successfully by otherlaboratories for their own research. Applications created from our software components have been distributed to over 15000 users around the world. In this paper we describe our approach and its various applications, discuss the reasons that make this approach so successful, and present lessons learns and pitfalls to avoid in order to maximize the reusability and interoperability of software components.

  16. A Collaboration Model for Community-Based Software Development with Social Machines

    Directory of Open Access Journals (Sweden)

    Dave Murray-Rust

    2016-02-01

    Full Text Available Crowdsourcing is generally used for tasks with minimal coordination, providing limited support for dynamic reconfiguration. Modern systems, exemplified by social ma chines, are subject to continual flux in both the client and development communities and their needs. To support crowdsourcing of open-ended development, systems must dynamically integrate human creativity with machine support. While workflows can be u sed to handle structured, predictable processes, they are less suitable for social machine development and its attendant uncertainty. We present models and techniques for coordination of human workers in crowdsourced software development environments. We combine the Social Compute Unit—a model of ad-hoc human worker teams—with versatile coordination protocols expressed in the Lightweight Social Calculus. This allows us to combine coordination and quality constraints with dynamic assessments of end-user desires, dynamically discovering and applying development protocols.

  17. Deadline aware virtual machine scheduler for scientific grids and cloud computing

    CERN Document Server

    Khalid, Omer; Anthony, Richard; Petridis, Miltos; Parrot, Kevin; Schulz, Markus; 10.1109/WAINA.2010.107

    2010-01-01

    Virtualization technology has enabled applications to be decoupled from the underlying hardware providing the benefits of portability, better control over execution environment and isolation. It has been widely adopted in scientific grids and commercial clouds. Since virtualization, despite its benefits incurs a performance penalty, which could be significant for systems dealing with uncertainty such as High Performance Computing (HPC) applications where jobs have tight deadlines and have dependencies on other jobs before they could run. The major obstacle lies in bridging the gap between performance requirements of a job and performance offered by the virtualization technology if the jobs were to be executed in virtual machines. In this paper, we present a novel approach to optimize job deadlines when run in virtual machines by developing a deadline-aware algorithm that responds to job execution delays in real time, and dynamically optimizes jobs to meet their deadline obligations. Our approaches borrowed co...

  18. Dynamic scheduling of virtual machines running hpc workloads in scientific grids

    CERN Document Server

    Khalid, Omer; Anthony, Richard; Petridis, Miltos; Parrot, Kevin; Schulz, Markus; 10.1145/1330555.1330556

    2010-01-01

    The primary motivation for uptake of virtualization has been resource isolation, capacity management and resource customization allowing resource providers to consolidate their resources in virtual machines. Various approaches have been taken to integrate virtualization in to scientific Grids especially in the arena of High Performance Computing (HPC) to run grid jobs in virtual machines, thus enabling better provisioning of the underlying resources and customization of the execution environment on runtime. Despite the gains, virtualization layer also incur a performance penalty and its not very well understood that how such an overhead will impact the performance of systems where jobs are scheduled with tight deadlines. Since this overhead varies the types of workload whether they are memory intensive, CPU intensive or network I/O bound, and could lead to unpredictable deadline estimation for the running jobs in the system. In our study, we have attempted to tackle this problem by developing an intelligent s...

  19. A Comparative Study of Three Machine Learning Methods for Software Fault Prediction

    Institute of Scientific and Technical Information of China (English)

    WANG Qi; ZHU Jie; YU Bo

    2005-01-01

    The contribution of this paper is comparing three popular machine learning methods for software fault prediction. They are classification tree, neural network and case-based reasoning. First, three different classifiers are built based on these three different approaches. Second, the three different classifiers utilize the same product metrics as predictor variables to identify the fault-prone components. Third, the predicting results are compared on two aspects, how good prediction capabilities these models are, and how the models support understanding a process represented by the data.

  20. Built To Last: Using Iterative Development Models for Sustainable Scientific Software Development

    Science.gov (United States)

    Jasiak, M. E.; Truslove, I.; Savoie, M.

    2013-12-01

    In scientific research, software development exists fundamentally for the results they create. The core research must take focus. It seems natural to researchers, driven by grant deadlines, that every dollar invested in software development should be used to push the boundaries of problem solving. This system of values is frequently misaligned with those of the software being created in a sustainable fashion; short-term optimizations create longer-term sustainability issues. The National Snow and Ice Data Center (NSIDC) has taken bold cultural steps in using agile and lean development and management methodologies to help its researchers meet critical deadlines, while building in the necessary support structure for the code to live far beyond its original milestones. Agile and lean software development and methodologies including Scrum, Kanban, Continuous Delivery and Test-Driven Development have seen widespread adoption within NSIDC. This focus on development methods is combined with an emphasis on explaining to researchers why these methods produce more desirable results for everyone, as well as promoting developers interacting with researchers. This presentation will describe NSIDC's current scientific software development model, how this addresses the short-term versus sustainability dichotomy, the lessons learned and successes realized by transitioning to this agile and lean-influenced model, and the current challenges faced by the organization.

  1. In Silico Identification Software (ISIS): A Machine Learning Approach to Tandem Mass Spectral Identification of Lipids

    Energy Technology Data Exchange (ETDEWEB)

    Kangas, Lars J.; Metz, Thomas O.; Isaac, Georgis; Schrom, Brian T.; Ginovska-Pangovska, Bojana; Wang, Luning; Tan, Li; Lewis, Robert R.; Miller, John H.

    2012-05-15

    Liquid chromatography-mass spectrometry-based metabolomics has gained importance in the life sciences, yet it is not supported by software tools for high throughput identification of metabolites based on their fragmentation spectra. An algorithm (ISIS: in silico identification software) and its implementation are presented and show great promise in generating in silico spectra of lipids for the purpose of structural identification. Instead of using chemical reaction rate equations or rules-based fragmentation libraries, the algorithm uses machine learning to find accurate bond cleavage rates in a mass spectrometer employing collision-induced dissocia-tion tandem mass spectrometry. A preliminary test of the algorithm with 45 lipids from a subset of lipid classes shows both high sensitivity and specificity.

  2. Execution time supports for adaptive scientific algorithms on distributed memory machines

    Science.gov (United States)

    Berryman, Harry; Saltz, Joel; Scroggs, Jeffrey

    1990-01-01

    Optimizations are considered that are required for efficient execution of code segments that consists of loops over distributed data structures. The PARTI (Parallel Automated Runtime Toolkit at ICASE) execution time primitives are designed to carry out these optimizations and can be used to implement a wide range of scientific algorithms on distributed memory machines. These primitives allow the user to control array mappings in a way that gives an appearance of shared memory. Computations can be based on a global index set. Primitives are used to carry out gather and scatter operations on distributed arrays. Communications patterns are derived at runtime, and the appropriate send and receive messages are automatically generated.

  3. Predicting Software Faults in Large Space Systems using Machine Learning Techniques

    Directory of Open Access Journals (Sweden)

    Bhekisipho Twala

    2011-07-01

    Full Text Available Recently, the use of machine learning (ML algorithms has proven to be of great practical value in solving a variety of engineering problems including the prediction of failure, fault, and defect-proneness as the space system software becomes complex. One of the most active areas of recent research in ML has been the use of ensemble classifiers. How ML techniques (or classifiers could be used to predict software faults in space systems, including many aerospace systems is shown, and further use ensemble individual classifiers by having them vote for the most popular class to improve system software fault-proneness prediction. Benchmarking results on four NASA public datasets show the Naive Bayes classifier as more robust software fault prediction while most ensembles with a decision tree classifier as one of its components achieve higher accuracy rates.Defence Science Journal, 2011, 61(4, pp.306-316, DOI:http://dx.doi.org/10.14429/dsj.61.1088

  4. Provenance tracking for scientific software toolchains through on-demand release and archiving

    Science.gov (United States)

    Ham, David

    2017-04-01

    There is an emerging consensus that published computational science results must be backed by a provenance chain tying results to the exact versions of input data and the code which generated them. There is also now an impressive range of web services devoted to revision control of software, and the archiving in citeable form of both software and input data. However, much scientific software itself builds on libraries and toolkits, and these themselves have dependencies. Further, it is common for cutting edge research to depend on the latest version of software in online repositories, rather than the official release version. This creates a situation in which an author who wishes to follow best practice in recording the provenance chain of their results must archive and cite unreleased versions of a series of dependencies. Here, we present an alternative which toolkit authors can easily implement to provide a semi-automatic mechanism for creating and archiving custom software releases of the precise version of a package used in a particular simulation. This approach leverages the excellent services provided by GitHub and Zenodo to generate a connected set of citeable DOIs for the archived software. We present the integration of this workflow into the Firedrake automated finite element framework as a practical example of this approach in use on a complex geoscientific tool chain in practical use.

  5. Software protocol design: Communication and control in a multi-task robot machine for ITER vacuum vessel assembly and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ming, E-mail: ming.li@lut.fi [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland); Wu, Huapeng; Handroos, Heikki [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland); Yang, Guangyou [School of Mechanical Engineering, Hubei University of Technology, Wuhan (China); Wang, Yongbo [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland)

    2015-10-15

    Highlights: • A high-level protocol is proposed for the data inter-transmission. • The protocol design is task-oriented for the robot control in the software system. • The protocol functions as a role of middleware in the software. • The protocol running stand-alone as an independent process in the software provides greater security. • Providing a reference design protocol for the multi-task robot machine in the industry. - Abstract: A specific communication and control protocol for software design of a multi-task robot machine is proposed. In order to fulfill the requirements on the complicated multi machining functions and the high performance motion control, the software design of robot is divided into two main parts accordingly, which consists of the user-oriented HMI part and robot control-oriented real-time control system. The two parts of software are deployed in the different hardware for the consideration of run-time performance, which forms a client–server-control architecture. Therefore a high-level task-oriented protocol is designed for the data inter-communication between the HMI part and the control system part, in which all the transmitting data related to a machining task is divided into three categories: trajectory-oriented data, task control-oriented data and status monitoring-oriented data. The protocol consists of three sub-protocols accordingly – a trajectory protocol, task control protocol and status protocol – which are deployed over the Ethernet and run as independent processes in both the client and server computers. The protocols are able to manage the vast amounts of data streaming due to the multi machining functions in a more efficient way. Since the protocol is functioning in the software as a role of middleware, and providing the data interface standards for the developing groups of two parts of software, it also permits greater focus of both software parts developers on their own requirements-oriented design. By

  6. ENHANCING SEISMIC CALIBRATION RESEARCH THROUGH SOFTWARE AUTOMATION AND SCIENTIFIC INFORMATION MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Ruppert, S D; Dodge, D A; Ganzberger, M D; Hauk, T F; Matzel, E M

    2007-07-06

    The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Engineering (GNEM R&E) Program at LLNL has made significant progress enhancing the process of deriving seismic calibrations and performing scientific integration, analysis, and information management with software automation tools. Several achievements in schema design, data visualization, synthesis, and analysis were completed this year. Our tool efforts address the problematic issues of very large datasets and varied formats encountered during seismic calibration research. As data volumes have increased, scientific information management issues such as data quality assessment, ontology mapping, and metadata collection that are essential for production and validation of derived calibrations have negatively impacted researchers abilities to produce products. New information management and analysis tools have resulted in demonstrated gains in efficiency of producing scientific data products and improved accuracy of derived seismic calibrations. Significant software engineering and development efforts have produced an object-oriented framework that provides database centric coordination between scientific tools, users, and data. Nearly a half billion parameters, signals, measurements, and metadata entries are all stored in a relational database accessed by an extensive object-oriented multi-technology software framework that includes elements of stored procedures, real-time transactional database triggers and constraints, as well as coupled Java and C++ software libraries to handle the information interchange and validation requirements. Significant resources were applied to schema design to enable recording of processing flow and metadata. A core capability is the ability to rapidly select and present subsets of related signals and measurements to the researchers for analysis and distillation both visually (JAVA GUI client applications) and in batch mode

  7. Developing a software for tracking the memory states of the machines in the LHCb Filter Farm

    CERN Document Server

    Jain, Harshit

    2017-01-01

    The LHCb Event Filter Farm consists of more than 1500 server nodes with a total amount of roughly 65 TB operating memory .The memory is crucial for the success of the LHCb experiment, since the proton-proton collisions are temporarily stored on these memory modules. Unfortunately, the aging nodes of the server farm occasionally suffer losses of their memory modules. The lower the available memory, the lower performance we can get out of it. Inducing the users or administrators to pay attention to this matter is inefficient. One needs to upgrade it to an acceptable way. The aim of this project was to develop a software to monitor a set of test machines. The software stores the data of the memory sticks in advance in a database which will be used for future reference. Then it checks the memory sticks at a future time instant to find any failures. In the case of any such losses the software looks up in the database to find out which memory sticks have lost and displays all information of those sticks in a log fi...

  8. A document-driven method for certifying scientific computing software for use in nuclear safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, W. Spencer; Koothoor, Mimitha [Computing and Software Department, McMaster University, Hamilton (Canada)

    2016-04-15

    This paper presents a documentation and development method to facilitate the certification of scientific computing software used in the safety analysis of nuclear facilities. To study the problems faced during quality assurance and certification activities, a case study was performed on legacy software used for thermal analysis of a fuel pin in a nuclear reactor. Although no errors were uncovered in the code, 27 issues of incompleteness and inconsistency were found with the documentation. This work proposes that software documentation follow a rational process, which includes a software requirements specification following a template that is reusable, maintainable, and understandable. To develop the design and implementation, this paper suggests literate programming as an alternative to traditional structured programming. Literate programming allows for documenting of numerical algorithms and code together in what is termed the literate programmer's manual. This manual is developed with explicit traceability to the software requirements specification. The traceability between the theory, numerical algorithms, and implementation facilitates achieving completeness and consistency, as well as simplifies the process of verification and the associated certification.

  9. A Document-Driven Method for Certifying Scientific Computing Software for Use in Nuclear Safety Analysis

    Directory of Open Access Journals (Sweden)

    W. Spencer Smith

    2016-04-01

    Full Text Available This paper presents a documentation and development method to facilitate the certification of scientific computing software used in the safety analysis of nuclear facilities. To study the problems faced during quality assurance and certification activities, a case study was performed on legacy software used for thermal analysis of a fuelpin in a nuclear reactor. Although no errors were uncovered in the code, 27 issues of incompleteness and inconsistency were found with the documentation. This work proposes that software documentation follow a rational process, which includes a software requirements specification following a template that is reusable, maintainable, and understandable. To develop the design and implementation, this paper suggests literate programming as an alternative to traditional structured programming. Literate programming allows for documenting of numerical algorithms and code together in what is termed the literate programmer's manual. This manual is developed with explicit traceability to the software requirements specification. The traceability between the theory, numerical algorithms, and implementation facilitates achieving completeness and consistency, as well as simplifies the process of verification and the associated certification.

  10. The Big Effects of Short-term Efforts: Mentorship and Code Integration in Open Source Scientific Software

    Directory of Open Access Journals (Sweden)

    Erik H Trainer

    2014-07-01

    Full Text Available Scientific progress relies crucially on software, yet in practice there are significant challenges to scientific software production and maintenance. We conducted a case study of a bioinformatics software library called Biopython to investigate the promise of Google Summer of Code (GSoC, a program that pays students to work on open-source projects for the summer, for addressing these challenges. We find three positive outcomes of GSoC in the Biopython community: the addition of new features to the Biopython codebase, training, and personal development. We also find, however, that mentors face several challenges related to GSoC project selection and ranking. We believe that because GSoC provides an occasion to extend the software with capabilities that can be used to produce new knowledge, and to train successive generations of potential contributors to the software, it can play a vital role in the sustainability of open-source scientific software.

  11. Secure State UML: Modeling and Testing Security Concerns of Software Systems Using UML State Machines

    Directory of Open Access Journals (Sweden)

    S. Batool

    2014-05-01

    Full Text Available In this research we present a technique by using which, extended UML models can be converted to standard UML models so that existing MBT techniques can be applied directly on these models. Existing Model Based Testing (MBT Techniques cannot be directly applied to extended UML models due to the difference of modeling notation and new model elements. Verification of these models is also very important. Realizing and testing non functional requirements such as efficiency, portability and security, at model level strengthens the ability of model to turn down risk, cost and probability of system failure in cost effective way. Access control is most widely used technique for implementing security in software systems. Existing approaches for security modeling focus on representation of access control policies such as authentication, role based access control by introducing security oriented model elements through extension in Unified Modelling Language (UML. But doing so hinders the potential and application of MBT techniques to verify these models and test access control policies. In this research we introduce a technique secure State UML to formally design security models with secure UML and then transform it to UML state machine diagrams so that it can be tested, verified by existing MBT techniques. By applying proposed technique on case studies, we found the results that MBT techniques can be applied on resulting state machine diagrams and generated test paths have potential to identify the risks associated with security constraints violation.

  12. Realization of a complex technical and scientific EDP project by means of 'software engineering'

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, B.

    1980-01-01

    The methodics of software engineering used in commercial EDP is applied to a scientific and technical problem. The corresponding approach is shown for the example of numerical calculation of flows through turbomachines.

  13. Software Tool Support to Specify and Verify Scientific Sensor Data Properties to Improve Anomaly Detection

    Science.gov (United States)

    Gallegos, I.; Gates, A. Q.; Tweedie, C.; Cybershare

    2010-12-01

    Advancements in scientific sensor data acquisition technologies, such as wireless sensor networks and robotic trams equipped with sensors, are increasing the amount of data being collected at field sites . This elevates the challenges of verifying the quality of streamed data and monitoring the correct operation of the instrumentation. Without the ability to evaluate the data collection process at near real-time, scientists can lose valuable time and data. In addition, scientists have to rely on their knowledge and experience in the field to evaluate data quality. Such knowledge is rarely shared or reused by other scientists mostly because of the lack of a well-defined methodology and tool support. Numerous scientific projects address anomaly detection, mostly as part of the verification system’s source code; however, anomaly detection properties, which often are embedded or hard-coded in the source code, are difficult to refine. In addition, a software developer is required to modify the source code every time a new anomaly detection property or a modification to an existing property is needed. This poster describes the tool support that has been developed, based on software engineering techniques, to address these challenges. The overall tool support allows scientists to specify and reuse anomaly detection properties generated using the specification tool and to use the specified properties to conduct automated anomaly detection at near-real time. The anomaly-detection mechanism is independent of the system used to collect the sensor data. With guidance provided by a classification and categorization of anomaly-detection properties, the user specifies properties on scientific sensor data. The properties, which can be associated with particular field sites or instrumentation, document knowledge about data anomalies that otherwise would have limited availability to the scientific community.

  14. Assessing Scientific Practices Using Machine-Learning Methods: How Closely Do They Match Clinical Interview Performance?

    Science.gov (United States)

    Beggrow, Elizabeth P.; Ha, Minsu; Nehm, Ross H.; Pearl, Dennis; Boone, William J.

    2013-07-01

    The landscape of science education is being transformed by the new Framework for Science Education (National Research Council, A framework for K-12 science education: practices, crosscutting concepts, and core ideas. The National Academies Press, Washington, DC, 2012), which emphasizes the centrality of scientific practices—such as explanation, argumentation, and communication—in science teaching, learning, and assessment. A major challenge facing the field of science education is developing assessment tools that are capable of validly and efficiently evaluating these practices. Our study examined the efficacy of a free, open-source machine-learning tool for evaluating the quality of students' written explanations of the causes of evolutionary change relative to three other approaches: (1) human-scored written explanations, (2) a multiple-choice test, and (3) clinical oral interviews. A large sample of undergraduates (n = 104) exposed to varying amounts of evolution content completed all three assessments: a clinical oral interview, a written open-response assessment, and a multiple-choice test. Rasch analysis was used to compute linear person measures and linear item measures on a single logit scale. We found that the multiple-choice test displayed poor person and item fit (mean square outfit >1.3), while both oral interview measures and computer-generated written response measures exhibited acceptable fit (average mean square outfit for interview: person 0.97, item 0.97; computer: person 1.03, item 1.06). Multiple-choice test measures were more weakly associated with interview measures (r = 0.35) than the computer-scored explanation measures (r = 0.63). Overall, Rasch analysis indicated that computer-scored written explanation measures (1) have the strongest correspondence to oral interview measures; (2) are capable of capturing students' normative scientific and naive ideas as accurately as human-scored explanations, and (3) more validly detect understanding

  15. PRAPRAG: software para planejamento racional de máquinas agrícolas PRAPRAG: software for rational planning of agricultural machines

    Directory of Open Access Journals (Sweden)

    Erivelto Mercante

    2010-04-01

    Full Text Available O software PRAPRAG é uma ferramenta de escolha de máquinas e implementos agrícolas que apresentam o menor custo por área ou por quantidade produzida, bem como, faz o planejamento de aquisição das máquinas para a propriedade agrícola, do ponto de vista técnico e econômico. Foi utilizada a linguagem de programação Borland Delphi 3.0 e, a partir de prospectos das máquinas e implementos, criou-se um banco de dados onde o usuário pode cadastrar e modificar suas características de uso. O software mostrou-se uma ferramenta útil e uso amigável. O software proporciona maior rapidez, segurança e confiabilidade ao processo produtivo e econômico das propriedades, na seleção e aquisição de conjuntos mecanizados agrícolas, e na determinação de custos com a mão de obra utilizada.The software PRAPRAG is a tool used for choosing agricultural machines and implements that present the lowest cost per area or produced amount, as well as, to it makes the machines acquisition planning for the agricultural property, from both technical and economical points of view. It was used the programming language Borland Delphi 3.0. From the machine and implement handouts, it was created a database where the user can register and modify their characteristics of use. The software showed to be a useful and friendly tool. The software provides high speed, safety and reliability for the productive and economical process of the properties, at the selection and acquisition of agricultural systems, as well as for the determination of costs with the used labor.

  16. 75 FR 71146 - In the Matter of Certain Machine Vision Software, Machine Vision Systems, and Products Containing...

    Science.gov (United States)

    2010-11-22

    ..., California; Techno Soft Systemnics, Inc. (``Techno Soft'') of Japan; Fuji Machine Manufacturing Co., Ltd. of... the investigation as to Amistar based on a consent order and settlement agreement, and as to...

  17. Enhancing Seismic Calibration Research Through Software Automation and Scientific Information Management

    Energy Technology Data Exchange (ETDEWEB)

    Ruppert, S D; Dodge, D A; Ganzberger, M D; Harris, D B; Hauk, T F

    2009-07-07

    The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Development (GNEMRD) Program at LLNL continues to make significant progress enhancing the process of deriving seismic calibrations and performing scientific integration, analysis, and information management with software automation tools. Our tool efforts address the problematic issues of very large datasets and varied formats encountered during seismic calibration research. New information management and analysis tools have resulted in demonstrated gains in efficiency of producing scientific data products and improved accuracy of derived seismic calibrations. In contrast to previous years, software development work this past year has emphasized development of automation at the data ingestion level. This change reflects a gradually-changing emphasis in our program from processing a few large data sets that result in a single integrated delivery, to processing many different data sets from a variety of sources. The increase in the number of sources had resulted in a large increase in the amount of metadata relative to the final volume of research products. Software developed this year addresses the problems of: (1) Efficient metadata ingestion and conflict resolution; (2) Automated ingestion of bulletin information; (3) Automated ingestion of waveform information from global data centers; and (4) Site Metadata and Response transformation required for certain products. This year, we also made a significant step forward in meeting a long-standing goal of developing and using a waveform correlation framework. Our objective for such a framework is to extract additional calibration data (e.g. mining blasts) and to study the extent to which correlated seismicity can be found in global and regional scale environments.

  18. nanoHUB.org: Experiences and Challenges in Software Sustainability for a Large Scientific Community

    Directory of Open Access Journals (Sweden)

    Lynn Zentner

    2014-07-01

    Full Text Available The science gateway nanoHUB.org, funded by the National Science Foundation (NSF, serves a large scientific community dedicated to research and education in nanotechnology with community-contributed simulation codes as well as a vast repository of other materials such as recorded presentations, teaching materials, and workshops and courses. Nearly 330,000 users annually access over 4400 items of content on nanoHUB, including 343 simulation tools. Arguably the largest nanotechnology facility in the world, nanoHUB has led the way not only in providing open access to scientific code in the nanotechnology community, but also in lowering barriers to the use of that code, by providing a platform where developers are able to easily and quickly deploy code written in a variety of languages with user-friendly graphical user interfaces and where users can run the latest versions of codes transparently on the grid or other powerful resources without ever having to download or update code. Being a leader in open access code deployment provides nanoHUB with opportunities and challenges as it meets the current and future needs of its community. This paper discusses the experiences of nanoHUB in addressing and adapting to the changing landscape of scientific software in ways that best serve its community and meet the needs of the largest portion of its user base.

  19. ENHANCING SEISMIC CALIBRATION RESEARCH THROUGH SOFTWARE AUTOMATION AND SCIENTIFIC INFORMATION MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Ruppert, S; Dodge, D A; Ganzberger, M D; Hauk, T F; Matzel, E M

    2008-07-03

    The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Development (GNEMRD) Program at LLNL continues to make significant progress enhancing the process of deriving seismic calibrations and performing scientific integration, analysis, and information management with software automation tools. Our tool efforts address the problematic issues of very large datasets and varied formats encountered during seismic calibration research. New information management and analysis tools have resulted in demonstrated gains in efficiency of producing scientific data products and improved accuracy of derived seismic calibrations. The foundation of a robust, efficient data development and processing environment is comprised of many components built upon engineered versatile libraries. We incorporate proven industry 'best practices' throughout our code and apply source code and bug tracking management as well as automatic generation and execution of unit tests for our experimental, development and production lines. Significant software engineering and development efforts have produced an object-oriented framework that provides database centric coordination between scientific tools, users, and data. Over a half billion parameters, signals, measurements, and metadata entries are all stored in a relational database accessed by an extensive object-oriented multi-technology software framework that includes stored procedures, real-time transactional database triggers and constraints, as well as coupled Java and C++ software libraries to handle the information interchange and validation requirements. Significant resources were applied to schema design to enable management of processing methods and station parameters, responses and metadata. This allowed for the development of merged ground-truth (GT) data sets compiled by the NNSA labs and AFTAC that include hundreds of thousands of events and tens of millions of arrivals. The

  20. China's experimental pragmatics of "Scientific development" in wind power: Algorithmic struggles over software in wind turbines

    DEFF Research Database (Denmark)

    Kirkegaard, Julia

    2016-01-01

    This article presents a case study on the development of China's wind power market. As China's wind industry has experienced a quality crisis, the Chinese government has intervened to steer the industry towards a turn to quality, indicating a pragmatist and experimental mode of market development....... This increased focus on quality, to ensure the sustainable and scientific development of China's wind energy market, requires improved indigenous Chinese innovation capabilities in wind turbine technology. To shed light on how the turn to quality impacts upon the industry and global competition, this study...... unfold over issues associated with intellectual property rights (IPRs), certification and standardisation of software algorithms. The article concludes that the use of this STS lens makes a fresh contribution to the often path-dependent, structuralist and hierarchical China literature, offering instead...

  1. The design strategy of scientific data quality control software for Euclid mission

    CERN Document Server

    Brescia, Massimo; Fredvik, Terje; Haugan, Stein Vidar Hagfors; Gozaliasl, Ghassem; Kirkpatrick, Charles; Kurki-Suonio, Hannu; Longo, Giuseppe; Nilsson, Kari; Wiesmann, Martin

    2016-01-01

    The most valuable asset of a space mission like Euclid are the data. Due to their huge volume, the automatic quality control becomes a crucial aspect over the entire lifetime of the experiment. Here we focus on the design strategy for the Science Ground Segment (SGS) Data Quality Common Tools (DQCT), which has the main role to provide software solutions to gather, evaluate, and record quality information about the raw and derived data products from a primarily scientific perspective. The SGS DQCT will provide a quantitative basis for evaluating the application of reduction and calibration reference data, as well as diagnostic tools for quality parameters, flags, trend analysis diagrams and any other metadata parameter produced by the pipeline. In a large programme like Euclid, it is prohibitively expensive to process large amount of data at the pixel level just for the purpose of quality evaluation. Thus, all measures of quality at the pixel level are implemented in the individual pipeline stages, and passed ...

  2. What Not To Do: Anti-patterns for Developing Scientific Workflow Software Components

    Science.gov (United States)

    Futrelle, J.; Maffei, A. R.; Sosik, H. M.; Gallager, S. M.; York, A.

    2013-12-01

    Scientific workflows promise to enable efficient scaling-up of researcher code to handle large datasets and workloads, as well as documentation of scientific processing via standardized provenance records, etc. Workflow systems and related frameworks for coordinating the execution of otherwise separate components are limited, however, in their ability to overcome software engineering design problems commonly encountered in pre-existing components, such as scripts developed externally by scientists in their laboratories. In practice, this often means that components must be rewritten or replaced in a time-consuming, expensive process. In the course of an extensive workflow development project involving large-scale oceanographic image processing, we have begun to identify and codify 'anti-patterns'--problematic design characteristics of software--that make components fit poorly into complex automated workflows. We have gone on to develop and document low-effort solutions and best practices that efficiently address the anti-patterns we have identified. The issues, solutions, and best practices can be used to evaluate and improve existing code, as well as guiding the development of new components. For example, we have identified a common anti-pattern we call 'batch-itis' in which a script fails and then cannot perform more work, even if that work is not precluded by the failure. The solution we have identified--removing unnecessary looping over independent units of work--is often easier to code than the anti-pattern, as it eliminates the need for complex control flow logic in the component. Other anti-patterns we have identified are similarly easy to identify and often easy to fix. We have drawn upon experience working with three science teams at Woods Hole Oceanographic Institution, each of which has designed novel imaging instruments and associated image analysis code. By developing use cases and prototypes within these teams, we have undertaken formal evaluations of

  3. The Terabyte Analysis Machine Project The Distance Machine:Performance Report

    Institute of Scientific and Technical Information of China (English)

    JamesAnnis; KoenHoltman; 等

    2001-01-01

    The Terabyte Analysis Machine Project is Developing hardware and software to analyze Terabyte scale datasets.The Distance Machine framework provides facilities to flexibly interface application specific indexing and partitioning algorthms to large scientific databases.

  4. Towards Analysis-Driven Scientific Software Architecture: The Case for Abstract Data Type Calculus

    Directory of Open Access Journals (Sweden)

    Damian W.I. Rouson

    2008-01-01

    Full Text Available This article approaches scientific software architecture from three analytical paths. Each path examines discrete time advancement of multiphysics phenomena governed by coupled differential equations. The new object-oriented Fortran 2003 constructs provide a formal syntax for an abstract data type (ADT calculus. The first analysis uses traditional object-oriented software design metrics to demonstrate the high cohesion and low coupling associated with the calculus. A second analysis from the viewpoint of computational complexity theory demonstrates that a more representative bug search strategy than that considered by Rouson et al. (ACM Trans. Math. Soft. 34(1 (2008 reduces the number of lines searched in a code with λ total lines from O(λ2 to O(λ log2 λ , which in turn becomes nearly independent of the overall code size in the context of ADT calculus. The third analysis derives from information theory an argument that ADT calculus simplifies developer communications in part by minimizing the growth in interface information content as developers add new physics to a multiphysics package.

  5. The Software Technology Center at Lawrence Livermore National Laboratory: Software engineering technology transfer in a scientific R&D laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Zucconi, L.

    1993-12-01

    Software engineering technology transfer for productivity and quality improvement can be difficult to initiate and sustain in a non-profit research laboratory where the concepts of profit and loss do not exist. In this experience report, the author discusses the approach taken to establish and maintain a software engineering technology transfer organization at a large R&D laboratory.

  6. A Classical Fuzzy Approach for Software Effort Estimation on Machine Learning Technique

    OpenAIRE

    S.Malathi; Sridhar, S.

    2011-01-01

    Software Cost Estimation with resounding reliability, productivity and development effort is a challenging and onerous task. This has incited the software community to give much needed thrust and delve into extensive research in software effort estimation for evolving sophisticated methods. Estimation by analogy is one of the expedient techniques in software effort estimation field. However, the methodology utilized for the estimation of software effort by analogy is not able to handle the ca...

  7. A Classical Fuzzy Approach for Software Effort Estimation on Machine Learning Technique

    CERN Document Server

    Malathi, S

    2011-01-01

    Software Cost Estimation with resounding reliability,productivity and development effort is a challenging and onerous task. This has incited the software community to give much needed thrust and delve into extensive research in software effort estimation for evolving sophisticated methods. Estimation by analogy is one of the expedient techniques in software effort estimation field. However, the methodology utilized for the estimation of software effort by analogy is not able to handle the categorical data in an explicit and precise manner. A new approach has been developed in this paper to estimate software effort for projects represented by categorical or numerical data using reasoning by analogy and fuzzy approach. The existing historical data sets, analyzed with fuzzy logic, produce accurate results in comparison to the data set analyzed with the earlier methodologies.

  8. A Classical Fuzzy Approach for Software Effort Estimation on Machine Learning Technique

    Directory of Open Access Journals (Sweden)

    S.Malathi

    2011-11-01

    Full Text Available Software Cost Estimation with resounding reliability, productivity and development effort is a challenging and onerous task. This has incited the software community to give much needed thrust and delve into extensive research in software effort estimation for evolving sophisticated methods. Estimation by analogy is one of the expedient techniques in software effort estimation field. However, the methodology utilized for the estimation of software effort by analogy is not able to handle the categorical data in an explicit and precise manner. A new approach has been developed in this paper to estimate software effort for projects represented by categorical or numerical data using reasoning by analogy and fuzzy approach. The existing historical datasets, analyzed with fuzzy logic, produce accurate results in comparison to the dataset analyzed with the earlier methodologies.

  9. From curve fitting to machine learning an illustrative guide to scientific data analysis and computational intelligence

    CERN Document Server

    Zielesny, Achim

    2016-01-01

    This successful book provides in its second edition an interactive and illustrative guide from two-dimensional curve fitting to multidimensional clustering and machine learning with neural networks or support vector machines. Along the way topics like mathematical optimization or evolutionary algorithms are touched. All concepts and ideas are outlined in a clear cut manner with graphically depicted plausibility arguments and a little elementary mathematics. The major topics are extensively outlined with exploratory examples and applications. The primary goal is to be as illustrative as possible without hiding problems and pitfalls but to address them. The character of an illustrative cookbook is complemented with specific sections that address more fundamental questions like the relation between machine learning and human intelligence. All topics are completely demonstrated with the computing platform Mathematica and the Computational Intelligence Packages (CIP), a high-level function library developed with M...

  10. PyRDM: A Python-based library for automating the management and online publication of scientific software and data

    Directory of Open Access Journals (Sweden)

    Christian T. Jacobs

    2014-10-01

    Full Text Available The recomputability and reproducibility of results from scientific software requires access to both the source code and all associated input and output data. However, the full collection of these resources often does not accompany the key findings published in journal articles, thereby making it difficult or impossible for the wider scientific community to verify the correctness of a result or to build further research on it. This paper presents a new Python-based library, PyRDM, whose functionality aims to automate the process of sharing the software and data via online, citable repositories such as Figshare. The library is integrated into the workflow of an open-source computational fluid dynamics package, Fluidity, to demonstrate an example of its usage.

  11. An Evaluation of Output Quality of Machine Translation (Padideh Software vs. Google Translate)

    Science.gov (United States)

    Azer, Haniyeh Sadeghi; Aghayi, Mohammad Bagher

    2015-01-01

    This study aims to evaluate the translation quality of two machine translation systems in translating six different text-types, from English to Persian. The evaluation was based on criteria proposed by Van Slype (1979). The proposed model for evaluation is a black-box type, comparative and adequacy-oriented evaluation. To conduct the evaluation, a…

  12. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    Directory of Open Access Journals (Sweden)

    Joseph P. Kenny

    2008-01-01

    Full Text Available Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also address interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.

  13. Charlotte: Scientific Modeling and Simulation Under the Software as a Service Paradigm Project

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA spends considerable effort supporting the efforts of collaborating researchers. These researchers are interested in interacting with scientific models provided...

  14. Software para el procesamiento de los ensayos de la máquina sincrónica; Software for processing Synchronous Machine test´data

    Directory of Open Access Journals (Sweden)

    Orlando Lázaro Rodríguez González

    2011-02-01

    Full Text Available Este trabajo resuelve, haciendo uso de las facilidades que brinda el MATLAB, de forma cuasi analítica,este problema.El software sigue los mismos pasos que realiza manualmente por el método tradicional. Fueimplementado el método de la fuerza magnetomotriz. Para el ajuste de la curva en el tramo saturado seemplea una aproximación según la función arcotangente, en el sentido de los mínimos cuadrados. Elpunto intercepción entre curvas se calcula con métodos numéricos, hasta una precisión que supera la quepueden tener los datos del experimento. Está destinado para su uso en la industria y para la docencia enlas universidades Synchronous Machine tests, such as no load, short circuit and zero power factor, are achieved to traceZero Power-factor, Regulation and External Load characteristics, in addition to Voltage Regulation for anygiven state. To carry out this results we are familiarized with graphical methods, which are cumbersomeand imprecise. This is a new cuasi analytic approach to this problem by means of a MATLAB software.Algorithm is similar to that followed by someone who is carrying out solutions by traditional way. It wasimplemented Magneto Motriz Force method. Saturation was approached to an inverse tangent curve in aleast means square sense. To compute the interception point between curves it was used numericalmethods which achieved a precision higher than experiments offer.This software is intended to be usedeither in Indutry or in universities with academic purpose.

  15. Walking the Talk: Adopting and Adapting Sustainable Scientific Software Development processes in a Small Biology Lab

    Directory of Open Access Journals (Sweden)

    Michael R. Crusoe

    2016-11-01

    Full Text Available The khmer software project provides both research and production functionality for largescale nucleic-acid sequence analysis. The software implements several novel data structures and algorithms that perform data pre-filtering for common bioinformatics tasks, including sequence mapping and de novo assembly. Development is driven by a small lab with one full-time developer (MRC, as well as several graduate students and a professor (CTB who contribute regularly to research features. Here we describe our efforts to bring better design, testing, and more open development to the khmer software project as of version 1.1. The khmer software is developed openly at http://github.com/dib-lab/khmer/.

  16. Scientific Component Technology Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, S; Bosl, B; Dahlgren, T; Kumfert, G; Smith, S

    2003-02-07

    The laboratory has invested a significant amount of resources towards the development of high-performance scientific simulation software, including numerical libraries, visualization, steering, software frameworks, and physics packages. Unfortunately, because this software was not designed for interoperability and re-use, it is often difficult to share these sophisticated software packages among applications due to differences in implementation language, programming style, or calling interfaces. This LDRD Strategic Initiative investigated and developed software component technology for high-performance parallel scientific computing to address problems of complexity, re-use, and interoperability for laboratory software. Component technology is an extension of scripting and object-oriented software development techniques that specifically focuses on the needs of software interoperability. Component approaches based on CORBA, COM, and Java technologies are widely used in industry; however, they do not support massively parallel applications in science and engineering. Our research focused on the unique requirements of scientific computing on ASCI-class machines, such as fast in-process connections among components, language interoperability for scientific languages, and data distribution support for massively parallel SPMD components.

  17. Harnessing the power of big data: infusing the scientific method with machine learning to transform ecology

    Science.gov (United States)

    Most efforts to harness the power of big data for ecology and environmental sciences focus on data and metadata sharing, standardization, and accuracy. However, many scientists have not accepted the data deluge as an integral part of their research because the current scientific method is not scalab...

  18. Lifelong personal health data and application software via virtual machines in the cloud

    OpenAIRE

    Van Gorp, P.; Comuzzi, M.

    2014-01-01

    Personal Health Records (PHRs) should remain the lifelong property of patients, who should be able to show them conveniently and securely to selected caregivers and institutions. In this paper, we present MyPHRMachines, a cloud-based PHR system taking a radically new architectural solution to health record portability. In MyPHRMachines, health-related data and the application software to view and/or analyze it are separately deployed in the PHR system. After uploading their medical data to My...

  19. An open CAM system for dentistry on the basis of China-made 5-axis simultaneous contouring CNC machine tool and industrial CAM software.

    Science.gov (United States)

    Lu, Li; Liu, Shusheng; Shi, Shenggen; Yang, Jianzhong

    2011-10-01

    China-made 5-axis simultaneous contouring CNC machine tool and domestically developed industrial computer-aided manufacture (CAM) technology were used for full crown fabrication and measurement of crown accuracy, with an attempt to establish an open CAM system for dental processing and to promote the introduction of domestic dental computer-aided design (CAD)/CAM system. Commercially available scanning equipment was used to make a basic digital tooth model after preparation of crown, and CAD software that comes with the scanning device was employed to design the crown by using domestic industrial CAM software to process the crown data in order to generate a solid model for machining purpose, and then China-made 5-axis simultaneous contouring CNC machine tool was used to complete machining of the whole crown and the internal accuracy of the crown internal was measured by using 3D-MicroCT. The results showed that China-made 5-axis simultaneous contouring CNC machine tool in combination with domestic industrial CAM technology can be used for crown making and the crown was well positioned in die. The internal accuracy was successfully measured by using 3D-MicroCT. It is concluded that an open CAM system for dentistry on the basis of China-made 5-axis simultaneous contouring CNC machine tool and domestic industrial CAM software has been established, and development of the system will promote the introduction of domestically-produced dental CAD/CAM system.

  20. The Center for Technology for Advanced Scientific Component Software (TASCS) Lawrence Livermore National Laboratory - Site Status Update

    Energy Technology Data Exchange (ETDEWEB)

    Epperly, T W

    2008-12-03

    This report summarizes LLNL's progress for the period April through September of 2008 for the Center for Technology for Advanced Scientific Component Software (TASCS) SciDAC. The TASCS project is organized into four major thrust areas: CCA Environment (72%), Component Technology Initiatives (16%), CCA Toolkit (8%), and User and Application Outreach & Support (4%). The percentage of LLNL's effort allocation is shown in parenthesis for each thrust area. Major thrust areas are further broken down into activity areas, LLNL's effort directed to each activity is shown in Figure 1. Enhancements, Core Tools, and Usability are all part of CCA Environment, and Software Quality is part of Component Technology Initiatives. The balance of this report will cover our accomplishments in each of these activity areas.

  1. Corganiser: a web-based software tool for planning time-sensitive sampling of whole rounds during scientific drilling

    DEFF Research Database (Denmark)

    Marshall, Ian

    2014-01-01

    Corganiser is a software tool developed to simplify the process of preparing whole-round sampling plans for time-sensitive microbiology and geochemistry sampling during scientific drilling. It was developed during the Integrated Ocean Drilling Program (IODP) Expedition 347, but is designed to work...... with a wide range of core and section configurations and can thus be used in future drilling projects. Corganiser is written in the Python programming language and is implemented both as a graphical web interface and command-line interface. It can be accessed online at http://130.226.247.137/....

  2. Man-machine Integration Design and Analysis System (MIDAS) Task Loading Model (TLM) experimental and software detailed design report

    Science.gov (United States)

    Staveland, Lowell

    1994-01-01

    This is the experimental and software detailed design report for the prototype task loading model (TLM) developed as part of the man-machine integration design and analysis system (MIDAS), as implemented and tested in phase 6 of the Army-NASA Aircrew/Aircraft Integration (A3I) Program. The A3I program is an exploratory development effort to advance the capabilities and use of computational representations of human performance and behavior in the design, synthesis, and analysis of manned systems. The MIDAS TLM computationally models the demands designs impose on operators to aide engineers in the conceptual design of aircraft crewstations. This report describes TLM and the results of a series of experiments which were run this phase to test its capabilities as a predictive task demand modeling tool. Specifically, it includes discussions of: the inputs and outputs of TLM, the theories underlying it, the results of the test experiments, the use of the TLM as both stand alone tool and part of a complete human operator simulation, and a brief introduction to the TLM software design.

  3. Effective software design and development for the new graph architecture HPC machines.

    Energy Technology Data Exchange (ETDEWEB)

    Dechev, Damian

    2012-03-01

    Software applications need to change and adapt as modern architectures evolve. Nowadays advancement in chip design translates to increased parallelism. Exploiting such parallelism is a major challenge in modern software engineering. Multicore processors are about to introduce a significant change in the way we design and use fundamental data structures. In this work we describe the design and programming principles of a software library of highly concurrent scalable and nonblocking data containers. In this project we have created algorithms and data structures for handling fundamental computations in massively multithreaded contexts, and we have incorporated these into a usable library with familiar look and feel. In this work we demonstrate the first design and implementation of a wait-free hash table. Our multiprocessor data structure design allows a large number of threads to concurrently insert, remove, and retrieve information. Non-blocking designs alleviate the problems traditionally associated with the use of mutual exclusion, such as bottlenecks and thread-safety. Lock-freedom provides the ability to share data without some of the drawbacks associated with locks, however, these designs remain susceptible to starvation. Furthermore, wait-freedom provides all of the benefits of lock-free synchronization with the added assurance that every thread makes progress in a finite number of steps. This implies deadlock-freedom, livelock-freedom, starvation-freedom, freedom from priority inversion, and thread-safety. The challenges of providing the desirable progress and correctness guarantees of wait-free objects makes their design and implementation difficult. There are few wait-free data structures described in the literature. Using only standard atomic operations provided by the hardware, our design is portable; therefore, it is applicable to a variety of data-intensive applications including the domains of embedded systems and supercomputers.Our experimental

  4. LISIRD 2: Applying Standards and Open Source Software in Exploring and Serving Scientific Data

    Science.gov (United States)

    Wilson, A.; Lindholm, D. M.; Ware Dewolfe, A.; Lindholm, C.; Pankratz, C. K.; Snow, M.; Woods, T. N.

    2009-12-01

    The LASP Interactive Solar IRradiance Datacenter (LISIRD), http://lasp.colorado.edu/lisird, seeks to provide exploration of and access to solar irradiance data, models and other related data. These irradiance datasets, from the SME, UARS, TIMED, and SORCE missions, are primarily a function of time and often also wavelength. Their measurements are typically made on a scale of seconds and derived products are provided at daily cadence. The first version of the LISIRD site was built using non standard, proprietary software. The non standard application structure and tight coupling to a variety of dataset representations made changes arduous and maintenance difficult. Eventually the software vender decided to no longer support a critical software component, further decreasing the viability of the site. In LISIRD 2, through the application of the Java EE standard coupled with open source software to fetch and plot the data, the functionality of the original site is being improved while the code structure is being streamlined and simplified. With a relatively minimal effort, the new site can access and serve a greater variety of datasets in an easier fashion, and produce responsive, interactive plots of datasets overlaid and/or linked in time. And it does so using a significantly smaller code base that is, at the same time, much more flexible and extensible. In particular, LISIRD 2 heavily leverages powerful, flexible functionality provided by the Time Series Data Server (TSDS). The OPeNDAP compliant TSDS supports requests for any data that are function of time. It can support scalar, vector, and spectra data types. Through the use of the Unidata NetCDF-Java library and NcML, the TSDS supports multiple input and output formats and is easily extended to support more. It also supports a variety of filters that can be chained and applied to the data on the server before being delivered. TSDS thinning capabilities make it easy for the clients to request appropriate data

  5. Implementing Scientific Simulation Codes Highly Tailored for Vector Architectures Using Custom Configurable Computing Machines

    Science.gov (United States)

    Rutishauser, David

    2006-01-01

    The motivation for this work comes from an observation that amidst the push for Massively Parallel (MP) solutions to high-end computing problems such as numerical physical simulations, large amounts of legacy code exist that are highly optimized for vector supercomputers. Because re-hosting legacy code often requires a complete re-write of the original code, which can be a very long and expensive effort, this work examines the potential to exploit reconfigurable computing machines in place of a vector supercomputer to implement an essentially unmodified legacy source code. Custom and reconfigurable computing resources could be used to emulate an original application's target platform to the extent required to achieve high performance. To arrive at an architecture that delivers the desired performance subject to limited resources involves solving a multi-variable optimization problem with constraints. Prior research in the area of reconfigurable computing has demonstrated that designing an optimum hardware implementation of a given application under hardware resource constraints is an NP-complete problem. The premise of the approach is that the general issue of applying reconfigurable computing resources to the implementation of an application, maximizing the performance of the computation subject to physical resource constraints, can be made a tractable problem by assuming a computational paradigm, such as vector processing. This research contributes a formulation of the problem and a methodology to design a reconfigurable vector processing implementation of a given application that satisfies a performance metric. A generic, parametric, architectural framework for vector processing implemented in reconfigurable logic is developed as a target for a scheduling/mapping algorithm that maps an input computation to a given instance of the architecture. This algorithm is integrated with an optimization framework to arrive at a specification of the architecture parameters

  6. Army-NASA aircrew/aircraft integration program: Phase 4 A(3)I Man-Machine Integration Design and Analysis System (MIDAS) software detailed design document

    Science.gov (United States)

    Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Constantine, Betsy; Murray, Jerry; Neukom, Christian; Prevost, Michael; Shankar, Renuka; Staveland, Lowell

    1991-01-01

    The Man-Machine Integration Design and Analysis System (MIDAS) is an integrated suite of software components that constitutes a prototype workstation to aid designers in applying human factors principles to the design of complex human-machine systems. MIDAS is intended to be used at the very early stages of conceptual design to provide an environment wherein designers can use computational representations of the crew station and operator, instead of hardware simulators and man-in-the-loop studies, to discover problems and ask 'what if' questions regarding the projected mission, equipment, and environment. This document is the Software Product Specification for MIDAS. Introductory descriptions of the processing requirements, hardware/software environment, structure, I/O, and control are given in the main body of the document for the overall MIDAS system, with detailed discussion of the individual modules included in Annexes A-J.

  7. Porting the AVS/Express scientific visualization software to Cray XT4.

    Science.gov (United States)

    Leaver, George W; Turner, Martin J; Perrin, James S; Mummery, Paul M; Withers, Philip J

    2011-08-28

    Remote scientific visualization, where rendering services are provided by larger scale systems than are available on the desktop, is becoming increasingly important as dataset sizes increase beyond the capabilities of desktop workstations. Uptake of such services relies on access to suitable visualization applications and the ability to view the resulting visualization in a convenient form. We consider five rules from the e-Science community to meet these goals with the porting of a commercial visualization package to a large-scale system. The application uses message-passing interface (MPI) to distribute data among data processing and rendering processes. The use of MPI in such an interactive application is not compatible with restrictions imposed by the Cray system being considered. We present details, and performance analysis, of a new MPI proxy method that allows the application to run within the Cray environment yet still support MPI communication required by the application. Example use cases from materials science are considered.

  8. Nuquantus: Machine learning software for the characterization and quantification of cell nuclei in complex immunofluorescent tissue images

    Science.gov (United States)

    Gross, Polina; Honnorat, Nicolas; Varol, Erdem; Wallner, Markus; Trappanese, Danielle M.; Sharp, Thomas E.; Starosta, Timothy; Duran, Jason M.; Koller, Sarah; Davatzikos, Christos; Houser, Steven R.

    2016-03-01

    Determination of fundamental mechanisms of disease often hinges on histopathology visualization and quantitative image analysis. Currently, the analysis of multi-channel fluorescence tissue images is primarily achieved by manual measurements of tissue cellular content and sub-cellular compartments. Since the current manual methodology for image analysis is a tedious and subjective approach, there is clearly a need for an automated analytical technique to process large-scale image datasets. Here, we introduce Nuquantus (Nuclei quantification utility software) - a novel machine learning-based analytical method, which identifies, quantifies and classifies nuclei based on cells of interest in composite fluorescent tissue images, in which cell borders are not visible. Nuquantus is an adaptive framework that learns the morphological attributes of intact tissue in the presence of anatomical variability and pathological processes. Nuquantus allowed us to robustly perform quantitative image analysis on remodeling cardiac tissue after myocardial infarction. Nuquantus reliably classifies cardiomyocyte versus non-cardiomyocyte nuclei and detects cell proliferation, as well as cell death in different cell classes. Broadly, Nuquantus provides innovative computerized methodology to analyze complex tissue images that significantly facilitates image analysis and minimizes human bias.

  9. Embedded software simulation testing environment based on virtual machine%基于虚拟机的嵌人式软件仿真测试环境

    Institute of Scientific and Technical Information of China (English)

    殷永峰; 刘斌; 王志

    2011-01-01

    In order to enhance the quality and reliability of embedded software, an embedded software simulation testing environment based on virtual machine was proposed. The virtual machine technology was introduced into the simulation testing field of embedded software, and the constitution principle of embedded software simulation testing environment was introduced. In addition, the virtual machine technology was analyzed, and an embedded software simulation testing virtual machine (ESSTVM) based on an extended program-transplantation virtual machine model was proposed. And the design scheme for the memory management and instruction system was presented. The ESSTVM was applied to the design of avionics embedded software simulation testing environment (AESSTE), and the system structure of the testing environment was studied. Moreover, the design and realization of both testing development and testing execution systems were elaborated, and the transportability of AESSTE was analyzed. The analysis results show that the proposed method can effectively improve the universality, transportability and maintainability of embedded software testing environment.%为了提高嵌入式软件的质量和可靠性,提出了一种基于虚拟机的嵌入式软件仿真测试环境.将虚拟机技术引入到嵌入式软件仿真测试领域中,介绍了嵌入式软件仿真测试环境的构成原理,并对虚拟机技术进行了分析,提出一种基于程序移植虚拟机模型扩展的嵌入式软件仿真测试虚拟机(ESSTVM),给出了内存管理和指令系统的设计方案.将ESSTVM应用到航电嵌入式软件系统测试环境AESSTE设计中,研究了该测试环境的体系结构,阐述了测试开发系统及测试执行系统的设计与实现,并对AESSTE的可移植性进行了分析.分析结果表明,该方法能有效提高嵌入式软件测试环境的通用性、可移植性和可维护性.

  10. The Machine within the Machine

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    Although Virtual Machines are widespread across CERN, you probably won't have heard of them unless you work for an experiment. Virtual machines - known as VMs - allow you to create a separate machine within your own, allowing you to run Linux on your Mac, or Windows on your Linux - whatever combination you need.   Using a CERN Virtual Machine, a Linux analysis software runs on a Macbook. When it comes to LHC data, one of the primary issues collaborations face is the diversity of computing environments among collaborators spread across the world. What if an institute cannot run the analysis software because they use different operating systems? "That's where the CernVM project comes in," says Gerardo Ganis, PH-SFT staff member and leader of the CernVM project. "We were able to respond to experimentalists' concerns by providing a virtual machine package that could be used to run experiment software. This way, no matter what hardware they have ...

  11. MACHINE LEARNING APPROACHES IN IMPROVING SERVICE LEVEL AGREEMENT-BASED ADMISSION CONTROL FOR A SOFTWARE-AS-A-SERVICE PROVIDER IN CLOUD

    Directory of Open Access Journals (Sweden)

    R. S. Mohana

    2013-01-01

    Full Text Available Software as a Service (SaaS offers reliable access to software applications to the end users over the Internet without direct investment in infrastructure and software. SaaS providers utilize resources of internal data centres or rent resources from a public Infrastructure as a Service (IaaS provider in order to serve their customers. Internal hosting can ample cost of administration and maintenance whereas hiring from an IaaS provider can impact the service quality due to its variable performance. To surmount these drawbacks, we propose pioneering admission control and scheduling algorithms for SaaS providers to effectively utilize public Cloud resources to maximize profit by minimizing cost and improving customer satisfaction level. There is a drawback in this method is strength of the algorithms by handling errors in dynamic scenario of cloud environment, also there is a need of machine learning method to predict the strategies and produce the according resources. The admission control provided by trust model that is based on SLA uses different strategies to decide upon accepting user requests so that there is minimal performance impact, avoiding SLA penalties that are giving higher profit. Machine learning method aims at building a distributed system for cloud resource monitoring and prediction that includes learning-based methodologies for modelling and optimization of resource prediction models. The learning methods are Artificial Neural Network (ANN and Support Vector Machine (SVM are two typical machine learning strategies in the category of regression computation. These two methods can be employed for modelling resource state prediction. In addition, we conduct a widespread evaluation study to analyze which solution matches best in which scenario to maximize SaaS provider’s profit. Results obtained through our extensive simulation shows that our proposed algorithms provide significant improvement (up to 40% cost saving over

  12. ASTRI SST-2M prototype and mini-array data reconstruction and scientific analysis software in the framework of the Cherenkov Telescope Array

    Science.gov (United States)

    Lombardi, Saverio; Antonelli, Lucio A.; Bastieri, Denis; Donnarumma, Imma; Lucarelli, Fabrizio; Madonna, Alberto; Mastropietro, Michele

    2016-07-01

    In the framework of the international Cherenkov Telescope Array (CTA) gamma-ray observatory, the Italian National Institute for Astrophysics (INAF) is developing a dual-mirror, small-sized, end-to-end prototype (ASTRI SST-2M), inaugurated on September 2014 at Mt. Etna (Italy), and a mini-array composed of nine ASTRI telescopes, proposed to be installed at the southern CTA site. The ASTRI mini-array is a collaborative effort led by INAF and carried out by institutes from Italy, Brazil, and South-Africa. The project is also including the full data handling chain from raw data up to final scientific products. To this end, a dedicated software for the online/ on-site/off-site data reconstruction and scientific analysis is under development for both the ASTRI SST-2M prototype and mini-array. The software is designed following a modular approach in which each single component and the entire pipeline are developed in compliance with the CTA requirements. Data reduction is conceived to be run on parallel computing architectures, as multi-core CPUs and graphic accelerators (GPUs), and new hardware architectures based on low-power consumption processors (e.g. ARM). The software components are coded in C++/Python/CUDA and wrapped by efficient pipelines written in Python. The final scientific products are then achieved by means of either science tools currently being used in the CTA Consortium (e.g. ctools) or specifically developed ones. In this contribution, we present the framework and the main software components of the ASTRI SST-2M prototype and mini-array data reconstruction and scientific analysis software package, and report the status of its development.

  13. System and Software Design for the Man Machine Interface System for Shin-Hanul Nuclear Power Plant Units 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Woong Seock; Kim, Chang Ho; Lee, Yoon Hee; Sohn, Se Do; Baek, Seung Min [KEPCO E and C, Daejeon (Korea, Republic of)

    2015-10-15

    The design of the safety MMIS(Man Machine Interface System) system has been performed using POSAFE-Q Programmable Logic Controller (PLC). The design of the non-safety MMIS has been performed using OPERASYSTEM Distributed Control System (DCS). This paper describes the design experiences from the design work of the MMIS using these new platforms. The SHN 1 and 2 MMIS has been developed using POSAFE-Q platform for safety and OPERASYSTEM for non-safety system. Through the utilization of the standardized platform, the safety system was developed using the above hardware and software blocks resulting in efficient safety system development. An integrated CASE tool has been setup for reliable software development. The integrated development environment has been setup formally resulting in consistent work. Even we have setup integrated development environment, the independent verification and validation including testing environment needs to be setup for more advanced environment which will be used for future plant.

  14. Development of Fingerprint Attendance Machine Statistical Software Based on JAVA%基于JAVA的指纹考勤机统计软件的开发

    Institute of Scientific and Technical Information of China (English)

    李德平

    2012-01-01

    This paper analyzes that data generated by the fingerprint attendance machine software does not meet the statistical requirements of enterprises.And then it proposes solutions that can meet the enterprises' needs for real time attendance statistics,based on which the author works out new statistical algorithms and develops the attendance statistical software based on JAVA.%本文分析了指纹考勤机自带软件导出的考勤数据不能满足企业实际统计要求的局限,提出了可以满足企业实际考勤统计需要的解决方案,设计出了新的统计算法,并开发出了基于JAVA的考勤统计软件。

  15. 关于洗衣机产品进行软件评估的探讨%Discussion on the Software Evaluation of Washing Machine Products

    Institute of Scientific and Technical Information of China (English)

    胡润泽

    2015-01-01

    Based on the basic definitions and test items of IEC 60335-1:2010, Household and similar electrical appli-ances - Safety - Part 1: General requirements, this paper summarizes and analyzes the feasibility of software evalua-tion for washing machine products. And it discusses the technique demands for software. evaluation in the Annex R of IEC 60335-1:2010.%本文基于IEC 60335-1:2010《家用和类似用途电器的安全第一部分:通用要求》的基本定义和测试条款,对洗衣机产品进行软件评估的适用性进行了归纳分析,探讨了标准IEC 60335-1:2010附录R对软件评估的技术要求。

  16. Path planning of multi head drilling machine and simulation software development%多头钻床轨迹规划和仿真软件的开发

    Institute of Scientific and Technical Information of China (English)

    梁全

    2011-01-01

    针对多主轴头的数控钻床钻孔路径规划问题进行了分析,在充分考虑机床的机械结构和加工效率要求的前提下,提出了多头数控钻床的钻孔路径规划算法.首先分析了DXF文件的结构,接下来将二维空间中的孔群分解成一维空间中的孔群进行钻孔路径的规划.为了验证算法和真实加工的可行性,还开发了仿真软件,针对某工程中的某管板类零件规划了钻孔路径,并进行了仿真加工.仿真加工结果证明,开发的算法正确可靠,可以用来进行多头钻的钻孔路径规划.%This paper analyzed the problem of drilling path planning about CNC drill machine. Under the premise of taking account of the mechanical structure and processing machine efficiency requirements, promoting path planning algorithm about multi spindles CNC drilling machine. First, this paper analyzed the structure of DXF file, then dividing the two dimensional space hole-group into one dimensional hole-group to plan. In order to verify the feasibility of processing algorithms and real application, this paper also developed simulation software, planned drilling path for a tube plate in actual application and made simulation machining. Simulation results show that the algorithm is accurate and can be used in drilling path planning in multi spindle CNC drilling machine.

  17. Objective detection of apoptosis in rat renal tissue sections using light microscopy and free image analysis software with subsequent machine learning: Detection of apoptosis in renal tissue.

    Science.gov (United States)

    Macedo, Nayana Damiani; Buzin, Aline Rodrigues; de Araujo, Isabela Bastos Binotti Abreu; Nogueira, Breno Valentim; de Andrade, Tadeu Uggere; Endringer, Denise Coutinho; Lenz, Dominik

    2017-02-01

    The current study proposes an automated machine learning approach for the quantification of cells in cell death pathways according to DNA fragmentation. A total of 17 images of kidney histological slide samples from male Wistar rats were used. The slides were photographed using an Axio Zeiss Vert.A1 microscope with a 40x objective lens coupled with an Axio Cam MRC Zeiss camera and Zen 2012 software. The images were analyzed using CellProfiler (version 2.1.1) and CellProfiler Analyst open-source software. Out of the 10,378 objects, 4970 (47,9%) were identified as TUNEL positive, and 5408 (52,1%) were identified as TUNEL negative. On average, the sensitivity and specificity values of the machine learning approach were 0.80 and 0.77, respectively. Image cytometry provides a quantitative analytical alternative to the more traditional qualitative methods more commonly used in studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Research on the Software Framework of Lockstitch Sewing Machine%工业平缝机控制系统的软件构架设计

    Institute of Scientific and Technical Information of China (English)

    胡延苏; 何德全; 高昂

    2009-01-01

    目前工业平缝机控制系统的研究主要集中在电机控制及硬件系统的实现上,而对软件构架的涉及很少.重点介绍了电机双闭环伺服系统和平缝机控制系统的设计,针对平缝机控制系统提出了一种降低软件耦合度的基于微任务的软件框架,并分析和说明了该设计的可行性.%The current research on control systems of lockstitch sewing machine mostly focus on the driver of motor and the reality of hardware systems, but do little about the software framework. After introducing the de-sign of the Double-loop servo motor control system and sewing machine system, an integrate software framework aiming at eliminating soft coupling is designed based on the micro mission, and the feasibility of the design is analyzed and verified.

  19. Community Capacity Building as a vital mechanism for enhancing the growth and efficacy of a sustainable scientific software ecosystem: experiences running a real-time bi-coastal "Open Science for Synthesis" Training Institute for young Earth and Environmental scientists

    Science.gov (United States)

    Schildhauer, M.; Jones, M. B.; Bolker, B.; Lenhardt, W. C.; Hampton, S. E.; Idaszak, R.; Rebich Hespanha, S.; Ahalt, S.; Christopherson, L.

    2014-12-01

    Continuing advances in computational capabilities, access to Big Data, and virtual collaboration technologies are creating exciting new opportunities for accomplishing Earth science research at finer resolutions, with much broader scope, using powerful modeling and analytical approaches that were unachievable just a few years ago. Yet, there is a perceptible lag in the abilities of the research community to capitalize on these new possibilities, due to lacking the relevant skill-sets, especially with regards to multi-disciplinary and integrative investigations that involve active collaboration. UC Santa Barbara's National Center for Ecological Analysis and Synthesis (NCEAS), and the University of North Carolina's Renaissance Computing Institute (RENCI), were recipients of NSF OCI S2I2 "Conceptualization awards", charged with helping define the needs of the research community relative to enabling science and education through "sustained software infrastructure". Over the course of our activities, a consistent request from Earth scientists was for "better training in software that enables more effective, reproducible research." This community-based feedback led to creation of an "Open Science for Synthesis" Institute— a innovative, three-week, bi-coastal training program for early career researchers. We provided a mix of lectures, hands-on exercises, and working group experience on topics including: data discovery and preservation; code creation, management, sharing, and versioning; scientific workflow documentation and reproducibility; statistical and machine modeling techniques; virtual collaboration mechanisms; and methods for communicating scientific results. All technologies and quantitative tools presented were suitable for advancing open, collaborative, and reproducible synthesis research. In this talk, we will report on the lessons learned from running this ambitious training program, that involved coordinating classrooms among two remote sites, and

  20. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: An Earth Modeling System Software Framework Strawman Design that Integrates Cactus and UCLA/UCB Distributed Data Broker

    Science.gov (United States)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task. both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation, while maintaining high performance across numerous supercomputer and workstation architectures. This document proposes a strawman framework design for the climate community based on the integration of Cactus, from the relativistic physics community, and UCLA/UCB Distributed Data Broker (DDB) from the climate community. This design is the result of an extensive survey of climate models and frameworks in the climate community as well as frameworks from many other scientific communities. The design addresses fundamental development and runtime needs using Cactus, a framework with interfaces for FORTRAN and C-based languages, and high-performance model communication needs using DDB. This document also specifically explores object-oriented design issues in the context of climate modeling as well as climate modeling issues in terms of object-oriented design.

  1. 带冠涡轮盘电火花加工专用CAD/CAM软件开发%A special CAD/CAM software for electro-discharge machining of shrouded turbine blisks

    Institute of Scientific and Technical Information of China (English)

    李刚; 赵万生; 王振龙; 吴湘

    2007-01-01

    In this paper, a special-purpose CAD/CAM software package, BliskCad/Cam, based on a commercial CAD/CAM software Unigraphics is developed to reduce difficulties in CNC-EDM of the shrouded turbine blisks. The software package consists of five modules such as electrode design, path searching, and machining simulation module. Functions of BliskCad/Cam include parametrical reconstruction of 3-D model of the blisk, intelligent design of complex shaped electrode, automatic generation of NC codes, search of interference-free tool path for multi-axis NC-EDM and machining simulation, etc. Experimental verification is conducted by using BliskCad/Cam and the results show that it satisfies the requirements, and can realize precision machining and reduce accessorial time remarkably.

  2. Man versus Machine: Software Training for Surgeons-An Objective Evaluation of Human and Computer-Based Training Tools for Cataract Surgical Performance.

    Science.gov (United States)

    Din, Nizar; Smith, Phillip; Emeriewen, Krisztina; Sharma, Anant; Jones, Simon; Wawrzynski, James; Tang, Hongying; Sullivan, Paul; Caputo, Silvestro; Saleh, George M

    2016-01-01

    This study aimed to address two queries: firstly, the relationship between two cataract surgical feedback tools for training, one human and one software based, and, secondly, evaluating microscope control during phacoemulsification using the software. Videos of surgeons with varying experience were enrolled and independently scored with the validated PhacoTrack motion capture software and the Objective Structured Assessment of Cataract Surgical Skill (OSACCS) human scoring tool. Microscope centration and path length travelled were also evaluated with the PhacoTrack software. Twenty-two videos correlated PhacoTrack motion capture with OSACCS. The PhacoTrack path length, number of movements, and total procedure time were found to have high levels of Spearman's rank correlation of -0.6792619 (p = 0.001), -0.6652021 (p = 0.002), and -0.771529 (p = 0001), respectively, with OSACCS. Sixty-two videos evaluated microscope camera control. Novice surgeons had their camera off the pupil centre at a far greater mean distance (SD) of 6.9 (3.3) mm, compared with experts of 3.6 (1.6) mm (p ≪ 0.05). The expert surgeons maintained good microscope camera control and limited total pupil path length travelled 2512 (1031) mm compared with novices of 4049 (2709) mm (p ≪ 0.05). Good agreement between human and machine quantified measurements of surgical skill exists. Our results demonstrate that surrogate markers for camera control are predictors of surgical skills.

  3. 基于Microsoft Office软件的科研项目时间管理%Time Management of Scientific Research Project Based on Microsoft Office Software

    Institute of Scientific and Technical Information of China (English)

    阮鹏

    2012-01-01

    阐述了当前科研项目时间管理存在的问题和难题,提出科研项目申请书的时间进度安排具有模糊性,与后期实际研究需要有出入;科研项目计划研究与实际研究内容不容易精确,因此申请书的研究内容影响实际研究的任务分解;科研人员往往缺乏项目管理知识和方法,不熟悉使用专业工具进行科研项目时间管理.作者从实际科研管理出发,研究基于Microsoft Office系统中的word和Project软件来分别制作科研项目工作分解结构图与研究进度的甘特图,并以实例进行运用说明.%This article elaborates the existing problems and dilemma of current time management of scientific research projects, and proposes as follows: time progress arrangement of scientific research item requisition has the fuzziness, which cannot meet the need of the actual research in later period; plan research content and actual research content of scientific research projects are not very precise, therefore requisition research content influence actual research task de-composition ; scientific researchers often lack the project management knowledge and method, and are not familiar with the specialized tool to carry on the time management of scientific research project. The author embarks from the actual scientific research management, studies work breakdown structure and research progress Gantt chart separately based on Microsoft Office system s Word and Project software, and carries on the utilization explanation by the examples.

  4. Design for script interpreter virtual machine of embedded configuration software%嵌入式组态软件脚本解释虚拟机的设计

    Institute of Scientific and Technical Information of China (English)

    廖义奎; 李智; 李为民; 韦卫星; 韦方海

    2012-01-01

    In order to enhance the function of the embedded configuration software, a C-like script is introduced. A compiler is designed to compile the script into Intermediate code, using intermediate code the running speed is improved, and the design difficulty of the script interpreter is reduced. A design of imitation microprocessor architecture is proposed for script virtual machine. Then the intermediate code is explained at run. Virtual machine is mainly composed of the program memory, instruction decoder, arithmetic unit, program counter, controller, and dynamic containers. Among them, the dynamic design of the container is the most critical, which has dynamically allocate memory, automatically release memory, etc. And is suitable for running in the embedded operating system. By experiments and tests show that, the script interpreter virtual machine can satisfy the design requirements for the embedded configuration software.%为了增强嵌入式组态软件的功能,引入一种类似于C语言的脚本.设计一个编译器把该脚本编译成中间代码,采用中间代码的优点是可提高程序运行的速度,也减小了脚本解释程序的设计难度.提出一种仿微处理器结构的脚本虚拟机设计方案,在运行时对中间代码进行解释.虚拟机主要由程序存储器、指令译码器、运算器、程序计数器、控制器以及动态容器组成,其中动态容器的设计是关键,它具有可动态分配内存、自动释放内存等优点,适合于嵌入式操作系统下运行.实验与测试结果表明,该脚本解释虚拟机可满足嵌入式组态软件设计的要求.

  5. Civacuve analysis software for mis machine examination of pressurized water reactor vessels; Civacuve logiciel d'analyse des controles mis des cuves de reacteurs nucleaires

    Energy Technology Data Exchange (ETDEWEB)

    Dubois, Ph.; Gagnor, A. [Intercontrole, 94 - Rungis (France)

    2001-07-01

    The product software CIVACUVE is used by INTERCONTROLE for the analysis of UT examinations, for detection, performed by the In-Service Inspection Machine (MIS) of the vessels of nuclear power plants. This software is based on an adaptation of an algorithm of SEGMENTATION (CEA CEREM), which is applied prior to any analysis. It is equipped with tools adapted to industrial use. It allows to: - perform image analysis thanks to advanced graphic tools (Zooms, True Bscan, 'contour' selection...), - backup of all data in a database (complete and transparent backup of all informations used and obtained during the different analysis operations), - connect PC to the Database (export of Reports and even of segmented points), - issue Examination Reports, Operating Condition Sheets, Sizing curves... - and last, perform a graphic and numerical comparison between different inspections of the same vessel. Used in Belgium and France on different kind of reactor vessels, CIVACUVE has allowed to show that the principle of SEGMENTATION can be adapted to detection exams. The use of CIVACUVE generates a important time gain as well as the betterment of quality in analysis. Wide data opening toward PC's allows a real flexibility with regard to client's requirements and preoccupations.

  6. What Comes First, the OWL or the Bean? Creating Reusable Scientific Software with OWL/RDF Vocabularies.

    Energy Technology Data Exchange (ETDEWEB)

    Stephan, Eric G.; Elsethagen, Todd O.; Kleese van Dam, Kerstin; Riihimaki, Laura D.

    2013-09-30

    In the spring of 2013 the U.S. White House by executive order mandated: “Government information shall be managed as an asset throughout its life cycle to promote interoperability and openness, and, wherever possible and legally permissible, to ensure that data are released to the public in ways that make the data easy to find, accessible, and usable.” Key for the reusability of any scientific data is hereby the availability of metadata describing the published data in a vocabulary that is familiar to its potential users. The objective of this paper is to help scientific application developers who want to adopt the continuous stream of new community vocabularies to help make their data sharable, self-describable, and easily understood. To achieve this we suggest semantic vocabulary and application integration best practices and discuss the tradeoffs of encoding vocabularies through code versus deriving code from vocabularies.

  7. Automatically-Programed Machine Tools

    Science.gov (United States)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  8. Automatically-Programed Machine Tools

    Science.gov (United States)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  9. 科学计算自由软件SCILAB在考试分析中的应用%The Application of Scientific Computing Software SCILAB in Exam Analysis

    Institute of Scientific and Technical Information of China (English)

    曲霄红

    2011-01-01

    It needs to analyze and explain the extensive exam results after the large-scale examinations,thus people can judge the students scientifically.The free scientific computing software SCILAB is the distinguished open-source software,with its numerical calc%大规模教育考试之后需要对大量的考试成绩进行分析和解释,对考生做出决策。科学计算自由软件SCILAB是著名的开源软件,以其数值计算和结果可视化受到人们喜爱,主要介绍了SCILAB软件中的结果可视化功能在考试成绩分析中的应用,利用常用的统计指标来绘制成绩的分布图表,使考试成绩分布特征直观明了,有助于教育管理者把握考试整体情况。

  10. Quantum machine learning.

    Science.gov (United States)

    Biamonte, Jacob; Wittek, Peter; Pancotti, Nicola; Rebentrost, Patrick; Wiebe, Nathan; Lloyd, Seth

    2017-09-13

    Fuelled by increasing computer power and algorithmic advances, machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that classical systems are thought not to produce efficiently, so it is reasonable to postulate that quantum computers may outperform classical computers on machine learning tasks. The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers. Recent work has produced quantum algorithms that could act as the building blocks of machine learning programs, but the hardware and software challenges are still considerable.

  11. [Method for evaluating the mechanical isocenter of the gantry of a radiotherapy machine with motion picture trace analysis software].

    Science.gov (United States)

    Yanano, Nobutaka; Fujibuchi, Toshioh

    2012-01-01

    In recent years, development of advanced radiotherapy technology has resulted in an improvement in radiotherapy. Although the radiotherapy system has improved, the effect of the gap, the gyration center, and distortion of the rotation orbit cannot be neglected. Therefore, a verification method for a geometrical isocenter and rotation orbit in a three-dimension (3D) space is required. We developed a verification method for determination of the geometrical isocenter. In this method, the rotation of the gantry that applied the measured target from two directions was imaged and analyzed using animation pursuit analysis software. The measurement targets were pursued by analysis, and the rotation orbit of the target was visually evaluated from obtained coordinates and displacement distance. The gyration center in 3D space was calculated from pursued coordinates and compared with the intersection in the side laser and crosshair. In this verification method, the rotation orbit and geometrical isocenter in the 3D space were confirmed, and visually evaluated. Thus, this method was effective in verifying the geometrical isocenter by solving the problem of the measurement precision and reproducibility.

  12. Educational Software.

    Science.gov (United States)

    Northwest Regional Educational Lab., Portland, OR.

    The third session of IT@EDU98 consisted of five papers on educational software and was chaired by Tran Van Hao (University of Education, Ho Chi Minh City, Vietnam). "Courseware Engineering" (Nguyen Thanh Son, Ngo Ngoc Bao Tran, Quan Thanh Tho, Nguyen Hong Lam) briefly describes the use of courseware. "Machine Discovery Theorems in Geometry: A…

  13. Software Patents.

    Science.gov (United States)

    Burke, Edmund B.

    1994-01-01

    Outlines basic patent law information that pertains to computer software programs. Topics addressed include protection in other countries; how to obtain patents; kinds of patents; duration; classes of patentable subject matter, including machines and processes; patentability searches; experimental use prior to obtaining a patent; and patent…

  14. Software Reviews.

    Science.gov (United States)

    Science and Children, 1990

    1990-01-01

    Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor"; and "Geology Search." Cost, quality, hardware, and…

  15. Solving the Software Legacy Problem with RISA

    Science.gov (United States)

    Ibarra, A.; Gabriel, C.

    2012-09-01

    Nowadays hardware and system infrastructure evolve on time scales much shorter than the typical duration of space astronomy missions. Data processing software capabilities have to evolve to preserve the scientific return during the entire experiment life time. Software preservation is a key issue that has to be tackled before the end of the project to keep the data usable over many years. We present RISA (Remote Interface to Science Analysis) as a solution to decouple data processing software and infrastructure life-cycles, using JAVA applications and web-services wrappers to existing software. This architecture employs embedded SAS in virtual machines assuring a homogeneous job execution environment. We will also present the first studies to reactivate the data processing software of the EXOSAT mission, the first ESA X-ray astronomy mission launched in 1983, using the generic RISA approach.

  16. Aides logicielles à la lecture de textes documentaires scientifiques Evaluating a software to help students to understand scientific text

    Directory of Open Access Journals (Sweden)

    Brigitte Marin

    2007-03-01

    Full Text Available Une série de recherches a permis de mettre au point un logiciel hypertexte d'aide à la compréhension des textes scientifiques et d'évaluer l'effet de deux types de notes (les unes fondées sur la "base de texte" les autres sur le "modèle de situation", sur support papier et sur écran, dans des situations de lecture "pour s'entraîner" et dans des situations où la lecture est liée à l'élaboration et à la résolution d'un problème scientifique. Les résultats mettent en évidence une aide plus importante à la construction d'une représentation cohérente de la situation évoquée par le texte dans la présentation hypertextuelle, en particulier lorsque le lecteur bénéficie de notes centrées sur le modèle de situation, c'est-à-dire lui fournissant de manière explicite et explicitement reliées aux informations du texte des connaissances permettant d'en combler les "trous sémantiques".A series of studies have allowed to elaborate a hypertext computer system that helps students to understand scientific texts and to evaluate the effect of two kinds of explanatory notes. The first ones are based on the "text base", the other ones are based on the "situation model". They are presented on a sheet of paper and on a computer screen in different situations where reading aims at training and where reading is in order to create and solve a scientific problem. The results show that the hypertext presentation helps the students to build a more coherent representation of the situation evoked by the text. This particularly happens when the reader gets explanatory notes in connection with the "situation model" that provides him clear knowledge in connection with the information of the text, that allows him to fill in the "semantic blanks" of the text.

  17. Application of Open Source Scientific Numerical Calculation Software QtOctave%开源科学数值计算软件QtOctave的应用

    Institute of Scientific and Technical Information of China (English)

    关东; 岳云娟; 邹晨

    2012-01-01

    QtOctave is a Linux operating system based on graphical interface on the integrated development environment of scientific numerical calculation software. It is a Matlab-like software by the GNU project support, and its Octave language is compatible with Matlab m language. QtOctave can be easily applied to a lorentz attractor numerical simulation experiment. It shows that QtOctave has the same advantages as Matalb: easy to use, easy to learn and powerful. QtOctave has a high val- ue of popularization in the numerical simulation experiment and teaching activities.%QtOctave是一个Linux操作系统之上的基于图形界面的集成开发环境的科学数值计算软件。它是一个由GNU项目支持的类Matlab软件,其使用的Octave语言兼容Matlab的m语言。以使用QtOctave进行洛伦兹吸引子的数值模拟实验为例,说明该软件和Matlab同样具有易使用、易学习、功能强大的优点,在数值模拟实验和教学活动中具备较好的推广价值。

  18. Army-NASA aircrew/aircraft integration program. Phase 5: A3I Man-Machine Integration Design and Analysis System (MIDAS) software concept document

    Science.gov (United States)

    Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Neukom, Christian; Nishimura, Sayuri; Prevost, Michael; Shankar, Renuka; Staveland, Lowell; Smith, Greg

    1992-01-01

    This is the Software Concept Document for the Man-machine Integration Design and Analysis System (MIDAS) being developed as part of Phase V of the Army-NASA Aircrew/Aircraft Integration (A3I) Progam. The approach taken in this program since its inception in 1984 is that of incremental development with clearly defined phases. Phase 1 began in 1984 and subsequent phases have progressed at approximately 10-16 month intervals. Each phase of development consists of planning, setting requirements, preliminary design, detailed design, implementation, testing, demonstration and documentation. Phase 5 began with an off-site planning meeting in November, 1990. It is expected that Phase 5 development will be complete and ready for demonstration to invited visitors from industry, government and academia in May, 1992. This document, produced during the preliminary design period of Phase 5, is intended to record the top level design concept for MIDAS as it is currently conceived. This document has two main objectives: (1) to inform interested readers of the goals of the MIDAS Phase 5 development period, and (2) to serve as the initial version of the MIDAS design document which will be continuously updated as the design evolves. Since this document is written fairly early in the design period, many design issues still remain unresolved. Some of the unresolved issues are mentioned later in this document in the sections on specific components. Readers are cautioned that this is not a final design document and that, as the design of MIDAS matures, some of the design ideas recorded in this document will change. The final design will be documented in a detailed design document published after the demonstrations.

  19. Design of Software of Command and Control System with Multitask and Man-machine Interaction Based on Tilcon%基于Tilcon的指控系统多任务人机交互软件设计

    Institute of Scientific and Technical Information of China (English)

    朱伟; 许春雷; 孔军

    2011-01-01

    Tilcon是军事特别是指挥控制领域广泛使用的人机交互界面开发工具.简单介绍了Tilcon的组成和人机交互处理流程,分析了应用Tilcon处理多任务人机交互事件的不足,设计了一个基于Tilcon的指控系统多任务人机交互软件架构,并给出了具体实现方法.工程应用表明,该人机交互软件体系结构清晰,具有较好的健壮性、开放性和兼容性,可显著提高指控系统人机交互处理的实时性.%Tilcon is the main development tool of man-machine interaction of command and control system. Tilcon's composition and method of Man-machine Interaction are simply introduced. After analyzing the shortage of handling multitask and man-machine interaction with Tilcon, a multitask man-machine interaction software architecture of command and control system and a realizing method based on Tilcon are lay out. According to practical application, the architecture is legible, with preferable robustness, open and compatibility. It can obviously improve real-time performance of command and control system's Man-machine Interaction.

  20. Workstation software framework

    Science.gov (United States)

    Andolfato, L.; Karban, R.

    2008-08-01

    The Workstation Software Framework (WSF) is a state machine model driven development toolkit designed to generate event driven applications based on ESO VLT software. State machine models are used to generate executables. The toolkit provides versatile code generation options and it supports Mealy, Moore and hierarchical state machines. Generated code is readable and maintainable since it combines well known design patterns such as the State and the Template patterns. WSF promotes a development process that is based on model reusability through the creation of a catalog of state machine patterns.

  1. Software Innovations Speed Scientific Computing

    Science.gov (United States)

    2012-01-01

    To help reduce the time needed to analyze data from missions like those studying the Sun, Goddard Space Flight Center awarded SBIR funding to Tech-X Corporation of Boulder, Colorado. That work led to commercial technologies that help scientists accelerate their data analysis tasks. Thanks to its NASA work, the company doubled its number of headquarters employees to 70 and generated about $190,000 in revenue from its NASA-derived products.

  2. SOFTWARE TOOLS; program development interface. [Base version (This version is not tailored to any one machine but serves as a portable base for the user who can add ''primitives'' or modify the base source to tailor SOFTWARE TOOLS to the local computing environment. ); FORTRAN IV and RATFOR

    Energy Technology Data Exchange (ETDEWEB)

    Scherrer, D.K.

    One of the problems encountered by computer users is the lack of common utility routines for different computer systems. The software which was initially presented in Kernighan and Plauger's SOFTWARE TOOLS represented a first step toward a solution to this problem. A common editor, text formatter, sort, and other program development tools were presented through two mechanisms: (a) all source was written in RATFOR, a FORTRAN preprocessor language directly translatable into FORTRAN, and (b) system-dependent routines were pushed down either into macro replacements or primitive function calls, to be implemented by the individual charged with bringing up the utilities in the local computing environment. These mechanisms, together with adoption of certain conventions pertaining to data types, permit many sites running different operating systems to implement these tools. If the shell, or command line interpreter, is implemented, this software can essentially define a portable ''virtual operating system'' providing inter-system uniformity at the three levels of user interface--virtual machine (the primitives), utilities, and command language. The SOFTWARE TOOLS package consists of a set of program development utilities and a program library modelled after the Bell Laboratories' proprietary UNIX operating system.Base version (This version is not tailored to any one machine but serves as a portable base for the user who can add ''primitives'' or modify the base source to tailor SOFTWARE TOOLS to the local computing environment.); FORTRAN IV and RATFOR.

  3. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...... rationalistic ways of thinking which stifle the ability to innovate. Professional software developers are often drowned in commercial drudgery and overwhelmed by work pressure and deadlines. The topic that will both ensure success in the market and revitalize their work lives is never addressed. This book sets...... out the new field of software innovation. It organizes the existing scientific research into eight simple heuristics - guiding principles for organizing a system developer's work-life so that it focuses on innovation....

  4. Debugging the virtual machine

    Energy Technology Data Exchange (ETDEWEB)

    Miller, P.; Pizzi, R.

    1994-09-02

    A computer program is really nothing more than a virtual machine built to perform a task. The program`s source code expresses abstract constructs using low level language features. When a virtual machine breaks, it can be very difficult to debug because typical debuggers provide only low level machine implementation in formation to the software engineer. We believe that the debugging task can be simplified by introducing aspects of the abstract design into the source code. We introduce OODIE, an object-oriented language extension that allows programmers to specify a virtual debugging environment which includes the design and abstract data types of the virtual machine.

  5. The Altar Machine in the Church Mother of Gangi (Palermo, Italy. Interpretation of the past uses, scientific investigation and preservation challenge

    Directory of Open Access Journals (Sweden)

    Monaco Angela

    2012-05-01

    Full Text Available Abstract Background The aim of this work was to study the Altar Machine in the Church Mother of Gangi, a little town near Palermo (Italy regarding the history, the technical manufacture, the constitutive materials and the state of preservation. The Altar Machine was dated back to the second half of the 18th century; it is constituted by carved and painted wood, a complex system of winch and pulleys allows move various statues and parts of the Machine in accordance with the baroque scenography machineries. Results The observation and survey of the mechanisms allowed formulate hypothesis on a more ancient mode of operation of the Altar Machine. Laboratory analysis revealed the presence of many superimposed layers constituted by several different materials (protein binders, siccative oils, natural terpene resins, shellac, calcium carbonate, gypsum, lead white, brass, zinc white, iron oxides and different wood species employed for the original and restoration elements of the Machine. This is due to a continuous usage of the object that has got a demo-ethno-anthropological significance. Microclimate monitoring (relative humidity RH and temperature T put in evidence that most of the data fall outside the tolerance intervals, i.e. the RH and T limits defined by the international standards. In particular, T values were generally high (out of the tolerance range but they appeared to be quite constant; on the other hand RH values fell almost always inside the tolerance area but they often exhibited dangerous variations. Conclusions The characterization of the constitutive materials provided useful information both to support the dating of the Machine proposed by the inscription and to obtain a base of data for a possible conservation work. The microclimate monitoring put in evidence that the temperature and relative humidity values are not always suitable to correctly preserve the artefact. The careful in situ investigation confirmed an on-going climate

  6. Feature Recognition for Virtual Machining

    OpenAIRE

    Xú, Shixin; Anwer, Nabil; Qiao, Lihong

    2014-01-01

    International audience; Virtual machining uses software tools to simulate machining processes in virtual environments ahead of actual production. This paper proposes that feature recognition techniques can be applied in the course of virtual machining, such as identifying some process problems, and presenting corresponding correcting advices. By comparing with the original CAD model, form errors of the machining features can be found. And then corrections are suggested to process designers. T...

  7. Research on the Development of the System Software of CNC grinder machines Based on Windows CE%基于Windows CE的数控磨床系统软件的开发与研究

    Institute of Scientific and Technical Information of China (English)

    刘露; 樊泽明

    2011-01-01

    文章首先对基于ARM的数控磨床系统进行了专门的研究.并以Windows CE作为软件开发平台,运用Embedded Visual C++开发工具设计出系统的人机界面以及整个软件平台.主要论述了WinCE操作系统的定制,文件系统以及编译模块的设计.并且,给出了数控磨床系统的人机界面与软件系统的设计及开发方案.最后.完成整个软件系统的开发.%Researched the system of CNC grinder machines based on ARM,this paper made Windows CE as software platform to develop the human-computer interface and the software platform of the system by making use of Embedded Visual C ++ as develop tool。 Furthermore,it mainly discussed the configuration of WinCE operation system and the design of file system and compiled modules specially. In addition, it provided the design and development program of human-computer interface and the software system of the CNC grinder system and the accomplishment of the development of the software system finally.

  8. Virtual reality devices integration in scientific visualization software in the VtkVRPN framework; Integration de peripheriques de realite virtuelle dans des applications de visualisation scientifique au sein de la plate-forme VtkVRPN

    Energy Technology Data Exchange (ETDEWEB)

    Journe, G.; Guilbaud, C

    2005-07-01

    A high-quality scientific visualization software relies on ergonomic navigation and exploration. Those are essential to be able to perform an efficient data analysis. To help solving this issue, management of virtual reality devices has been developed inside the CEA 'VtkVRPN' framework. This framework is based on VTK, a 3D graphical library, and VRPN, a virtual reality devices management library. This document describes the developments done during a post-graduate training course. (authors)

  9. Software Program Reconstructions of Automatic Medicine Packing Machine in Our Hospital%我院全自动药品分包机的软件程序改造

    Institute of Scientific and Technical Information of China (English)

    邓思韵; 王玉紫; 梁嘉俊; 吴昭仪

    2016-01-01

    OBJECTIVE:To promote the working performace of automatic medicine packing machine in our hospital. METH-ODS:Combined with the problems we met in the use of the automatic medicine packing machine,the reconstructions of software functions,such as drug inventory management,drug identification,specific drug sub-package and document printing of non-pack-age drug,were introduced. Related indicators were compared before and after reconstruction. RESULTS:After the software recon-struction,compared with 2012,loss events of expire drug due to poor sales fell by 70% in 2013;drug dispensing errors related to medicine packing machine dropped by 1/2;the time of drug checking shortened by 1/5. CONCLUSIONS:The software program re-construction of automatic medicine packing machine can improve the work efficiency of pharmacy and the accuracy of drug dispens-ing,further guarantee the quality and safety of drugs,and meet the practical demand of our hospital.%目的:提升全自动药品分包机的工作性能.方法:结合在使用全自动药品分包机中遇到的实际问题,介绍我院在分包机库存管理、药品标识、特定药品分包、不分包药品单据打印这4个软件功能的改造情况,并比较改造前后的相关指标.结果:经改造分包机软件后,与2012年比较,2013年因机内药品滞销而过期的报损事件减少了70%;与分包机相关的药品配发差错减少1/2;药品核对时间缩短了1/5.结论:我院对全自动药品分包机的软件程序改造,有效提高了药房的工作效率与摆药的准确率,进一步确保了药品的质量安全,更切合我院的实际工作需求.

  10. 压框可旋转式电子花样机软件控制系统研究%Research on the Software Control System of the Rotary Electronic Pattern Machine

    Institute of Scientific and Technical Information of China (English)

    章小龙

    2016-01-01

    嵌入式系统软硬件协同设计技术非常重要,可以有效提高系统的开发效率和质量。本文在分析压框可旋转式电子花样机的结构特点和嵌入式系统的基础上,主要研究压框可旋转式电子花样机机的电运动控制方法和仿真系统原理,并设计和开发电子原型机电控制系统仿真环境,模拟环境有助于电子花样机控制系统的开发、调试和正确性验证。%it is very important to design the hardware and software of embedded system, which can effectively improve the efficiency and quality of the system. In the analysis of the pressure frame rotary electronic pattern sewing machine structure and the embedded system based on, the main research pressure box rotary electro pattern sewing machine electric motor control method and the principle of simulation system, and the design and development of electronic prototype electromechanical control system simulation environment, the simulation environment is helpful to electronic pattern sewing machine control system development, debugging and verification of the correctness of the.

  11. Machine function based control code algebras

    NARCIS (Netherlands)

    Bergstra, J.A.

    2008-01-01

    Machine functions have been introduced by Earley and Sturgis in [6] in order to provide a mathematical foundation of the use of the T-diagrams proposed by Bratman in [5]. Machine functions describe the operation of a machine at a very abstract level. A theory of hardware and software based on machin

  12. Singer CNC sewing and embroidery machine

    Directory of Open Access Journals (Sweden)

    Lokodi Zsolt

    2011-12-01

    Full Text Available This paper presents the adaptation of a classic foot pedal operated Singer sewing machine to a computerized numerical control (CNC sewing and embroidery machine. This machine is composed of a Singer sewing machine and a two-degrees-of-freedom XY stage designed specifically for this application. The whole system is controlled from a PC using adequate CNC control software.

  13. A classification algorithm of defect prediction for software modules based on fuzzy support vector machine%一种基于模糊支持向量机软件模块缺陷检测算法

    Institute of Scientific and Technical Information of China (English)

    郭丽娜; 杨杨

    2012-01-01

    不平衡数据的分类问题是机器学习研究领域的重要问题,有着广泛的应用,如软件模块缺陷检测.基于支持向量机的不平衡数据分类方法是主流的分类方法之一,受到研究者广泛的关注.本文在已有的基于模糊支持向量机的不平衡数据分类方法的基础上,结合抽样技术,提出了基于模糊支持向量机的不平衡数据分类算法和基于模糊支持向量机的不平衡数据分类集成算法.在NASA的两个软件模块缺陷度量数据集CM1和KC3上的实验结果表明了本文新提出算法的有效性.%Classification problem on imbalanced data is a key issue in the machine learning field, obtaining data is unbalanced in many real applications, such as the defect prediction for software modules. The classification methods based on support vector machine for imbalanced data is one of the effective classification approaches, many researchers focus on these methods. Due to the software modules defect metric datasets have the characteristics, such as class imbalance and noise, the prediction models based on the normal support vector machine (SVM) can't get satisfactory results. Therefore, in this paper, we make a relatively in-depth study on support vector machine for predicting software module defects. Based on the previously proposed fuzzy support vector machine for imbalanced data classification (FSVM_CIL), integrating sampling technology, in this paper we introduce two improved algorithms: One is FSCM_CIL_RUS, which combines FSVM_CIL algorithm with random under sampling algorithm. Before building software module defect prediction models using FSVM_CIL, we balance the datasets using random under sampling. And the other is an ensemble algorithm called FSVM_CIL RBBag. This algorithm combines the FSVM_CIL algorithm with roughly balanced bagging algorithm. Using FSVM_CIL algorithm to build base classifiers, and then we ensemble the base classifiers to

  14. Mastering machine learning with scikit-learn

    CERN Document Server

    Hackeling, Gavin

    2014-01-01

    If you are a software developer who wants to learn how machine learning models work and how to apply them effectively, this book is for you. Familiarity with machine learning fundamentals and Python will be helpful, but is not essential.

  15. 小模数齿轮单面啮合测量机测控软件设计%Design of Test and Control Software of Measuring Machine based on Single- flank Testing for Fine- pitch Gears

    Institute of Scientific and Technical Information of China (English)

    张万年; 石照耀

    2011-01-01

    基于Visual C++设计了小模数齿轮单面啮合测量机测控软件.采用圆光栅编码器实现两路角度测量,微位移传感器实现微角度测量,ACR9000控制器实现伺服电机控制,PCI1784数字计数卡实现数据采集.测控软件具有齿轮和电机参数输入、电气控制、自动测量、数据采集、偏差计算、测量结果动态显示和保存、报表打印等功能.测试结果表明该软件功能稳定可靠,人机界面友好,操作方便,可测量小模数齿轮传动误差.%A test and control software of the measuring machine based on single- flank testing for fine- pitch gears is programmed based on Visual C ++ . Two circular gratings are used to measure the angle of spindles, a capacitance displacement sensor is used to measure micro- displacement, a motion controller of ACR9000 is used to control dual - motor synchronous drive, and a data acquisition card of PCI 1784 is used to acquire data. Main functions of the software include parameter inputting, electric control, automatic measuring, data acquisition, deviation calculation,results saving, and inspection report and print. In addition, the inspection results indicate that the software has high stability, friendly man - machine interface and operating convenience. It can be applied to inspect the transmission error of fine- pitch gears.

  16. When Machines Design Machines!

    DEFF Research Database (Denmark)

    2011-01-01

    Until recently we were the sole designers, alone in the driving seat making all the decisions. But, we have created a world of complexity way beyond human ability to understand, control, and govern. Machines now do more trades than humans on stock markets, they control our power, water, gas...... and food supplies, manage our elevators, microclimates, automobiles and transport systems, and manufacture almost everything. It should come as no surprise that machines are now designing machines. The chips that power our computers and mobile phones, the robots and commercial processing plants on which we...... depend, all are now largely designed by machines. So what of us - will be totally usurped, or are we looking at a new symbiosis with human and artificial intelligences combined to realise the best outcomes possible. In most respects we have no choice! Human abilities alone cannot solve any of the major...

  17. N286.7-99, A Canadian standard specifying software quality management system requirements for analytical, scientific, and design computer programs and its implementation at AECL

    Energy Technology Data Exchange (ETDEWEB)

    Abel, R. [R and M Abel Consultants Inc. (Canada)

    2000-07-01

    Analytical, scientific, and design computer programs (referred to in this paper as 'scientific computer programs') are developed for use in a large number of ways by the user-engineer to support and prove engineering calculations and assumptions. These computer programs are subject to frequent modifications inherent in their application and are often used for critical calculations and analysis relative to safety and functionality of equipment and systems. N286.7-99(4) was developed to establish appropriate quality management system requirements to deal with the development, modification, and application of scientific computer programs. N286.7-99 provides particular guidance regarding the treatment of legacy codes.

  18. Advanced Analysis of Nontraditional Machining

    CERN Document Server

    Tsai, Hung-Yin

    2013-01-01

    Nontraditional machining utilizes thermal, chemical, electrical, mechanical and optical sources of energy to form and cut materials. Advanced Analysis of Nontraditional Machining explains in-depth how each of these advanced machining processes work, their machining system components, and process variables and industrial applications, thereby offering advanced knowledge and scientific insight. This book also documents the latest and frequently cited research results of a few key nonconventional machining processes for the most concerned topics in industrial applications, such as laser machining, electrical discharge machining, electropolishing of die and mold, and wafer processing for integrated circuit manufacturing. This book also: Fills the gap of the advanced knowledge of nonconventional machining between industry and research Documents latest and frequently cited research of key nonconventional machining processes for the most sought after topics in industrial applications Demonstrates advanced multidisci...

  19. mlpy: Machine Learning Python

    CERN Document Server

    Albanese, Davide; Merler, Stefano; Riccadonna, Samantha; Jurman, Giuseppe; Furlanello, Cesare

    2012-01-01

    mlpy is a Python Open Source Machine Learning library built on top of NumPy/SciPy and the GNU Scientific Libraries. mlpy provides a wide range of state-of-the-art machine learning methods for supervised and unsupervised problems and it is aimed at finding a reasonable compromise among modularity, maintainability, reproducibility, usability and efficiency. mlpy is multiplatform, it works with Python 2 and 3 and it is distributed under GPL3 at the website http://mlpy.fbk.eu.

  20. mlpy: Machine Learning Python

    OpenAIRE

    Albanese, Davide; Visintainer, Roberto; Merler, Stefano; Riccadonna, Samantha; Jurman, Giuseppe; Furlanello, Cesare

    2012-01-01

    mlpy is a Python Open Source Machine Learning library built on top of NumPy/SciPy and the GNU Scientific Libraries. mlpy provides a wide range of state-of-the-art machine learning methods for supervised and unsupervised problems and it is aimed at finding a reasonable compromise among modularity, maintainability, reproducibility, usability and efficiency. mlpy is multiplatform, it works with Python 2 and 3 and it is distributed under GPL3 at the website http://mlpy.fbk.eu.

  1. 数控铣镗床头库附件头自动更换管理软件开发设计%Design and development on AAC control software of multilayer head magazine for CNC milling-boring machine

    Institute of Scientific and Technical Information of China (English)

    刘志兵; 孙志强

    2011-01-01

    以数控落地铣镗床头库附件头自动更换,配备西门子840D数控系统为例,介绍了基于CNC和PLC控制程序,并融合了用户变量、用户宏变量指令及DRP双口RAM数据交换等功能的,一种开发设计机床功能控制软件的方法,供参考.%Taking AAC control software of CNC floor type milling-boring machine with multilayer head magazine and Siemens 840D for example. Introducing a design and development measure on the application of technique of CNC and PLC programming, combining with the functions sach as UGDB, UMAC and DRP data exchange.

  2. Car2x with software defined networks, network functions virtualization and supercomputers technical and scientific preparations for the Amsterdam Arena telecoms fieldlab

    NARCIS (Netherlands)

    Meijer R.J.; Cushing R.; De Laat C.; Jackson P.; Klous S.; Koning R.; Makkes M.X.; Meerwijk A.

    2015-01-01

    In the invited talk 'Car2x with SDN, NFV and supercomputers' we report about how our past work with SDN [1, 2] allows the design of a smart mobility fieldlab in the huge parking lot the Amsterdam Arena. We explain how we can engineer and test software that handle the complex conditions of the Car2X

  3. Car2x with software defined networks, network functions virtualization and supercomputers technical and scientific preparations for the Amsterdam Arena telecoms fieldlab

    NARCIS (Netherlands)

    Meijer R.J.; Cushing R.; De Laat C.; Jackson P.; Klous S.; Koning R.; Makkes M.X.; Meerwijk A.

    2015-01-01

    In the invited talk 'Car2x with SDN, NFV and supercomputers' we report about how our past work with SDN [1, 2] allows the design of a smart mobility fieldlab in the huge parking lot the Amsterdam Arena. We explain how we can engineer and test software that handle the complex conditions of the Car2X

  4. Haunted by the ghost in the machine. Commentary on "The spirituality of human consciousness: a Catholic evaluation of some current neuro-scientific interpretations".

    Science.gov (United States)

    Miller, James B

    2012-09-01

    Metaphysical and epistemological dualism informs much contemporary discussion of the relationships of science and religion, in particular in relation to the neurosciences and the religious understanding of the human person. This dualism is a foundational artifact of modern culture; however, contemporary scientific research and historical theological scholarship encourage a more holistic view wherein human personhood is most fittingly understood as an emergent phenomenon of, but not simply reducible to, evolutionary and developmental neurobiology.

  5. Development of the Human-machine Monitoring System with Configuration Software for Leak Detectors%气密性检测设备组态软件的人机监控系统开发

    Institute of Scientific and Technical Information of China (English)

    潘赛虎; 段锁林; 高玉梅

    2011-01-01

    In order to implement human-machine interaction for leak detection devices, the novel human-machine monitoring system based on panel PC has been designed. Through the communication between the configuration software and PLC, the system realizes the data exchange, monitor the detection site in real-time, control automatic and semi-automatic operation of the detection device; meanwhile, record and save the parameters and detection results of the valves. The monitoring system using configuration software features superior performance of commercial PC, and also offers ease operation function from touch screen. The implementation of the system provides consultable method for designing similar products.%为实现与气密性检测设备的人机交互,设计了基于平板电脑的新型人机监控系统.该系统通过组态软件与PLC的通信,实现两者数据的交换,实时监控气密性检测现场,并控制气密性检测设备的自动和半自动操作,同时完成对阀门的参数、检测结果的记录和保存.组态监控系统不仅具有普通商用计算机的优良性能,还具有触摸屏操作简单方便的特点.该系统的实现为同类产品的设计提供了可借鉴的方法.

  6. High-end software design for automatic bending machine%嵌入式自动折弯机接口协议及高端软件设计

    Institute of Scientific and Technical Information of China (English)

    谭碧云; 王宜怀

    2012-01-01

    Automatic bending machine follows the traditional CNC bending machine in the processing of the high precision and high efficiency, in addition to this, it increases the function of automatically feed and automatically slot, which further enhances the selectivity and adaptability of the processing materials. Based on the automatic bending machine system structure and function, this paper defines PC and bottom interface protocols, which makes the instruction generated by PC be directly used by bottom via USBo Bottom executes the corresponding instruction and finally realizes the function of automatically feed and automatically slot. Simultaneously, this paper expounds system PC software design scheme which integrated several advanced technologies such as word-model extraction, digital image process and bend and generated a special command file for bottom to use.%自动折弯机沿袭了传统数控折弯机在加工方面的高精度、高效率,在此基础上增加了自动送料及自动开槽功能,进一步增强了加工材料的选择性和适应性。本文基于自动折弯机系统的结构和功能,定义了PC端与底端接口协议,使得PC端生成的命令通过USB直接被底端使用。底端执行相应命令,最终实现自动送料及自动开槽功能。同时本文详细阐述了系统PC端软件设计方案,采用字模提取、数字图像处理、折弯加工等先进技术,生成了特定的命令文件供底端使用。

  7. Feasibility study, software design, layout and simulation of a two-dimensional Fast Fourier Transform machine for use in optical array interferometry

    Science.gov (United States)

    Boriakoff, Valentin

    1994-01-01

    The goal of this project was the feasibility study of a particular architecture of a digital signal processing machine operating in real time which could do in a pipeline fashion the computation of the fast Fourier transform (FFT) of a time-domain sampled complex digital data stream. The particular architecture makes use of simple identical processors (called inner product processors) in a linear organization called a systolic array. Through computer simulation the new architecture to compute the FFT with systolic arrays was proved to be viable, and computed the FFT correctly and with the predicted particulars of operation. Integrated circuits to compute the operations expected of the vital node of the systolic architecture were proven feasible, and even with a 2 micron VLSI technology can execute the required operations in the required time. Actual construction of the integrated circuits was successful in one variant (fixed point) and unsuccessful in the other (floating point).

  8. Funding Research Software Development

    Science.gov (United States)

    Momcheva, Ivelina G.

    2017-01-01

    Astronomical software is used by each and every member of our scientific community. Purpose-build software is becoming ever more critical as we enter the regime of large datasets and simulations of increasing complexity. However, financial investments in building, maintaining and renovating the software infrastructure have been uneven. In this talk I will summarize past and current funding sources for astronomical software development, discuss other models of funding and introduce a new initiative for supporting community software at STScI. The purpose of this talk is to prompt discussion about how we allocate resources to this vital infrastructure.

  9. The Machine Scoring of Writing

    Science.gov (United States)

    McCurry, Doug

    2010-01-01

    This article provides an introduction to the kind of computer software that is used to score student writing in some high stakes testing programs, and that is being promoted as a teaching and learning tool to schools. It sketches the state of play with machines for the scoring of writing, and describes how these machines work and what they do.…

  10. Requirements Engineering in Building Climate Science Software

    Science.gov (United States)

    Batcheller, Archer L.

    2011-01-01

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling…

  11. Current trends in free software research

    OpenAIRE

    Navarro Bosch, Ramon; Vila Marta, Sebastià

    2009-01-01

    This report analyzes how scientific research is studying free software. We find which research is being done on free software by looking into scientific journals and conferences publications. The data thus obtained is analized and the most salient trends related to free software discovered. We also reviewed the main works published in each free software research area.

  12. Requirements Engineering in Building Climate Science Software

    Science.gov (United States)

    Batcheller, Archer L.

    2011-01-01

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling…

  13. Scientific workflows for bibliometrics

    NARCIS (Netherlands)

    Guler, A.T.; Waaijer, C.J.; Palmblad, M.

    2016-01-01

    Scientific workflows organize the assembly of specialized software into an overall data flow and are particularly well suited for multi-step analyses using different types of software tools. They are also favorable in terms of reusability, as previously designed workflows could be made publicly avai

  14. Software Design and Development In a Scientific Environment: Lessons Learned During the Development of STAR, an Astrophysical Analysis and Visualization Package

    Science.gov (United States)

    Domik, Gitta O.; Mickus-Miceli, Kristina D.

    Data visualization has become an important component in the analysis of scientific data. This paper describes two common goals aimed at when extending data analysis environments by visualization: offer visualization tools that fit the mind set of the targeted scientists; and integrate visualization tools into the complexity of existing data analysis tools. A case study performed between astrophysicists and computer scientists describes the design and development of STAR, a user-centered and integrated data analysis and visualization system.

  15. DiFX: A software correlator for very long baseline interferometry using multi-processor computing environments

    CERN Document Server

    Deller, A T; Bailes, M; West, C

    2007-01-01

    We describe the development of an FX style correlator for Very Long Baseline Interferometry (VLBI), implemented in software and intended to run in multi-processor computing environments, such as large clusters of commodity machines (Beowulf clusters) or computers specifically designed for high performance computing, such as multi-processor shared-memory machines. We outline the scientific and practical benefits for VLBI correlation, these chiefly being due to the inherent flexibility of software and the fact that the highly parallel and scalable nature of the correlation task is well suited to a multi-processor computing environment. We suggest scientific applications where such an approach to VLBI correlation is most suited and will give the best returns. We report detailed results from the Distributed FX (DiFX) software correlator, running on the Swinburne supercomputer (a Beowulf cluster of approximately 300 commodity processors), including measures of the performance of the system. For example, to correla...

  16. Challenge in Numerical Software for Microcomputers

    Energy Technology Data Exchange (ETDEWEB)

    Cody, W J

    1977-09-02

    Microcomputers are now capable of serious numerical computation using programmed floating-point arithmetic and Basic compilers. Unless numerical software designers for these machines exploit experience gained in providing software for larger machines, history will repeat with the initial spread of treacherous software. This paper discusses good software, especially for the elementary functions, in terms of reliability and robustness. The emphasis. is on insight rather than detailed algorithms, to show why certain things are important and how they may be achieved.

  17. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  18. Trace Software Pipelining

    Institute of Scientific and Technical Information of China (English)

    王剑; AndreasKrall; 等

    1995-01-01

    Global software pipelining is a complex but efficient compilation technique to exploit instruction-level parallelism for loops with branches.This paper presents a novel global software pipelining technique,called Trace Software Pipelining,targeted to the instruction-level parallel processors such as Very Long Instruction Word (VLIW) and superscalar machines.Trace software pipelining applies a global code scheduling technique to compact the original loop body.The resulting loop is called a trace software pipelined (TSP) code.The trace softwrae pipelined code can be directly executed with special architectural support or can be transformed into a globally software pipelined loop for the current VLIW and superscalar processors.Thus,exploiting parallelism across all iterations of a loop can be completed through compacting the original loop body with any global code scheduling technique.This makes our new technique very promising in practical compilers.Finally,we also present the preliminary experimental results to support our new approach.

  19. CROSS-SECTION GENERATION OF VARIOUS GEO-SCIENTIFIC FEATURES WITHOUT CONTOUR DIGITIZATION USING A VISUAL C++ BASED SOFTWARE APPLICATION ‘VIGAT 2005’

    OpenAIRE

    Dasgupta A. R.; Solanki Ajay M.; Rathod Brijesh G.; Srivastava Naveenchandra N.; Patel Vivek R.; Machhar Suresh P.

    2007-01-01

    Cross-section can be described as a two dimensional dataset where the horizontal distances are represented on the x-axis and the depth on the y-axis. A cross-section is a window into the subsurface.
    This work presents the construction of cross sections with the help of 'Vigat 2005' - a Visual C ++ based software application. Its main purpose is to provide cross section views of geoscientific features
    and to interpret their variation within the area of study. In geologica...

  20. Cross-section generation of various geo-scientific features without contour digitization using a visual c++ based software application ‘vigat 2005’

    OpenAIRE

    Srivastava, Naveenchandra N.; Rathod, Brijesh G.; Solanki, Ajay M.; Machhar, Suresh P.; Patel, Vivek R.; Dasgupta, A R

    2011-01-01

    Cross-section can be described as a two dimensional dataset where the horizontal distances are represented on the x-axis and the depth on the y-axis. A cross-section is a window into the subsurface.This work presents the construction of cross sections with the help of 'Vigat 2005' - a Visual C ++ based software application. Its main purpose is to provide cross section views of geoscientific featuresand to interpret their variation within the area of study. In geological context, profile or cr...

  1. Machine Vision Handbook

    CERN Document Server

    2012-01-01

    The automation of visual inspection is becoming more and more important in modern industry as a consistent, reliable means of judging the quality of raw materials and manufactured goods . The Machine Vision Handbook  equips the reader with the practical details required to engineer integrated mechanical-optical-electronic-software systems. Machine vision is first set in the context of basic information on light, natural vision, colour sensing and optics. The physical apparatus required for mechanized image capture – lenses, cameras, scanners and light sources – are discussed followed by detailed treatment of various image-processing methods including an introduction to the QT image processing system. QT is unique to this book, and provides an example of a practical machine vision system along with extensive libraries of useful commands, functions and images which can be implemented by the reader. The main text of the book is completed by studies of a wide variety of applications of machine vision in insp...

  2. Numerical software: science or alchemy

    Energy Technology Data Exchange (ETDEWEB)

    Gear, C.W.

    1979-06-01

    This is a summary of the Forsythe lecture presented at the Computer Science Conference, Dayton, Ohio, in February 1979. It examines the activity called Numerical Software, first to see what distinguishes numerical software from any other form of software and why numerical software is so much more difficult. Then it examines the scientific basis of such software and discusses that is lacking in that basis.

  3. Computer Software Cataloging: Techniques and Examples.

    Science.gov (United States)

    Holzberlein, Deanne

    1986-01-01

    Examples of catalog entries for microcomputer software data files are given in three sections: educational software (elementary and secondary level, college level); educational game software; business-related software. Catalog record elements, simplification methods for cataloging of machine-readable data files, and future considerations are…

  4. Mini lathe machine converted to CNC

    Directory of Open Access Journals (Sweden)

    Alexandru Morar

    2012-06-01

    Full Text Available This paper presents the adaptation of a mechanical mini-lathing machine to a computerized numerical control (CNC lathing machine. This machine is composed of a ASIST mini-lathe and a two-degrees-of-freedom XZ stage designed specifically for this application. The whole system is controlled from a PC using adequate CNC control software.

  5. In praise of open software

    CERN Multimedia

    2000-01-01

    Much scientific software is proprietary and beyond the reach of poorer scientific communities. This issue will become critical as companies build bioinformatics tools for genomics. The principal of open-source software needs to be defended by academic research institutions (1/2 p).

  6. Study of Virtual Machine and its application

    Directory of Open Access Journals (Sweden)

    Rohaan Chandra

    2013-07-01

    Full Text Available A virtual machine is software that’s capable of executing programs as if it were a physical machine—it’s a computer within a computer. A virtual machine (VM is a software implemented abstraction of the underlying hardware, which is presented to the application layer of the system. Virtual machines may be based on specifications of a hypothetical computer or emulate the computer architecture and functions of a real world computer.

  7. Design of Maritime Satellite Navigation Equipment Man-Machine Interactive Software Based on ReWorks Operating System%基于ReWorks操作系统海用卫星导航设备人机交互软件开发

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    针对海用卫星导航设备人机交互软件,引入了ReWorks操作系统,研究了基于ReWorks的应用软件开发方法,使用ReDe开发环境和DirectX工具实现了海用卫星导航设备人机交互软件的设计开发,最后工程实践表明其正确性和有效性。%According to maritime satellite navigation equipment man-machine interactive software, ReWorks operating system is introduced, software development methods Based on ReWorks operating system is studied, maritime satellite navigation equipment man-machine interactive software design is realized by using the development environment of ReDe and DirectX tool. The final project application proved its correctness and effectiveness.

  8. Mr Lars Leijonborg, Minister for Higher Education and Research of Sweden visiting the cavern ATLAS, the control room of ATLAS and the machine LHC at Point 1 with Collaboration Spokesperson P. Jenni and Dr. Jos Engelen, Chief Scientific Officer of CERN.

    CERN Multimedia

    Maximilien Brice

    2008-01-01

    Mr Lars Leijonborg, Minister for Higher Education and Research of Sweden visiting the cavern ATLAS, the control room of ATLAS and the machine LHC at Point 1 with Collaboration Spokesperson P. Jenni and Dr. Jos Engelen, Chief Scientific Officer of CERN.

  9. Advanced fingerprint verification software

    Science.gov (United States)

    Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.

    2016-05-01

    We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.

  10. Scalable Machine Learning for Massive Astronomical Datasets

    Science.gov (United States)

    Ball, Nicholas M.; Gray, A.

    2014-04-01

    We present the ability to perform data mining and machine learning operations on a catalog of half a billion astronomical objects. This is the result of the combination of robust, highly accurate machine learning algorithms with linear scalability that renders the applications of these algorithms to massive astronomical data tractable. We demonstrate the core algorithms kernel density estimation, K-means clustering, linear regression, nearest neighbors, random forest and gradient-boosted decision tree, singular value decomposition, support vector machine, and two-point correlation function. Each of these is relevant for astronomical applications such as finding novel astrophysical objects, characterizing artifacts in data, object classification (including for rare objects), object distances, finding the important features describing objects, density estimation of distributions, probabilistic quantities, and exploring the unknown structure of new data. The software, Skytree Server, runs on any UNIX-based machine, a virtual machine, or cloud-based and distributed systems including Hadoop. We have integrated it on the cloud computing system of the Canadian Astronomical Data Centre, the Canadian Advanced Network for Astronomical Research (CANFAR), creating the world's first cloud computing data mining system for astronomy. We demonstrate results showing the scaling of each of our major algorithms on large astronomical datasets, including the full 470,992,970 objects of the 2 Micron All-Sky Survey (2MASS) Point Source Catalog. We demonstrate the ability to find outliers in the full 2MASS dataset utilizing multiple methods, e.g., nearest neighbors. This is likely of particular interest to the radio astronomy community given, for example, that survey projects contain groups dedicated to this topic. 2MASS is used as a proof-of-concept dataset due to its convenience and availability. These results are of interest to any astronomical project with large and/or complex

  11. Machine Translation

    Institute of Scientific and Technical Information of China (English)

    张严心

    2015-01-01

    As a kind of ancillary translation tool, Machine Translation has been paid increasing attention to and received different kinds of study by a great deal of researchers and scholars for a long time. To know the definition of Machine Translation and to analyse its benefits and problems are significant for translators in order to make good use of Machine Translation, and helpful to develop and consummate Machine Translation Systems in the future.

  12. Sustainable machining

    CERN Document Server

    2017-01-01

    This book provides an overview on current sustainable machining. Its chapters cover the concept in economic, social and environmental dimensions. It provides the reader with proper ways to handle several pollutants produced during the machining process. The book is useful on both undergraduate and postgraduate levels and it is of interest to all those working with manufacturing and machining technology.

  13. Man - Machine Communication

    CERN Document Server

    Petersen, Peter; Nielsen, Henning

    1984-01-01

    This report describes a Man-to-Machine Communication module which together with a STAC can take care of all operator inputs from the touch-screen, tracker balls and mechanical buttons. The MMC module can also contain a G64 card which could be a GPIB driver but many other G64 cards could be used. The soft-ware services the input devices and makes the results accessible from the CAMAC bus. NODAL functions for the Man Machine Communication is implemented in the STAC and in the ICC.

  14. Case Analysis on Application of SPSS Software in Forestry Production and Scientific Research%SPSS在林业生产和科学研究中的应用实例解析

    Institute of Scientific and Technical Information of China (English)

    琚松苗

    2012-01-01

    SPSS作为统计分析工具,具有数据管理、统计分析、趋势研究、制表绘图、文字处理等功能。本文从科技推广角度出发,结合典型实例介绍SPSS统计软件在林业生产和科学研究中的应用。%SPSS (Statistical Program for Social Sciences), a statistical analysis tool, is used for data management, statistical analysis, trend study, tabulation and drawing, word processing and so on. In this paper the application of the statistical software SPSS in forestry production and scientific research is introduced with typical cases from the perspective of forestry science and technology promotion.

  15. Modern Analysis of Marx's Scientific Thoughts——Taking Machine. The Forces of Nature and Scientific Applications for Text%马克思科学思想的现代解析——以《机器。自然力和科学的应用》为文本

    Institute of Scientific and Technical Information of China (English)

    刘庆炬

    2012-01-01

    Machine.The Forces of Nature and Scientific Applications,contains rich scientific thoughts,which is the important preparation for Marx before writing Das Kaptial.In it,Marx points out the trend of the development of science and technology integration,reveals the interaction and the law of development between the science and economic activities,gains an insight into the alienation between the natural force and science based on the analysis of the capital motion rule.The words of wisdom of the late 19th century still has important significance in contemporary age.%《机器。自然力和科学的应用》是马克思为《资本论》的写作所做的重要准备,蕴含着丰富的科学思想。在书中,马克思指出了科学技术一体化的发展趋势;深刻揭示了科学与经济活动的互动关系和科学的发展规律;通过对资本运动规律的分析,洞见了自然力、科学的异化。这些在19世纪中后期的真知灼见仍然有着重要的当代意义。

  16. Electrical machines diagnosis

    CERN Document Server

    Trigeassou, Jean-Claude

    2013-01-01

    Monitoring and diagnosis of electrical machine faults is a scientific and economic issue which is motivated by objectives for reliability and serviceability in electrical drives.This book provides a survey of the techniques used to detect the faults occurring in electrical drives: electrical, thermal and mechanical faults of the electrical machine, faults of the static converter and faults of the energy storage unit.Diagnosis of faults occurring in electrical drives is an essential part of a global monitoring system used to improve reliability and serviceability. This diagnosis is perf

  17. Performance and portability of the SciBy virtual machine

    DEFF Research Database (Denmark)

    Andersen, Rasmus; Vinter, Brian

    2010-01-01

    The Scientific Bytecode Virtual Machine is a virtual machine designed specifically for performance, security, and portability of scientific applications deployed in a Grid environment. The performance overhead normally incurred by virtual machines is mitigated using native optimized scientific...... libraries, security is obtained by sandboxing techniques. Lastly, by executing platform-independent bytecodes, the machine is highly portable. To evaluate the machine, we demonstrate several use-case scenarios from some of the intended application domains. Further, we show the ease of porting the machine...

  18. CROSS-SECTION GENERATION OF VARIOUS GEO-SCIENTIFIC FEATURES WITHOUT CONTOUR DIGITIZATION USING A VISUAL C++ BASED SOFTWARE APPLICATION ‘VIGAT 2005’

    Directory of Open Access Journals (Sweden)

    Dasgupta A. R.

    2007-06-01

    Full Text Available Cross-section can be described as a two dimensional dataset where the horizontal distances are represented on the x-axis and the depth on the y-axis. A cross-section is a window into the subsurface.
    This work presents the construction of cross sections with the help of 'Vigat 2005' - a Visual C ++ based software application. Its main purpose is to provide cross section views of geoscientific features
    and to interpret their variation within the area of study. In geological context, profile or cross section is an exposure of the ground showing depositional strata. Geological cross sections are very powerful
    means of conveying structural geometries. They are planar, usually vertical, graphical representations of earth sections showing stratigraphical successions, age, structure, and rock types present in the subsurface. Geological cross sections allow a better conceptualization of the 3-D geometry of the structures. By using 'Vigat 2005', a cross section graphic can be displayed by the user with a simple click of the mouse. It offers much easy to use functionality to facilitate the completion of desired tasks. Specific boundary conditions to represent the movement of rock block over the fault can be
    displayed using the graphical user interface. Relief or slope variation of the study area can also be viewed. A topographical map provides an aerial (overhead view of a landscape. It is possible to create a more pictorial representation of the landscape by making a topographic profile of the region.
    A topographic profile is a cross section showing elevations and slopes along a given line. A precise method to determine slope variations is to construct a profile or cross section through the topography. The most important advantage of 'Vigat 2005' is that users do not need to digitize contours. This work focuses on the design and implementation of an optimized interpretive environment that have been built using Visual C++ tools

  19. The Research of CNC Machining Cutter Choice Based on CAXA

    Institute of Scientific and Technical Information of China (English)

    RUAN Xiao-guang; YUAN Si-cong; CAI An-jiang; ZHANG Dang-fei

    2011-01-01

    The article introduces the unique characteristics of CNC machining center cutter compared to traditional cutters, analyzes the choice of CNC machining cutter and factors of choice. Meanwhile, proved by the examples with manufacture software CAXA2004, the correct choice of CNC machining center cutter can give full play to the advantages of CNC machining and improve the economic efficiency and production levels of enterprises.

  20. Foundations of microprogramming architecture, software and applications

    CERN Document Server

    Agrawala, Ashok K

    1976-01-01

    Foundations of Microprogramming: Architecture, Software, and Applications discusses the foundations and trends in microprogramming, focusing on the architectural, software, and application aspects of microprogramming. The book reviews microprocessors, microprogramming concepts, and characteristics, as well as the architectural features in microprogrammed computers. The text explains support software and the different hierarchies or levels of languages. These include assembler languages which are mnemonic or symbolic representation of machine commands; the procedure oriented machine-dependent;

  1. Machine Learning for Security

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Applied statistics, aka ‘Machine Learning’, offers a wealth of techniques for answering security questions. It’s a much hyped topic in the big data world, with many companies now providing machine learning as a service. This talk will demystify these techniques, explain the math, and demonstrate their application to security problems. The presentation will include how-to’s on classifying malware, looking into encrypted tunnels, and finding botnets in DNS data. About the speaker Josiah is a security researcher with HP TippingPoint DVLabs Research Group. He has over 15 years of professional software development experience. Josiah used to do AI, with work focused on graph theory, search, and deductive inference on large knowledge bases. As rules only get you so far, he moved from AI to using machine learning techniques identifying failure modes in email traffic. There followed digressions into clustered data storage and later integrated control systems. Current ...

  2. Future database machine architectures

    OpenAIRE

    Hsiao, David K.

    1984-01-01

    There are many software database management systems available on many general-purpose computers ranging from micros to super-mainframes. Database machines as backened computers can offload the database management work from the mainframe so that we can retain the same mainframe longer. However, the database backend must also demonstrate lower cost, higher performance, and newer functionality. Some of the fundamental architecture issues in the design of high-performance and great-capacity datab...

  3. Quantum Virtual Machine (QVM)

    Energy Technology Data Exchange (ETDEWEB)

    2016-11-18

    There is a lack of state-of-the-art HPC simulation tools for simulating general quantum computing. Furthermore, there are no real software tools that integrate current quantum computers into existing classical HPC workflows. This product, the Quantum Virtual Machine (QVM), solves this problem by providing an extensible framework for pluggable virtual, or physical, quantum processing units (QPUs). It enables the execution of low level quantum assembly codes and returns the results of such executions.

  4. Evaluation & Optimization of Software Engineering

    Directory of Open Access Journals (Sweden)

    Asaduzzaman Noman

    2016-06-01

    Full Text Available The term is made of two words, software and engineering. Software is more than just a program code. A program is an executable code, which serves some computational purpose. Software is considered to be collection of executable programming code, associated libraries and documentations. Software, when made for a specific requirement is called software product. Engineering on the other hand, is all about developing products, using well-defined, scientific principles and methods. The outcome of software engineering is an efficient and reliable software product. IEEE defines software engineering as: The application of a systematic, disciplined, quantifiable approach to the development, operation and maintenance of software; that is, the application of engineering to software.

  5. Simple machines

    CERN Document Server

    Graybill, George

    2007-01-01

    Just how simple are simple machines? With our ready-to-use resource, they are simple to teach and easy to learn! Chocked full of information and activities, we begin with a look at force, motion and work, and examples of simple machines in daily life are given. With this background, we move on to different kinds of simple machines including: Levers, Inclined Planes, Wedges, Screws, Pulleys, and Wheels and Axles. An exploration of some compound machines follows, such as the can opener. Our resource is a real time-saver as all the reading passages, student activities are provided. Presented in s

  6. MACHINING OPTIMISATION AND OPERATION ALLOCATION FOR NC LATHE MACHINES IN A JOB SHOP MANUFACTURING SYSTEM

    Directory of Open Access Journals (Sweden)

    MUSSA I. MGWATU

    2013-08-01

    Full Text Available Numerical control (NC machines in a job shop may not be cost and time effective if the assignment of cutting operations and optimisation of machining parameters are overlooked. In order to justify better utilisation and higher productivity of invested NC machine tools, it is necessary to determine the optimum machining parameters and realize effective assignment of cutting operations on machines. This paper presents two mathematical models for optimising machining parameters and effectively allocating turning operations on NC lathe machines in a job shop manufacturing system. The models are developed as non-linear programming problems and solved using a commercial LINGO software package. The results show that the decisions of machining optimisation and operation allocation on NC lathe machines can be simultaneously made while minimising both production cost and cycle time. In addition, the results indicate that production cost and cycle time can be minimised while significantly reducing or totally eliminating idle times among machines.

  7. The economics of information systems and software

    CERN Document Server

    Veryard, Richard

    2014-01-01

    The Economics of Information Systems and Software focuses on the economic aspects of information systems and software, including advertising, evaluation of information systems, and software maintenance. The book first elaborates on value and values, software business, and scientific information as an economic category. Discussions focus on information products and information services, special economic properties of information, culture and convergence, hardware and software products, materiality and consumption, technological progress, and software flexibility. The text then takes a look at a

  8. Electric machine

    Science.gov (United States)

    El-Refaie, Ayman Mohamed Fawzi [Niskayuna, NY; Reddy, Patel Bhageerath [Madison, WI

    2012-07-17

    An interior permanent magnet electric machine is disclosed. The interior permanent magnet electric machine comprises a rotor comprising a plurality of radially placed magnets each having a proximal end and a distal end, wherein each magnet comprises a plurality of magnetic segments and at least one magnetic segment towards the distal end comprises a high resistivity magnetic material.

  9. Software Design Improvements. Part 1; Software Benefits and Limitations

    Science.gov (United States)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  10. Ten recommendations for software engineering in research.

    Science.gov (United States)

    Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph

    2014-01-01

    Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software.

  11. Machine speech and speaking about machines

    Energy Technology Data Exchange (ETDEWEB)

    Nye, A. [Univ. of Wisconsin, Whitewater, WI (United States)

    1996-12-31

    Current philosophy of language prides itself on scientific status. It boasts of being no longer contaminated with queer mental entities or idealist essences. It theorizes language as programmable variants of formal semantic systems, reimaginable either as the properly epiphenomenal machine functions of computer science or the properly material neural networks of physiology. Whether or not such models properly capture the physical workings of a living human brain is a question that scientists will have to answer. I, as a philosopher, come at the problem from another direction. Does contemporary philosophical semantics, in its dominant truth-theoretic and related versions, capture actual living human thought as it is experienced, or does it instead reflect, regardless of (perhaps dubious) scientific credentials, pathology of thought, a pathology with a disturbing social history.

  12. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  13. Software Engineering Tools for Scientific Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We design and demonstrate the feasibility of extending the open source Eclipse integrated development environment (IDE) to support the full range of capabilities now...

  14. Usability in Scientific Databases

    Directory of Open Access Journals (Sweden)

    Ana-Maria Suduc

    2012-07-01

    Full Text Available Usability, most often defined as the ease of use and acceptability of a system, affects the users' performance and their job satisfaction when working with a machine. Therefore, usability is a very important aspect which must be considered in the process of a system development. The paper presents several numerical data related to the history of the scientific research of the usability of information systems, as it is viewed in the information provided by three important scientific databases, Science Direct, ACM Digital Library and IEEE Xplore Digital Library, at different queries related to this field.

  15. Scientific workflows for bibliometrics.

    Science.gov (United States)

    Guler, Arzu Tugce; Waaijer, Cathelijn J F; Palmblad, Magnus

    Scientific workflows organize the assembly of specialized software into an overall data flow and are particularly well suited for multi-step analyses using different types of software tools. They are also favorable in terms of reusability, as previously designed workflows could be made publicly available through the myExperiment community and then used in other workflows. We here illustrate how scientific workflows and the Taverna workbench in particular can be used in bibliometrics. We discuss the specific capabilities of Taverna that makes this software a powerful tool in this field, such as automated data import via Web services, data extraction from XML by XPaths, and statistical analysis and visualization with R. The support of the latter is particularly relevant, as it allows integration of a number of recently developed R packages specifically for bibliometrics. Examples are used to illustrate the possibilities of Taverna in the fields of bibliometrics and scientometrics.

  16. Scientific publication

    Directory of Open Access Journals (Sweden)

    Getulio Teixeira Batista

    2007-06-01

    Full Text Available The necessary work for developing a scientific publication is sometimes underestimated and requires the effective participation of many players to obtain a result in good standard. Initially it depends upon the determination of the authors that decide to write the scientific article. Scientific writing is a very challenging and time consuming task, but at the same time essential for any scientist. A published scientific article is unquestionably one of the main indicators of scientific production, especially if published in a qualified scientific journal with highly qualified editorial committee and strict peer review procedure. By looking at evaluation criteria for scientific production of the several Thematic Scientific Committees of the Brazilian Council for Scientific and Technological Development (CNPq it becomes clear publications in scientific journals that has certified quality is the most important item in the evaluation of a scientist production.

  17. Machine Learning

    CERN Document Server

    CERN. Geneva

    2017-01-01

    Machine learning, which builds on ideas in computer science, statistics, and optimization, focuses on developing algorithms to identify patterns and regularities in data, and using these learned patterns to make predictions on new observations. Boosted by its industrial and commercial applications, the field of machine learning is quickly evolving and expanding. Recent advances have seen great success in the realms of computer vision, natural language processing, and broadly in data science. Many of these techniques have already been applied in particle physics, for instance for particle identification, detector monitoring, and the optimization of computer resources. Modern machine learning approaches, such as deep learning, are only just beginning to be applied to the analysis of High Energy Physics data to approach more and more complex problems. These classes will review the framework behind machine learning and discuss recent developments in the field.

  18. Reusable State Machine Code Generator

    Science.gov (United States)

    Hoffstadt, A. A.; Reyes, C.; Sommer, H.; Andolfato, L.

    2010-12-01

    The State Machine model is frequently used to represent the behaviour of a system, allowing one to express and execute this behaviour in a deterministic way. A graphical representation such as a UML State Chart diagram tames the complexity of the system, thus facilitating changes to the model and communication between developers and domain experts. We present a reusable state machine code generator, developed by the Universidad Técnica Federico Santa María and the European Southern Observatory. The generator itself is based on the open source project architecture, and uses UML State Chart models as input. This allows for a modular design and a clean separation between generator and generated code. The generated state machine code has well-defined interfaces that are independent of the implementation artefacts such as the middle-ware. This allows using the generator in the substantially different observatory software of the Atacama Large Millimeter Array and the ESO Very Large Telescope. A project-specific mapping layer for event and transition notification connects the state machine code to its environment, which can be the Common Software of these projects, or any other project. This approach even allows to automatically create tests for a generated state machine, using techniques from software testing, such as path-coverage.

  19. Teaching Empirical Software Engineering Using Expert Teams

    DEFF Research Database (Denmark)

    Kuhrmann, Marco

    Empirical software engineering aims at making software engineering claims measurable, i.e., to analyze and understand phenomena in software engineering and to evaluate software engineering approaches and solutions. Due to the involvement of humans and the multitude of fields for which software...... is crucial, software engineering is considered hard to teach. Yet, empirical software engineering increases this difficulty by adding the scientific method as extra dimension. In this paper, we present a Master-level course on empirical software engineering in which different empirical instruments...

  20. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  1. Software Reviews.

    Science.gov (United States)

    Smith, Richard L., Ed.

    1985-01-01

    Reviews software packages by providing extensive descriptions and discussions of their strengths and weaknesses. Software reviewed include (1) "VISIFROG: Vertebrate Anatomy" (grade seven-adult); (2) "Fraction Bars Computer Program" (grades three to six) and (3) four telecommunications utilities. (JN)

  2. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  3. Applying Software Engineering Methodology for Designing Biomedical Software Devoted To Electronic Instrumentation

    OpenAIRE

    2012-01-01

    Problem statement: Significant effort goes into the development of biomedical software, which is integrated with computers/processors, sensors and electronic instrumentation devoted to a specific application. However, the scientific work on electronic instrumentation controlled by biomedical software has not emphasized software development, instead focusing mainly on electronics engineering. The development team is rarely composed of Software Engineering (SE) experts. Usually, a commercial au...

  4. Quantum adiabatic machine learning

    CERN Document Server

    Pudenz, Kristen L

    2011-01-01

    We develop an approach to machine learning and anomaly detection via quantum adiabatic evolution. In the training phase we identify an optimal set of weak classifiers, to form a single strong classifier. In the testing phase we adiabatically evolve one or more strong classifiers on a superposition of inputs in order to find certain anomalous elements in the classification space. Both the training and testing phases are executed via quantum adiabatic evolution. We apply and illustrate this approach in detail to the problem of software verification and validation.

  5. Diamond turning machine controller implementation

    Energy Technology Data Exchange (ETDEWEB)

    Garrard, K.P.; Taylor, L.W.; Knight, B.F.; Fornaro, R.J.

    1988-12-01

    The standard controller for a Pnuemo ASG 2500 Diamond Turning Machine, an Allen Bradley 8200, has been replaced with a custom high-performance design. This controller consists of four major components. Axis position feedback information is provided by a Zygo Axiom 2/20 laser interferometer with 0.1 micro-inch resolution. Hardware interface logic couples the computers digital and analog I/O channels to the diamond turning machine`s analog motor controllers, the laser interferometer, and other machine status and control information. It also provides front panel switches for operator override of the computer controller and implement the emergency stop sequence. The remaining two components, the control computer hardware and software, are discussed in detail below.

  6. Model Checking Software Systems: A Case Study.

    Science.gov (United States)

    1995-03-10

    gained. We suggest a radically different tack: model checking. The two formal objects compared are a finite state machine model of the software...simply terminates. 3.1.1. State Machine Model Let’s consider a simplified model with just one client, one server, and one file. The top graph

  7. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  8. Software platform virtualization in chemistry research and university teaching

    Directory of Open Access Journals (Sweden)

    Kind Tobias

    2009-11-01

    Full Text Available Abstract Background Modern chemistry laboratories operate with a wide range of software applications under different operating systems, such as Windows, LINUX or Mac OS X. Instead of installing software on different computers it is possible to install those applications on a single computer using Virtual Machine software. Software platform virtualization allows a single guest operating system to execute multiple other operating systems on the same computer. We apply and discuss the use of virtual machines in chemistry research and teaching laboratories. Results Virtual machines are commonly used for cheminformatics software development and testing. Benchmarking multiple chemistry software packages we have confirmed that the computational speed penalty for using virtual machines is low and around 5% to 10%. Software virtualization in a teaching environment allows faster deployment and easy use of commercial and open source software in hands-on computer teaching labs. Conclusion Software virtualization in chemistry, mass spectrometry and cheminformatics is needed for software testing and development of software for different operating systems. In order to obtain maximum performance the virtualization software should be multi-core enabled and allow the use of multiprocessor configurations in the virtual machine environment. Server consolidation, by running multiple tasks and operating systems on a single physical machine, can lead to lower maintenance and hardware costs especially in small research labs. The use of virtual machines can prevent software virus infections and security breaches when used as a sandbox system for internet access and software testing. Complex software setups can be created with virtual machines and are easily deployed later to multiple computers for hands-on teaching classes. We discuss the popularity of bioinformatics compared to cheminformatics as well as the missing cheminformatics education at universities worldwide.

  9. Software platform virtualization in chemistry research and university teaching.

    Science.gov (United States)

    Kind, Tobias; Leamy, Tim; Leary, Julie A; Fiehn, Oliver

    2009-11-16

    Modern chemistry laboratories operate with a wide range of software applications under different operating systems, such as Windows, LINUX or Mac OS X. Instead of installing software on different computers it is possible to install those applications on a single computer using Virtual Machine software. Software platform virtualization allows a single guest operating system to execute multiple other operating systems on the same computer. We apply and discuss the use of virtual machines in chemistry research and teaching laboratories. Virtual machines are commonly used for cheminformatics software development and testing. Benchmarking multiple chemistry software packages we have confirmed that the computational speed penalty for using virtual machines is low and around 5% to 10%. Software virtualization in a teaching environment allows faster deployment and easy use of commercial and open source software in hands-on computer teaching labs. Software virtualization in chemistry, mass spectrometry and cheminformatics is needed for software testing and development of software for different operating systems. In order to obtain maximum performance the virtualization software should be multi-core enabled and allow the use of multiprocessor configurations in the virtual machine environment. Server consolidation, by running multiple tasks and operating systems on a single physical machine, can lead to lower maintenance and hardware costs especially in small research labs. The use of virtual machines can prevent software virus infections and security breaches when used as a sandbox system for internet access and software testing. Complex software setups can be created with virtual machines and are easily deployed later to multiple computers for hands-on teaching classes. We discuss the popularity of bioinformatics compared to cheminformatics as well as the missing cheminformatics education at universities worldwide.

  10. Machine Learning

    Energy Technology Data Exchange (ETDEWEB)

    Chikkagoudar, Satish; Chatterjee, Samrat; Thomas, Dennis G.; Carroll, Thomas E.; Muller, George

    2017-04-21

    The absence of a robust and unified theory of cyber dynamics presents challenges and opportunities for using machine learning based data-driven approaches to further the understanding of the behavior of such complex systems. Analysts can also use machine learning approaches to gain operational insights. In order to be operationally beneficial, cybersecurity machine learning based models need to have the ability to: (1) represent a real-world system, (2) infer system properties, and (3) learn and adapt based on expert knowledge and observations. Probabilistic models and Probabilistic graphical models provide these necessary properties and are further explored in this chapter. Bayesian Networks and Hidden Markov Models are introduced as an example of a widely used data driven classification/modeling strategy.

  11. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-06-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  12. Fiscal 1997 project on the R and D of industrial scientific technology under consignment from NEDO. Report on the results of the R and D of new software structuring models (R and D of micromachine cooperative control use software); 1997 nendo sangyo kagaku gijutsu kenkyu kaihatsu jigyo Shin Energy Sangyo Gijutsu Sogo Kaihatsu Kiko itaku. Shin software kozoka model no kenkyu kaihatsu (bisho kikai kyocho seigyoyo software no kenkyu kaihatsu) seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    A R and D was conducted of software structuring models which ease the development and maintenance of software systems and meet diversification of needs. As for the study of the cooperative control use programming language, a R and D of agent oriented language Flage was carried out for expansion of language function, arrangement of network function, development of exercises, etc. As to the formulation of agent knowledge, proposed were processes to make a program from the specifications, and EVA, a mechanism in response to changes in the specifications of existing programs. In relation to the basic theory of cooperation system, a study was made mainly of object oriented attribute grammar OOAG as a model representing cooperative computation in software process as a rule group. Concerning the study of the situation recognition mechanism, researched were models of communication and reasoning among agents in cooperation. 187 refs., 107 figs., 23 tabs.

  13. The ANS mathematics and computation software standards

    Energy Technology Data Exchange (ETDEWEB)

    Smetana, A. O. [Savannah River National Laboratory, Washington Savannah River Company, Aiken, SC 29808 (United States)

    2006-07-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  14. On-machine dimensional verification. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Rendulic, W.

    1993-08-01

    General technology for automating in-process verification of machined products has been studied and implemented on a variety of machines and products at AlliedSignal Inc., Kansas City Division (KCD). Tests have been performed to establish system accuracy and probe reliability on two numerically controlled machining centers. Commercial software has been revised, and new cycles such as skew check and skew machining, have been developed to enhance and expand probing capabilities. Probe benefits have been demonstrated in the area of setup, cycle time, part quality, tooling cost, and product sampling.

  15. Machine testning

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo

    This document is used in connection with a laboratory exercise of 3 hours duration as a part of the course GEOMETRICAL METROLOGY AND MACHINE TESTING. The exercise includes a series of tests carried out by the student on a conventional and a numerically controled lathe, respectively. This document...

  16. Representational Machines

    DEFF Research Database (Denmark)

    Petersson, Dag; Dahlgren, Anna; Vestberg, Nina Lager

    to the enterprises of the medium. This is the subject of Representational Machines: How photography enlists the workings of institutional technologies in search of establishing new iconic and social spaces. Together, the contributions to this edited volume span historical epochs, social environments, technological...

  17. Software piracy

    OpenAIRE

    Kráčmer, Stanislav

    2011-01-01

    The objective of the present thesis is to clarify the term of software piracy and to determine responsibility of individual entities as to actual realization of software piracy. First, the thesis focuses on a computer programme, causes, realization and pitfalls of its inclusion under copyright protection. Subsequently, it observes methods of legal usage of a computer programme. This is the point of departure for the following attempt to define software piracy, accompanied with methods of actu...

  18. The relationships between software publications and software systems

    Science.gov (United States)

    Hogg, David W.

    2017-01-01

    When we build software systems or software tools for astronomy, we sometimes do and sometimes don't also write and publish standard scientific papers about those software systems. I will discuss the pros and cons of writing such publications. There are impacts of writing such papers immediately (they can affect the design and structure of the software project itself), in the short term (they can promote adoption and legitimize the software), in the medium term (they can provide a platform for all the literature's mechanisms for citation, criticism, and reuse), and in the long term (they can preserve ideas that are embodied in the software, possibly on timescales much longer than the lifetime of any software context). I will argue that as important as pure software contributions are to astronomy—and I am both a preacher and a practitioner—software contributions are even more valuable when they are associated with traditional scientific publications. There are exceptions and complexities of course, which I will discuss.

  19. Teaching Empirical Software Engineering Using Expert Teams

    DEFF Research Database (Denmark)

    Kuhrmann, Marco

    2017-01-01

    is crucial, software engineering is considered hard to teach. Yet, empirical software engineering increases this difficulty by adding the scientific method as extra dimension. In this paper, we present a Master-level course on empirical software engineering in which different empirical instruments...

  20. The purely functional software deployment model

    NARCIS (Netherlands)

    Dolstra, E.

    2006-01-01

    Software deployment is the set of activities related to getting software components to work on the machines of end users. It includes activities such as installation, upgrading, uninstallation, and so on. Many tools have been developed to support deployment, but they all have serious limitations wi

  1. The VELO High Voltage System Control Software

    CERN Document Server

    Rakotomiaramanana, B; Eklund, L; De Capua, S

    2010-01-01

    This note describes the VELO high voltage control software. The implementation of its structure as a PVSS Finite State Machine is emphasized. The main error conditions that may occur during operation is also discussed. The VELO HV software conforms to the specification of the VELO.

  2. Introduction: Minds, Bodies, Machines

    Directory of Open Access Journals (Sweden)

    Deirdre Coleman

    2008-10-01

    Full Text Available This issue of 19 brings together a selection of essays from an interdisciplinary conference on 'Minds, Bodies, Machines' convened last year by Birkbeck's Centre for Nineteenth-Century Studies, University of London, in partnership with the English programme, University of Melbourne and software developers Constraint Technologies International (CTI. The conference explored the relationship between minds, bodies and machines in the long nineteenth century, with a view to understanding the history of our technology-driven, post-human visions. It is in the nineteenth century that the relationship between the human and the machine under post-industrial capitalism becomes a pervasive theme. From Blake on the mills of the mind by which we are enslaved, to Carlyle's and Arnold's denunciation of the machinery of modern life, from Dickens's sooty fictional locomotive Mr Pancks, who 'snorted and sniffed and puffed and blew, like a little labouring steam-engine', and 'shot out […]cinders of principles, as if it were done by mechanical revolvency', to the alienated historical body of the late-nineteenth-century factory worker under Taylorization, whose movements and gestures were timed, regulated and rationalised to maximize efficiency; we find a cultural preoccupation with the mechanisation of the nineteenth-century human body that uncannily resonates with modern dreams and anxieties around technologies of the human.

  3. Controls and Machine Protection Systems

    CERN Document Server

    Carrone, E

    2016-01-01

    Machine protection, as part of accelerator control systems, can be managed with a 'functional safety' approach, which takes into account product life cycle, processes, quality, industrial standards and cybersafety. This paper will discuss strategies to manage such complexity and the related risks, with particular attention to fail-safe design and safety integrity levels, software and hardware standards, testing, and verification philosophy. It will also discuss an implementation of a machine protection system at the SLAC National Accelerator Laboratory's Linac Coherent Light Source (LCLS).

  4. Adding machine and calculating machine

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    In 1642 the French mathematician Blaise Pascal(1623-1662) invented a machine;.that could add and subtract. It had.wheels that each had: 1 to 10 marked off along its circumference. When the wheel at the right, representing units, made one complete circle, it engaged the wheel to its left, represents tens, and moved it forward one notch.

  5. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  6. Design and manufacturing of abrasive jet machine for drilling operation

    Directory of Open Access Journals (Sweden)

    Mittal Divyansh

    2016-01-01

    Full Text Available Wide application of Abrasive Jet Machine (AJM is found in machining hard and brittle materials. Machining of brittle materials by AJM is due to brittle fracture and removal of micro chips from the work piece. Embedment of the abrasive particles in the brittle materials results in decrease of machining efficiency. In this paper design and manufacturing of AJM has been presented. Various parts of AJM have been designed using ANSYS 16.2 software. The parts are then manufactured indigenously as per designed parameters. The machine fabricated in this work will be used further for process optimization of AJM parameters for machining of glass and ceramics.

  7. Genesis machines

    CERN Document Server

    Amos, Martyn

    2014-01-01

    Silicon chips are out. Today's scientists are using real, wet, squishy, living biology to build the next generation of computers. Cells, gels and DNA strands are the 'wetware' of the twenty-first century. Much smaller and more intelligent, these organic computers open up revolutionary possibilities. Tracing the history of computing and revealing a brave new world to come, Genesis Machines describes how this new technology will change the way we think not just about computers - but about life itself.

  8. Multivariate Statistical Analysis Software Technologies for Astrophysical Research Involving Large Data Bases

    Science.gov (United States)

    Djorgovski, S. G.

    1994-01-01

    We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complex database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects of the SKICAT system, and of some of the scientific results achieved to date. We also developed a user-friendly package for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications and has

  9. Modern Tools for Modern Software

    Energy Technology Data Exchange (ETDEWEB)

    Kumfert, G; Epperly, T

    2001-10-31

    This is a proposal for a new software configure/build tool for building, maintaining, deploying, and installing software. At its completion, this new tool will replace current standard tool suites such as ''autoconf'', ''automake'', ''libtool'', and the de facto standard build tool, ''make''. This ambitious project is born out of the realization that as scientific software has grown in size and complexity over the years, the difficulty of configuring and building software has increased as well. For high performance scientific software, additional complexities often arises from the need for portability to multiple platforms (including many one-of-a-kind platforms), multilanguage implementations, use of third party libraries, and a need to adapt algorithms to the specific features of the hardware. Development of scientific software is being hampered by the quality of configuration and build tools commonly available. Inordinate amounts of time and expertise are required to develop and maintain the configure and build system for a moderately complex project. Better build and configure tools will increase developer productivity. This proposal is a first step in a process of shoring up the foundation upon which DOE software is created and used.

  10. Mining software specifications methodologies and applications

    CERN Document Server

    Lo, David

    2011-01-01

    An emerging topic in software engineering and data mining, specification mining tackles software maintenance and reliability issues that cost economies billions of dollars each year. The first unified reference on the subject, Mining Software Specifications: Methodologies and Applications describes recent approaches for mining specifications of software systems. Experts in the field illustrate how to apply state-of-the-art data mining and machine learning techniques to address software engineering concerns. In the first set of chapters, the book introduces a number of studies on mining finite

  11. Etiquette in scientific publishing.

    Science.gov (United States)

    Krishnan, Vinod

    2013-10-01

    Publishing a scientific article in a journal with a high impact factor and a good reputation is considered prestigious among one's peer group and an essential achievement for career progression. In the drive to get their work published, researchers can forget, either intentionally or unintentionally, the ethics that should be followed in scientific publishing. In an environment where "publish or perish" rules the day, some authors might be tempted to bend or break rules. This special article is intended to raise awareness among orthodontic journal editors, authors, and readers about the types of scientific misconduct in the current publishing scenario and to provide insight into the ways these misconducts are managed by the Committee of Publishing Ethics. Case studies are presented, and various plagiarism detection software programs used by publishing companies are briefly described.

  12. VIRTUAL MACHINES IN EDUCATION – CNC MILLING MACHINE WITH SINUMERIK 840D CONTROL SYSTEM

    Directory of Open Access Journals (Sweden)

    Ireneusz Zagórski

    2014-11-01

    Full Text Available Machining process nowadays could not be conducted without its inseparable element: cutting edge and frequently numerically controlled milling machines. Milling and lathe machining centres comprise standard equipment in many companies of the machinery industry, e.g. automotive or aircraft. It is for that reason that tertiary education should account for this rising demand. This entails the introduction into the curricula the forms which enable visualisation of machining, milling process and virtual production as well as virtual machining centres simulation. Siemens Virtual Machine (Virtual Workshop sets an example of such software, whose high functionality offers a range of learning experience, such as: learning the design of machine tools, their configuration, basic operation functions as well as basics of CNC.

  13. Scientific news

    NARCIS (Netherlands)

    NN,

    1994-01-01

    The Rijksherbarium/Hortus Botanicus acquired funds through NWO (Netherlands Organisation for Scientific Research) to participate in a 7-year interdisciplinary cooperative programme of Indonesian and Dutch scientific institutions aiming at research in Irian Jaya, Cenderawasih province (the Bird’s Hea

  14. Configurable software for satellite graphics

    Energy Technology Data Exchange (ETDEWEB)

    Hartzman, P D

    1977-12-01

    An important goal in interactive computer graphics is to provide users with both quick system responses for basic graphics functions and enough computing power for complex calculations. One solution is to have a distributed graphics system in which a minicomputer and a powerful large computer share the work. The most versatile type of distributed system is an intelligent satellite system in which the minicomputer is programmable by the application user and can do most of the work while the large remote machine is used for difficult computations. At New York University, the hardware was configured from available equipment. The level of system intelligence resulted almost completely from software development. Unlike previous work with intelligent satellites, the resulting system had system control centered in the satellite. It also had the ability to reconfigure software during realtime operation. The design of the system was done at a very high level using set theoretic language. The specification clearly illustrated processor boundaries and interfaces. The high-level specification also produced a compact, machine-independent virtual graphics data structure for picture representation. The software was written in a systems implementation language; thus, only one set of programs was needed for both machines. A user can program both machines in a single language. Tests of the system with an application program indicate that is has very high potential. A major result of this work is the demonstration that a gigantic investment in new hardware is not necessary for computing facilities interested in graphics.

  15. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  16. Simulating Turing machines on Maurer machines

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2008-01-01

    In a previous paper, we used Maurer machines to model and analyse micro-architectures. In the current paper, we investigate the connections between Turing machines and Maurer machines with the purpose to gain an insight into computability issues relating to Maurer machines. We introduce ways to

  17. Software Reviews.

    Science.gov (United States)

    Classroom Computer Learning, 1990

    1990-01-01

    Reviewed are three computer software packages including "Martin Luther King, Jr.: Instant Replay of History,""Weeds to Trees," and "The New Print Shop, School Edition." Discussed are hardware requirements, costs, grade levels, availability, emphasis, strengths, and weaknesses. (CW)

  18. Software Reviews.

    Science.gov (United States)

    Wulfson, Stephen, Ed.

    1987-01-01

    Reviews seven computer software programs that can be used in science education programs. Describes courseware which deals with muscles and bones, terminology, classifying animals without backbones, molecular structures, drugs, genetics, and shaping the earth's surface. (TW)

  19. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  20. Environmentally Friendly Machining

    CERN Document Server

    Dixit, U S; Davim, J Paulo

    2012-01-01

    Environment-Friendly Machining provides an in-depth overview of environmentally-friendly machining processes, covering numerous different types of machining in order to identify which practice is the most environmentally sustainable. The book discusses three systems at length: machining with minimal cutting fluid, air-cooled machining and dry machining. Also covered is a way to conserve energy during machining processes, along with useful data and detailed descriptions for developing and utilizing the most efficient modern machining tools. Researchers and engineers looking for sustainable machining solutions will find Environment-Friendly Machining to be a useful volume.

  1. Machine learning phases of matter

    Science.gov (United States)

    Carrasquilla, Juan; Melko, Roger G.

    2017-02-01

    Condensed-matter physics is the study of the collective behaviour of infinitely complex assemblies of electrons, nuclei, magnetic moments, atoms or qubits. This complexity is reflected in the size of the state space, which grows exponentially with the number of particles, reminiscent of the `curse of dimensionality' commonly encountered in machine learning. Despite this curse, the machine learning community has developed techniques with remarkable abilities to recognize, classify, and characterize complex sets of data. Here, we show that modern machine learning architectures, such as fully connected and convolutional neural networks, can identify phases and phase transitions in a variety of condensed-matter Hamiltonians. Readily programmable through modern software libraries, neural networks can be trained to detect multiple types of order parameter, as well as highly non-trivial states with no conventional order, directly from raw state configurations sampled with Monte Carlo.

  2. Reusable Software.

    Science.gov (United States)

    1984-03-01

    overseeing reusable software, the Reusable Software Organization ( RUSO ). This author does not feel at this time that establishment of such a specific...49] have not been accompanied by establishment of RUSO -like activities. There is need, however, for assurance that functions which a RUSO might be...assurance 6. establishment and maintenance of reuse archival facilities and activities. Actual establishment of a RUSO is best dictated by size of the

  3. Software Epistemology

    Science.gov (United States)

    2016-03-01

    comprehensive approach for determining software epistemology which significantly advances the state of the art in automated vulnerability discovery...straightforward. First, internet -based repositories of open source software (e.g., FreeBSD ports, GitHub, SourceForge, etc.) are mined Approved for...the fix delta, we attempted to perform the same process to determine if the firmware release present in an Internet -of-Things (IoT) streaming camera

  4. Machine learning phases of matter

    OpenAIRE

    Carrasquilla, Juan; Melko, Roger G.

    2016-01-01

    Neural networks can be used to identify phases and phase transitions in condensed matter systems via supervised machine learning. Readily programmable through modern software libraries, we show that a standard feed-forward neural network can be trained to detect multiple types of order parameter directly from raw state configurations sampled with Monte Carlo. In addition, they can detect highly non-trivial states such as Coulomb phases, and if modified to a convolutional neural network, topol...

  5. Machine Transliteration

    CERN Document Server

    Knight, K; Knight, Kevin; Graehl, Jonathan

    1997-01-01

    It is challenging to translate names and technical terms across languages with different alphabets and sound inventories. These items are commonly transliterated, i.e., replaced with approximate phonetic equivalents. For example, "computer" in English comes out as "konpyuutaa" in Japanese. Translating such items from Japanese back to English is even more challenging, and of practical interest, as transliterated items make up the bulk of text phrases not found in bilingual dictionaries. We describe and evaluate a method for performing backwards transliterations by machine. This method uses a generative model, incorporating several distinct stages in the transliteration process.

  6. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  7. Study of on-machine error identification and compensation methods for micro machine tools

    Science.gov (United States)

    Wang, Shih-Ming; Yu, Han-Jen; Lee, Chun-Yi; Chiu, Hung-Sheng

    2016-08-01

    Micro machining plays an important role in the manufacturing of miniature products which are made of various materials with complex 3D shapes and tight machining tolerance. To further improve the accuracy of a micro machining process without increasing the manufacturing cost of a micro machine tool, an effective machining error measurement method and a software-based compensation method are essential. To avoid introducing additional errors caused by the re-installment of the workpiece, the measurement and compensation method should be on-machine conducted. In addition, because the contour of a miniature workpiece machined with a micro machining process is very tiny, the measurement method should be non-contact. By integrating the image re-constructive method, camera pixel correction, coordinate transformation, the error identification algorithm, and trajectory auto-correction method, a vision-based error measurement and compensation method that can on-machine inspect the micro machining errors and automatically generate an error-corrected numerical control (NC) program for error compensation was developed in this study. With the use of the Canny edge detection algorithm and camera pixel calibration, the edges of the contour of a machined workpiece were identified and used to re-construct the actual contour of the work piece. The actual contour was then mapped to the theoretical contour to identify the actual cutting points and compute the machining errors. With the use of a moving matching window and calculation of the similarity between the actual and theoretical contour, the errors between the actual cutting points and theoretical cutting points were calculated and used to correct the NC program. With the use of the error-corrected NC program, the accuracy of a micro machining process can be effectively improved. To prove the feasibility and effectiveness of the proposed methods, micro-milling experiments on a micro machine tool were conducted, and the results

  8. Model-Driven Robot-Software Design using integrated Models and Co-Simulation

    NARCIS (Netherlands)

    Broenink, Johannes F.; Ni, Yunyun; McAllister, J.; Bhattacharyya, S.

    2012-01-01

    The work presented here is on a methodology for design of hard real-time embedded control software for robots, i.e. mechatronic products. The behavior of the total robot system (machine, control, software and I/O) is relevant, because the dynamics of the machine influences the robot software.

  9. Software Defect Detection with Rocus

    Institute of Scientific and Technical Information of China (English)

    Yuan Jiang; Ming Li; Zhi-Hua Zhou

    2011-01-01

    Software defect detection aims to automatically identify defective software modules for efficient software test in order to improve the quality of a software system. Although many machine learning methods have been successfully applied to the task, most of them fail to consider two practical yet important issues in software defect detection. First, it is rather difficult to collect a large amount of labeled training data for learning a well-performing model; second, in a software system there are usually much fewer defective modules than defect-free modules, so learning would have to be conducted over an imbalanced data set. In this paper, we address these two practical issues simultaneously by proposing a novel semi-supervised learning approach named Rocus. This method exploits the abundant unlabeled examples to improve the detection accuracy, as well as employs under-sampling to tackle the class-imbalance problem in the learning process. Experimental results of real-world software defect detection tasks show that Rocgs is effective for software defect detection. Its performance is better than a semi-supervised learning method that ignores the class-imbalance nature of the task and a class-imbalance learning method that does not make effective use of unlabeled data.

  10. Machine Protection

    CERN Document Server

    Schmidt, R

    2014-01-01

    The protection of accelerator equipment is as old as accelerator technology and was for many years related to high-power equipment. Examples are the protection of powering equipment from overheating (magnets, power converters, high-current cables), of superconducting magnets from damage after a quench and of klystrons. The protection of equipment from beam accidents is more recent. It is related to the increasing beam power of high-power proton accelerators such as ISIS, SNS, ESS and the PSI cyclotron, to the emission of synchrotron light by electron–positron accelerators and FELs, and to the increase of energy stored in the beam (in particular for hadron colliders such as LHC). Designing a machine protection system requires an excellent understanding of accelerator physics and operation to anticipate possible failures that could lead to damage. Machine protection includes beam and equipment monitoring, a system to safely stop beam operation (e.g. dumping the beam or stopping the beam at low energy) and an ...

  11. Heterogeneous versus Homogeneous Machine Learning Ensembles

    Directory of Open Access Journals (Sweden)

    Petrakova Aleksandra

    2015-12-01

    Full Text Available The research demonstrates efficiency of the heterogeneous model ensemble application for a cancer diagnostic procedure. Machine learning methods used for the ensemble model training are neural networks, random forest, support vector machine and offspring selection genetic algorithm. Training of models and the ensemble design is performed by means of HeuristicLab software. The data used in the research have been provided by the General Hospital of Linz, Austria.

  12. Development of Machine Learning Tools in ROOT

    Science.gov (United States)

    Gleyzer, S. V.; Moneta, L.; Zapata, Omar A.

    2016-10-01

    ROOT is a framework for large-scale data analysis that provides basic and advanced statistical methods used by the LHC experiments. These include machine learning algorithms from the ROOT-integrated Toolkit for Multivariate Analysis (TMVA). We present several recent developments in TMVA, including a new modular design, new algorithms for variable importance and cross-validation, interfaces to other machine-learning software packages and integration of TMVA with Jupyter, making it accessible with a browser.

  13. A Scientific Cloud Computing Platform for Condensed Matter Physics

    Science.gov (United States)

    Jorissen, K.; Johnson, W.; Vila, F. D.; Rehr, J. J.

    2013-03-01

    Scientific Cloud Computing (SCC) makes possible calculations with high performance computational tools, without the need to purchase or maintain sophisticated hardware and software. We have recently developed an interface dubbed SC2IT that controls on-demand virtual Linux clusters within the Amazon EC2 cloud platform. Using this interface we have developed a more advanced, user-friendly SCC Platform configured especially for condensed matter calculations. This platform contains a GUI, based on a new Java version of SC2IT, that permits calculations of various materials properties. The cloud platform includes Virtual Machines preconfigured for parallel calculations and several precompiled and optimized materials science codes for electronic structure and x-ray and electron spectroscopy. Consequently this SCC makes state-of-the-art condensed matter calculations easy to access for general users. Proof-of-principle performance benchmarks show excellent parallelization and communication performance. Supported by NSF grant OCI-1048052

  14. A Robust, Format-Agnostic Scientific Data Transfer Framework

    Directory of Open Access Journals (Sweden)

    James Hester

    2016-09-01

    Full Text Available The olog approach of Spivak and Kent (PLoS ONE 7, 1 (2012 p e24274 is applied to the practical development of data transfer frameworks, yielding simple rules for construction and assessment of data transfer standards. The simplicity, extensibility and modularity of such descriptions allows discipline experts unfamiliar with complex ontological constructs or toolsets to synthesise multiple pre-existing standards, potentially including a variety of file formats, into a single overarching ontology. These ontologies nevertheless capture all scientifically-relevant prior knowledge, and when expressed in machine-readable form are sufficiently expressive to mediate translation between legacy and modern data formats. A format-independent programming interface informed by this ontology consists of six functions, of which only two handle data. Demonstration software implementing this interface is used to translate between two common diffraction image formats using such an ontology in place of an intermediate format.

  15. Recent results from the Swinburne supercomputer software correlator

    Science.gov (United States)

    Tingay, Steven; et al.

    I will descrcibe the development of software correlators on the Swinburne Beowulf supercomputer and recent work using the Cray XD-1 machine. I will also describe recent Australian and global VLBI experiments that have been processed on the Swinburne software correlator, along with imaging results from these data. The role of the software correlator in Australia's eVLBI project will be discussed.

  16. Numerical modeling and optimization of machining duplex stainless steels

    Directory of Open Access Journals (Sweden)

    Rastee D. Koyee

    2015-01-01

    Full Text Available The shortcomings of the machining analytical and empirical models in combination with the industry demands have to be fulfilled. A three-dimensional finite element modeling (FEM introduces an attractive alternative to bridge the gap between pure empirical and fundamental scientific quantities, and fulfill the industry needs. However, the challenging aspects which hinder the successful adoption of FEM in the machining sector of manufacturing industry have to be solved first. One of the greatest challenges is the identification of the correct set of machining simulation input parameters. This study presents a new methodology to inversely calculate the input parameters when simulating the machining of standard duplex EN 1.4462 and super duplex EN 1.4410 stainless steels. JMatPro software is first used to model elastic–viscoplastic and physical work material behavior. In order to effectively obtain an optimum set of inversely identified friction coefficients, thermal contact conductance, Cockcroft–Latham critical damage value, percentage reduction in flow stress, and Taylor–Quinney coefficient, Taguchi-VIKOR coupled with Firefly Algorithm Neural Network System is applied. The optimization procedure effectively minimizes the overall differences between the experimentally measured performances such as cutting forces, tool nose temperature and chip thickness, and the numerically obtained ones at any specified cutting condition. The optimum set of input parameter is verified and used for the next step of 3D-FEM application. In the next stage of the study, design of experiments, numerical simulations, and fuzzy rule modeling approaches are employed to optimize types of chip breaker, insert shapes, process conditions, cutting parameters, and tool orientation angles based on many important performances. Through this study, not only a new methodology in defining the optimal set of controllable parameters for turning simulations is introduced, but also

  17. Strategies for measuring machine consciousness

    OpenAIRE

    2009-01-01

    The accurate measurement of the level of consciousness of a creature remains a major scientific challenge, nevertheless a number of new accounts that attempt to address this problem have been proposed recently. In this paper we analyze the principles of these new measures of consciousness along with other classical approaches focusing on their applicability to Machine Consciousness (MC). Furthermore, we propose a set of requirements of what we think a suitable measure for MC should be, discus...

  18. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  19. Multivariate statistical analysis software technologies for astrophysical research involving large data bases

    Science.gov (United States)

    Djorgovski, S. George

    1994-01-01

    We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complete database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful, and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications, and has produced real, published results.

  20. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    This article presents MIAWARE, a software for Medical Image Analysis With Automated Reporting Engine, which was designed and developed for doctor/radiologist assistance. It allows to analyze an image stack from computed axial tomography scan of lungs (thorax) and, at the same time, to mark all...... pathologies on images and report their characteristics. The reporting process is normalized - radiologists cannot describe pathological changes with their own words, but can only use some terms from a specific vocabulary set provided by the software. Consequently, a normalized radiological report...... is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  1. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  2. Terra Harvest software architecture

    Science.gov (United States)

    Humeniuk, Dave; Klawon, Kevin

    2012-06-01

    Under the Terra Harvest Program, the DIA has the objective of developing a universal Controller for the Unattended Ground Sensor (UGS) community. The mission is to define, implement, and thoroughly document an open architecture that universally supports UGS missions, integrating disparate systems, peripherals, etc. The Controller's inherent interoperability with numerous systems enables the integration of both legacy and future UGS System (UGSS) components, while the design's open architecture supports rapid third-party development to ensure operational readiness. The successful accomplishment of these objectives by the program's Phase 3b contractors is demonstrated via integration of the companies' respective plug-'n'-play contributions that include controllers, various peripherals, such as sensors, cameras, etc., and their associated software drivers. In order to independently validate the Terra Harvest architecture, L-3 Nova Engineering, along with its partner, the University of Dayton Research Institute, is developing the Terra Harvest Open Source Environment (THOSE), a Java Virtual Machine (JVM) running on an embedded Linux Operating System. The Use Cases on which the software is developed support the full range of UGS operational scenarios such as remote sensor triggering, image capture, and data exfiltration. The Team is additionally developing an ARM microprocessor-based evaluation platform that is both energy-efficient and operationally flexible. The paper describes the overall THOSE architecture, as well as the design decisions for some of the key software components. Development process for THOSE is discussed as well.

  3. Languages for computer-controlled machines

    OpenAIRE

    MUCHKA, Martin

    2012-01-01

    The work deals with the options and describing the languages of computer-controlled machine tools. In the introductory part is the history and development of numerical control and the description of certain control systems with an emphasis on learning the concept with the possibility of use as a study of the text. The next section describes the school's CNC milling machine, both hardware and software, and example theses on the CNC machine in practice. In the context of the work of a well-orga...

  4. Modal Analysis of Drilling Machine Derrick

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In this paper, the finite element analysis software ( ANSYS ) is applied to the modal analysis of a ZJ30/1700CZ 's drilling machine derrick under a natural condition and a loaded condition, respectively. The preceding nine step natural frequencies and the corresponding mode shapes of the derrick are calculated. By means of the comparison of the natural frequency of the derrick with the design work frequency of the drilling machine and the analysis of the step mode shape of the derrick, the drilling machine derrick structure design is proved to be correct.

  5. Photonometers for coating and sputtering machines

    Directory of Open Access Journals (Sweden)

    Václavík J.

    2013-05-01

    Full Text Available The concept of photonometers (alternative name of optical monitor of a vacuum deposition process for coating and sputtering machines is based on photonometers produced by companies like SATIS or HV Dresden. Photometers were developed in the TOPTEC centre and its predecessor VOD (Optical Development Workshop of Institut of Plasma Physics AS CR for more than 10 years. The article describes current status of the technology and ideas which will be incorporated in next development steps. Hardware and software used on coating machines B63D, VNA600 and sputtering machine UPM810 is presented.

  6. BADMINTON TRAINING MACHINE WITH IMPACT MECHANISM

    OpenAIRE

    B.F. Yousif; KOK SOON YEH

    2011-01-01

    In the current work, a newly machine was designed and fabricated for badminton training purpose. In the designing process, CATIA software was used to design and simulate the machine components. The design was based on direct impact method to launch the shuttle using spring as the source of the impact. Hook’s law was used theoretically to determine the initial and the maximum lengths of the springs. The main feature of the machine is that can move in two axes (up and down, left and right). For...

  7. Operating System For Numerically Controlled Milling Machine

    Science.gov (United States)

    Ray, R. B.

    1992-01-01

    OPMILL program is operating system for Kearney and Trecker milling machine providing fast easy way to program manufacture of machine parts with IBM-compatible personal computer. Gives machinist "equation plotter" feature, which plots equations that define movements and converts equations to milling-machine-controlling program moving cutter along defined path. System includes tool-manager software handling up to 25 tools and automatically adjusts to account for each tool. Developed on IBM PS/2 computer running DOS 3.3 with 1 MB of random-access memory.

  8. Operating System For Numerically Controlled Milling Machine

    Science.gov (United States)

    Ray, R. B.

    1992-01-01

    OPMILL program is operating system for Kearney and Trecker milling machine providing fast easy way to program manufacture of machine parts with IBM-compatible personal computer. Gives machinist "equation plotter" feature, which plots equations that define movements and converts equations to milling-machine-controlling program moving cutter along defined path. System includes tool-manager software handling up to 25 tools and automatically adjusts to account for each tool. Developed on IBM PS/2 computer running DOS 3.3 with 1 MB of random-access memory.

  9. Vibration-assisted machining of single crystal

    Science.gov (United States)

    Zahedi, S. A.; Roy, A.; Silberschmidt, V. V.

    2013-07-01

    Vibration-assisted machining offers a solution to expanding needs for improved machining, especially where accuracy and precision are of importance, such as in micromachining of single crystals of metals and alloys. Crystallographic anisotropy plays a crucial role in determining on overall response to machining. In this study, we intend to address the matter of ultra-precision machining of material at the micron scale using computational modelling. A hybrid modelling approach is implemented that combines two discrete schemes: smoothed particle hydrodynamics and continuum finite elements. The model is implemented in a commercial software ABAQUS/Explicit employing a user-defined subroutine (VUMAT) and used to elucidate the effect of crystallographic anisotropy on a response of face centred cubic (f.c.c.) metals to machining.

  10. [Software version and medical device software supervision].

    Science.gov (United States)

    Peng, Liang; Liu, Xiaoyan

    2015-01-01

    The importance of software version in the medical device software supervision does not cause enough attention at present. First of all, the effect of software version in the medical device software supervision is discussed, and then the necessity of software version in the medical device software supervision is analyzed based on the discussion of the misunderstanding of software version. Finally the concrete suggestions on software version naming rules, software version supervision for the software in medical devices, and software version supervision scheme are proposed.

  11. Human Factors Analysis in Software Engineering

    Institute of Scientific and Technical Information of China (English)

    Xu Ren-zuo; Ma Ruo-feng; Liu Li-na; Xiong Zhong-wei

    2004-01-01

    The general human factors analysis analyzes human functions, effects and influence in a system. But in a narrow sense, it analyzes human influence upon the reliability of a system, it includes traditional human reliability analysis, human error analysis, man-machine interface analysis, human character analysis, and others. A software development project in software engineering is successful or not to be completely determined by human factors. In this paper, we discuss the human factors intensions, declare the importance of human factors analysis for software engineering by listed some instances. At last, we probe preliminarily into the mentality that a practitioner in software engineering should possess.

  12. Software Systems

    Institute of Scientific and Technical Information of China (English)

    崔涛; 周淼

    1996-01-01

    The information used with computers is known as software and includesprograms and data. Programs are sets of instructions telling the computerwhat operations have to be carried out and in what order they should be done. Specialised programs which enable the computer to be used for particularpurposes are called applications programs. A collection of these programs kept

  13. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game management. (CW)

  14. Scientific communication

    Directory of Open Access Journals (Sweden)

    Aleksander Kobylarek

    2017-09-01

    Full Text Available The article tackles the problem of models of communication in science. The formal division of communication processes into oral and written does not resolve the problem of attitude. The author defines successful communication as a win-win game, based on the respect and equality of the partners, regardless of their position in the world of science. The core characteristics of the process of scientific communication are indicated , such as openness, fairness, support, and creation. The task of creating the right atmosphere for science communication belongs to moderators, who should not allow privilege and differentiation of position to affect scientific communication processes.

  15. Mendeley Impact on Scientific Writing: Thematic Analysis

    National Research Council Canada - National Science Library

    Salija, Kisman; Hidayat, Rahmat; Patak, Andi Anto

    2016-01-01

    .... This study aims at exposing the Mendeley impact on scientific writing. The thematic analysis was applied in this study to transcribe the recordings and translate as well as analyze the transcripts through NVivo 8 software...

  16. Automation of printing machine

    OpenAIRE

    Sušil, David

    2016-01-01

    Bachelor thesis is focused on the automation of the printing machine and comparing the two types of printing machines. The first chapter deals with the history of printing, typesettings, printing techniques and various kinds of bookbinding. The second chapter describes the difference between sheet-fed printing machines and offset printing machines, the difference between two representatives of rotary machines, technological process of the products on these machines, the description of the mac...

  17. Hyperspectral Soil Mapper (HYSOMA) software interface: Review and future plans

    Science.gov (United States)

    Chabrillat, Sabine; Guillaso, Stephane; Eisele, Andreas; Rogass, Christian

    2014-05-01

    code was developed as standalone IDL software to allow easy implementation in the hyperspectral and non-hyperspectral communities. Indeed, within the hyperspectral community, IDL language is very widely used, and for non-expert users that do not have an ENVI license, such software can be executed as a binary version using the free IDL virtual machine under various operating systems. Based on the growing interest of users in the software interface, the experimental software was adapted for public release version in 2012, and since then ~80 users of hyperspectral soil products downloaded the soil algorithms at www.gfz-potsdam.de/hysoma. The software interface was distributed for free as IDL plug-ins under the IDL-virtual machine. Up-to-now distribution of HYSOMA was based on a close source license model, for non-commercial and educational purposes. Currently, the HYSOMA is being under further development in the context of the EnMAP satellite mission, for extension and implementation in the EnMAP Box as EnSoMAP (EnMAP SOil MAPper). The EnMAP Box is a freely available, platform-independent software distributed under an open source license. In the presentation we will focus on an update of the HYSOMA software interface status and upcoming implementation in the EnMAP Box. Scientific software validation, associated publication record and users responses as well as software management and transition to open source will be discussed.

  18. Based on VERICUT Software Virtual NC Machining Simulation for Double-enveloping Hourglass Worm's Processing%基于VERICUT软件的平面二次包络环面蜗杆虚拟数控加工仿真研究

    Institute of Scientific and Technical Information of China (English)

    陈小静; 李文星

    2012-01-01

    Double-enveloping hourglass worm' s processing is complex for its spatial spiral surface. As double-enveloping hourglass worm gearing is characterized with multi-tooth line contact, the manufacturing error and load elastic deformation which are existed in the actual conditions will have a very sensitive influence, and increase the cost of manufacturing and manufacturing difficulties, ao by using VERICUT software to from the virtual INC machining simulation during the development of processing equipment, the machine movement and its processing performance can be verified ,and development efficiency can be improved great.%平面二次包络环面蜗杆的加工曲面为复杂的空间螺旋齿面,在平面二次包络环面蜗杆副工作中是多齿双线啮合.在加工时,对实际工况下存在的制造误差和承载弹性变形十分敏感,增加了制造成本和制造难度.在开发加工设备时,用VERICUT软件对其进行虚拟数控加工仿真,可以验证机床运动及其加工性能,并有效提高开发效率.

  19. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  20. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  1. Diamond Measuring Machine

    Energy Technology Data Exchange (ETDEWEB)

    Krstulic, J.F.

    2000-01-27

    The fundamental goal of this project was to develop additional capabilities to the diamond measuring prototype, work out technical difficulties associated with the original device, and perform automated measurements which are accurate and repeatable. For this project, FM and T was responsible for the overall system design, edge extraction, and defect extraction and identification. AccuGem provided a lab and computer equipment in Lawrence, 3D modeling, industry expertise, and sets of diamonds for testing. The system executive software which controls stone positioning, lighting, focusing, report generation, and data acquisition was written in Microsoft Visual Basic 6, while data analysis and modeling were compiled in C/C++ DLLs. All scanning parameters and extracted data are stored in a central database and available for automated analysis and reporting. The Phase 1 study showed that data can be extracted and measured from diamond scans, but most of the information had to be manually extracted. In this Phase 2 project, all data required for geometric modeling and defect identification were automatically extracted and passed to a 3D modeling module for analysis. Algorithms were developed which automatically adjusted both light levels and stone focus positioning for each diamond-under-test. After a diamond is analyzed and measurements are completed, a report is printed for the customer which shows carat weight, summarizes stone geometry information, lists defects and their size, displays a picture of the diamond, and shows a plot of defects on a top view drawing of the stone. Initial emphasis of defect extraction was on identification of feathers, pinpoints, and crystals. Defects were plotted color-coded by industry standards for inclusions (red), blemishes (green), and unknown defects (blue). Diamonds with a wide variety of cut quality, size, and number of defects were tested in the machine. Edge extraction, defect extraction, and modeling code were tested for

  2. Machine musicianship

    Science.gov (United States)

    Rowe, Robert

    2002-05-01

    The training of musicians begins by teaching basic musical concepts, a collection of knowledge commonly known as musicianship. Computer programs designed to implement musical skills (e.g., to make sense of what they hear, perform music expressively, or compose convincing pieces) can similarly benefit from access to a fundamental level of musicianship. Recent research in music cognition, artificial intelligence, and music theory has produced a repertoire of techniques that can make the behavior of computer programs more musical. Many of these were presented in a recently published book/CD-ROM entitled Machine Musicianship. For use in interactive music systems, we are interested in those which are fast enough to run in real time and that need only make reference to the material as it appears in sequence. This talk will review several applications that are able to identify the tonal center of musical material during performance. Beyond this specific task, the design of real-time algorithmic listening through the concurrent operation of several connected analyzers is examined. The presentation includes discussion of a library of C++ objects that can be combined to perform interactive listening and a demonstration of their capability.

  3. 《西游补》、《浮士德》和《时间旅行机》--游走于文学的幻想和科学的时间中%A Supplement to Journey to the West,Faust and Time Traveling Machine:Winding Their Way in Literature Fantasy and Scientific Time

    Institute of Scientific and Technical Information of China (English)

    张一方

    2013-01-01

      对中外三部不同文学作品《西游补》、《浮士德》、《时间旅行机》进行比较,它们最主要的共同特点是在时间中游走。这是从神话小说、浪漫主义到科幻小说的发展,也是人类从幻想到现代科学的发展途径。%Analyzing the books of A Supplement to Journey to the West,Faust and Time Traveling machine,We find that their main com-mon feature is winding its way in time .It is the development from mythological novel and romantism to scientific novel .In fact it is also for humans the development from fantasy to modern science .

  4. Finite Element Analysis of Reciprocating Screw for Injection Molding Machine

    Directory of Open Access Journals (Sweden)

    Nagsen B. Nagrale

    2011-06-01

    Full Text Available This paper deals with, the solution of problem occurred for reciprocating screw of Injection molding machine. It identifies and solves the problem by using the modeling and simulation techniques. The problem occurred in the reciprocating screw of machine which was wearing of threads due to affect of temperature of mold materials(flow materials i.e. Nylon, low density polypropylene, polystyrene, PVC etc., The main work was to model the components of machine with dimensions, assemble those components and then simulate the whole assembly for rotation of the screw. The modeling software used is PRO-E wildfire 4.0 for modeling the machine components like body, movable platen, fixed platen, barrel, screw, nozzle, etc. The analysis software ANSYS is used to analyze the reciprocating screws. The objectives involved are:- • To model all the components using modeling software Pro-E 4.0 • To assemble all the components of the machine in the software. • To make the assembly run in Pro-E software.• Analysis of screw of machine using Ansys 11.0 software. • To identify the wearing of threads and to provide the possible solutions.This problem is major for all industrial injection molding machines which the industries are facing and they need the permanent solution, so if the better solution is achieved then the industries will think for implementing it. The industries are having temporary solution but it will affect the life of the screw, because the stresses will be more in machined screw on lathe machine as compared to normal screw. Also if the screw will fail after some years of operation, the new screw available in the market will have the same problem. Also the cost associated with new screw and its mounting is much more as it is the main component of machine.

  5. Dynamic Modal Analysis of Vertical Machining Centre Components

    Directory of Open Access Journals (Sweden)

    Anayet U. Patwari

    2009-01-01

    Full Text Available The paper presents a systematic procedure and details of the use of experimental and analytical modal analysis technique for structural dynamic evaluation processes of a vertical machining centre. The main results deal with assessment of the mode shape of the different components of the vertical machining centre. The simplified experimental modal analysis of different components of milling machine was carried out. This model of the different machine tool's structure is made by design software and analyzed by finite element simulation using ABAQUS software to extract the different theoretical mode shape of the components. The model is evaluated and corrected with experimental results by modal testing of the machine components in which the natural frequencies and the shape of vibration modes are analyzed. The analysis resulted in determination of the direction of the maximal compliance of a particular machine component.

  6. Software based controls module development

    Energy Technology Data Exchange (ETDEWEB)

    Graves, v.b.; kelley, g; welch, j.c.

    1999-12-10

    A project was initiated at the Oak Ridge Y-12 Plant to implement software geometric error compensation within a PC-based machine tool controller from Manufacturing Data Systems, Inc. This project may be the first in which this type of compensation system was implemented in a commercially available machine tool controller totally in software. Previous implementations typically required using an external computer and hardware to interface through the position feedback loop of the controller because direct access to the controller software was not available. The test-bed machine for this project was a 2-axis Excello 921 T-base lathe. A mathematical error model of the lathe was created using homogeneous transformation matrices to relate the positions of the machine's slides to each other and to a world reference system. Equations describing the effects of the geometric errors were derived from the model. A software architecture was developed to support geometric error compensation for machine tools with up to 3 linear axes. Rotary axes were not supported in this implementation, but the developed architecture would not preclude their support in the future. Specific implementations will be dependent upon the configuration of the machine tool. A laser measuring system from Automated Precision, Inc. was used to characterize the lathe's geometric errors as functions of axis position and direction of motion. Multiple data files generated by the laser system were combined into a single Error File that was read at system startup and used by the compensation system to provide real-time position adjustments to the axis servos. A Renishaw Ballbar was used to evaluate the compensation system. Static positioning tests were conducted in an attempt to observe improved positioning accuracy with the compensation system enabled. These tests gave inconsistent results due to the lathe's inability to position the tool repeatably. The development of the architecture and

  7. Writing software for the clinic.

    Science.gov (United States)

    Rosen, I I

    1998-03-01

    Medical physicists often write computer programs to support scientific, educational, and clinical endeavors. Errors in scientific and educational software can waste time and effort by producing meaningless results, but errors in clinical software can contribute to patient injuries. Although the ultimate goal of error-free software is impossible to achieve except in very small programs, there are many good design, implementation, and testing practices that can be used by small development groups to significantly reduce errors, improve quality, and reduce maintenance. The software development process should include four basic steps: specifications, design, implementation, and testing. A specifications document defining what the software is intended to do is valuable for clearly delimiting the scope of the project and providing a benchmark for evaluating the final product. Keep the software design simple and straightforward. Document assumptions, and check them. Emphasize maintainability, portability, and reliability rather than speed. Use layers to isolate the application from hardware and the operating system. Plan for upgrades. Expect the software to be used in unplanned ways. Whenever possible, be generous with RAM and disk storage; hardware is cheaper than development and maintenance. During implementation, use well-known algorithms whenever possible. Use prototypes to try out ideas. Use generic modules, version numbering, unique file names, defensive programming, and operating system and language/compiler defaults. Avoid binary data files and clever tricks. Remember that real numbers are not exact in a computer. Get it right before making it faster. Document the software extensively. Test continuously during development; the later a problem is found, the more it costs to fix. Use a written procedure to test the final product exactly as a typical user would run it. Allow no changes after clinical release. Expect to spend at least an additional 50% of the initial

  8. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  9. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology-Lausanne (EPFL), Solar Energy and Building Physics Laboratory (LESO-PB), Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Institute of Meteorology and Physics of Atmospheric Environment, Group Energy Conservation, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Division of Energy and Indoor Environment, Hoersholm, (Denmark)

    2000-07-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenarios and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (author)

  10. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  11. Software Engineering to Professionalize Software Development

    Directory of Open Access Journals (Sweden)

    Juan Miguel Alonso

    2011-12-01

    Full Text Available The role, increasingly important, that plays the software in the systems with widespread effects presents new challenges for the formation of Software Engineers. Not only because social dependence software is increasing, but also because the character of software development is also changing and with it the demands for software developers certified. In this paper are propose some challenges and aspirations that guide the learning processes Software Engineering and help to identify the need to train professionals in software development.

  12. Electrical machines mathematical fundamentals of machine topologies

    CERN Document Server

    Gerling, Dieter

    2015-01-01

    Electrical Machines and Drives play a powerful role in industry with an ever increasing importance. This fact requires the understanding of machine and drive principles by engineers of many different disciplines. Therefore, this book is intended to give a comprehensive deduction of these principles. Special attention is given to the precise mathematical derivation of the necessary formulae to calculate machines and drives and to the discussion of simplifications (if applied) with the associated limits. The book shows how the different machine topologies can be deduced from general fundamentals, and how they are linked together. This book addresses graduate students, researchers, and developers of Electrical Machines and Drives, who are interested in getting knowledge about the principles of machine and drive operation and in detecting the mathematical and engineering specialties of the different machine and drive topologies together with their mutual links. The detailed - but nevertheless compact - mat...

  13. Prediction in Marketing Using the Support Vector Machine

    OpenAIRE

    Dapeng Cui; David Curry

    2005-01-01

    Many marketing problems require accurately predicting the outcome of a process or the future state of a system. In this paper, we investigate the ability of the support vector machine to predict outcomes in emerging environments in marketing, such as automated modeling, mass-produced models, intelligent software agents, and data mining. The support vector machine (SVM) is a semiparametric technique with origins in the machine-learning literature of computer science. Its approach to prediction...

  14. Writing references and using citation management software.

    Science.gov (United States)

    Sungur, Mukadder Orhan; Seyhan, Tülay Özkan

    2013-09-01

    The correct citation of references is obligatory to gain scientific credibility, to honor the original ideas of previous authors and to avoid plagiarism. Currently, researchers can easily find, cite and store references using citation management software. In this review, two popular citation management software programs (EndNote and Mendeley) are summarized.

  15. Laser machining of advanced materials

    CERN Document Server

    Dahotre, Narendra B

    2011-01-01

    Advanced materialsIntroductionApplicationsStructural ceramicsBiomaterials CompositesIntermetallicsMachining of advanced materials IntroductionFabrication techniquesMechanical machiningChemical Machining (CM)Electrical machiningRadiation machining Hybrid machiningLaser machiningIntroductionAbsorption of laser energy and multiple reflectionsThermal effectsLaser machining of structural ceramicsIntrodu

  16. Scientific Crossbreeding

    DEFF Research Database (Denmark)

    Hvidtfeldt, Rolf

    This thesis presents an alternative approach to the analysis of interdisciplinarity. One of the basic reasons for developing an alternative method for evaluation of interdisciplinary activities is that epistemic issues are insufficiently dealt with in the existing literature on the topic. To deve......This thesis presents an alternative approach to the analysis of interdisciplinarity. One of the basic reasons for developing an alternative method for evaluation of interdisciplinary activities is that epistemic issues are insufficiently dealt with in the existing literature on the topic....... To develop a more adequate way of capturing what is at stake in interdisciplinarity, I suggest drawing inspiration from the contemporary philosophical literature on scientific representation. The development of a representation based approach to the analysis of interdisciplinarity, and the discussion...... of the concept of “scientific discipline” and disciplinary difference. This chapter provides reasons to assume that conventional scientific taxonomies do not provide a good basis for analysing epistemic aspects of interdisciplinary science. On this background it is argued that the concept of “approaches...

  17. Tensor Network Quantum Virtual Machine (TNQVM)

    Energy Technology Data Exchange (ETDEWEB)

    2016-11-18

    There is a lack of state-of-the-art quantum computing simulation software that scales on heterogeneous systems like Titan. Tensor Network Quantum Virtual Machine (TNQVM) provides a quantum simulator that leverages a distributed network of GPUs to simulate quantum circuits in a manner that leverages recent results from tensor network theory.

  18. Improving Energy Efficiency in CNC Machining

    Science.gov (United States)

    Pavanaskar, Sushrut S.

    We present our work on analyzing and improving the energy efficiency of multi-axis CNC milling process. Due to the differences in energy consumption behavior, we treat 3- and 5-axis CNC machines separately in our work. For 3-axis CNC machines, we first propose an energy model that estimates the energy requirement for machining a component on a specified 3-axis CNC milling machine. Our model makes machine-specific predictions of energy requirements while also considering the geometric aspects of the machining toolpath. Our model - and the associated software tool - facilitate direct comparison of various alternative toolpath strategies based on their energy-consumption performance. Further, we identify key factors in toolpath planning that affect energy consumption in CNC machining. We then use this knowledge to propose and demonstrate a novel toolpath planning strategy that may be used to generate new toolpaths that are inherently energy-efficient, inspired by research on digital micrography -- a form of computational art. For 5-axis CNC machines, the process planning problem consists of several sub-problems that researchers have traditionally solved separately to obtain an approximate solution. After illustrating the need to solve all sub-problems simultaneously for a truly optimal solution, we propose a unified formulation based on configuration space theory. We apply our formulation to solve a problem variant that retains key characteristics of the full problem but has lower dimensionality, allowing visualization in 2D. Given the complexity of the full 5-axis toolpath planning problem, our unified formulation represents an important step towards obtaining a truly optimal solution. With this work on the two types of CNC machines, we demonstrate that without changing the current infrastructure or business practices, machine-specific, geometry-based, customized toolpath planning can save energy in CNC machining.

  19. Man-machine interactions 3

    CERN Document Server

    Czachórski, Tadeusz; Kozielski, Stanisław

    2014-01-01

    Man-Machine Interaction is an interdisciplinary field of research that covers many aspects of science focused on a human and machine in conjunction.  Basic goal of the study is to improve and invent new ways of communication between users and computers, and many different subjects are involved to reach the long-term research objective of an intuitive, natural and multimodal way of interaction with machines.  The rapid evolution of the methods by which humans interact with computers is observed nowadays and new approaches allow using computing technologies to support people on the daily basis, making computers more usable and receptive to the user's needs.   This monograph is the third edition in the series and presents important ideas, current trends and innovations in  the man-machine interactions area.  The aim of this book is to introduce not only hardware and software interfacing concepts, but also to give insights into the related theoretical background. Reader is provided with a compilation of high...

  20. Design of Waste Shredder Machine

    Directory of Open Access Journals (Sweden)

    Asst. Prof. S.Nithyananth

    2014-03-01

    Full Text Available The conventional agro-waste disposal is a traditional and oldest method of waste disposal in which agriculture wastes are dumped as it is to degrade in a particular place for decomposing. As the wastes are dumped as such, it takes more time to degrade and it causes environmental pollution. The waste shredder machine aims to reduce the agro waste and convert it into useful nourishing fertilizer. It decreases the man work making the farm neat and clean. Also it reduces the heap amount of pollution, disease causing agro-wastes and produces a better fertilizer with vermin compost. The waste shredder machine is an attachment as like a ploughing attachment. In the shredder attachment input power and rigid support is provided by a KAMCO Tera-trac 4W tractor by means of PTO (power take off shaft and three point linkage. PTO shaft of the tractor acts as a basic power input and the three point linkage provide a rigid support to the machine. Various kinds of blades are used for chipping and powdering operations like sawing blades, rotatory blades, and triangular shape blades. The blades are mounted on the shaft. The power is transmitted to another shaft by means of pulley and belt. For designing waste shredder machine, Creo parametric 1.0 software is used.

  1. The deleuzian abstract machines

    DEFF Research Database (Denmark)

    Werner Petersen, Erik

    2005-01-01

    production. In Kafka: Toward a Minor Literature, Deleuze and Guatari gave the most comprehensive explanation to the abstract machine in the work of art. Like the war-machines of Virilio, the Kafka-machine operates in three gears or speeds. Furthermore, the machine is connected to spatial diagrams...

  2. Promoting Science Software Best Practices: A Scientist's Perspective (Invited)

    Science.gov (United States)

    Blanton, B. O.

    2013-12-01

    Software is at the core of most modern scientific activities, and as societal awareness of, and impacts from, extreme weather, disasters, and climate and global change continue to increase, the roles that scientific software play in analyses and decision-making are brought more to the forefront. Reproducibility of research results (particularly those that enter into the decision-making arena) and open access to the software is essential for scientific and scientists' credibility. This has been highlighted in a recent article by Joppa et al (Troubling Trends in Scientific Software Use, Science Magazine, May 2013) that describes reasons for particular software being chosen by scientists, including that the "developer is well-respected" and on "recommendation from a close colleague". This reliance on recommendation, Joppa et al conclude, is fraught with risks to both sciences and scientists. Scientists must frequently take software for granted, assuming that it performs as expected and advertised and that the software itself has been validated and results verified. This is largely due to the manner in which much software is written and developed; in an ad hoc manner, with an inconsistent funding stream, and with little application of core software engineering best practices. Insufficient documentation, limited test cases, and code unavailability are significant barriers to informed and intelligent science software usage. This situation is exacerbated when the scientist becomes the software developer out of necessity due to resource constraints. Adoption of, and adherence to, best practices in scientific software development will substantially increase intelligent software usage and promote a sustainable evolution of the science as encoded in the software. We describe a typical scientist's perspective on using and developing scientific software in the context of storm surge research and forecasting applications that have real-time objectives and regulatory constraints

  3. Key characteristics for software for open architecture controllers

    Science.gov (United States)

    Pfeffer, Lawrence E.; Tran, Hy D.

    1997-01-01

    Software development time, cost, and ease of (re)use are now among the major issues in development of advanced machines, whether for machine tools, automation systems, or process systems. Two keys to reducing development time are powerful, user-friendly development tools and software architectures that provide clean, well-documented interfaces to the various real-time functions that such machines require. Examples of essential functions are signal conditioning, servo-control, trajectory generation, calibration/registration, coordination of a synchronous events, task sequencing, communication with external systems, and user interfaces. There are a number of existing standards that can help with software development, such as the IEEE POSIX standards for operating systems and real time services; software tools to compliment these standards are beginning to see use. This paper will detail some of the existing standards, some new tools, and development activities relevant to advanced, 'smart' machines.

  4. Application of the Design Pattern in Human-machine Interface Designing of Software System%设计模式在软件系统人机界面设计中的应用

    Institute of Scientific and Technical Information of China (English)

    黄文军; 王以群

    2009-01-01

    The thesis introduces the basic concept and the classification of design-pattern,and describes the framework and operational mechanism in the designing of HCI when the analysis of software system interface's functional requirement was done.%介绍了以面向对象技术为基础的软件设计模式的基本概念及分类,并以Composite(组合)模式和Command(命令)模式为例,阐述了对软件系统人机界面进行功能分析之后,在人机界面设计中应用设计模式的实现结构与运行机制.

  5. CernVM WebAPI - Controlling Virtual Machines from the Web

    Science.gov (United States)

    Charalampidis, I.; Berzano, D.; Blomer, J.; Buncic, P.; Ganis, G.; Meusel, R.; Segal, B.

    2015-12-01

    Lately, there is a trend in scientific projects to look for computing resources in the volunteering community. In addition, to reduce the development effort required to port the scientific software stack to all the known platforms, the use of Virtual Machines (VMs)u is becoming increasingly popular. Unfortunately their use further complicates the software installation and operation, restricting the volunteer audience to sufficiently expert people. CernVM WebAPI is a software solution addressing this specific case in a way that opens wide new application opportunities. It offers a very simple API for setting-up, controlling and interfacing with a VM instance in the users computer, while in the same time offloading the user from all the burden of downloading, installing and configuring the hypervisor. WebAPI comes with a lightweight javascript library that guides the user through the application installation process. Malicious usage is prohibited by offering a per-domain PKI validation mechanism. In this contribution we will overview this new technology, discuss its security features and examine some test cases where it is already in use.

  6. Space Software

    Science.gov (United States)

    1990-01-01

    Xontech, Inc.'s software package, XonVu, simulates the missions of Voyager 1 at Jupiter and Saturn, Voyager 2 at Jupiter, Saturn, Uranus and Neptune, and Giotto in close encounter with Comet Halley. With the program, the user can generate scenes of the planets, moons, stars or Halley's nucleus and tail as seen by Giotto, all graphically reproduced with high accuracy in wireframe representation. Program can be used on a wide range of computers, including PCs. User friendly and interactive, with many options, XonVu can be used by a space novice or a professional astronomer. With a companion user's manual, it sells for $79.

  7. Software architecture

    CERN Document Server

    Vogel, Oliver; Chughtai, Arif

    2011-01-01

    As a software architect you work in a wide-ranging and dynamic environment. You have to understand the needs of your customer, design architectures that satisfy both functional and non-functional requirements, and lead development teams in implementing the architecture. And it is an environment that is constantly changing: trends such as cloud computing, service orientation, and model-driven procedures open up new architectural possibilities. This book will help you to develop a holistic architectural awareness and knowledge base that extends beyond concrete methods, techniques, and technologi

  8. Analysis and Research of Scientific and Technological Innovation on the Green Technology and Its Application Technology of the Multi-Layer Co-Extrusion Blow Plastic Molding Machine%多层共挤中空塑料成型机绿色技术的科技创新及其应用技术的分析研究

    Institute of Scientific and Technical Information of China (English)

    张友根; 冯刚; 张朝阁; 齐继宝

    2013-01-01

    Put out the science technology system and green design principles of the multi-layer coextruded blow plastic molding machine green technology.Research the scientific and technological innovation and its development main points on the green technology that the mold,plastics extrusion,clamping mechanism,power driven,heat system,blow,control components and systems of the co-extrusion of multilayer co-extruded blow plastic molding machine in to improve energy efficiency,reduce resource consumption,energy saving and processing,cleaner production,energy and material back regeneration applications.Analysis of the points of the research and development of green technology and scientific development of intelligent multi-layer co-extruded blow molding machine.Brief introduction of the green molding technology progress on Achieve extrusion blow molded containers "reality needs" and "potential demand" resourcesaving and improved barrier properties and environmental of the multi-layer co-extruded blow plastic molding machine.Studied the multi-layer co-extrusion plastic blow molding machines of green technology science and technology innovation promote the greening of hollow container processing technology progress.%提出了多层共挤中空塑料成型机绿色技术的科学技术体系和绿色设计原则,研究了多层共挤中空塑料成型机的共挤机头、塑化挤出、合模机构、动力驱动、加热、吹塑、控制等部件及系统在提高能源利用率、减少资源消耗、节能加工、清洁生产、能量和回料再生应用等方面的绿色技术的科技创新及研发要点,论述了智能化实现多层共挤中空成型机的绿色技术科学化发展的研发要点,浅述了多层共挤中空塑料成型机实现挤出中空成型容器“现实需求”和“潜在需求”实现资源节约型、提高阻隔性和环保性等绿色成型加工技术化的进展,研究了多层共挤塑料中空成型机绿色技术的科技创新推

  9. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  10. Vane Pump Casing Machining of Dumpling Machine Based on CAD/CAM

    Science.gov (United States)

    Huang, Yusen; Li, Shilong; Li, Chengcheng; Yang, Zhen

    Automatic dumpling forming machine is also called dumpling machine, which makes dumplings through mechanical motions. This paper adopts the stuffing delivery mechanism featuring the improved and specially-designed vane pump casing, which can contribute to the formation of dumplings. Its 3D modeling in Pro/E software, machining process planning, milling path optimization, simulation based on UG and compiling post program were introduced and verified. The results indicated that adoption of CAD/CAM offers firms the potential to pursue new innovative strategies.

  11. Experimental Investigation of process parameters influence on machining Inconel 800 in the Electrical Spark Eroding Machine

    Science.gov (United States)

    Karunakaran, K.; Chandrasekaran, M.

    2016-11-01

    The Electrical Spark Eroding Machining is an entrenched sophisticated machining process for producing complex geometry with close tolerances in hard materials like super alloy which are extremely difficult-to-machine by using conventional machining processes. It is sometimes offered as a better alternative or sometimes as an only alternative for generating accurate 3D complex shapes of macro, micro and nano-features in such difficult-to-machine materials among other advanced machining processes. The accomplishment of such challenging task by use of Electrical Spark Eroding Machining or Electrical Discharge Machining (EDM) is depending upon selection of apt process parameters. This paper is about analyzing the influencing of parameter in electrical eroding machining for Inconel 800 with electrolytic copper as a tool. The experimental runs were performed with various input conditions to process Inconel 800 nickel based super alloy for analyzing the response of material removal rate, surface roughness and tool wear rate. These are the measures of performance of individual experimental value of parameters such as pulse on time, Pulse off time, peak current. Taguchi full factorial Design by using Minitab release 14 software was employed to meet the manufacture requirements of preparing process parameter selection card for Inconel 800 jobs. The individual parameter's contribution towards surface roughness was observed from 13.68% to 64.66%.

  12. A Survey of Reliability, Maintainability, Supportability, and Testability Software Tools

    Science.gov (United States)

    1991-04-01

    Industrial Reliability Program ABSTRACT: Program is utilized for analysis of industrial equipment for which military requirements are not applicable...Can communicate with other TECNASA software and has British or Portuguese menus. MACHINES: IBM PC POC: TECNASA Attn: Jose L. Barletta Electronica ...termed "performability". Models both repairable and nonrepairable systems. MACHINES: No Data POC: Industrial Technology Institute 20 NAME: METFAC

  13. Predicting Software Suitability Using a Bayesian Belief Network

    Science.gov (United States)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  14. Decomposition of forging die for high speed machining

    CERN Document Server

    Tapie, Laurent

    2009-01-01

    Today's forging die manufacturing process must be adapted to several evolutions in machining process generation: CAD/CAM models, CAM software solutions and High Speed Machining (HSM). In this context, the adequacy between die shape and HSM process is in the core of machining preparation and process planning approaches. This paper deals with an original approach of machining preparation integrating this adequacy in the main tasks carried out. In this approach, the design of the machining process is based on two levels of decomposition of the geometrical model of a given die with respect to HSM cutting conditions (cutting speed and feed rate) and technological constrains (tool selection, features accessibility). This decomposition assists machining assistant to generate an HSM process. The result of this decomposition is the identification of machining features.

  15. Vending machine assessment methodology. A systematic review.

    Science.gov (United States)

    Matthews, Melissa A; Horacek, Tanya M

    2015-07-01

    The nutritional quality of food and beverage products sold in vending machines has been implicated as a contributing factor to the development of an obesogenic food environment. How comprehensive, reliable, and valid are the current assessment tools for vending machines to support or refute these claims? A systematic review was conducted to summarize, compare, and evaluate the current methodologies and available tools for vending machine assessment. A total of 24 relevant research studies published between 1981 and 2013 met inclusion criteria for this review. The methodological variables reviewed in this study include assessment tool type, study location, machine accessibility, product availability, healthfulness criteria, portion size, price, product promotion, and quality of scientific practice. There were wide variations in the depth of the assessment methodologies and product healthfulness criteria utilized among the reviewed studies. Of the reviewed studies, 39% evaluated machine accessibility, 91% evaluated product availability, 96% established healthfulness criteria, 70% evaluated portion size, 48% evaluated price, 52% evaluated product promotion, and 22% evaluated the quality of scientific practice. Of all reviewed articles, 87% reached conclusions that provided insight into the healthfulness of vended products and/or vending environment. Product healthfulness criteria and complexity for snack and beverage products was also found to be variable between the reviewed studies. These findings make it difficult to compare results between studies. A universal, valid, and reliable vending machine assessment tool that is comprehensive yet user-friendly is recommended.

  16. Accelerator Operators and Software Development

    Energy Technology Data Exchange (ETDEWEB)

    April Miller; Michele Joyce

    2001-11-01

    At Thomas Jefferson National Accelerator Facility, accelerator operators perform tasks in their areas of specialization in addition to their machine operations duties. One crucial area in which operators contribute is software development. Operators with programming skills are uniquely qualified to develop certain controls applications because of their expertise in the day-to-day operation of the accelerator. Jefferson Lab is one of the few laboratories that utilizes the skills and knowledge of operators to create software that enhances machine operations. Through the programs written; by operators, Jefferson Lab has improved machine efficiency and beam availability. Because many of these applications involve automation of procedures and need graphical user interfaces, the scripting language Tcl and the Tk toolkit have been adopted. In addition to automation, some operator-developed applications are used for information distribution. For this purpose, several standard web development tools such as perl, VBScript, and ASP are used. Examples of applications written by operators include injector steering, spin angle changes, system status reports, magnet cycling routines, and quantum efficiency measurements. This paper summarizes how the unique knowledge of accelerator operators has contributed to the success of the Jefferson Lab control system. *This work was supported by the U.S. DOE contract No. DE-AC05-84-ER40150.

  17. Complexity of software trustworthiness and its dynamical statistical analysis methods

    Institute of Scientific and Technical Information of China (English)

    ZHENG ZhiMing; MA ShiLong; LI Wei; JIANG Xin; WEI Wei; MA LiLi; TANG ShaoTing

    2009-01-01

    Developing trusted softwares has become an important trend and a natural choice in the development of software technology and applications.At present,the method of measurement and assessment of software trustworthiness cannot guarantee safe and reliable operations of software systems completely and effectively.Based on the dynamical system study,this paper interprets the characteristics of behaviors of software systems and the basic scientific problems of software trustworthiness complexity,analyzes the characteristics of complexity of software trustworthiness,and proposes to study the software trustworthiness measurement in terms of the complexity of software trustworthiness.Using the dynamical statistical analysis methods,the paper advances an invariant-measure based assessment method of software trustworthiness by statistical indices,and hereby provides a dynamical criterion for the untrustworthiness of software systems.By an example,the feasibility of the proposed dynamical statistical analysis method in software trustworthiness measurement is demonstrated using numerical simulations and theoretical analysis.

  18. Use of machine learning techniques for modeling of snow depth

    Directory of Open Access Journals (Sweden)

    G. V. Ayzel

    2017-01-01

    Full Text Available Snow exerts significant regulating effect on the land hydrological cycle since it controls intensity of heat and water exchange between the soil-vegetative cover and the atmosphere. Estimating of a spring flood runoff or a rain-flood on mountainous rivers requires understanding of the snow cover dynamics on a watershed. In our work, solving a problem of the snow cover depth modeling is based on both available databases of hydro-meteorological observations and easily accessible scientific software that allows complete reproduction of investigation results and further development of this theme by scientific community. In this research we used the daily observational data on the snow cover and surface meteorological parameters, obtained at three stations situated in different geographical regions: Col de Porte (France, Sodankyla (Finland, and Snoquamie Pass (USA.Statistical modeling of the snow cover depth is based on a complex of freely distributed the present-day machine learning models: Decision Trees, Adaptive Boosting, Gradient Boosting. It is demonstrated that use of combination of modern machine learning methods with available meteorological data provides the good accuracy of the snow cover modeling. The best results of snow cover depth modeling for every investigated site were obtained by the ensemble method of gradient boosting above decision trees – this model reproduces well both, the periods of snow cover accumulation and its melting. The purposeful character of learning process for models of the gradient boosting type, their ensemble character, and use of combined redundancy of a test sample in learning procedure makes this type of models a good and sustainable research tool. The results obtained can be used for estimating the snow cover characteristics for river basins where hydro-meteorological information is absent or insufficient.

  19. BADMINTON TRAINING MACHINE WITH IMPACT MECHANISM

    Directory of Open Access Journals (Sweden)

    B. F. YOUSIF

    2011-02-01

    Full Text Available In the current work, a newly machine was designed and fabricated for badminton training purpose. In the designing process, CATIA software was used to design and simulate the machine components. The design was based on direct impact method to launch the shuttle using spring as the source of the impact. Hook’s law was used theoretically to determine the initial and the maximum lengths of the springs. The main feature of the machine is that can move in two axes (up and down, left and right. For the control system, infra-red sensor and touch switch were adapted in microcontroller. The final product was locally fabricated and proved that the machine can operate properly.

  20. Scientific Growth

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    As one of the world's largest grain consumers,food security has always been a major concern for the Chinese nation.China must confront the challenge of feeding a fifth of the world's population with less than 9 percent of the planet's arable land.In 2011,China's grain output recorded growth for the eighth successive year,and total production reached an all-time high of 571million tons.In terms of food security,China's goal is to maintain a self-sufficiency rate of above 95 percent.However,an annual net population growth of 7.39 million and the effective decline of the area of farmland in the country,as a result of urbanization,make achieving such selfsufficiency a serious challenge.Given the heavy burden placed on Chinese agriculture,constantly raising productivity by relying on scientific and technological progress has become a priority for China's agricultural sector.The Ministry of Agriculture,for example,has worked to raise China's annual grain yield per-unit area by 1 percent,on average,over the past decade.Last year,the contributory rate of scientific and technological development to China's agriculture reached 52 percent,surpassing the contribution made by land,labor and other production factors for the first time in history.

  1. Redefining Earthquakes and the Earthquake Machine

    Science.gov (United States)

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  2. Redefining Earthquakes and the Earthquake Machine

    Science.gov (United States)

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  3. 医用统计学软件PPMS1.5在医学科学研究中的应用价值%APPLICATION OF MEDICAL STATISTICS SOFTWARE PPMS 1.5 IN MEDICAL SCIENTIFIC RESEARCH

    Institute of Scientific and Technical Information of China (English)

    周晓彬

    2011-01-01

    目的 介绍在医学科学研究中如何应用全中文医用统计学软件PPMS 1.5.方法 分析应用De1 phi语言编制而成的PPMS 1.5的功能及其应用特点.结果 PPMS 1.5完全采用菜单和对话框的操作方式,全中文界面,用户界面友好;具有和Office软件一致的文件管理功能及文字和数字编辑功能,具有完善的数据管理功能;具有强大的统计分析功能.PPMS不仅能处理原始数据,而对于原始数据已丢失、只有“均数±标准差”的数据也能进行检验和单因素方差分析;且在统计设计中的应用功能强大.结论 PPMS 1.5是一种实用、便捷、易学的医用统计学软件,在医学科学研究中应用价值较高.%Objective To explain how to use medical statistical software PPMS 1. 5 of Chinese version for medical research. Methods The functions and characteristics of Dolphinbased PPMS 1. 5 were analyzed. Results PPMS 1. 5, with its Chinese, customer-friendly interface, could be operated by using menus and dialogues. It had office software consistent file management features, text and digital editing functions, perfect data management functions, and powerful statistical analysis function. It could not only PPMS 1. 5 process the original data, but also did the t-test and one-way ANOVA of the data with only "mean± standardeviation". PPMS 1. 5 was a powerful tool in statistical designing, as well. Conclusion PPMS 1. 5 is a practical and facile statistical software, with high application value in medical research.

  4. Compendium of Scientific Linacs

    Energy Technology Data Exchange (ETDEWEB)

    Clendenin, James E

    2003-05-16

    The International Committee supported the proposal of the Chairman of the XVIII International Linac Conference to issue a new Compendium of linear accelerators. The last one was published in 1976. The Local Organizing Committee of Linac96 decided to set up a sub-committee for this purpose. Contrary to the catalogues of the High Energy Accelerators which compile accelerators with energies above 1 GeV, we have not defined a specific limit in energy. Microtrons and cyclotrons are not in this compendium. Also data from thousands of medical and industrial linacs has not been collected. Therefore, only scientific linacs are listed in the present compendium. Each linac found in this research and involved in a physics context was considered. It could be used, for example, either as an injector for high energy accelerators, or in nuclear physics, materials physics, free electron lasers or synchrotron light machines. Linear accelerators are developed in three continents only: America, Asia, and Europe. This geographical distribution is kept as a basis. The compendium contains the parameters and status of scientific linacs. Most of these linacs are operational. However, many facilities under construction or design studies are also included. A special mention has been made at the end for the studies of future linear colliders.

  5. Memoised Garbage Collection for Software Model Checking

    NARCIS (Netherlands)

    Nguyen, V.Y.; Ruys, T.C.; Kowalewski, S.; Philippou, A.

    Virtual machine based software model checkers like JPF and MoonWalker spend up to half of their veri��?cation time on garbage collection. This is no surprise as after nearly each transition the heap has to be cleaned from garbage. To improve this, this paper presents the Memoised Garbage Collection

  6. Software Engineering for Multi-core Platforms

    NARCIS (Netherlands)

    Arbab, F.; Jongmans, S.-S.T.Q.

    2012-01-01

    Decades after Turing proposed his model of computation, we still lack suitable means to tackle the complexity of getting more than a few Turing Machines to interact with one another in a verifiably coherent manner. This dearth currently hampers software engineering in unleashing the full potential o

  7. Software-defined anything challenges status quo

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Wayne; Borders, Tammie

    2015-01-01

    INL successfully developed a proof of concept for "Software Defined Anything" by emulating the laboratory's business applications that run on Virtual Machines. The work INL conducted demonstrates to industry on how this methodology can be used to improve security, automate and repeat processes, and improve consistency.

  8. RELAP-7 Software Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Choi, Yong-Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Zou, Ling [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support

    2014-09-25

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  9. SOFTWARE METRICS VALIDATION METHODOLOGIES IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    K.P. Srinivasan

    2014-12-01

    Full Text Available In the software measurement validations, assessing the validation of software metrics in software engineering is a very difficult task due to lack of theoretical methodology and empirical methodology [41, 44, 45]. During recent years, there have been a number of researchers addressing the issue of validating software metrics. At present, software metrics are validated theoretically using properties of measures. Further, software measurement plays an important role in understanding and controlling software development practices and products. The major requirement in software measurement is that the measures must represent accurately those attributes they purport to quantify and validation is critical to the success of software measurement. Normally, validation is a collection of analysis and testing activities across the full life cycle and complements the efforts of other quality engineering functions and validation is a critical task in any engineering project. Further, validation objective is to discover defects in a system and assess whether or not the system is useful and usable in operational situation. In the case of software engineering, validation is one of the software engineering disciplines that help build quality into software. The major objective of software validation process is to determine that the software performs its intended functions correctly and provides information about its quality and reliability. This paper discusses the validation methodology, techniques and different properties of measures that are used for software metrics validation. In most cases, theoretical and empirical validations are conducted for software metrics validations in software engineering [1-50].

  10. Chemical datuments as scientific enablers

    Directory of Open Access Journals (Sweden)

    Rzepa Henry S

    2013-01-01

    Full Text Available Abstract This article is an attempt to construct a chemical datument as a means of presenting insights into chemical phenomena in a scientific journal. An exploration of the interactions present in a small fragment of duplex Z-DNA and the nature of the catalytic centre of a carbon-dioxide/alkene epoxide alternating co-polymerisation is presented in this datument, with examples of the use of three software tools, one based on Java, the other two using Javascript and HTML5 technologies. The implications for the evolution of scientific journals are discussed.

  11. Chemical datuments as scientific enablers.

    Science.gov (United States)

    Rzepa, Henry S

    2013-01-23

    This article is an attempt to construct a chemical datument as a means of presenting insights into chemical phenomena in a scientific journal. An exploration of the interactions present in a small fragment of duplex Z-DNA and the nature of the catalytic centre of a carbon-dioxide/alkene epoxide alternating co-polymerisation is presented in this datument, with examples of the use of three software tools, one based on Java, the other two using Javascript and HTML5 technologies. The implications for the evolution of scientific journals are discussed.

  12. Software Complexity Threatens Performance Portability

    Energy Technology Data Exchange (ETDEWEB)

    Gamblin, T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-09-11

    Modern HPC software packages are rarely self-contained. They depend on a large number of external libraries, and many spend large fractions of their runtime in external subroutines. Performance portability depends not only on the effort of application teams, but also on the availability of well-tuned libraries. At most sites, the burden of maintaining libraries is shared by code teams and facilities. Facilities typically provide well-tuned default versions, but code teams frequently build with bleeding-edge compilers to achieve high performance. For this reason, HPC has no “standard” software stack, unlike other domains where performance is not critical. Incompatibilities among compilers and software versions force application teams and facility staff to re-build custom versions of libraries for each new toolchain. Because the number of potential configurations is combinatorial, and because HPC software is notoriously difficult to port to new machines [3, 7, 8], the tuning effort required to support and maintain performance-portable libraries outstrips the available manpower at most sites. Software complexity is a growing obstacle to performance portability for HPC.

  13. Machine tool structures

    CERN Document Server

    Koenigsberger, F

    1970-01-01

    Machine Tool Structures, Volume 1 deals with fundamental theories and calculation methods for machine tool structures. Experimental investigations into stiffness are discussed, along with the application of the results to the design of machine tool structures. Topics covered range from static and dynamic stiffness to chatter in metal cutting, stability in machine tools, and deformations of machine tool structures. This volume is divided into three sections and opens with a discussion on stiffness specifications and the effect of stiffness on the behavior of the machine under forced vibration c

  14. A co-design approach for embedded control software of cyber-physical systems

    NARCIS (Netherlands)

    Broenink, Jan F.; Vos, Peter-Jan D.; Lu, Zhou; Bezemer, Maarten M.

    2016-01-01

    This work is about an approach for designing control software for mechatronic and robotic machines. As all system parts (control algorithms, software infrastructure, I/O, and machine) influence each other, its total behaviour needs to be taken into account. Therefore, we use appropriate modelling fo

  15. Selecting and effectively using a computer aided software engineering tool

    Energy Technology Data Exchange (ETDEWEB)

    Kuhn, D.L.

    1989-01-01

    Software engineering is a science by which user requirements are translated into a quality software product. Computer Aided Software Engineering (CASE) is the scientific application of a set of tools and methods to a software which results in high-quality, defect-free, and maintainable software products. The Computer Systems Engineering (CSE) group of Separations Technology at the Savannah River Site has successfully used CASE tools to produce high-quality, reliable, and maintainable software products. This paper details the selection process CSE used to acquire a commonly available CASE product and how the CSE group effectively used this CASE tool to consistently produce quality software. 9 refs.

  16. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  17. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  18. DiFX: A Software Correlator for Very Long Baseline Interferometry Using Multiprocessor Computing Environments

    Science.gov (United States)

    Deller, A. T.; Tingay, S. J.; Bailes, M.; West, C.

    2007-03-01

    We describe the development of an FX-style correlator for very long baseline interferometry (VLBI), implemented in software and intended to run in multiprocessor computing environments, such as large clusters of commodity machines (Beowulf clusters) or computers specifically designed for high-performance computing, such as multiprocessor shared-memory machines. We outline the scientific and practical benefits for VLBI correlation, these chiefly being due to the inherent flexibility of software and the fact that the highly parallel and scalable nature of the correlation task is well suited to a multiprocessor computing environment. We suggest scientific applications where such an approach to VLBI correlation is most suited and will give the best returns. We report detailed results from the Distributed FX (DiFX) software correlator running on the Swinburne supercomputer (a Beowulf cluster of ~300 commodity processors), including measures of the performance of the system. For example, to correlate all Stokes products for a 10 antenna array with an aggregate bandwidth of 64 MHz per station, and using typical time and frequency resolution, currently requires an order of 100 desktop-class compute nodes. Due to the effect of Moore's law on commodity computing performance, the total number and cost of compute nodes required to meet a given correlation task continues to decrease rapidly with time. We show detailed comparisons between DiFX and two existing hardware-based correlators: the Australian Long Baseline Array S2 correlator and the NRAO Very Long Baseline Array correlator. In both cases, excellent agreement was found between the correlators. Finally, we describe plans for the future operation of DiFX on the Swinburne supercomputer for both astrophysical and geodetic science.

  19. Design of Demining Machines

    CERN Document Server

    Mikulic, Dinko

    2013-01-01

    In constant effort to eliminate mine danger, international mine action community has been developing safety, efficiency and cost-effectiveness of clearance methods. Demining machines have become necessary when conducting humanitarian demining where the mechanization of demining provides greater safety and productivity. Design of Demining Machines describes the development and testing of modern demining machines in humanitarian demining.   Relevant data for design of demining machines are included to explain the machinery implemented and some innovative and inspiring development solutions. Development technologies, companies and projects are discussed to provide a comprehensive estimate of the effects of various design factors and to proper selection of optimal parameters for designing the demining machines.   Covering the dynamic processes occurring in machine assemblies and their components to a broader understanding of demining machine as a whole, Design of Demining Machines is primarily tailored as a tex...

  20. Applied machining technology

    CERN Document Server

    Tschätsch, Heinz

    2010-01-01

    Machining and cutting technologies are still crucial for many manufacturing processes. This reference presents all important machining processes in a comprehensive and coherent way. It includes many examples of concrete calculations, problems and solutions.

  1. Machining with abrasives

    CERN Document Server

    Jackson, Mark J

    2011-01-01

    Abrasive machining is key to obtaining the desired geometry and surface quality in manufacturing. This book discusses the fundamentals and advances in the abrasive machining processes. It provides a complete overview of developing areas in the field.

  2. Women, Men, and Machines.

    Science.gov (United States)

    Form, William; McMillen, David Byron

    1983-01-01

    Data from the first national study of technological change show that proportionately more women than men operate machines, are more exposed to machines that have alienating effects, and suffer more from the negative effects of technological change. (Author/SSH)

  3. Machine medical ethics

    CERN Document Server

    Pontier, Matthijs

    2015-01-01

    The essays in this book, written by researchers from both humanities and sciences, describe various theoretical and experimental approaches to adding medical ethics to a machine in medical settings. Medical machines are in close proximity with human beings, and getting closer: with patients who are in vulnerable states of health, who have disabilities of various kinds, with the very young or very old, and with medical professionals. In such contexts, machines are undertaking important medical tasks that require emotional sensitivity, knowledge of medical codes, human dignity, and privacy. As machine technology advances, ethical concerns become more urgent: should medical machines be programmed to follow a code of medical ethics? What theory or theories should constrain medical machine conduct? What design features are required? Should machines share responsibility with humans for the ethical consequences of medical actions? How ought clinical relationships involving machines to be modeled? Is a capacity for e...

  4. Brain versus Machine Control.

    Directory of Open Access Journals (Sweden)

    Jose M Carmena

    2004-12-01

    Full Text Available Dr. Octopus, the villain of the movie "Spiderman 2", is a fusion of man and machine. Neuroscientist Jose Carmena examines the facts behind this fictional account of a brain- machine interface

  5. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  6. NASA-Enhanced Version Of Automatically Programmed Tool Software (APT)

    Science.gov (United States)

    Purves, L. R.

    1989-01-01

    APT code one of most widely used software tools for complex numerically-controlled machining. Both a programming language and software that processes language. Upgrades include super pocket for concave polygon pockets and editor to reprocess cutter location coordinates according to user-supplied commands.

  7. Towards a software evolution benchmark

    OpenAIRE

    Demeyer, Serge; Mens, Tom; Wermelinger, Michel

    2001-01-01

    Case-studies are extremely popular in rapidly evolving research disciplines such as software engineering because they allow for a quick but fair assessment of new techniques. Unfortunately, a proper experimental set-up is rarely the case: all too often case-studies are based on a single small toy-example chosen to favour the technique under study. Such lack of scientific rigor prevents fair evaluation and has disastrous consequences for the credibility of our field. In this paper, we propose ...

  8. Scientific notations for the digital era

    CERN Document Server

    Hinsen, Konrad

    2016-01-01

    Computers have profoundly changed the way scientific research is done. Whereas the importance of computers as research tools is evident to everyone, the impact of the digital revolution on the representation of scientific knowledge is not yet widely recognized. An ever increasing part of today's scientific knowledge is expressed, published, and archived exclusively in the form of software and electronic datasets. In this essay, I compare these digital scientific notations to the the traditional scientific notations that have been used for centuries, showing how the digital notations optimized for computerized processing are often an obstacle to scientific communication and to creative work by human scientists. I analyze the causes and propose guidelines for the design of more human-friendly digital scientific notations.

  9. Software Metrics to Estimate Software Quality using Software Component Reusability

    Directory of Open Access Journals (Sweden)

    Prakriti Trivedi

    2012-03-01

    Full Text Available Today most of the applications developed using some existing libraries, codes, open sources etc. As a code is accessed in a program, it is represented as the software component. Such as in java beans and in .net ActiveX controls are the software components. These components are ready to use programming code or controls that excel the code development. A component based software system defines the concept of software reusability. While using these components the main question arise is whether to use such components is beneficial or not. In this proposed work we are trying to present the answer for the same question. In this work we are presenting a set of software matrix that will check the interconnection between the software component and the application. How strong this relation defines the software quality after using this software component. The overall metrics will return the final result in terms of the boundless of the component with application.

  10. Software-Reconfigurable Processors for Spacecraft

    Science.gov (United States)

    Farrington, Allen; Gray, Andrew; Bell, Bryan; Stanton, Valerie; Chong, Yong; Peters, Kenneth; Lee, Clement; Srinivasan, Jeffrey

    2005-01-01

    A report presents an overview of an architecture for a software-reconfigurable network data processor for a spacecraft engaged in scientific exploration. When executed on suitable electronic hardware, the software performs the functions of a physical layer (in effect, acts as a software radio in that it performs modulation, demodulation, pulse-shaping, error correction, coding, and decoding), a data-link layer, a network layer, a transport layer, and application-layer processing of scientific data. The software-reconfigurable network processor is undergoing development to enable rapid prototyping and rapid implementation of communication, navigation, and scientific signal-processing functions; to provide a long-lived communication infrastructure; and to provide greatly improved scientific-instrumentation and scientific-data-processing functions by enabling science-driven in-flight reconfiguration of computing resources devoted to these functions. This development is an extension of terrestrial radio and network developments (e.g., in the cellular-telephone industry) implemented in software running on such hardware as field-programmable gate arrays, digital signal processors, traditional digital circuits, and mixed-signal application-specific integrated circuits (ASICs).

  11. 一种基于机器指纹的可信软件水印技术研究%The Research on Trusted Software Watermarking Based on Machine Fingerprint

    Institute of Scientific and Technical Information of China (English)

    王伟; 张毅; 王刘程; 朱健伟

    2014-01-01

    software watermarks which are totally based on the software’s attributes and the analysis of their disadvantages when face distortion attack and adding attack, this paper proposes a new method to generate digital watermarking with the help of the principles of trusted platform module (TPM) in trusted computing system. The digital watermarking generation method we proposed has great security, undetectability and robustness, avoids damages to the computer, and can be used much more broadly.

  12. A Universal Reactive Machine

    DEFF Research Database (Denmark)

    Andersen, Henrik Reif; Mørk, Simon; Sørensen, Morten U.

    1997-01-01

    Turing showed the existence of a model universal for the set of Turing machines in the sense that given an encoding of any Turing machine asinput the universal Turing machine simulates it. We introduce the concept of universality for reactive systems and construct a CCS processuniversal...

  13. PC controlled toothbrush/dentifrice abrasion machine.

    Science.gov (United States)

    Bal, G; Uçtaşli, S; Bekiroğlu, E

    1999-02-01

    A toothbrush/dentifrice abrasion machine was developed to use in dental research laboratory. The mechanism was designed as a hexagonal block driven by two stepping motors which move the mechanism in four directions. In order to control the stepping motors speed, position and direction commands or signals were generated by a software written in C Programming Language and then these commands were applied the stepping motor drives through parallel port of a personal computer. The toothbrush/dentifrice abrasion machine was finally used to measure different longevity of tooth brush. It was experimentally shown that the mechanism can be used for highly accurate position and speed applications.

  14. Dynamic study of synchronous machine electric drive

    Directory of Open Access Journals (Sweden)

    Dimitar Spirov

    2005-10-01

    Full Text Available The dynamic behaviour of the fan blower synchoronous machine drive have been studied in the paper. The equations for the voltages of the synchoronous machine windings are presented in a coordinate system which rotates at the angular speed of the rotor. The mechanical equipment is presented by means of a single-mass dynamic model. The derived system of differential equations is transformed and solved using suitable software product. The results obtained for rotation frequency and electromagnetic torque motor in the courses of different values of rated supply voltage and of different initial resistant moment of the mechanism have been graphically presented. Conclusions from the results obtained have been done.

  15. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  16. High performance in software development

    CERN Document Server

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  17. Data storage technology: Hardware and software, Appendix B

    Science.gov (United States)

    Sable, J. D.

    1972-01-01

    This project involves the development of more economical ways of integrating and interfacing new storage devices and data processing programs into a computer system. It involves developing interface standards and a software/hardware architecture which will make it possible to develop machine independent devices and programs. These will interface with the machine dependent operating systems of particular computers. The development project will not be to develop the software which would ordinarily be the responsibility of the manufacturer to supply, but to develop the standards with which that software is expected to confirm in providing an interface with the user or storage system.

  18. Tank monitor and control system (TMACS) software configuration management plan

    Energy Technology Data Exchange (ETDEWEB)

    GLASSCOCK, J.A.

    1999-05-13

    This Software Configuration Management Plan (SCMP) describes the methodology for control of computer software developed and supported by the Systems Development and Integration (SD and I) organization of Lockheed Martin Services, Inc. (LMSI) for the Tank Monitor and Control System (TMACS). This plan controls changes to the software and configuration files used by TMACS. The controlled software includes the Gensym software package, Gensym knowledge base files developed for TMACS, C-language programs used by TMACS, the operating system on the production machine, language compilers, and all Windows NT commands and functions which affect the operating environment. The configuration files controlled include the files downloaded to the Acromag and Westronic field instruments.

  19. Asynchronized synchronous machines

    CERN Document Server

    Botvinnik, M M

    1964-01-01

    Asynchronized Synchronous Machines focuses on the theoretical research on asynchronized synchronous (AS) machines, which are "hybrids” of synchronous and induction machines that can operate with slip. Topics covered in this book include the initial equations; vector diagram of an AS machine; regulation in cases of deviation from the law of full compensation; parameters of the excitation system; and schematic diagram of an excitation regulator. The possible applications of AS machines and its calculations in certain cases are also discussed. This publication is beneficial for students and indiv

  20. Precision machine design

    CERN Document Server

    Slocum, Alexander H

    1992-01-01

    This book is a comprehensive engineering exploration of all the aspects of precision machine design - both component and system design considerations for precision machines. It addresses both theoretical analysis and practical implementation providing many real-world design case studies as well as numerous examples of existing components and their characteristics. Fast becoming a classic, this book includes examples of analysis techniques, along with the philosophy of the solution method. It explores the physics of errors in machines and how such knowledge can be used to build an error budget for a machine, how error budgets can be used to design more accurate machines.

  1. Safety-Critical Software: Status Report and Annotated Bibliography

    Science.gov (United States)

    1993-06-01

    software in place of hardware in safety-critical sys- tems are the Therac 25 (a therapeutic linear accelerator) and nuclear reactor shutdown sys- tems...Leveson and Turner [141], is the Therac 25 radiation treatment machine. A predecessor to the Therac 25, the Therac 20, had a number of hardware Interlocks...to stop an undesirable behavior. Much of the software in the Therac 25 was similar to that of the Therac 20 and the software in both cases contained

  2. A Method for Design of Modular Reconfigurable Machine Tools

    Directory of Open Access Journals (Sweden)

    Zhengyi Xu

    2017-02-01

    Full Text Available Presented in this paper is a method for the design of modular reconfigurable machine tools (MRMTs. An MRMT is capable of using a minimal number of modules through reconfiguration to perform the required machining tasks for a family of parts. The proposed method consists of three steps: module identification, module determination, and layout synthesis. In the first step, the module components are collected from a family of general-purpose machines to establish a module library. In the second step, for a given family of parts to be machined, a set of needed modules are selected from the module library to construct a desired reconfigurable machine tool. In the third step, a final machine layout is decided though evaluation by considering a number of performance indices. Based on this method, a software package has been developed that can design an MRMT for a given part family.

  3. Promoting Scientific Spirit to Cultivate Scientific Culture

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Scientific culture is an advanced culture that is based on scientific knowledge and supported by the scientific method, with scientific thinking as its core and scientific spirit as its soul. During the process of modernization, it has profound impacts on human society in terms of values, ethics, mode of thinking, lifestyle and code of conduct, offering human civilization an important ideological source, physical foundation, technological tool and effective carrier.

  4. Modelling and simulation of multitechnological machine systems

    Energy Technology Data Exchange (ETDEWEB)

    Holopainen, T. (ed.) [VTT Manufacturing Technology, Espoo (Finland)

    2001-07-01

    The Smart Machines and Systems 2010 (SMART) technology programme 1997-2000 aimed at supporting the machine and electromechanical industries in incorporating the modern technology into their products and processes. The public research projects in this programme were planned to accumulate the latest research results and transfer them for the benefit of industrial product development. The major research topic in the SMART programme was called Modelling and Simulation of Multitechnological Mechatronic Systems. The behaviour of modern machine systems and subsystems addresses many different types of physical phenomena and their mutual interactions: mechanical behaviour of structures, electromagnetic effects, hydraulics, vibrations and acoustics etc. together with associated control systems and software. The actual research was carried out in three separate projects called Modelling and Simulation of Mechtronic Machine Systems for Product Development and Condition Monitoring Purposes (MASI), Virtual Testing of Hydraulically Driven Machines (HYSI), and Control of Low Frequency Vibration of a Mobile Machine (AKSUS). This publication contains the papers presented at the final seminar of these three research projects, held on November 30th at Otaniemi Espoo. (orig.)

  5. OPENING REMARKS: Scientific Discovery through Advanced Computing

    Science.gov (United States)

    Strayer, Michael

    2006-01-01

    as the national and regional electricity grid, carbon sequestration, virtual engineering, and the nuclear fuel cycle. The successes of the first five years of SciDAC have demonstrated the power of using advanced computing to enable scientific discovery. One measure of this success could be found in the President’s State of the Union address in which President Bush identified ‘supercomputing’ as a major focus area of the American Competitiveness Initiative. Funds were provided in the FY 2007 President’s Budget request to increase the size of the NERSC-5 procurement to between 100-150 teraflops, to upgrade the LCF Cray XT3 at Oak Ridge to 250 teraflops and acquire a 100 teraflop IBM BlueGene/P to establish the Leadership computing facility at Argonne. We believe that we are on a path to establish a petascale computing resource for open science by 2009. We must develop software tools, packages, and libraries as well as the scientific application software that will scale to hundreds of thousands of processors. Computer scientists from universities and the DOE’s national laboratories will be asked to collaborate on the development of the critical system software components such as compilers, light-weight operating systems and file systems. Standing up these large machines will not be business as usual for ASCR. We intend to develop a series of interconnected projects that identify cost, schedule, risks, and scope for the upgrades at the LCF at Oak Ridge, the establishment of the LCF at Argonne, and the development of the software to support these high-end computers. The critical first step in defining the scope of the project is to identify a set of early application codes for each leadership class computing facility. These codes will have access to the resources during the commissioning phase of the facility projects and will be part of the acceptance tests for the machines. Applications will be selected, in part, by breakthrough science, scalability, and

  6. Application of the object-oriented paradigm for scientific experiment monitoring & control

    Science.gov (United States)

    Racaud, Thierry; Assis-Arantes, Patrick

    1994-12-01

    This paper presents a new approach to the monitoring and control of scientific experiments. This new approach is based on an object-oriented environment composed of three elements: (a) A graphical environment that allows the creation of an object-oriented model of the experiment based on objects, attributes and methods. (b) A language for writing procedures to access the model by sending messages in order to operate the experiment. (c) A man-machine interface based on an interactive graphical layer above the object-oriented representation for controlling and monitoring the experiment. This new approach has been prototyped in a project called "Man-Machine Interface Software for Ground User Terminal", or User Terminal in short. The project is carried out by SPACEBEL Informatique on behalf of the European Space Research and Technology Centre (ESTEC). Although this project has been undertaken for the operation of scientific experiments in space, User Terminal can naturally be used for the monitoring and control of ground based experiments. This article presents the User Terminal system as well as one of the first practical exercises performed in the context of the teleoperation of a liquid science experiment to be shipped into space.

  7. Understanding dental CAD/CAM for restorations - dental milling machines from a mechanical engineering viewpoint. Part A: chairside milling machines.

    Science.gov (United States)

    Lebon, Nicolas; Tapie, Laurent; Duret, Francois; Attal, Jean-Pierre

    2016-01-01

    The dental milling machine is an important device in the dental CAD/CAM chain. Nowadays, dental numerical controlled (NC) milling machines are available for dental surgeries (chairside solution). This article provides a mechanical engineering approach to NC milling machines to help dentists understand the involvement of technology in digital dentistry practice. First, some technical concepts and definitions associated with NC milling machines are described from a mechanical engineering viewpoint. The technical and economic criteria of four chairside dental NC milling machines that are available on the market are then described. The technical criteria are focused on the capacities of the embedded technologies of these milling machines to mill both prosthetic materials and types of shape restorations. The economic criteria are focused on investment costs and interoperability with third-party software. The clinical relevance of the technology is assessed in terms of the accuracy and integrity of the restoration.

  8. Amalgamation of Personal Software Process in Software ...

    African Journals Online (AJOL)

    evolutionary series of personal software engineering techniques that an engineer learns and ... Article History: Received : 30-04- ... began to realize that software process, plans and methodologies for ..... Executive Strategy. Addison-Wesley ...

  9. Automatic Extraction of Metadata from Scientific Publications for CRIS Systems

    Science.gov (United States)

    Kovacevic, Aleksandar; Ivanovic, Dragan; Milosavljevic, Branko; Konjovic, Zora; Surla, Dusan

    2011-01-01

    Purpose: The aim of this paper is to develop a system for automatic extraction of metadata from scientific papers in PDF format for the information system for monitoring the scientific research activity of the University of Novi Sad (CRIS UNS). Design/methodology/approach: The system is based on machine learning and performs automatic extraction…

  10. LIFE AND SCIENTIFIC ACHIEVEMENTS OF TRYPHON MAKSYMOVYCH BASHTA

    Directory of Open Access Journals (Sweden)

    Olena Bashta

    2016-03-01

    Full Text Available The article considers life and scientific achievements of T. M. Bashta (1904 -1987, a product manager of aircraft industry and the founder of home scientific school of industrial hydraulics. Two strategic branches – machine tool-building and aeronautic – are indebted exactly to him for its advance.

  11. Automatic Extraction of Metadata from Scientific Publications for CRIS Systems

    Science.gov (United States)

    Kovacevic, Aleksandar; Ivanovic, Dragan; Milosavljevic, Branko; Konjovic, Zora; Surla, Dusan

    2011-01-01

    Purpose: The aim of this paper is to develop a system for automatic extraction of metadata from scientific papers in PDF format for the information system for monitoring the scientific research activity of the University of Novi Sad (CRIS UNS). Design/methodology/approach: The system is based on machine learning and performs automatic extraction…

  12. Dynamical characteristics of software trustworthiness and their evolutionary complexity

    Institute of Scientific and Technical Information of China (English)

    ZHENG ZhiMing; MA ShiLong; LI Wei; WEI Wei; JIANG Xin; ZHANG ZhanLi; GUO BingHui

    2009-01-01

    Developing trusted $oftwares has become an important trend and a natural choice In the development of software technology and applications, and software trustworthiness modeling has become a prerequisite and necessary means. To discuss and explain the basic scientific problems in software trustworthiness and to establish theoretical foundations for software trustworthiness measurement, combining the Ideas of dynamical system study, this paper studies evolutionary laws of software trustworthiness and the dynamical mechanism under the effect of various internal and external factors, and proposes dynamical models for software trustworthiness, thus, software trustworthiness can be considered as the statistical characteristics of behaviors of software systems in the dynamical and open environment. By analyzing two simple examples, the paper explains the relationship between the limit evolutionary behaviors of software trustworthiness attributes and dynamical system characteristics, and interprets the dynamical characteristics of software trustworthiness and their evolutionary complexity.

  13. Motion Simulation Analysis of Rail Weld CNC Fine Milling Machine

    Science.gov (United States)

    Mao, Huajie; Shu, Min; Li, Chao; Zhang, Baojun

    CNC fine milling machine is a new advanced equipment of rail weld precision machining with high precision, high efficiency, low environmental pollution and other technical advantages. The motion performance of this machine directly affects its machining accuracy and stability, which makes it an important consideration for its design. Based on the design drawings, this article completed 3D modeling of 60mm/kg rail weld CNC fine milling machine by using Solidworks. After that, the geometry was imported into Adams to finish the motion simulation analysis. The displacement, velocity, angular velocity and some other kinematical parameters curves of the main components were obtained in the post-processing and these are the scientific basis for the design and development for this machine.

  14. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  15. Perspex machine: VII. The universal perspex machine

    Science.gov (United States)

    Anderson, James A. D. W.

    2006-01-01

    The perspex machine arose from the unification of projective geometry with the Turing machine. It uses a total arithmetic, called transreal arithmetic, that contains real arithmetic and allows division by zero. Transreal arithmetic is redefined here. The new arithmetic has both a positive and a negative infinity which lie at the extremes of the number line, and a number nullity that lies off the number line. We prove that nullity, 0/0, is a number. Hence a number may have one of four signs: negative, zero, positive, or nullity. It is, therefore, impossible to encode the sign of a number in one bit, as floating-point arithmetic attempts to do, resulting in the difficulty of having both positive and negative zeros and NaNs. Transrational arithmetic is consistent with Cantor arithmetic. In an extension to real arithmetic, the product of zero, an infinity, or nullity with its reciprocal is nullity, not unity. This avoids the usual contradictions that follow from allowing division by zero. Transreal arithmetic has a fixed algebraic structure and does not admit options as IEEE, floating-point arithmetic does. Most significantly, nullity has a simple semantics that is related to zero. Zero means "no value" and nullity means "no information." We argue that nullity is as useful to a manufactured computer as zero is to a human computer. The perspex machine is intended to offer one solution to the mind-body problem by showing how the computable aspects of mind and, perhaps, the whole of mind relates to the geometrical aspects of body and, perhaps, the whole of body. We review some of Turing's writings and show that he held the view that his machine has spatial properties. In particular, that it has the property of being a 7D lattice of compact spaces. Thus, we read Turing as believing that his machine relates computation to geometrical bodies. We simplify the perspex machine by substituting an augmented Euclidean geometry for projective geometry. This leads to a general

  16. Ontologies for software engineering and software technology

    CERN Document Server

    Calero, Coral; Piattini, Mario

    2006-01-01

    Covers two applications of ontologies in software engineering and software technology: sharing knowledge of the problem domain and using a common terminology among all stakeholders; and filtering the knowledge when defining models and metamodels. This book is of benefit to software engineering researchers in both academia and industry.

  17. Multilanguage parallel programming of heterogeneous machines

    Energy Technology Data Exchange (ETDEWEB)

    Bisiani, R.; Forin, A.

    1988-08-01

    The authors designed and implemented a system, Agora, that supports the development of multilanguage parallel applications for heterogeneous machines. Agora hinges on two ideas: the first one is that shared memory can be a suitable abstraction to program concurrent, multilanguage modules running on heterogeneous machines. The second one is that a shared memory abstraction can efficiently supported across different computer architectures that are not connected by a physical shared memory, for example local are network workstations or ensemble machines. Agora has been in use for more than a year. This paper describes the Agora shared memory and its software implementation on both tightly and loosely coupled architectures. Measurements of the current implementation are also included.

  18. Load Balancing Scientific Applications

    Energy Technology Data Exchange (ETDEWEB)

    Pearce, Olga Tkachyshyn [Texas A & M Univ., College Station, TX (United States)

    2014-12-01

    The largest supercomputers have millions of independent processors, and concurrency levels are rapidly increasing. For ideal efficiency, developers of the simulations that run on these machines must ensure that computational work is evenly balanced among processors. Assigning work evenly is challenging because many large modern parallel codes simulate behavior of physical systems that evolve over time, and their workloads change over time. Furthermore, the cost of imbalanced load increases with scale because most large-scale scientific simulations today use a Single Program Multiple Data (SPMD) parallel programming model, and an increasing number of processors will wait for the slowest one at the synchronization points. To address load imbalance, many large-scale parallel applications use dynamic load balance algorithms to redistribute work evenly. The research objective of this dissertation is to develop methods to decide when and how to load balance the application, and to balance it effectively and affordably. We measure and evaluate the computational load of the application, and develop strategies to decide when and how to correct the imbalance. Depending on the simulation, a fast, local load balance algorithm may be suitable, or a more sophisticated and expensive algorithm may be required. We developed a model for comparison of load balance algorithms for a specific state of the simulation that enables the selection of a balancing algorithm that will minimize overall runtime.

  19. Study of an NC system of machining crown gears

    Science.gov (United States)

    Xu, Xiaogang; Wang, Huaqing; Yan, Jian; Gao, Shenyou

    2005-12-01

    Crown gear couplings are usually used in metallurgy and steel rolling equipments, which is manufactured by duplicating processing in common. The method makes the manipulator work hard, and the efficiency is low. The machining precision is limited to the shape of the mold and it is difficult to control the movement of machines table. This work stated an NC system to use hobbing machine. It consists of an industrial control computer, grating sensor, servo- motor and its driver source, servo driver card and other I/O equipments of inputting and outputting. The grating sensor was installed in the axial direction to trace the instantaneous position of gob rest. The radial movement of the machine table was controlled by a servomotor. When the computer captures the axial signal, this system controls the machine table by moving ahead or backwards according to the calculated value of interpolation theory. Thus, two dimensions (axial and radial) associated movement was realized while the crown gear was processed. The feature of the system is that a grating sensor used in the axial direction replaces the servomotor. By making a little change in the mechanism of the machine, NC can be implement and its redesign cost is very low. The design software has an interpolation function for a circular arc and line. The system has been used on a Y1380 gear hobbing machine, and the correlative software of machining crown gear has been designed as well. Satisfactory results have been obtained, showing facility and reliability in practical operation.

  20. The ANS mathematics and computation software standards

    Energy Technology Data Exchange (ETDEWEB)

    Smetana, A.O. [Westinghouse Savannah River Company, Aiken, SC (United States)

    1995-12-31

    The Mathematics and Computations Division of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains four ANSI/ANS software standards. These standards are: Recommended Programming Practices to Facilitate the Portability of Scientific Computer Programs, ANS-10.2; Guidelines for the Documentation of Computer Software, ANS-10.3; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Computer Program Development, ANS-10.5.

  1. Controlling Software Piracy.

    Science.gov (United States)

    King, Albert S.

    1992-01-01

    Explains what software manufacturers are doing to combat software piracy, recommends how managers should deal with this problem, and provides a role-playing exercise to help students understand the issues in software piracy. (SR)

  2. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  3. Kinematic modelling of a 3-axis NC machine tool in linear and circular interpolation

    CERN Document Server

    Pessoles, Xavier; Rubio, Walter; 10.1007/s00170-009-2236-z

    2010-01-01

    Machining time is a major performance criterion when it comes to high-speed machining. CAM software can help in estimating that time for a given strategy. But in practice, CAM-programmed feed rates are rarely achieved, especially where complex surface finishing is concerned. This means that machining time forecasts are often more than one step removed from reality. The reason behind this is that CAM routines do not take either the dynamic performances of the machines or their specific machining tolerances into account. The present article seeks to improve simulation of high-speed NC machine dynamic behaviour and machining time prediction, offering two models. The first contributes through enhanced simulation of three-axis paths in linear and circular interpolation, taking high-speed machine accelerations and jerks into account. The second model allows transition passages between blocks to be integrated in the simulation by adding in a polynomial transition path that caters for the true machining environment t...

  4. An open-source solution for advanced imaging flow cytometry data analysis using machine learning.

    Science.gov (United States)

    Hennig, Holger; Rees, Paul; Blasi, Thomas; Kamentsky, Lee; Hung, Jane; Dao, David; Carpenter, Anne E; Filby, Andrew

    2017-01-01

    Imaging flow cytometry (IFC) enables the high throughput collection of morphological and spatial information from hundreds of thousands of single cells. This high content, information rich image data can in theory resolve important biological differences among complex, often heterogeneous biological samples. However, data analysis is often performed in a highly manual and subjective manner using very limited image analysis techniques in combination with conventional flow cytometry gating strategies. This approach is not scalable to the hundreds of available image-based features per cell and thus makes use of only a fraction of the spatial and morphometric information. As a result, the quality, reproducibility and rigour of results are limited by the skill, experience and ingenuity of the data analyst. Here, we describe a pipeline using open-source software that leverages the rich information in digital imagery using machine learning algorithms. Compensated and corrected raw image files (.rif) data files from an imaging flow cytometer (the proprietary .cif file format) are imported into the open-source software CellProfiler, where an image processing pipeline identifies cells and subcellular compartments allowing hundreds of morphological features to be measured. This high-dimensional data can then be analysed using cutting-edge machine learning and clustering approaches using "user-friendly" platforms such as CellProfiler Analyst. Researchers can train an automated cell classifier to recognize different cell types, cell cycle phases, drug treatment/control conditions, etc., using supervised machine learning. This workflow should enable the scientific community to leverage the full analytical power of IFC-derived data sets. It will help to reveal otherwise unappreciated populations of cells based on features that may be hidden to the human eye that include subtle measured differences in label free detection channels such as bright-field and dark-field imagery.

  5. InSAR Scientific Computing Environment on the Cloud

    Science.gov (United States)

    Rosen, P. A.; Shams, K. S.; Gurrola, E. M.; George, B. A.; Knight, D. S.

    2012-12-01

    In response to the needs of the international scientific and operational Earth observation communities, spaceborne Synthetic Aperture Radar (SAR) systems are being tasked to produce enormous volumes of raw data daily, with availability to scientists to increase substantially as more satellites come online and data becomes more accessible through more open data policies. The availability of these unprecedentedly dense and rich datasets has led to the development of sophisticated algorithms that can take advantage of them. In particular, interferometric time series analysis of SAR data provides insights into the changing earth and requires substantial computational power to process data across large regions and over large time periods. This poses challenges for existing infrastructure, software, and techniques required to process, store, and deliver the results to the global community of scientists. The current state-of-the-art solutions employ traditional data storage and processing applications that require download of data to the local repositories before processing. This approach is becoming untenable in light of the enormous volume of data that must be processed in an iterative and collaborative manner. We have analyzed and tested new cloud computing and virtualization approaches to address these challenges within the context of InSAR in the earth science community. Cloud computing is democratizing computational and storage capabilities for science users across the world. The NASA Jet Propulsion Laboratory has been an early adopter of this technology, successfully integrating cloud computing in a variety of production applications ranging from mission operations to downlink data processing. We have ported a new InSAR processing suite called ISCE (InSAR Scientific Computing Environment) to a scalable distributed system running in the Amazon GovCloud to demonstrate the efficacy of cloud computing for this application. We have integrated ISCE with Polyphony to

  6. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  7. Scientific Misconduct in India: Causes and Perpetuation.

    Science.gov (United States)

    Patnaik, Pratap R

    2016-08-01

    Along with economic strength, space technology and software expertise, India is also a leading nation in fraudulent scientific research. The problem is worsened by vested interests working in concert for their own benefits. These self-promoting cartels, together with biased evaluation methods and weak penal systems, combine to perpetuate scientific misconduct. Some of these issues are discussed in this commentary, with supporting examples and possible solutions.

  8. Software and systems traceability

    CERN Document Server

    Cleland-Huang, Jane; Zisman, Andrea

    2012-01-01

    ""Software and Systems Traceability"" provides a comprehensive description of the practices and theories of software traceability across all phases of the software development lifecycle. The term software traceability is derived from the concept of requirements traceability. Requirements traceability is the ability to track a requirement all the way from its origins to the downstream work products that implement that requirement in a software system. Software traceability is defined as the ability to relate the various types of software artefacts created during the development of software syst

  9. Scalable Scientific Workflows Management System SWFMS

    Directory of Open Access Journals (Sweden)

    M. Abdul Rahman

    2016-11-01

    Full Text Available In today’s electronic world conducting scientific experiments, especially in natural sciences domain, has become more and more challenging for domain scientists since “science” today has turned out to be more complex due to the two dimensional intricacy; one: assorted as well as complex computational (analytical applications and two: increasingly large volume as well as heterogeneity of scientific data products processed by these applications. Furthermore, the involvement of increasingly large number of scientific instruments such as sensors and machines makes the scientific data management even more challenging since the data generated from such type of instruments are highly complex. To reduce the amount of complexities in conducting scientific experiments as much as possible, an integrated framework that transparently implements the conceptual separation between both the dimensions is direly needed. In order to facilitate scientific experiments ‘workflow’ technology has in recent years emerged in scientific disciplines like biology, bioinformatics, geology, environmental science, and eco-informatics. Much more research work has been done to develop the scientific workflow systems. However, our analysis over these existing systems shows that they lack a well-structured conceptual modeling methodology to deal with the two complex dimensions in a transparent manner. This paper presents a scientific workflow framework that properly addresses these two dimensional complexities in a proper manner.

  10. Machinability of advanced materials

    CERN Document Server

    Davim, J Paulo

    2014-01-01

    Machinability of Advanced Materials addresses the level of difficulty involved in machining a material, or multiple materials, with the appropriate tooling and cutting parameters.  A variety of factors determine a material's machinability, including tool life rate, cutting forces and power consumption, surface integrity, limiting rate of metal removal, and chip shape. These topics, among others, and multiple examples comprise this research resource for engineering students, academics, and practitioners.

  11. Pattern recognition & machine learning

    CERN Document Server

    Anzai, Y

    1992-01-01

    This is the first text to provide a unified and self-contained introduction to visual pattern recognition and machine learning. It is useful as a general introduction to artifical intelligence and knowledge engineering, and no previous knowledge of pattern recognition or machine learning is necessary. Basic for various pattern recognition and machine learning methods. Translated from Japanese, the book also features chapter exercises, keywords, and summaries.

  12. Support vector machines applications

    CERN Document Server

    Guo, Guodong

    2014-01-01

    Support vector machines (SVM) have both a solid mathematical background and good performance in practical applications. This book focuses on the recent advances and applications of the SVM in different areas, such as image processing, medical practice, computer vision, pattern recognition, machine learning, applied statistics, business intelligence, and artificial intelligence. The aim of this book is to create a comprehensive source on support vector machine applications, especially some recent advances.

  13. Machining of titanium alloys

    CERN Document Server

    2014-01-01

    This book presents a collection of examples illustrating the resent research advances in the machining of titanium alloys. These materials have excellent strength and fracture toughness as well as low density and good corrosion resistance; however, machinability is still poor due to their low thermal conductivity and high chemical reactivity with cutting tool materials. This book presents solutions to enhance machinability in titanium-based alloys and serves as a useful reference to professionals and researchers in aerospace, automotive and biomedical fields.

  14. Machine learning for identifying botnet network traffic

    DEFF Research Database (Denmark)

    Stevanovic, Matija; Pedersen, Jens Myrup

    2013-01-01

    . Due to promise of non-invasive and resilient detection, botnet detection based on network traffic analysis has drawn a special attention of the research community. Furthermore, many authors have turned their attention to the use of machine learning algorithms as the mean of inferring botnet......-related knowledge from the monitored traffic. This paper presents a review of contemporary botnet detection methods that use machine learning as a tool of identifying botnet-related traffic. The main goal of the paper is to provide a comprehensive overview on the field by summarizing current scientific efforts....... The contribution of the paper is three-fold. First, the paper provides a detailed insight on the existing detection methods by investigating which bot-related heuristic were assumed by the detection systems and how different machine learning techniques were adapted in order to capture botnet-related knowledge...

  15. Data Mining and Machine Learning in Astronomy

    CERN Document Server

    Ball, Nicholas M

    2009-01-01

    We review the current state of data mining and machine learning in Astronomy. 'Data Mining' can have a somewhat mixed connotation from the point of view of a researcher in this field. On the one hand, it is a powerful approach, holding the potential to fully exploit the exponentially increasing amount of available data, which promises almost limitless scientific advances. On the other, it can be the application of black-box computing algorithms that at best give little physical insight, and at worst provide questionable results. Here, we give an overview of the entire data mining process, from data collection through the interpretation of results. We cover common machine learning algorithms, such as artificial neural networks and support vector machines; applications from a broad range of Astronomy, with an emphasis on those where data mining resulted in improved physical insights, and important current and future directions, including the construction of full probability density functions, parallel algorithm...

  16. Maximizing ROI on software development

    CERN Document Server

    Sikka, Vijay

    2004-01-01

    A brief review of software development history. Software complexity crisis. Software development ROI. The case for global software development and testing. Software quality and test ROI. How do you implement global software development and testing. Case studies.

  17. Rotating electrical machines

    CERN Document Server

    Le Doeuff, René

    2013-01-01

    In this book a general matrix-based approach to modeling electrical machines is promulgated. The model uses instantaneous quantities for key variables and enables the user to easily take into account associations between rotating machines and static converters (such as in variable speed drives).   General equations of electromechanical energy conversion are established early in the treatment of the topic and then applied to synchronous, induction and DC machines. The primary characteristics of these machines are established for steady state behavior as well as for variable speed scenarios. I

  18. Chaotic Boltzmann machines.

    Science.gov (United States)

    Suzuki, Hideyuki; Imura, Jun-ichi; Horio, Yoshihiko; Aihara, Kazuyuki

    2013-01-01

    The chaotic Boltzmann machine proposed in this paper is a chaotic pseudo-billiard system that works as a Boltzmann machine. Chaotic Boltzmann machines are shown numerically to have computing abilities comparable to conventional (stochastic) Boltzmann machines. Since no randomness is required, efficient hardware implementation is expected. Moreover, the ferromagnetic phase transition of the Ising model is shown to be characterised by the largest Lyapunov exponent of the proposed system. In general, a method to relate probabilistic models to nonlinear dynamics by derandomising Gibbs sampling is presented.

  19. Tribology in machine design

    CERN Document Server

    Stolarski, Tadeusz

    1999-01-01

    ""Tribology in Machine Design is strongly recommended for machine designers, and engineers and scientists interested in tribology. It should be in the engineering library of companies producing mechanical equipment.""Applied Mechanics ReviewTribology in Machine Design explains the role of tribology in the design of machine elements. It shows how algorithms developed from the basic principles of tribology can be used in a range of practical applications within mechanical devices and systems.The computer offers today's designer the possibility of greater stringen

  20. Electrical machines & drives

    CERN Document Server

    Hammond, P

    1985-01-01

    Containing approximately 200 problems (100 worked), the text covers a wide range of topics concerning electrical machines, placing particular emphasis upon electrical-machine drive applications. The theory is concisely reviewed and focuses on features common to all machine types. The problems are arranged in order of increasing levels of complexity and discussions of the solutions are included where appropriate to illustrate the engineering implications. This second edition includes an important new chapter on mathematical and computer simulation of machine systems and revised discussions o

  1. Machine learning with R

    CERN Document Server

    Lantz, Brett

    2013-01-01

    Written as a tutorial to explore and understand the power of R for machine learning. This practical guide that covers all of the need to know topics in a very systematic way. For each machine learning approach, each step in the process is detailed, from preparing the data for analysis to evaluating the results. These steps will build the knowledge you need to apply them to your own data science tasks.Intended for those who want to learn how to use R's machine learning capabilities and gain insight from your data. Perhaps you already know a bit about machine learning, but have never used R; or

  2. Induction machine handbook

    CERN Document Server

    Boldea, Ion

    2002-01-01

    Often called the workhorse of industry, the advent of power electronics and advances in digital control are transforming the induction motor into the racehorse of industrial motion control. Now, the classic texts on induction machines are nearly three decades old, while more recent books on electric motors lack the necessary depth and detail on induction machines.The Induction Machine Handbook fills industry's long-standing need for a comprehensive treatise embracing the many intricate facets of induction machine analysis and design. Moving gradually from simple to complex and from standard to

  3. Human and machine diagnosis of scientific problem-solving abilities

    Science.gov (United States)

    Good, Ron; Kromhout, Robert; Bandler, Wyllis

    Diagnosis of the problem-solving state of a novice student in science, by an accomplished teacher, is studied in order to build a computer system that will simulate the process. Although such expert systems have been successfully developed in medicine (MYCIN, INTERNIST/CADUCEUS), very little has been accomplished in science education, even though there is a reasonably close parallel between expert medical diagnosis of patients with physiological problems and expert instructional diagnosis of students with learning problems. The system described in this paper, DIPS: Diagnosis for Instruction in Problem Solving, involves a new line of research for science educators interested in interdisciplinary efforts and ways in which computer technology might be used to better understand how to improve science learning. The basic architecture of the DIPS system is outlined and explained in terms of instruction and research implications, and the role of such intelligent computer systems in science education of the future is considered.

  4. A Methodology for Software Cost Estimation Using Machine Learning Techniques

    Science.gov (United States)

    1993-09-03

    The correlation coefficient, R-squared, of 0.726 indicates that this estimate has a fairly strong relationship with the actual project effort. Why...networks could be considered truly accurate, the results of this experiment indicate that networks are worth strong consideration. The best indication that...ExpOneM 1.20 1.12 1.05 (B" ehm , 1981.P& 117) This comparison between the genetic algorithm derived values and Boehm’s values for the coefficients and

  5. Model of Pulsed Electrical Discharge Machining (EDM using RL Circuit

    Directory of Open Access Journals (Sweden)

    Ade Erawan Bin Minhat

    2014-10-01

    Full Text Available This article presents a model of pulsed Electrical Discharge Machining (EDM using RL circuit. There are several mathematical models have been successfully developed based on the initial, ignition and discharge phase of current and voltage gap. According to these models, the circuit schematic of transistor pulse power generator has been designed using electrical model in Matlab Simulink software to identify the profile of voltage and current during machining process. Then, the simulation results are compared with the experimental results.

  6. Design Criteria and Machine Integration of the Ignitor Experiment

    Science.gov (United States)

    Bianchi, A.; Coppi, B.

    2010-11-01

    High field, high density compact experiments are the only ones capable of producing, on the basis of available technology and knowledge of plasma physics, plasmas that can reach ignition conditions. The Ignitor machine (R01.32 m, a xb0.47x0.83 m^2, BTCATIA-V software. A complete structural analysis has verified that the machine can withstand the forces produced for all the main operational scenarios.

  7. A Control System Retrofit for a Plastic Bag Making Machine

    OpenAIRE

    DR S S ADAMU

    2011-01-01

    This work presents the development of a microcontroller system to replace a problematic mechanical system of a plastic bag making machine. After detailed study of the existing system the theory of finite state machines is used to model the proposed retrofit, using simulink and stateflow toolboxes of MATLAB. Using the model, theretrofit system is partitioned into hardware and software components. The retrofit is implemented using Microchip’s PIC16F84A 8-bit microcontroller. The developed retro...

  8. SIM_EXPLORE: Software for Directed Exploration of Complex Systems

    Science.gov (United States)

    Burl, Michael; Wang, Esther; Enke, Brian; Merline, William J.

    2013-01-01

    Physics-based numerical simulation codes are widely used in science and engineering to model complex systems that would be infeasible to study otherwise. While such codes may provide the highest- fidelity representation of system behavior, they are often so slow to run that insight into the system is limited. Trying to understand the effects of inputs on outputs by conducting an exhaustive grid-based sweep over the input parameter space is simply too time-consuming. An alternative approach called "directed exploration" has been developed to harvest information from numerical simulators more efficiently. The basic idea is to employ active learning and supervised machine learning to choose cleverly at each step which simulation trials to run next based on the results of previous trials. SIM_EXPLORE is a new computer program that uses directed exploration to explore efficiently complex systems represented by numerical simulations. The software sequentially identifies and runs simulation trials that it believes will be most informative given the results of previous trials. The results of new trials are incorporated into the software's model of the system behavior. The updated model is then used to pick the next round of new trials. This process, implemented as a closed-loop system wrapped around existing simulation code, provides a means to improve the speed and efficiency with which a set of simulations can yield scientifically useful results. The software focuses on the case in which the feedback from the simulation trials is binary-valued, i.e., the learner is only informed of the success or failure of the simulation trial to produce a desired output. The software offers a number of choices for the supervised learning algorithm (the method used to model the system behavior given the results so far) and a number of choices for the active learning strategy (the method used to choose which new simulation trials to run given the current behavior model). The software

  9. Strengthening Software Authentication with the ROSE Software Suite

    Energy Technology Data Exchange (ETDEWEB)

    White, G

    2006-06-15

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.

  10. Virtual machine vs Real Machine: Security Systems

    Directory of Open Access Journals (Sweden)

    Dr. C. Suresh Gnana Das

    2009-08-01

    Full Text Available This paper argues that the operating system and applications currently running on a real machine should relocate into a virtual machine. This structure enables services to be added below the operating system and to do so without trusting or modifying the operating system or applications. To demonstrate the usefulness of this structure, we describe three services that take advantage of it: secure logging, intrusion prevention and detection, and environment migration. In particular, we can provide services below the guest operating system without trusting or modifying it. We believe providing services at this layer are especially useful for enhancing security and mobility. This position paper describes the general benefits and challenges that arise from running most applications in a virtual machine, and then describes some example services and alternative ways to provide those services.

  11. Advances in software science and technology

    CERN Document Server

    Kakuda, Hiroyasu; Ohno, Yoshio

    1992-01-01

    Advances in Software Science and Technology, Volume 3 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into two parts encompassing 11 chapters, this volume begins with an overview of the development of a system of writing tools called SUIKOU that analyzes a machine-readable Japanese document textually. This text then presents the conditioned attribute grammars (CAGs) and a system for evaluating them that can be applied to natural-language processing. Other chapters c

  12. Advances in software science and technology

    CERN Document Server

    Kamimura, Tsutomu

    1994-01-01

    This serial is a translation of the original works within the Japan Society of Software Science and Technology. A key source of information for computer scientists in the U.S., the serial explores the major areas of research in software and technology in Japan. These volumes are intended to promote worldwide exchange of ideas among professionals.This volume includes original research contributions in such areas as Augmented Language Logic (ALL), distributed C language, Smalltalk 80, and TAMPOPO-an evolutionary learning machine based on the principles of Realtime Minimum Skyline Detection.

  13. Pattern recognition, machine intelligence and biometrics

    CERN Document Server

    Wang, Patrick S P

    2012-01-01

    ""Pattern Recognition, Machine Intelligence and Biometrics"" covers the most recent developments in Pattern Recognition and its applications, using artificial intelligence technologies within an increasingly critical field. It covers topics such as: image analysis and fingerprint recognition; facial expressions and emotions; handwriting and signatures; iris recognition; hand-palm gestures; and multimodal based research. The applications span many fields, from engineering, scientific studies and experiments, to biomedical and diagnostic applications, to personal identification and homeland secu

  14. TMARKER: A free software toolkit for histopathological cell counting and staining estimation

    Directory of Open Access Journals (Sweden)

    Peter J Schüffler

    2013-01-01

    Full Text Available Background: Histological tissue analysis often involves manual cell counting and staining estimation of cancerous cells. These assessments are extremely time consuming, highly subjective and prone to error, since immunohistochemically stained cancer tissues usually show high variability in cell sizes, morphological structures and staining quality. To facilitate reproducible analysis in clinical practice as well as for cancer research, objective computer assisted staining estimation is highly desirable. Methods: We employ machine learning algorithms as randomized decision trees and support vector machines for nucleus detection and classification. Superpixels as segmentation over the tissue image are classified into foreground and background and thereafter into malignant and benign, learning from the user′s feedback. As a fast alternative without nucleus classification, the existing color deconvolution method is incorporated. Results: Our program TMARKER connects already available workflows for computational pathology and immunohistochemical tissue rating with modern active learning algorithms from machine learning and computer vision. On a test dataset of human renal clear cell carcinoma and prostate carcinoma, the performance of the used algorithms is equivalent to two independent pathologists for nucleus detection and classification. Conclusion: We present a novel, free and operating system independent software package for computational cell counting and staining estimation, supporting IHC stained tissue analysis in clinic and for research. Proprietary toolboxes for similar tasks are expensive, bound to specific commercial hardware (e.g. a microscope and mostly not quantitatively validated in terms of performance and reproducibility. We are confident that the presented software package will proof valuable for the scientific community and we anticipate a broader application domain due to the possibility to interactively learn models for new

  15. Timing system control software in the SLC

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, K.; Phinney, N.

    1985-04-01

    A new timing system that allows precision (approx.1 to 2 ns) control of the trigger times of klystrons, beam position monitors, and other devices on a pulse-to-pulse basis at up to 360 Hz is in operation in the first third of the SLAC linear accelerator. The control software is divided between a central host VAX and local Intel 8086-based microprocessor clusters. Facilities exist to set up and adjust the timing of devices or groups of devices independently for beam pulses having different destinations and purposes, which are run in an interlaced fashion during normal machine operation. Upgrading of the system is currently underway, using a new version of the Programmable Delay Unit CAMAC module to allow pipelining of timing information for three machine pulses. An overview of the current state of the system is presented in this paper, with an emphasis on software control.

  16. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  17. Fully automatic CNC machining production system

    Directory of Open Access Journals (Sweden)

    Lee Jeng-Dao

    2017-01-01

    Full Text Available Customized manufacturing is increasing years by years. The consumption habits change has been cause the shorter of product life cycle. Therefore, many countries view industry 4.0 as a target to achieve more efficient and more flexible automated production. To develop an automatic loading and unloading CNC machining system via vision inspection is the first step in industrial upgrading. CNC controller is adopted as the main controller to command to the robot, conveyor, and other equipment in this study. Moreover, machine vision systems are used to detect position of material on the conveyor and the edge of the machining material. In addition, Open CNC and SCADA software will be utilized to make real-time monitor, remote system of control, alarm email notification, and parameters collection. Furthermore, RFID has been added to employee classification and management. The machine handshaking has been successfully proposed to achieve automatic vision detect, edge tracing measurement, machining and system parameters collection for data analysis to accomplish industrial automation system integration with real-time monitor.

  18. Component-Based QoS-Driven Synthesis of High Assurance Embedded Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Software is an integral part of many complex embedded systems, such as avionics, scientific exploration, and on-board systems. However, poor software reliability is...

  19. Support vector machine classifiers for large data sets.

    Energy Technology Data Exchange (ETDEWEB)

    Gertz, E. M.; Griffin, J. D.

    2006-01-31

    This report concerns the generation of support vector machine classifiers for solving the pattern recognition problem in machine learning. Several methods are proposed based on interior point methods for convex quadratic programming. Software implementations are developed by adapting the object-oriented packaging OOQP to the problem structure and by using the software package PETSc to perform time-intensive computations in a distributed setting. Linear systems arising from classification problems with moderately large numbers of features are solved by using two techniques--one a parallel direct solver, the other a Krylov-subspace method incorporating novel preconditioning strategies. Numerical results are provided, and computational experience is discussed.

  20. Semantic representation of scientific literature: bringing claims, contributions and named entities onto the Linked Open Data cloud

    Directory of Open Access Journals (Sweden)

    Bahar Sateli

    2015-12-01

    Full Text Available Motivation. Finding relevant scientific literature is one of the essential tasks researchers are facing on a daily basis. Digital libraries and web information retrieval techniques provide rapid access to a vast amount of scientific literature. However, no further automated support is available that would enable fine-grained access to the knowledge ‘stored’ in these documents. The emerging domain of Semantic Publishing aims at making scientific knowledge accessible to both humans and machines, by adding semantic annotations to content, such as a publication’s contributions, methods, or application domains. However, despite the promises of better knowledge access, the manual annotation of existing research literature is prohibitively expensive for wide-spread adoption. We argue that a novel combination of three distinct methods can significantly advance this vision in a fully-automated way: (i Natural Language Processing (NLP for Rhetorical Entity (RE detection; (ii Named Entity (NE recognition based on the Linked Open Data (LOD cloud; and (iii automatic knowledge base construction for both NEs and REs using semantic web ontologies that interconnect entities in documents with the machine-readable LOD cloud.Results. We present a complete workflow to transform scientific literature into a semantic knowledge base, based on the W3C standards RDF and RDFS. A text mining pipeline, implemented based on the GATE framework, automatically extracts rhetorical entities of type Claims and Contributions from full-text scientific literature. These REs are further enriched with named entities, represented as URIs to the linked open data cloud, by integrating the DBpedia Spotlight tool into our workflow. Text mining results are stored in a knowledge base through a flexible export process that provides for a dynamic mapping of semantic annotations to LOD vocabularies through rules stored in the knowledge base. We created a gold standard corpus from computer

  1. Software architecture of biomimetic underwater vehicle

    Science.gov (United States)

    Praczyk, Tomasz; Szymak, Piotr

    2016-05-01

    Autonomous underwater vehicles are vehicles that are entirely or partly independent of human decisions. In order to obtain operational independence, the vehicles have to be equipped with a specialized software. The main task of the software is to move the vehicle along a trajectory with collision avoidance. Moreover, the software has also to manage different devices installed on the vehicle board, e.g. to start and stop cameras, sonars etc. In addition to the software embedded on the vehicle board, the software responsible for managing the vehicle by the operator is also necessary. Its task is to define mission of the vehicle, to start, to stop the mission, to send emergency commands, to monitor vehicle parameters, and to control the vehicle in remotely operated mode. An important objective of the software is also to support development and tests of other software components. To this end, a simulation environment is necessary, i.e. simulation model of the vehicle and all its key devices, the model of the sea environment, and the software to visualize behavior of the vehicle. The paper presents architecture of the software designed for biomimetic autonomous underwater vehicle (BAUV) that is being constructed within the framework of the scientific project financed by Polish National Center of Research and Development.

  2. Encapsulating Software Platform Logic by Aspect-Oriented Programming: A Case Study in Using Aspects for Language Portability

    NARCIS (Netherlands)

    Kats, L.C.; Visser, E.

    2010-01-01

    Software platforms such as the Java Virtual Machine or the CLR .NET virtual machine have their own ecosystem of a core programming language or instruction set, libraries, and developer community. Programming languages can target multiple software platforms to increase interoperability or to boost pe

  3. Encapsulating Software Platform Logic by Aspect-Oriented Programming: A Case Study in Using Aspects for Language Portability

    NARCIS (Netherlands)

    Kats, L.C.; Visser, E.

    2010-01-01

    Software platforms such as the Java Virtual Machine or the CLR .NET virtual machine have their own ecosystem of a core programming language or instruction set, libraries, and developer community. Programming languages can target multiple software platforms to increase interoperability or to boost

  4. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  5. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  6. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  7. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  8. Software distribution using xnetlib

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science]|[Oak Ridge National Lab., TN (US); Rowan, T.H. [Oak Ridge National Lab., TN (US); Wade, R.C. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  9. Image Processing Software

    Science.gov (United States)

    Bosio, M. A.

    1990-11-01

    ABSTRACT: A brief description of astronomical image software is presented. This software was developed in a Digital Micro Vax II Computer System. : St presenta una somera descripci6n del software para procesamiento de imagenes. Este software fue desarrollado en un equipo Digital Micro Vax II. : DATA ANALYSIS - IMAGE PROCESSING

  10. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  11. Software productivity improvement through software engineering technology

    Science.gov (United States)

    Mcgarry, F. E.

    1985-01-01

    It has been estimated that NASA expends anywhere from 6 to 10 percent of its annual budget on the acquisition, implementation and maintenance of computer software. Although researchers have produced numerous software engineering approaches over the past 5-10 years; each claiming to be more effective than the other, there is very limited quantitative information verifying the measurable impact htat any of these technologies may have in a production environment. At NASA/GSFC, an extended research effort aimed at identifying and measuring software techniques that favorably impact productivity of software development, has been active over the past 8 years. Specific, measurable, software development technologies have been applied and measured in a production environment. Resulting software development approaches have been shown to be effective in both improving quality as well as productivity in this one environment.

  12. Attacking Software Crisis: A Macro Approach.

    Science.gov (United States)

    1985-03-01

    blue-collar workers with a new "steel-collar" class Even office workers will feel the crunch, as smart machines do more and more of the clerical work...In an attempt to circumvent the shortage of qualified software engineers, conscious efforts have been made to deskill programming. By creating the...piloting was deskilled , an increase in the number of crashes would certainly be expected; or if civil engineering was deskilled , a drastic increase

  13. The Perfect Science Machine

    Science.gov (United States)

    2008-05-01

    ESO celebrates 10 years since First Light of the VLT Today marks the 10th anniversary since First Light with ESO's Very Large Telescope (VLT), the most advanced optical telescope in the world. Since then, the VLT has evolved into a unique suite of four 8.2-m Unit Telescopes (UTs) equipped with no fewer than 13 state-of-the-art instruments, and four 1.8-m moveable Auxiliary Telescopes (ATs). The telescopes can work individually, and they can also be linked together in groups of two or three to form a giant 'interferometer' (VLTI), allowing astronomers to see details corresponding to those from a much larger telescope. Green Flash at Paranal ESO PR Photo 16a/08 The VLT 10th anniversary poster "The Very Large Telescope array is a flagship facility for astronomy, a perfect science machine of which Europe can be very proud," says Tim de Zeeuw, ESO's Director General. "We have built the most advanced ground-based optical observatory in the world, thanks to the combination of a long-term adequately-funded instrument and technology development plan with an approach where most of the instruments were built in collaboration with institutions in the member states, with in-kind contributions in labour compensated by guaranteed observing time." Sitting atop the 2600m high Paranal Mountain in the Chilean Atacama Desert, the VLT's design, suite of instruments, and operating principles set the standard for ground-based astronomy. It provides the European scientific community with a telescope array with collecting power significantly greater than any other facilities available at present, offering imaging and spectroscopy capabilities at visible and infrared wavelengths. Blue Flash at Paranal ESO PR Photo 16b/08 A Universe of Discoveries The first scientifically useful images, marking the official 'First Light' of the VLT, were obtained on the night of 25 to 26 May 1998, with a test camera attached to "Antu", Unit Telescope number 1. They were officially presented to the press on

  14. Smile (System/Machine-Independent Local Environment)

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, J.G.

    1988-04-01

    This document defines the characteristics of Smile, a System/machine-independent local environment. This environment consists primarily of a number of primitives (types, macros, procedure calls, and variables) that a program may use; these primitives provide facilities, such as memory allocation, timing, tasking and synchronization beyond those typically provided by a programming language. The intent is that a program will be portable from system to system and from machine to machine if it relies only on the portable aspects of its programming language and on the Smile primitives. For this to be so, Smile itself must be implemented on each system and machine, most likely using non-portable constructions; that is, while the environment provided by Smile is intended to be portable, the implementation of Smile is not necessarily so. In order to make the implementation of Smile as easy as possible and thereby expedite the porting of programs to a new system or a new machine, Smile has been defined to provide a minimal portable environment; that is, simple primitives are defined, out of which more complex facilities may be constructed using portable procedures. The implementation of Smile can be as any of the following: the underlying software environment for the operating system of an otherwise {open_quotes}bare{close_quotes} machine, a {open_quotes}guest{close_quotes} system environment built upon a preexisting operating system, an environment within a {open_quotes}user{close_quotes} process run by an operating system, or a single environment for an entire machine, encompassing both system and {open_quotes}user{close_quotes} processes. In the first three of these cases the tasks provided by Smile are {open_quotes}lightweight processes{close_quotes} multiplexed within preexisting processes or the system, while in the last case they also include the system processes themselves.

  15. Software Carpentry and the Hydrological Sciences

    Science.gov (United States)

    Ahmadia, A. J.; Kees, C. E.; Farthing, M. W.

    2013-12-01

    Scientists are spending an increasing amount of time building and using hydrology software. However, most scientists are never taught how to do this efficiently. As a result, many are unaware of tools and practices that would allow them to write more reliable and maintainable code with less effort. As hydrology models increase in capability and enter use by a growing number of scientists and their communities, it is important that the scientific software development practices scale up to meet the challenges posed by increasing software complexity, lengthening software lifecycles, a growing number of stakeholders and contributers, and a broadened developer base that extends from application domains to high performance computing centers. Many of these challenges in complexity, lifecycles, and developer base have been successfully met by the open source community, and there are many lessons to be learned from their experiences and practices. Additionally, there is much wisdom to be found in the results of research studies conducted on software engineering itself. Software Carpentry aims to bridge the gap between the current state of software development and these known best practices for scientific software development, with a focus on hands-on exercises and practical advice based on the following principles: 1. Write programs for people, not computers. 2. Automate repetitive tasks 3. Use the computer to record history 4. Make incremental changes 5. Use version control 6. Don't repeat yourself (or others) 7. Plan for mistakes 8. Optimize software only after it works 9. Document design and purpose, not mechanics 10. Collaborate We discuss how these best practices, arising from solid foundations in research and experience, have been shown to help improve scientist's productivity and the reliability of their software.

  16. Stirling machine operating experience

    Energy Technology Data Exchange (ETDEWEB)

    Ross, B. [Stirling Technology Co., Richland, WA (United States); Dudenhoefer, J.E. [Lewis Research Center, Cleveland, OH (United States)

    1994-09-01

    Numerous Stirling machines have been built and operated, but the operating experience of these machines is not well known. It is important to examine this operating experience in detail, because it largely substantiates the claim that stirling machines are capable of reliable and lengthy operating lives. The amount of data that exists is impressive, considering that many of the machines that have been built are developmental machines intended to show proof of concept, and are not expected to operate for lengthy periods of time. Some Stirling machines (typically free-piston machines) achieve long life through non-contact bearings, while other Stirling machines (typically kinematic) have achieved long operating lives through regular seal and bearing replacements. In addition to engine and system testing, life testing of critical components is also considered. The record in this paper is not complete, due to the reluctance of some organizations to release operational data and because several organizations were not contacted. The authors intend to repeat this assessment in three years, hoping for even greater participation.

  17. Perpetual Motion Machine

    Directory of Open Access Journals (Sweden)

    D. Tsaousis

    2008-01-01

    Full Text Available Ever since the first century A.D. there have been relative descriptions of known devices as well as manufactures for the creation of perpetual motion machines. Although physics has led, with two thermodynamic laws, to the opinion that a perpetual motion machine is impossible to be manufactured, inventors of every age and educational level appear to claim that they have invented something «entirely new» or they have improved somebody else’s invention, which «will function henceforth perpetually»! However the fact of the failure in manufacturing a perpetual motion machine till now, it does not mean that countless historical elements for these fictional machines become indifferent. The discussion on every version of a perpetual motion machine on the one hand gives the chance to comprehend the inventor’s of each period level of knowledge and his way of thinking, and on the other hand, to locate the points where this «perpetual motion machine» clashes with the laws of nature and that’s why it is impossible to have been manufactured or have functioned. The presentation of a new «perpetual motion machine» has excited our interest to locate its weak points. According to the designer of it the machine functions with the work produced by the buoyant force

  18. Machine Intelligence and Explication

    NARCIS (Netherlands)

    Wieringa, Roelf J.

    1987-01-01

    This report is an MA ("doctoraal") thesis submitted to the department of philosophy, university of Amsterdam. It attempts to answer the question whether machines can think by conceptual analysis. Ideally. a conceptual analysis should give plausible explications of the concepts of "machine" and "inte

  19. Microsoft Azure machine learning

    CERN Document Server

    Mund, Sumit

    2015-01-01

    The book is intended for those who want to learn how to use Azure Machine Learning. Perhaps you already know a bit about Machine Learning, but have never used ML Studio in Azure; or perhaps you are an absolute newbie. In either case, this book will get you up-and-running quickly.

  20. Reactive Turing machines

    NARCIS (Netherlands)

    Baeten, J.C.M.; Luttik, B.; Tilburg, P.J.A. van

    2013-01-01

    We propose reactive Turing machines (RTMs), extending classical Turing machines with a process-theoretical notion of interaction, and use it to define a notion of executable transition system. We show that every computable transition system with a bounded branching degree is simulated modulo diverge