WorldWideScience

Sample records for evolving software reengineering

  1. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  2. Evolving software reengineering technology for the emerging innovative-competitive era

    Science.gov (United States)

    Hwang, Phillip Q.; Lock, Evan; Prywes, Noah

    1994-01-01

    This paper reports on a multi-tool commercial/military environment combining software Domain Analysis techniques with Reusable Software and Reengineering of Legacy Software. It is based on the development of a military version for the Department of Defense (DOD). The integrated tools in the military version are: Software Specification Assistant (SSA) and Software Reengineering Environment (SRE), developed by Computer Command and Control Company (CCCC) for Naval Surface Warfare Center (NSWC) and Joint Logistics Commanders (JLC), and the Advanced Research Project Agency (ARPA) STARS Software Engineering Environment (SEE) developed by Boeing for NAVAIR PMA 205. The paper describes transitioning these integrated tools to commercial use. There is a critical need for the transition for the following reasons: First, to date, 70 percent of programmers' time is applied to software maintenance. The work of these users has not been facilitated by existing tools. The addition of Software Reengineering will also facilitate software maintenance and upgrading. In fact, the integrated tools will support the entire software life cycle. Second, the integrated tools are essential to Business Process Reengineering, which seeks radical process innovations to achieve breakthrough results. Done well, process reengineering delivers extraordinary gains in process speed, productivity and profitability. Most importantly, it discovers new opportunities for products and services in collaboration with other organizations. Legacy computer software must be changed rapidly to support innovative business processes. The integrated tools will provide commercial organizations important competitive advantages. This, in turn, will increase employment by creating new business opportunities. Third, the integrated system will produce much higher quality software than use of the tools separately. The reason for this is that producing or upgrading software requires keen understanding of extremely complex

  3. REENGINEERING OF THE AIR SIMULATORS LEGACY SOFTWARE

    Directory of Open Access Journals (Sweden)

    Nikolay O. Sidorov

    2008-02-01

    Full Text Available  There are the technical complexes consisting of components, parts of which are actively used, but the rest has lost working capacity owing to moral and physical deterioration. An example of such a complex is the aviation-flight complex "plane-simulator". High cost of components which continue to be used (plane do the actual task of restoring and supporting the out-of-order components (simulator. The considerable part of such complexes is the software, which owing to replacement of the obsolete and physically worn out hardware requires the rework. The rework method is reengineering.

  4. Hospital reengineering: an evolving management innovation: history, current status and future direction.

    Science.gov (United States)

    Walston, S L; Urden, L D; Sullivan, P

    2001-01-01

    This article summarizes six years of research on reengineering in hospitals and is the result of two national surveys and eighteen site visits to hospitals that engaged in reengineering in the 1990s. The research shows that actual hospital reengineering differs substantially from that which was initially proposed by early promoters of reengineering. However, this evolved reengineering continues to be implemented by the majority of hospitals in the United States. The authors illustrate how extensive reductions of managers and changes of nursing models have been in the past six years. Data comparing financial and cost competitiveness changes are also shown. The authors then explore the continued experiences of two early proponents of reengineering and find that their competitive outcomes to be in contrast with their early statements. Finally, the authors suggest a number of reasons that may impact on the success or failure of reengineering.

  5. Evolvable Neural Software System

    Science.gov (United States)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  6. Using a Foundational Ontology for Reengineering a Software Enterprise Ontology

    Science.gov (United States)

    Perini Barcellos, Monalessa; de Almeida Falbo, Ricardo

    The knowledge about software organizations is considerably relevant to software engineers. The use of a common vocabulary for representing the useful knowledge about software organizations involved in software projects is important for several reasons, such as to support knowledge reuse and to allow communication and interoperability between tools. Domain ontologies can be used to define a common vocabulary for sharing and reuse of knowledge about some domain. Foundational ontologies can be used for evaluating and re-designing domain ontologies, giving to these real-world semantics. This paper presents an evaluating of a Software Enterprise Ontology that was reengineered using the Unified Foundation Ontology (UFO) as basis.

  7. Simulation software: engineer processes before reengineering.

    Science.gov (United States)

    Lepley, C J

    2001-01-01

    People make decisions all the time using intuition. But what happens when you are asked: "Are you sure your predictions are accurate? How much will a mistake cost? What are the risks associated with this change?" Once a new process is engineered, it is difficult to analyze what would have been different if other options had been chosen. Simulating a process can help senior clinical officers solve complex patient flow problems and avoid wasted efforts. Simulation software can give you the data you need to make decisions. The author introduces concepts, methodologies, and applications of computer aided simulation to illustrate their use in making decisions to improve workflow design.

  8. IT & C Projects Duration Assessment Based on Audit and Software Reengineering

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available This paper analyses the effect of applying the core elements of software engineering and reengineering, probabilistic simulations and system development auditing to software development projects. Our main focus is reducing software development project duration. Due to the fast changing economy, the need for efficiency and productivity is greater than ever. Optimal allocation of resources has proved to be the main element contributing to an increase in efficiency.

  9. The software-cycle model for re-engineering and reuse

    Science.gov (United States)

    Bailey, John W.; Basili, Victor R.

    1992-01-01

    This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model.

  10. Evolvability as a Quality Attribute of Software Architectures

    NARCIS (Netherlands)

    Ciraci, S.; van den Broek, P.M.; Duchien, Laurence; D'Hondt, Maja; Mens, Tom

    We review the definition of evolvability as it appears on the literature. In particular, the concept of software evolvability is compared with other system quality attributes, such as adaptability, maintainability and modifiability.

  11. Software SCMS re-engineering for a objected oriented language (JAVA) for use in construction of segmented phantoms

    International Nuclear Information System (INIS)

    Possani, Rafael Guedes

    2012-01-01

    Recent treatment planning systems depend strongly on CT images and the tendency is that the internal dosimetry procedures in nuclear medicine therapy be also based on images, such as magnetic resonance imaging (MRI) and computed tomography (CT), to extract anatomical and histological information, as well as, functional imaging or activities map as PET and SPECT. This information associated with a radiation transport simulation software is used to estimate internal dose in patients undergoing treatment in nuclear medicine. This work aims to re-engineer the software SCMS, which is an interface software between the Monte Carlo code MCNP, and the medical images, that carry information from the patient in treatment. In other words, the necessary information contained in the images are interpreted and presented in a specific format to the Monte Carlo MCNP code to perform the simulation of radiation transport. Therefore, the user does not need to understand the complex process of inputting data on MCNP, as the SCMS is responsible for automatically constructing anatomical data from the patient, as well as the radioactive source data. The SCMS was originally developed in Fortran-77. In this work it was rewritten in an object-oriented language (JAVA). New features and data options have also been incorporated into the software. Thus, the new software has a number of improvements, such as intuitive GUI and a menu for the selection of the energy spectra correspondent to a specific radioisotope stored in a XML data bank. The new version also supports new materials and the user can specify an image region of interest for the calculation of absorbed dose. (author)

  12. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  13. Reengineering Hanford

    Energy Technology Data Exchange (ETDEWEB)

    Badalamente, R.V.; Carson, M.L.; Rhoads, R.E.

    1995-03-01

    The Department of Energy Richland Operations Office is in the process of reengineering its Hanford Site operations. There is a need to fundamentally rethink and redesign environmental restoration and waste management processes to achieve dramatic improvements in the quality, cost-effectiveness, and timeliness of the environmental services and products that make cleanup possible. Hanford is facing the challenge of reengineering in a complex environment in which major processes cuts across multiple government and contractor organizations and a variety of stakeholders and regulators have a great influence on cleanup activities. By doing the upfront work necessary to allow effective reengineering, Hanford is increasing the probability of its success.

  14. Reengineering Hanford

    International Nuclear Information System (INIS)

    Badalamente, R.V.; Carson, M.L.; Rhoads, R.E.

    1995-03-01

    The Department of Energy Richland Operations Office is in the process of reengineering its Hanford Site operations. There is a need to fundamentally rethink and redesign environmental restoration and waste management processes to achieve dramatic improvements in the quality, cost-effectiveness, and timeliness of the environmental services and products that make cleanup possible. Hanford is facing the challenge of reengineering in a complex environment in which major processes cuts across multiple government and contractor organizations and a variety of stakeholders and regulators have a great influence on cleanup activities. By doing the upfront work necessary to allow effective reengineering, Hanford is increasing the probability of its success

  15. Enterprise Information Systems as a Service: Re-engineering Enterprise Software as Product-Service System

    NARCIS (Netherlands)

    Wortmann, Johan; Don, H.; Hasselman, J.; J., Wilbrink; Frick, Jan; Laugen, Bjørge Timenes

    2012-01-01

    This paper draws an analogy between developments in enterprise software and in capital goods manufacturing industry. Many branches of manufacturing industry, especially automotive industry, have grown in maturity by moving from craftsmanship to mass production. These industries subsequently move

  16. Analytical Design of Evolvable Software for High-Assurance Computing

    Science.gov (United States)

    2001-02-14

    system size Sext wij j 1= Ai ∑ wik k 1= Mi ∑+               i 1= N ∑= = 59 5 Analytical Partition of Components As discussed in Chapter 1...76]. Does the research approach yield evolvable components in less mathematically-oriented applications such as multi- media and e- commerce? There is... Social Security Number Date 216 217 Appendix H Benchmark Design for the Microwave Oven Software The benchmark design consists of the

  17. Pattern-Oriented Reengineering of a Network System

    Directory of Open Access Journals (Sweden)

    Chung-Horng Lung

    2004-08-01

    Full Text Available Reengineering is to reorganize and modify existing systems to enhance them or to make them more maintainable. Reengineering is usually necessary as systems evolve due to changes in requirements, technologies, and/or personnel. Design patterns capture recurring structures and dynamics among software participants to facilitate reuse of successful designs. Design patterns are common and well studied in network systems. In this project, we reengineer part of a network system with some design patterns to support future evolution and performance improvement. We start with reverse engineering effort to understand the system and recover its high level architecture. Then we apply concurrent and networked design patterns to restructure the main sub-system. Those patterns include Half-Sync/Half-Async, Monitor Object, and Scoped Locking idiom. The resulting system is more maintainable and has better performance.

  18. Re-engineering software systems in the Department of Defense using integrated computer aided software engineering tools

    OpenAIRE

    Jennings, Charles A.

    1992-01-01

    Approved for public release; distribution is unlimited The Department of Defense (DoD) is plagues with severe cost overruns and delays in developing software systems. Existing software within Dod, some developed 15-to 20 years ago, require continual maintenance and modification. Major difficulties arise with maintaining older systems due to cryptic source code and a lack of adequate documentation. To remedy this situation, the DoD, is pursuing the integrated computer aided software engi...

  19. Evolving impact of Ada on a production software environment

    Science.gov (United States)

    Mcgarry, F.; Esker, L.; Quimby, K.

    1988-01-01

    Many aspects of software development with Ada have evolved as our Ada development environment has matured and personnel have become more experienced in the use of Ada. The Software Engineering Laboratory (SEL) has seen differences in the areas of cost, reliability, reuse, size, and use of Ada features. A first Ada project can be expected to cost about 30 percent more than an equivalent FORTRAN project. However, the SEL has observed significant improvements over time as a development environment progresses to second and third uses of Ada. The reliability of Ada projects is initially similar to what is expected in a mature FORTRAN environment. However, with time, one can expect to gain improvements as experience with the language increases. Reuse is one of the most promising aspects of Ada. The proportion of reusable Ada software on our Ada projects exceeds the proportion of reusable FORTRAN software on our FORTRAN projects. This result was noted fairly early in our Ada projects, and experience shows an increasing trend over time.

  20. Evolving software products, the design of a water-related modeling software ecosystem

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2017-01-01

    more than 50 years ago. However, a radical change of software products to evolve both in the software engineering as much as the organizational and business aspects in a disruptive manner are rather rare. In this paper, we report on the transformation of one of the market leader product series in water......-related calculation and modeling from a traditional business-as-usual series of products to an evolutionary software ecosystem. We do so by relying on existing concepts on software ecosystem analysis to analyze the future ecosystem. We report and elaborate on the main focus points necessary for this transition. We...... argue for the generalization of our focus points to the transition from traditional business-as-usual software products to software ecosystems....

  1. How can usability measurement affect the re-engineering process of clinical software procedures?

    Science.gov (United States)

    Terazzi, A; Giordano, A; Minuco, G

    1998-01-01

    As a consequence of the dramatic improvements achieved in information technology standards in terms of single hardware and software components, efforts in the evaluation processes have been focused on the assessment of critical human factors, such as work-flow organisation, man-machine interaction and, in general, quality of use, or usability. This trend is particularly valid when applied to medical informatics, since the human component is the basis of the information processing system in health care context. With the aim to establish an action-research project on the evaluation and assessment of clinical software procedures which constitute an integrated hospital information system, the authors adopted this strategy and considered the measurement of perceived usability as one of the main goals of the project itself: the paper reports the results of this experience.

  2. The Systems Biology Research Tool: evolvable open-source software

    OpenAIRE

    Wright, J; Wagner, A

    2008-01-01

    Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...

  3. A Re-Engineered Software Interface and Workflow for the Open-Source SimVascular Cardiovascular Modeling Package.

    Science.gov (United States)

    Lan, Hongzhi; Updegrove, Adam; Wilson, Nathan M; Maher, Gabriel D; Shadden, Shawn C; Marsden, Alison L

    2018-02-01

    Patient-specific simulation plays an important role in cardiovascular disease research, diagnosis, surgical planning and medical device design, as well as education in cardiovascular biomechanics. simvascular is an open-source software package encompassing an entire cardiovascular modeling and simulation pipeline from image segmentation, three-dimensional (3D) solid modeling, and mesh generation, to patient-specific simulation and analysis. SimVascular is widely used for cardiovascular basic science and clinical research as well as education, following increased adoption by users and development of a GATEWAY web portal to facilitate educational access. Initial efforts of the project focused on replacing commercial packages with open-source alternatives and adding increased functionality for multiscale modeling, fluid-structure interaction (FSI), and solid modeling operations. In this paper, we introduce a major SimVascular (SV) release that includes a new graphical user interface (GUI) designed to improve user experience. Additional improvements include enhanced data/project management, interactive tools to facilitate user interaction, new boundary condition (BC) functionality, plug-in mechanism to increase modularity, a new 3D segmentation tool, and new computer-aided design (CAD)-based solid modeling capabilities. Here, we focus on major changes to the software platform and outline features added in this new release. We also briefly describe our recent experiences using SimVascular in the classroom for bioengineering education.

  4. EVOLVE

    CERN Document Server

    Deutz, André; Schütze, Oliver; Legrand, Pierrick; Tantar, Emilia; Tantar, Alexandru-Adrian

    2017-01-01

    This book comprises nine selected works on numerical and computational methods for solving multiobjective optimization, game theory, and machine learning problems. It provides extended versions of selected papers from various fields of science such as computer science, mathematics and engineering that were presented at EVOLVE 2013 held in July 2013 at Leiden University in the Netherlands. The internationally peer-reviewed papers include original work on important topics in both theory and applications, such as the role of diversity in optimization, statistical approaches to combinatorial optimization, computational game theory, and cell mapping techniques for numerical landscape exploration. Applications focus on aspects including robustness, handling multiple objectives, and complex search spaces in engineering design and computational biology.

  5. Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN

    Science.gov (United States)

    Frederick, J. M.; Hammond, G. E.

    2017-12-01

    Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such

  6. The Evolving Role of Open Source Software in Medicine and Health Services

    Directory of Open Access Journals (Sweden)

    Sevket Seref Arikan

    2013-01-01

    Full Text Available The past five decades have witnessed immense coevolution of methods and tools of information technology, and their practical and experimental application within the medical and healthcare domain. Healthcare itself continues to evolve in response to change in healthcare needs, progress in the scientific foundations of treatments, and in professional and managerial organization of affordable and effective services, in which patients and their families and carers increasingly participate. Taken together, these trends impose highly complex underlying challenges for the design, development, and sustainability of the quality of supporting information services and software infrastructure that are needed. The challenges are multidisciplinary and multiprofessional in scope, and they require deeper study and learning to inform policy and promote public awareness of the problems health services have faced in this area for many years. The repeating pattern of failure to live up to expectations of policy-driven national health IT initiatives has proved very costly and remains frustrating and unproductive for all involved. In this article, we highlight the barriers to progress and discuss the dangers of pursuing a standardization framework devoid of empirical testing and iterative development. We give the example of the openEHR Foundation, which was established at University College London (UCL in London, England, with members in 80 countries. The Foundation is a not-for-profit company providing open specifications and working for generic standards for electronic records, informed directly by a wide range of implementation experience. We also introduce the Opereffa open source framework, which was developed at UCL based on these specifications and which has been downloaded in some 70 countries. We argue that such an approach is now essential to support good discipline, innovation, and governance at the heart of medicine and health services, in line with the

  7. BUSINESS PROCESS REENGINEERING

    Directory of Open Access Journals (Sweden)

    Magdalena LUCA (DEDIU

    2014-06-01

    Full Text Available Business process reengineering determines the change of organizational functions from an orientation focused on operations through a multidimensional approach. Former employees who were mere executors are now determined to take their own decisions and as a result the functional departments lose their reason to exist. Managers do not act anymore as supervisors, but mainly as mentors, while the employees focus more attention on customer needs and less than the head’s. Under these conditions, new organizational paradigms are required, the most important being that of learning organizations. In order to implement a reengineering of the economic processes and promoting a new organizational paradigm the information technology plays a decisive role. The article presents some results obtained in a research theme ANSTI funded by contract no. 501/2000. Economic and financial analysis is performed in order to know the current situation to achieve better results in the future. One of its objectives is the production analyzed as a labour process and the interaction elements of this process. The indicators investigated in the analysis of financial and economic activity of production reflect the development directions, the means and resources to accomplish predetermined objectives and express the results and effectiveness of what is expected.

  8. Systems, methods and apparatus for developing and maintaining evolving systems with software product lines

    Science.gov (United States)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Pena, Joaquin (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which an evolutionary system is managed and viewed as a software product line. In some embodiments, the core architecture is a relatively unchanging part of the system, and each version of the system is viewed as a product from the product line. Each software product is generated from the core architecture with some agent-based additions. The result may be a multi-agent system software product line.

  9. REENGINEERING PROSES BISNIS : TINJAUAN KONSEPTUAL DAN METODOLOGI

    Directory of Open Access Journals (Sweden)

    Lena Ellitan

    1999-01-01

    Full Text Available Business process reengineering is fundamental rethinking and radical redesign of an organization's business processes that leads the organization to achieve dramatic improvement in business performance. Many firms have successfully embraced this new innovation paradigm to achieve orders of magnitude improvements in cost, eficiency, quality and value. Even more firms are seeking opportunities to apply reengineering and methodologies to assist them in doing so. The recognition of reengineering as a new management paradigm emerged in the 1990's, though it may be argued that the principle of reengineering has been applied well before then. The early 1990's saw world wide interest in reengineering. Consequently, many organizations have reported their first-cycle experiences in reengineering. Reengineering practices in the period of the 1990's was largerly characterized by application to operational processes and emphasis on operational measure of time, cost, and quality. Quite recently, a more strategic flavor of reengineering has been advocated. One of the hopes of new thinking is that by trancending microscopic concern of operational strategy, it would help the organization derive significantly greater value out of the reengineering effort. This paper presents: 1. The concept of reengineering 2. Various problems in business process reengineering. 3. The rigorous methodology for organizing reengineering activities. Abstract in Bahasa Indonesia : Reengineering proses bisnis adalah pemikiran ulang fundamental dan disain ulang radikal suatu proses bisnis organisasi yang akan mengarahkan organisasi untuk mencapai peningkatan kinerja bisnis secara dramatis. Beberapa perusahaan telah menerapkan paradigma inovasi baru ini untuk mencapai berbagai perbaikan dalam biaya, kualitas, dan efisiensi. Bahkan makin banyak perusahaan yang mencari peluang untuk menerapkan proyek reengineering dan metodologi-metodologi yang membantu mereka dalam mencapai usaha

  10. A Framework for the Management of Evolving Requirements in Software Systems Supporting Network-Centric Warfare

    National Research Council Canada - National Science Library

    Reynolds, Linda K

    2006-01-01

    .... There are many sources of requirements for these software systems supporting NCO, which may increase in number as the Services continue to develop the capabilities necessary for the transformation...

  11. Evolving a Simulation Model Product Line Software Architecture from Heterogeneous Model Representations

    National Research Council Canada - National Science Library

    Greaney, Kevin

    2003-01-01

    .... Many of these large-scale, software-intensive simulation systems were autonomously developed over time, and subject to varying degrees of funding, maintenance, and life-cycle management practices...

  12. The evolving marriage of hardware and software, as seen from the openlab perspective

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    This talk will give an overview of the activities of the openlab Platform Competence Center, collaborating with Intel. The problem of making hardware and software talk to each other efficiently has been around since the concept of computers ever came up, and current times are no different. We will report on the related R&D activities of the openlab PCC, touching on topics ranging from hardware platforms, through compilers, to next-generation physics software. We will also relate to relevant practice in the industry, which made significant progress in the last decade.

  13. Computational Intelligence in Software Cost Estimation: Evolving Conditional Sets of Effort Value Ranges

    OpenAIRE

    Papatheocharous, Efi; Andreou, Andreas S.

    2008-01-01

    In this approach we aimed at addressing the problem of large variances found in available historical data that are used in software cost estimation. Project data is expensive to collect, manage and maintain. Therefore, if we wish to lower the dependence of the estimation to

  14. IDC Reengineering Phase 2 Project Scope.

    Energy Technology Data Exchange (ETDEWEB)

    Harris, James M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    This report provides a brief description of the scope of the IDC Reengineering Phase 2 project. It describes the goals and objectives of reengineering, the system definition, and the technical scope of the system. REVISIONS Version Date Author/Team Revision Description Authorized by 1.0 9/25/2014 SNL IDC Reengineering Team Unlimited Release for I2 M. Harris 1.1 28/01/2015 IDC Reengineering Team Align with previous IDC scope document E. Tomuta.

  15. Reengineering Real-Time Software Systems

    Science.gov (United States)

    1993-09-09

    Advisor : Yutaka Kanayama Approved for public release; distribution is unlimited. 93-29769 93 12 6 098 Form Appmoved REPORT DOCUMENTATION PAGE 1o No. PI rep...line...and~parabola() Queue.c inc...getinst() Queue.c readjinst() Queue.c se~_insto ImmCmd.c accO0) ImmCmd.c getjineO() ImmCmd.c get robO () ImmCmd.c

  16. Reengineering a cardiovascular surgery service.

    Science.gov (United States)

    Tunick, P A; Etkin, S; Horrocks, A; Jeglinski, G; Kelly, J; Sutton, P

    1997-04-01

    Reengineering, involving the radical redesign of business processes, has been used successfully in a variety of health care settings. In 1994 New York University (NYU) Medical Center (MC) launched its first reengineering team, whose purpose was to redesign the entire process of caring for patients-from referral to discharge-on the cardiovascular (CV) surgery service. REENIGINEERING TEAM: The multidisciplinary CV Surgery Reengineering Team was charged with two goals: improving customer (patient, family, and referring physician) satisfaction and improving profitability. The methodology to be used was based on a reengineering philosophy-discarding basic assumptions and designing the patient care process from the ground up. THE TRANSFER-IN INITIATIVE: A survey of NYU cardiologists, distributed in April 1994, suggested that the organization was considered a difficult place to transfer patients. The team's recommendations led to a new, streamlined transfer-in policy. The average waiting time from when a referring physician requested a patient transfer and the time when an NYUMC physician accepted the transfer decreased from an average of 9 hours under the old system to immediate acceptance. Three customer satisfaction task forces implemented multiple programs to make the service more user friendly. In addition, referrals increased and length of stay decreased, without an adverse impact on the mortality rate. For the first time at NYUMC, a multidisciplinary team was given the mandate to achieve major changes in an entire patient care process. Similar projects are now underway.

  17. Software engineering a practitioner's approach

    CERN Document Server

    Pressman, Roger S

    1997-01-01

    This indispensable guide to software engineering exploration enables practitioners to navigate the ins and outs of this rapidly changing field. Pressman's fully revised and updated Fourth Edition provides in-depth coverage of every important management and technical topic in software engineering. Moreover, readers will find the inclusion of the hottest developments in the field such as: formal methods and cleanroom software engineering, business process reengineering, and software reengineering.

  18. The present status of software engineering

    CERN Document Server

    Pressman, Roger S

    1991-01-01

    In this seminar, we will discuss the present status and future directions of software engeneering and CASE. Key topics to be discussed include: new paradigms for software engineering; software metrics; process assessment; the current state of analysis and design methods; reusability and re-engineering; formal methods. Among the questions to be answered are: How will software engineering change as the 1990s progress? What are the "technology drivers"? What will analysis, design, coding, testing, quality assurance and software management look like in the year 2000? How will CASE tools evolve in the 1990s and will they be as "integrated" as many people believe? How can you position your Organization to accommodate the coming changes?

  19. Reengineering the Project Design Process

    Science.gov (United States)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  20. Re-Engineering Marketing (RM

    Directory of Open Access Journals (Sweden)

    Bozhidar Iv. Hadzhiev

    2010-12-01

    Full Text Available La globalización, el auge de la economía, el progreso de la e-net economía, y el gran dinamismo de las relaciones comerciales se están constituyendo como una función progresiva en constante crecimiento, predeterminando la utilización de unas pocas nuevas oportunidades para aumentar la eficacia de las empresas. Llegados a este punto, a través del prisma de los métodos de reingeniería en el presente artículo se muestran algunos problemas básicos y las oportunidades existentes para la Reingeniería del Marketing (RM.Globalization, the rise of the economy, the progress of the e-net economy, and the high dynamics of business relationships are developing as one of the permanently rising progressive functions, predetermining the use of a few new opportunities for increasing effectiveness of the industry companies. At this point, through the prism of Re-engineering methods, a few basic problems and opportunities for Re-engineering Marketing (RM are presented in this paper.

  1. Reengineering the project design process

    Science.gov (United States)

    Kane Casani, E.; Metzger, Robert M.

    1995-01-01

    In response to the National Aeronautics and Space Administration's goal of working faster, better, and cheaper, the Jet Propulsion Laboratory (JPL) has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Development Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center (PDC) and the Flight System Testbed (FST). Reengineering at JPL implies a cultural change whereby the character of the Laboratory's design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and more accurate cost estimating. These improvements signal JPL's commitment to meeting the challenges of space exploration in the next century.

  2. Reengineering health care materials management.

    Science.gov (United States)

    Connor, L R

    1998-01-01

    Health care executives across the country, faced with intense competition, are being forced to consider drastic cost cutting measures as a matter of survival. The entire health care industry is under siege from boards of directors, management and others who encourage health care systems to take actions ranging from strategic acquisitions and mergers to simple "downsizing" or "rightsizing," to improve their perceived competitive positions in terms of costs, revenues and market share. In some cases, management is poorly prepared to work within this new competitive paradigm and turns to consultants who promise that following their methodologies can result in competitive advantage. One favored methodology is reengineering. Frequently, cost cutting attention is focused on the materials management budget because it is relatively large and is viewed as being comprised mostly of controllable expenses. Also, materials management is seldom considered a core competency for the health care system and the organization performing these activities does not occupy a strongly defensible position. This paper focuses on the application of a reengineering methodology to healthcare materials management.

  3. Reengineering in Australia: factors affecting success

    Directory of Open Access Journals (Sweden)

    Felicity Murphy

    1998-11-01

    Full Text Available Business process reengineering (BPR is being used in many organisations worldwide to realign operations. Most of the research undertaken has been focused on North American or European practices. The study reported here replicates a US reengineering study in an Australian context by surveying large public and private sector Australian organisations. The study makes three main contributions by: (1 presenting a picture of BPR practices in Australia, (2 clarifying factors critical to the success of reengineering projects in Australia, and (3 providing a comparison of factors leading to success in Australian BPR projects with those found in the US.

  4. IDC Reengineering Phase 2 & 3 Project Scope

    Energy Technology Data Exchange (ETDEWEB)

    Harris, James M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Prescott, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-10-01

    Sandia National Laboratories has prepared a cost estimate budgetary planning for the IDC Reengineering Phase 2 & 3 effort. This report provides the cost estimate and describes the methodology, assumptions, and cost model details used to create the cost estimate.

  5. CONCEPT OF REENGINEERING AGAIN RETURNS IN ACTUALITY

    Directory of Open Access Journals (Sweden)

    Vasile Ionel POPESCU

    2014-06-01

    Full Text Available Although it was released in the summer of 1990, the concept of reengineering returns in actuality, because in the social and economic conditions that we are experiencing at the moment, to face the increasingly fierce competition more and more companies have to resort to redesign the processes. Throughout this article, after a brief introduction, we will present the factors that contributed to the occurrence of reengineering; trying to highlight what involves this concept, the characteristics of the processes resulted from the reengineering, the importance and methods to prepare a process map, and the method to launch the process redesign. Finally we have issued several opinions and have made a number of recommendations that will lead to achieving a qualitative leap targeted by the companies which resort to reengineering.

  6. The Reengineering of Processes a Tool in the Administration of Business: Case Cereales "Santiago"

    Directory of Open Access Journals (Sweden)

    Roberto René Moreno-García

    2015-12-01

    Full Text Available The article presents the research result on the application of the Reengineering of processes in the company Cereales Santiago and the introduction of the information sciences through the PesajeVoz software. In the research it is characterized the main deficiencies of the strategic process of commercialization that affect the economic result of the company and the satisfaction of their clients, by the losses and delays when receiving their raw materials. A study is also realized on the evolution of the reengineering of processes concept from its initial formulation and a characterization of some of the methodologies for its application, reference is made to an own methodology generic for the application of the reengineering of processes in the Cuban system of companies, that have been validated it in the company study object, allowed obtaining of results of impacts in quantitative and qualitative benefits for the company and its clients. 

  7. Evolving a lingua franca and associated software infrastructure for computational systems biology: the Systems Biology Markup Language (SBML) project.

    Science.gov (United States)

    Hucka, M; Finney, A; Bornstein, B J; Keating, S M; Shapiro, B E; Matthews, J; Kovitz, B L; Schilstra, M J; Funahashi, A; Doyle, J C; Kitano, H

    2004-06-01

    Biologists are increasingly recognising that computational modelling is crucial for making sense of the vast quantities of complex experimental data that are now being collected. The systems biology field needs agreed-upon information standards if models are to be shared, evaluated and developed cooperatively. Over the last four years, our team has been developing the Systems Biology Markup Language (SBML) in collaboration with an international community of modellers and software developers. SBML has become a de facto standard format for representing formal, quantitative and qualitative models at the level of biochemical reactions and regulatory networks. In this article, we summarise the current and upcoming versions of SBML and our efforts at developing software infrastructure for supporting and broadening its use. We also provide a brief overview of the many SBML-compatible software tools available today.

  8. MSiReader v1.0: Evolving Open-Source Mass Spectrometry Imaging Software for Targeted and Untargeted Analyses

    Science.gov (United States)

    Bokhart, Mark T.; Nazari, Milad; Garrard, Kenneth P.; Muddiman, David C.

    2018-01-01

    A major update to the mass spectrometry imaging (MSI) software MSiReader is presented, offering a multitude of newly added features critical to MSI analyses. MSiReader is a free, open-source, and vendor-neutral software written in the MATLAB platform and is capable of analyzing most common MSI data formats. A standalone version of the software, which does not require a MATLAB license, is also distributed. The newly incorporated data analysis features expand the utility of MSiReader beyond simple visualization of molecular distributions. The MSiQuantification tool allows researchers to calculate absolute concentrations from quantification MSI experiments exclusively through MSiReader software, significantly reducing data analysis time. An image overlay feature allows the incorporation of complementary imaging modalities to be displayed with the MSI data. A polarity filter has also been incorporated into the data loading step, allowing the facile analysis of polarity switching experiments without the need for data parsing prior to loading the data file into MSiReader. A quality assurance feature to generate a mass measurement accuracy (MMA) heatmap for an analyte of interest has also been added to allow for the investigation of MMA across the imaging experiment. Most importantly, as new features have been added performance has not degraded, in fact it has been dramatically improved. These new tools and the improvements to the performance in MSiReader v1.0 enable the MSI community to evaluate their data in greater depth and in less time. [Figure not available: see fulltext.

  9. The Topical Problems of Reengineering of Production Enterprises

    Directory of Open Access Journals (Sweden)

    Chumak Larysa F.

    2018-01-01

    Full Text Available The article is aimed at researching the problems of reengineering of industrial enterprises and determining the efficient ways of their solution. The essence of process of reengineering, conditions and expediency of carrying out reengineering were researched. Elements of the system of principles of reengineering, its stages, problems of implementation, and typical errors arising during reengineering, have been defined. It has been determined that reengineering should be closely connected with the strategies of industrial enterprise in order to achieve maximum efficiency of the enterprise’s activity and to prevent additional costs. Reengineering of an industrial enterprise should be supported by an appropriate organizational structure, sound information technology, and contemporary strategic considerations.

  10. Distance Measures for Information System Reengineering

    NARCIS (Netherlands)

    Poels, G.; Viaene, S.; Dedene, G.; Wangler, B.; Bergman, L.

    2000-01-01

    We present an approach to assess the magnitude and impact of information system reengineering caused by business process change. This approach is based on two concepts: object-oriented business modeling and distance measurement. The former concept is used to visualize changes in the business layer

  11. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  12. Reengineering of Analytical Data Management for the Environmental Restoration Project at Los Alamos National Laboratory

    International Nuclear Information System (INIS)

    Bolivar, S.; Dorries, A.; Nasser, K.; Scherma, S.

    2003-01-01

    The Environmental Restoration (ER) Project at Los Alamos National Laboratory (LANL) is responsible for the characterization, clean up, and monitoring of over 2,124 identified potential release sites (PRS). These PRSs have resulted from operations associated with weapons and energy related research which has been conducted at LANL since 1942. To accomplish mission goals, the ER Project conducts field sampling to determine possible types and levels of chemical contamination as well as their geographic extent. Last fiscal year, approximately 4000 samples were collected during ER Project field sampling campaigns. In the past, activities associated with field sampling such as sample campaign planning, paperwork, shipping and analytical laboratory tracking; verification and order fulfillment; validation and data quality assurance were performed by multiple groups working with a variety of software applications, databases and hard copy reports. This resulted in significant management and communication difficulties, data delivery delays, and inconsistent processes; it also represented a potential threat to overall data integrity. Creation of an organization, software applications and a data process that could provide for cost-effective management of the activities and data mentioned above became a management priority, resulting in a development of a reengineering task. This reengineering effort--currently nearing completion--has resulted in personnel reorganization, the development of a centralized data repository, and a powerful web-based sample management system that allows for an appreciably streamlined and more efficient data process. These changes have collectively cut data delivery times, allowed for larger volumes of samples and data to be handled with fewer personnel, and resulted in significant cost savings. This paper will provide a case study of the reengineering effort undertaken by the ER Project of its analytical data management process. It includes

  13. Reengineering of Analytical Data Management for the Environmental Restoration Project at Los Alamos National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Bolivar, S.; Dorries, A.; Nasser, K.; Scherma, S.

    2003-02-27

    The Environmental Restoration (ER) Project at Los Alamos National Laboratory (LANL) is responsible for the characterization, clean up, and monitoring of over 2,124 identified potential release sites (PRS). These PRSs have resulted from operations associated with weapons and energy related research which has been conducted at LANL since 1942. To accomplish mission goals, the ER Project conducts field sampling to determine possible types and levels of chemical contamination as well as their geographic extent. Last fiscal year, approximately 4000 samples were collected during ER Project field sampling campaigns. In the past, activities associated with field sampling such as sample campaign planning, paperwork, shipping and analytical laboratory tracking; verification and order fulfillment; validation and data quality assurance were performed by multiple groups working with a variety of software applications, databases and hard copy reports. This resulted in significant management and communication difficulties, data delivery delays, and inconsistent processes; it also represented a potential threat to overall data integrity. Creation of an organization, software applications and a data process that could provide for cost-effective management of the activities and data mentioned above became a management priority, resulting in a development of a reengineering task. This reengineering effort--currently nearing completion--has resulted in personnel reorganization, the development of a centralized data repository, and a powerful web-based sample management system that allows for an appreciably streamlined and more efficient data process. These changes have collectively cut data delivery times, allowed for larger volumes of samples and data to be handled with fewer personnel, and resulted in significant cost savings. This paper will provide a case study of the reengineering effort undertaken by the ER Project of its analytical data management process. It includes

  14. ROMANIAN COMPANIES DILEMMAS - BUSINESS REENGINEERING OR KAIZEN

    Directory of Open Access Journals (Sweden)

    MIHAELA GHICAJANU

    2011-01-01

    Full Text Available This paper presents an analysis of two American and Japanese management strategies, the reengineering and Kaizen strategies, which can be used successfully by the Romanian companies, too. Reengineering is the fundamental rethinking and radical redesign of business processes to achieve dramatic improvements in critical contemporary measures of performance such as cost, quality, service and speed. Kaizen is a philosophy of life that addresses to the best who want to be more and better. It is a process of improvement that never ends and it results in many advantages. The Japanese leadership model has shown that progress in small steps, but fast, reliable and leads to long-term wins. Kaizen method, implemented in Romania, too, has brought to people satisfaction and more money in their pocket.

  15. ORGANIZATIONAL CHANGE: BUSINESS PROCESS REENGINEERING OR OUTSOURCING?

    Directory of Open Access Journals (Sweden)

    Pellicelli Michela

    2012-12-01

    Full Text Available This article will analyze the logic behind the adoption of Business Process Reengineering and outsourcing. The first part analyzes Business Process Reengineering as a technique for analysis and for defining the business processes implemented by organizations in order to make the achievement of corporate objectives more efficient and effective. Nevertheless, this approach has some limits when the reengineering project aims solely at cost reduction. In any event, for several activities management must constantly evaluate the alternative to turning to outsourcing. In the second part we thus observe what should be the evaluations of management in order to pursue the objectives of maximum efficiency, economic efficiency, and productivity. Starting from the methodological assumptions that aid our understanding of the outsourcing of processes and that represent the operational and conceptual framework for the existence of this approach, several models will be analyzed held to be significant for determining those processes that can be outsourced, from a “strategic” point of view, and that are useful for deciding on the shift from BPR to outsourcing.

  16. Innovative model of business process reengineering at machine building enterprises

    Science.gov (United States)

    Nekrasov, R. Yu; Tempel, Yu A.; Tempel, O. A.

    2017-10-01

    The paper provides consideration of business process reengineering viewed as amanagerial innovation accepted by present day machine building enterprises, as well as waysto improve its procedure. A developed innovative model of reengineering measures isdescribed and is based on the process approach and other principles of company management.

  17. Re-engineering production systems: the Royal Netherlands Naval Dockyard

    NARCIS (Netherlands)

    Zijm, Willem H.M.

    1996-01-01

    Reengineering production systems in an attempt to meet tight cost, quality and leadtime standards has received considerable attention in the last decade. In this paper, we discuss the reengineering process at the Royal Netherlands Naval Dockyard. The process starts with a characterisation and a

  18. BUSINESS PROCESS REENGINEERING: CONCEPTS CAUSES AND EFFECT

    Directory of Open Access Journals (Sweden)

    Bernardo Nugroho Yahya

    2002-01-01

    Full Text Available Some people made a wrong concept about Business Process Reengineering (BPR. Some were misunderstanding about the BPR term. In other way, so many researches were introduced to describe a better definition about BPR. The thinking about concepts, causes, and effect of BPR will make a new perception about the term of BPR itself as a better methodology instead of the other Quality Management Methodology such as Total Quality Management (TQM, Just In Time (JIT, etc. This paper will mention the context of BPR in some of case study's journal.

  19. Reenginering of the i4 workflow engine

    OpenAIRE

    Likar, Tilen

    2013-01-01

    I4 is an enterprise resource planning system which allows you to manage business processes. Due to increasing demands for managing complex processes and adjusting those processes to global standards, a renewal of a part of the system was required. In this thesis we faced the reengineering of the workflow engine, and corresponding data model. We designed a business process diagram in Bizagi Porcess Modeler. The import to i4 and the export from i4 was developed on XPDL file exported from the mo...

  20. Reengineering GSM/GPRS Towards a Dedicated Network for Massive Smart Metering

    DEFF Research Database (Denmark)

    Madueño, Germán Corrales; Stefanovic, Cedomir; Popovski, Petar

    2014-01-01

    GSM is a synonym for a major success in wireless technology, achieving widespread use and high technology maturity. However, its future is questionable, as many stakeholders indicate that the GSM spectrum should be re-farmed for LTE. On the other hand, the advent of smart grid and the ubiquity...... of smart meters will require reliable, long-lived wide area connections. This motivates to investigate the potential of GSM to be evolved into a dedicated network for smart metering. We introduce simple mechanisms to reengineer the access control in GSM. The result is a system that offers excellent support...

  1. Re-engineering change in higher education

    Directory of Open Access Journals (Sweden)

    David Allen

    1999-01-01

    Full Text Available Business Process Re-engineering (BPR is being used in a number of UK Higher Education Institutions (HEIs as a change management strategy. Whilst the focus of these HEIs is on re-engineering administrative services, there are also tentative attempts to redesign teaching and learning. This paper adopts a case study approach to determine the applicability of BPR to HEIs. The research started from a broad research question: How does organisational culture in HEIs impact on the implementation of BPR programmes? The conclusions drawn from the research are that the organisational culture and structure of HEIs limit the degree of change sought from a BPR project: the focus of the case study HEIs was on incremental process improvement of administrative services. The projects in these institutions were not about radical change. BPR techniques are shown to have something to offer HEIs in terms of co-ordinating administrative activities, but the emphasis on IT and processes in project design means the human resources change necessary for significant gains in efficiency is unlikely.

  2. The Clean Development Mechanism Re-engineered

    DEFF Research Database (Denmark)

    Lütken, Søren

    2016-01-01

    for engineering such mechanism, or indeed reengineering the CDM itself, to make it a viable mitigation financing tool, providing receipts for payments in the form of certified emission reductions (CER). Two solutions are presented, both of which secure new financing for projects that deliver real and measurable...... emissions reduction benefits on the basis of prospective revenues from emissions reduction: one introduces up-front securitization of the emissions reductions; the other builds on a defined value of the CERs without the need for a carbon price or a market for trading. Most of us use simple heuristics...... time. Simply put CERs are not project finance and do not address project capital needs when most needed — upfront. CER based returns are available only after a project is operational. That is why only one third of registered CDM projects went as far as to get their carefully calculated CERs issued...

  3. Hospital Registration Process Reengineering Using Simulation Method

    Directory of Open Access Journals (Sweden)

    Qiang Su

    2010-01-01

    Full Text Available With increasing competition, many healthcare organizations have undergone tremendous reform in the last decade aiming to increase efficiency, decrease waste, and reshape the way that care is delivered. This study focuses on the operational efficiency improvement of hospital’s registration process. The operational efficiency related factors including the service process, queue strategy, and queue parameters were explored systematically and illustrated with a case study. Guided by the principle of business process reengineering (BPR, a simulation approach was employed for process redesign and performance optimization. As a result, the queue strategy is changed from multiple queues and multiple servers to single queue and multiple servers with a prepare queue. Furthermore, through a series of simulation experiments, the length of the prepare queue and the corresponding registration process efficiency was quantitatively evaluated and optimized.

  4. The effect of business process reengineering (BPR) on human ...

    African Journals Online (AJOL)

    The effect of business process reengineering (BPR) on human resource management in Addis Ababa City Administration. ... The PDF file you selected should load here if your Web browser has a PDF reader plug-in installed (for example, ...

  5. IDC Re-Engineering Phase 3 Development Plan.

    Energy Technology Data Exchange (ETDEWEB)

    Harris, James M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, John F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pollock, David L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    Sandia National Laboratories has prepared a project development plan that proposes how the parties interested in the IDC Re-Engineering system will coordinate its development, testing and transition to operations.

  6. BUSINESS PROCESS REENGINEERING AS THE METHOD OF PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    O. Honcharova

    2013-09-01

    Full Text Available The article is devoted to the analysis of process management approach. The main understanding of process management approach has been researched in the article. The definition of process and process management has been given. Also the methods of business process improvement has been analyzed, among them are fast-analysis solution technology (FAST, benchmarking, reprojecting and reengineering. The main results of using business process improvement have been described in figures of reducing cycle time, costs and errors. Also the tasks of business process reengineering have been noticed. The main stages of business process reengineering have been noticed. The main efficiency results of business process reengineering and its success factors have been determined.

  7. IDC Re-Engineering Phase 2 Architecture Document.

    Energy Technology Data Exchange (ETDEWEB)

    Burns, John F.

    2015-12-01

    This document contains a description of the system architecture for the IDC Re-Engineering Phase 2 project. This is a draft version that primarily provides background information for understanding delivered Use Case Realizations.

  8. IDC Re-Engineering Phase 3 Development Plan

    International Nuclear Information System (INIS)

    Harris, James M.; Burns, John F.; Pollock, David L.

    2017-01-01

    Sandia National Laboratories has prepared a project development plan that proposes how the parties interested in the IDC Re-Engineering system will coordinate its development, testing and transition to operations.

  9. Environmental management compliance reengineering project, FY 1997 report

    International Nuclear Information System (INIS)

    VanVliet, J.A.; Davis, J.N.

    1997-09-01

    Through an integrated reengineering effort, the Idaho National Engineering and Environmental Laboratory (INEEL) is successfully implementing process improvements that will permit safe and compliant operations to continue during the next 5 years, even though $80 million was removed from the Environmental Management (EM) program budget. A 2-year analysis, design, and implementation project will reengineer compliance-related activities and reduce operating costs by approximately $17 million per year from Fiscal Year (FY) 1998 through 2002, while continuing to meet the INEEL''s environment, safety, and health requirements and milestone commitments. Compliance reengineer''s focus is improving processes, not avoiding full compliance with environmental, safety, and health laws. In FY 1997, compliance reengineering used a three-phase approach to analyze, design, and implement the changes that would decrease operating costs. Implementation for seven specific improvement projects was completed in FY 1997, while five projects will complete implementation in FY 1998. During FY 1998, the three-phase process will be repeated to continue reengineering the INEEL

  10. Environmental management compliance reengineering project, FY 1997 report

    Energy Technology Data Exchange (ETDEWEB)

    VanVliet, J.A.; Davis, J.N.

    1997-09-01

    Through an integrated reengineering effort, the Idaho National Engineering and Environmental Laboratory (INEEL) is successfully implementing process improvements that will permit safe and compliant operations to continue during the next 5 years, even though $80 million was removed from the Environmental Management (EM) program budget. A 2-year analysis, design, and implementation project will reengineer compliance-related activities and reduce operating costs by approximately $17 million per year from Fiscal Year (FY) 1998 through 2002, while continuing to meet the INEEL`s environment, safety, and health requirements and milestone commitments. Compliance reengineer`s focus is improving processes, not avoiding full compliance with environmental, safety, and health laws. In FY 1997, compliance reengineering used a three-phase approach to analyze, design, and implement the changes that would decrease operating costs. Implementation for seven specific improvement projects was completed in FY 1997, while five projects will complete implementation in FY 1998. During FY 1998, the three-phase process will be repeated to continue reengineering the INEEL.

  11. Customer configuration updating in a software supply network

    NARCIS (Netherlands)

    Jansen, S.R.L.

    2007-01-01

    Product software development is the activity of development, modification, reuse, re-engineering, maintenance, or any other activities that result in packaged configurations of software components or software-based services that are released for and traded in a specific market \\cite{XuBrinkkemper}.

  12. Defense programs business practices re-engineering QFD exercise

    International Nuclear Information System (INIS)

    Murray, C.; Halbleib, L.

    1996-03-01

    The end of the cold war has resulted in many changes for the Nuclear Weapons Complex (NWC). We now work in a smaller complex, with reduced resources, a smaller stockpile, and no new phase 3 weapons development programs. This new environment demands that we re-evaluate the way we design and produce nuclear weapons. The Defense Program (DP) Business Practices Re-engineering activity was initiated to improve the design and production efficiency of the DP Sector. The activity had six goals: (1) to identify DP business practices that are exercised by the Product Realization Process (PRP); (2) to determine the impact (positive, negative, or none) of these practices on defined, prioritized customer criteria; (3) to identify business practices that are candidates for elimination or re-engineering; (4) to select two or three business practices for re-engineering; (5) to re-engineer the selected business practices; and (6) to exercise the re-engineered practices on three pilot development projects. Business practices include technical and well as administrative procedures that are exercised by the PRP. A QFD exercise was performed to address (1)-(4). The customer that identified, defined, and prioritized the criteria to rate the business practices was the Block Change Advisory Group. Five criteria were identified: cycle time, flexibility, cost, product performance/quality, and best practices. Forty-nine business practices were identified and rated per the criteria. From this analysis, the group made preliminary recommendations as to which practices would be addressed in the re-engineering activity. Sixteen practices will be addressed in the re-engineering activity. These practices will then be piloted on three projects: (1) the Electronic Component Assembly (ECA)/Radar Project, (2) the B61 Mod 11, and (3) Warhead Protection Program (WPP)

  13. ORNL engineering design and construction reengineering report

    Energy Technology Data Exchange (ETDEWEB)

    McNeese, L.E.

    1998-01-01

    A team composed of individuals representing research and development (R and D) divisions, infrastructure support organizations, and Department of Energy (DOE)-Oak Ridge Operations was chartered to reengineer the engineering, design, and construction (ED and C) process at Oak Ridge National Laboratory (ORNL). The team recognized that ED and C needs of both R and D customers and the ORNL infrastructure program have to be met to maintain a viable and competitive national laboratory. Their goal was to identify and recommend implementable best-in-class ED and C processes that will efficiently and cost-effectively support the ORNL R and D staff by being responsive to their programmatic and infrastructure needs. The team conducted process mapping of current and potential ED and C approaches, developed idealized versions of ED and C processes, and identified potential barriers to an efficient ED and C process. Eight subteams were assigned to gather information and to evaluate the significance of potential barriers through benchmarking, surveys, interviews, and reviews of key topical areas in order to determine whether the perceived barriers were real and important and whether they resulted from laws or regulations over which ORNL has no control.

  14. Business process re-engineering in service operations

    International Nuclear Information System (INIS)

    McClintock, J.W.

    1995-01-01

    The concept of business process re-engineering, and how it was applied to the operations of the Consumers Gas Company were discussed. Business process re-engineering was defined as the improvement of the efficiency of the customer-service process, and the overall improvement of practices and operations. The re-engineering project was said to involve a thorough analysis of information technology, current limitations, and business operational needs, undertaken on an enterprise-wide basis. Viewed generically,a re-engineering project was said to have six major components: (1) business drivers (i.e. the articulation of the Company's strategic issues); (2) benchmark measures; (3) future state process models; (4) cost/benefit analysis; (5) a change management plan; and (6) a development plan. Business improvements expected to result from the project include reduced cost of operation, reduction of waste, and a substantially complete re-design of the business process. Management of the project involved a team approach, and help of a consultant to identify the scope of the re-design, its limitations, and future state. A life expectancy of approximately 10 years was given for the re-engineering plan, with annual benefits (in terms of cost reduction) of $4.6 million by the year 2000

  15. Goal-Equivalent Secure Business Process Re-engineering

    DEFF Research Database (Denmark)

    Acosta, Hugo Andrés Lópes; Massacci, Fabio; Zannone, Nicola

    2008-01-01

    that they are somehow “equivalent”. In this paper we propose a method for passing from SI*, a modeling language for capturing and modeling functional, security, and trust organizational and system requirements, to business process specifications and vice versa. In particular, starting from an old secure business......The introduction of information technologies in health care systems often requires to re-engineer the business processes used to deliver care. Obviously, the new and re-engineered processes are observationally different and thus we cannot use existing model-based techniques to argue...... process, we reconstruct the functional and security requirements at organizational level that such a business process was supposed to meet (including the trust relations that existed among the members of the organization). To ensure that the re-engineered business process meets the elicited requirements...

  16. Application of information and communication technology in process reengineering

    Directory of Open Access Journals (Sweden)

    Đurović Aleksandar M.

    2014-01-01

    Full Text Available This paper examines the role of information communication technologies in reengineering processes. General analysis of a process will show that information communication technologies improve their efficiency. Reengineering model based on the BPMN 2.0 standard will be applied to the process of seeking internship/job by students from Faculty of Transport and Traffic Engineering. In the paper, after defining the technical characteristics and required functionalities, web / mobile application is proposed, enabling better visibility of traffic engineers to companies seeking that education profile.

  17. Leadership processes for re-engineering changes to the health care industry.

    Science.gov (United States)

    Guo, Kristina L

    2004-01-01

    As health care organizations seek innovative ways to change financing and delivery mechanisms due to escalated health care costs and increased competition, drastic changes are being sought in the form of re-engineering. This study discusses the leader's role of re-engineering in health care. It specifically addresses the reasons for failures in re-engineering and argues that success depends on senior level leaders playing a critical role. Existing studies lack comprehensiveness in establishing models of re-engineering and management guidelines. This research focuses on integrating re-engineering and leadership processes in health care by creating a step-by-step model. Particularly, it illustrates the four Es: Examination, Establishment, Execution and Evaluation, as a comprehensive re-engineering process that combines managerial roles and activities to result in successfully changed and reengineered health care organizations.

  18. Conceptual Framework of Business Process Reengineering for Civil ...

    African Journals Online (AJOL)

    Tesfaye Deb

    endorsed Business Process Reengineering (BPR) as a foundation for strengthening Result Based ... Ethiopian government recognized the importance of improving ...... finance process. 4. Project process. (research and consultancy). Low frequency, request arrival is random, time interval between two requests can be very.

  19. Downsizing, reengineering and patient safety: numbers, newness and resultant risk.

    Science.gov (United States)

    Knox, G E; Kelley, M; Hodgson, S; Simpson, K R; Carrier, L; Berry, D

    1999-01-01

    Downsizing and reengineering are facts of life in contemporary healthcare organizations. In most instances, these organizational changes are undertaken in an attempt to increase productivity or cut operational costs with results measured in these terms. Less often considered are potential detrimental effects on patient safety or strategies, which might be used to minimize these risks.

  20. Reshaping the Enterprise through an Information Architecture and Process Reengineering.

    Science.gov (United States)

    Laudato, Nicholas C.; DeSantis, Dennis J.

    1995-01-01

    The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…

  1. THEORETICAL ASPECTS OF REENGINEERING IN SMALL AND MEDIUM ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Slobodan Stefanović

    2014-01-01

    Full Text Available Reengineering is a fundamental rethinking and radical redesign of business processes - to achieve dramatic improvements in critical, important measures of performances, such as cost, quality, service and speed. This definition contains four keywords: fundamental, radical, dramatic and processes.

  2. Accountability-based reengineering of an order fulfillment process

    NARCIS (Netherlands)

    Zhang, L.; Jiao, J.; Ma, Q.

    2009-01-01

    In view of the dynamic changes in a supply chain network and the significance of order fulfillment processes (OFPs) for the successful implementation of supply chain management, this paper puts forward an accountability-based methodology for companies to reengineer OFPs while considering both

  3. THE PROJECT MANAGEMENT OF INDUSTRIAL BUILDINGS REENGINEERING (RECONSTRUCTION AND COMPLETION

    Directory of Open Access Journals (Sweden)

    K. Kolesnikova

    2017-06-01

    Full Text Available Creative element fate of any activity may not fall to zero because of the turbulent environment in which these activities are carried out, always prevents this. Environment that makes each building unique, that is, provides the main basis of the project. When we are talking about complex construction, the share of the creative component becomes very significant. First and foremost, this is explained by the duration of the construction work, during which time to happen risk events. The article analyses the processes of construction from the point of view of their conformity to the concept of project activities. It is shown that with increasing degree of difficulty of construction or time of the last share of creative activities in the overall project grows. In recent times more and more widespread work on re-engineering complex systems, for example, building constructions. This means repair of the building, but not a simple repair with restoration of the original, incorporated in the design of building elements and their interfaces, and partial or full replacement of items that fail or are outdated, new ones require first, a new design of their structures and production technologies, as well as the design of the accessories for the installation and technology is reshaping the object. Combining the two above-mentioned factors of growth of the share of creative activities during the project management of the re-engineering of building structures: complexity and construction time, received a cognitive model of such growth. Introduced the concept of "reengineering in construction" as a combination of the processes of adjustment and worn or completion of unfinished buildings. It is proved that any re-engineering in construction is the project activities. Provisions are tested in a real reengineering of industrial buildings with a positive technical and economic effect.

  4. THE PROJECT MANAGEMENT OF INDUSTRIAL BUILDINGS REENGINEERING (RECONSTRUCTION AND COMPLETION

    Directory of Open Access Journals (Sweden)

    Katerina Kolesikova

    2017-05-01

    Full Text Available Creative element fate of any activity may not fall to zero because of the turbulent environment in which these activities are carried out, always prevents this. Environment that makes each building unique, that is, provides the main basis of the project. When we are talking about complex construction, the share of the creative component becomes very significant. First and foremost, this is explained by the duration of the construction work, during which time to happen risk events. The article analyses the processes of construction from the point of view of their conformity to the concept of project activities. It is shown that with increasing degree of difficulty of construction or time of the last share of creative activities in the overall project grows. In recent times more and more widespread work on re-engineering complex systems, for example, building constructions. This means repair of the building, but not a simple repair with restoration of the original, incorporated in the design of building elements and their interfaces, and partial or full replacement of items that fail or are outdated, new ones require first, a new design of their structures and production technologies, as well as the design of the accessories for the installation and technology is reshaping the object. Combining the two above-mentioned factors of growth of the share of creative activities during the project management of the re-engineering of building structures: complexity and construction time, received a cognitive model of such growth. Introduced the concept of "reengineering in construction" as a combination of the processes of adjustment and worn or completion of unfinished buildings. It is proved that any re-engineering in construction is the project activities. Provisions are tested in a real reengineering of industrial buildings with a positive technical and economic effect.

  5. Maintaining evolvability.

    Science.gov (United States)

    Crow, James F

    2008-12-01

    Although molecular methods, such as QTL mapping, have revealed a number of loci with large effects, it is still likely that the bulk of quantitative variability is due to multiple factors, each with small effect. Typically, these have a large additive component. Conventional wisdom argues that selection, natural or artificial, uses up additive variance and thus depletes its supply. Over time, the variance should be reduced, and at equilibrium be near zero. This is especially expected for fitness and traits highly correlated with it. Yet, populations typically have a great deal of additive variance, and do not seem to run out of genetic variability even after many generations of directional selection. Long-term selection experiments show that populations continue to retain seemingly undiminished additive variance despite large changes in the mean value. I propose that there are several reasons for this. (i) The environment is continually changing so that what was formerly most fit no longer is. (ii) There is an input of genetic variance from mutation, and sometimes from migration. (iii) As intermediate-frequency alleles increase in frequency towards one, producing less variance (as p --> 1, p(1 - p) --> 0), others that were originally near zero become more common and increase the variance. Thus, a roughly constant variance is maintained. (iv) There is always selection for fitness and for characters closely related to it. To the extent that the trait is heritable, later generations inherit a disproportionate number of genes acting additively on the trait, thus increasing genetic variance. For these reasons a selected population retains its ability to evolve. Of course, genes with large effect are also important. Conspicuous examples are the small number of loci that changed teosinte to maize, and major phylogenetic changes in the animal kingdom. The relative importance of these along with duplications, chromosome rearrangements, horizontal transmission and polyploidy

  6. Development of a testlet generator in re-engineering the Indonesian physics national-exams

    Science.gov (United States)

    Mindyarto, Budi Naini; Mardapi, Djemari; Bastari

    2017-08-01

    The Indonesian Physics national-exams are end-of-course summative assessments that could be utilized to support the assessment for learning in physics educations. This paper discusses the development and evaluation of a testlet generator based on a re-engineering of Indonesian physics national exams. The exam problems were dissected and decomposed into testlets revealing the deeper understanding of the underlying physical concepts by inserting a qualitative question and its scientific reasoning question. A template-based generator was built to facilitate teachers in generating testlet variants that would be more conform to students' scientific attitude development than their original simple multiple-choice formats. The testlet generator was built using open source software technologies and was evaluated focusing on the black-box testing by exploring the generator's execution, inputs and outputs. The results showed the correctly-performed functionalities of the developed testlet generator in validating inputs, generating testlet variants, and accommodating polytomous item characteristics.

  7. Reengineering a PC-based System into the Mobile Device Product Line

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stanislaw; Loughran, Neil

    2003-01-01

    There is a growing demand to port existing PC-based software systems to mobile device platforms. Systems running on mobile devices share basic characteristics with their PC-based counterparts, but differ from them in details of user interfaces, application models, etc. Systems running on mobile...... devices must also perform well using less memory than PC-based systems. Mobile devices themselves are different from each other in many ways, too. We describe how we made an existing PC-based City Guide System available on a wide range of mobile devices, in a cost-effective way. We applied "reengineering...... into a product line architecture" approach to achieve the goal. Our product line architecture facilitates reuse via generation. We generate specific City Guide Systems for target platforms including PC, Pocket PC and other mobile devices, from generic meta-components that form the City Guide System product line...

  8. Software Technology for E-Commerce Era

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The rapid growth of Internet usage and electronic commerce(e-commerce) applica t ions will push traditional industries to transform their business models and to re-engineer their information systems. This direction will give the software in d ustry either great opportunities for their business growth or crucial challenges to their existence. This article describes two essential challenges the softwar e industry will face and presents relevant new technologies that will be helpful for overcoming those challenges.

  9. A Process Re-engineering Framework for Reverse Logistics based on a Case Study

    Directory of Open Access Journals (Sweden)

    Hing Kai Chan

    2010-09-01

    Full Text Available Reverse logistics has gained increasing attention in recent years as a channel for companies to achieve operational excellence. The process involves manipulation of returned materials, or even products, which forms a pivotal role in sustainable development throughout the whole supply chains. To make reverse logistics possible, process re-engineering may need to be carried out. However, the processes involved in reengineering are practically complicated. Objectives, benefits, and applicability of any process re-engineering require a careful and detailed strategic planning. This paper aims to propose an easy-to-follow step-by-step framework for practitioners to perform process re-engineering, to learn and identify the critical issues in each step, and to be successful in applying process re-engineering in order to enhance reverse logistics performance. A learner-centred approach is adopted based on a case study of process re-engineering, which is demonstrated in the paper for explanation.

  10. Business Process Reengineering, a Crisis Solution or a Necessity

    Directory of Open Access Journals (Sweden)

    Gabriela GHEORGHE

    2012-08-01

    Full Text Available This case study shows that the company decided to implement Business Process Reengineering (BPR not only because external environment had changed, but also due to its obsolete business processes and organizational structure. The article will highlight the importance of the organizations' focusing on sub-goals, in order to finally reach the desired result in the organization's main goals. When rapid evolution has become the fundamental contemporary coordinate, reengineering is a form of company innovative reaction in terms of intensifying competition and globalization. Remodeling the Company in phases of crisis, when time pressure reduces the type and number of solutions that can be adopted, without effective leadership, can lead in most cases to failure. The effect of redesigning the business processes depends on how well it is implemented, coordinated and monitored.

  11. E-learning and the Educational Organizations Structure Reengineering (EOSR

    Directory of Open Access Journals (Sweden)

    Osama Alshara

    2007-06-01

    Full Text Available There are many calls for innovative learning methods that utilize advanced technologies. However, we will raise fundamental questions that look deep into the future of the educational organization. Can the educational institute survive without adapting learning technologies? Would the educational institute succeed in adapting new learning technologies without changing its organizational structure and processes? We claim that the answer to both questions is no. Our research will present the need for edu-cational institutes to incorporate learning technologies and focuses on the demand for the educational organization structure reengineering as a basic requirement for the suc-cess of incorporating learning technologies. Our study ex-plores the faculty requirements and policies and procedures of educational institutes in the UAE.The paper concludes with some discussions on findings from a case study of the need of educational organization struc-ture reengineering as a basic requirement for incorporating learning technologies.

  12. Business Process Reengineering, a Crises Solution or a Necessity

    Directory of Open Access Journals (Sweden)

    Gabriela GHEORGHE

    2011-11-01

    Full Text Available This case study shows that the company decided to implement Business Process Reengineering (BPR not only because external environment had changed, but also due to its obsolete business processes and organizational structure. The article will highlight the importance of the organizations' focusing on sub-goals, in order to finally reach the desired result in the organization's main goals. When rapid evolution has become the fundamental contemporary coordinate, reengineering is a form of company innovative reaction in terms of intensifying competition and globalization. Remodelling the Company in phases of crisis, when time pressure reduces the type and number of solutions that can be adopted, without effective leadership can lead in most cases to failure. The effect of redesigning the business processes depends on how well it is implemented, coordinated and monitored.

  13. Reengineering and health physics within the project Hanford management contract

    International Nuclear Information System (INIS)

    Atencio, E.M.

    1997-01-01

    The impending transition of the Hartford Site management and operations (M ampersand O) contract to a management and integrating (M ampersand I) contract format, together with weak radiological performance assessments by external organizations and reduced financial budgets prompted the 're-engineering' of the previous Hanford prime contractor Radiological Control (Rad Con) organization. This paper presents the methodology, identified areas of improvements, and results of the re-engineering process. The conversion from the M ampersand O to the M ampersand I contract concept resulted in multiple independent Rad Con organizations reporting to separate major contractors who are managed by an integrating contractor. This brought significant challenges when establishing minimum site standards for sitewide consistency, developing roles and responsibilities, and maintaining site Rad Con goals. Championed by the previous contractor's Rad Con Director, Denny Newland, a five month planning effort was executed to address the challenges of the M ampersand I and to address identified weaknesses. Fluor Daniel Hanford assumed the responsibility as integrator of the Project Hanford Management Contract on October 1, 1996. The Fluor Daniel Hanford Radiation Protection Director Jeff Foster presents the results of the re-engineering effort, including the significant cost savings, process improvements, field support improvements, and clarification of roles and responsibilities that have been achieved

  14. Reengineering observatory operations for the time domain

    Science.gov (United States)

    Seaman, Robert L.; Vestrand, W. T.; Hessman, Frederic V.

    2014-07-01

    Observatories are complex scientific and technical institutions serving diverse users and purposes. Their telescopes, instruments, software, and human resources engage in interwoven workflows over a broad range of timescales. These workflows have been tuned to be responsive to concepts of observatory operations that were applicable when various assets were commissioned, years or decades in the past. The astronomical community is entering an era of rapid change increasingly characterized by large time domain surveys, robotic telescopes and automated infrastructures, and - most significantly - of operating modes and scientific consortia that span our individual facilities, joining them into complex network entities. Observatories must adapt and numerous initiatives are in progress that focus on redesigning individual components out of the astronomical toolkit. New instrumentation is both more capable and more complex than ever, and even simple instruments may have powerful observation scripting capabilities. Remote and queue observing modes are now widespread. Data archives are becoming ubiquitous. Virtual observatory standards and protocols and astroinformatics data-mining techniques layered on these are areas of active development. Indeed, new large-aperture ground-based telescopes may be as expensive as space missions and have similarly formal project management processes and large data management requirements. This piecewise approach is not enough. Whatever challenges of funding or politics facing the national and international astronomical communities it will be more efficient - scientifically as well as in the usual figures of merit of cost, schedule, performance, and risks - to explicitly address the systems engineering of the astronomical community as a whole.

  15. Re-engineering Nascom's network management architecture

    Science.gov (United States)

    Drake, Brian C.; Messent, David

    1994-01-01

    The development of Nascom systems for ground communications began in 1958 with Project Vanguard. The low-speed systems (rates less than 9.6 Kbs) were developed following existing standards; but, there were no comparable standards for high-speed systems. As a result, these systems were developed using custom protocols and custom hardware. Technology has made enormous strides since the ground support systems were implemented. Standards for computer equipment, software, and high-speed communications exist and the performance of current workstations exceeds that of the mainframes used in the development of the ground systems. Nascom is in the process of upgrading its ground support systems and providing additional services. The Message Switching System (MSS), Communications Address Processor (CAP), and Multiplexer/Demultiplexer (MDM) Automated Control System (MACS) are all examples of Nascom systems developed using standards such as, X-windows, Motif, and Simple Network Management Protocol (SNMP). Also, the Earth Observing System (EOS) Communications (Ecom) project is stressing standards as an integral part of its network. The move towards standards has produced a reduction in development, maintenance, and interoperability costs, while providing operational quality improvement. The Facility and Resource Manager (FARM) project has been established to integrate the Nascom networks and systems into a common network management architecture. The maximization of standards and implementation of computer automation in the architecture will lead to continued cost reductions and increased operational efficiency. The first step has been to derive overall Nascom requirements and identify the functionality common to all the current management systems. The identification of these common functions will enable the reuse of processes in the management architecture and promote increased use of automation throughout the Nascom network. The MSS, CAP, MACS, and Ecom projects have indicated

  16. Re-engineering caused by ISO-9000 certification

    DEFF Research Database (Denmark)

    Hvam, Lars; Nielsen, Anders Paarup; Bjarnø, Ole-Christian

    1997-01-01

    Based on a project performed at a medium-sized producer of medical utensils, reviews some of the problems which the company experienced in connection with the system built up during ISO 9001 certification, and the re-engineering efforts which were performed in order to relieve these problems....... Focuses in particular on a re-structuring of the company’s system for production documentation and its relation to the traceability of their products. This system was radically altered during the project without the traceability requirements being violated or reduced. These changes resulted in a marked...... increase in productivity....

  17. The role of business process reengineering in health care.

    Science.gov (United States)

    Kohn, D

    1994-02-01

    Business process reengineering (BPR) is a management philosophy capturing attention in health care. It combines some new, old, and recycled management philosophies, and, more often than not, is yielding positive results. BPR's emphasis is on the streamlining of cross-functional processes to significantly reduce time and/or cost, increase revenue, improve quality and service, and reduce risk. Therefore, it has many applications in health care. This article provides an introduction to the concept of BPR, including the definition of BPR, its origin, its champions, and factors for its success.

  18. Reengineering the Innovation Culture through Social media Crowdsourcing

    DEFF Research Database (Denmark)

    Scupola, Ada; Nicolajsen, Hanne Westh

    2012-01-01

    In this article we investigate how social media-based crowdsourcing systems can be used to reengineer the innovation culture in an organization. Based on a case study of a large engineering consultancy’s use of a social media crowdsourcing system we investigate the impact on the organizations...... innovation culture using theory on organizational culture and crowdsourcing. The analysis shows that the organizational crowdsourcing event has supported an innovation culture change in the case company towards a more including approach to innovation; creating a new and different awareness of innovation...

  19. Reengineering the Innovation Culture through Social media Crowdsourcing

    DEFF Research Database (Denmark)

    Scupola, Ada; Nicolajsen, Hanne Westh

    2012-01-01

    innovation culture using theory on organizational culture and crowdsourcing. The analysis shows that the organizational crowdsourcing event has supported an innovation culture change in the case company towards a more including approach to innovation; creating a new and different awareness of innovation......In this article we investigate how social media-based crowdsourcing systems can be used to reengineer the innovation culture in an organization. Based on a case study of a large engineering consultancy’s use of a social media crowdsourcing system we investigate the impact on the organizations...

  20. Re-engineering pre-employment check-up systems: a model for improving health services.

    Science.gov (United States)

    Rateb, Said Abdel Hakim; El Nouman, Azza Abdel Razek; Rateb, Moshira Abdel Hakim; Asar, Mohamed Naguib; El Amin, Ayman Mohammed; Gad, Saad abdel Aziz; Mohamed, Mohamed Salah Eldin

    2011-01-01

    The purpose of this paper is to develop a model for improving health services provided by the pre-employment medical fitness check-up system affiliated to Egypt's Health Insurance Organization (HIO). Operations research, notably system re-engineering, is used in six randomly selected centers and findings before and after re-engineering are compared. The re-engineering model follows a systems approach, focusing on three areas: structure, process and outcome. The model is based on six main components: electronic booking, standardized check-up processes, protected medical documents, advanced archiving through an electronic content management (ECM) system, infrastructure development, and capacity building. The model originates mainly from customer needs and expectations. The centers' monthly customer flow increased significantly after re-engineering. The mean time spent per customer cycle improved after re-engineering--18.3 +/- 5.5 minutes as compared to 48.8 +/- 14.5 minutes before. Appointment delay was also significantly decreased from an average 18 to 6.2 days. Both beneficiaries and service providers were significantly more satisfied with the services after re-engineering. The model proves that re-engineering program costs are exceeded by increased revenue. Re-engineering in this study involved multiple structure and process elements. The literature review did not reveal similar re-engineering healthcare packages. Therefore, each element was compared separately. This model is highly recommended for improving service effectiveness and efficiency. This research is the first in Egypt to apply the re-engineering approach to public health systems. Developing user-friendly models for service improvement is an added value.

  1. WSC-07: Evolving the Web Services Challenge

    NARCIS (Netherlands)

    Blake, M. Brian; Cheung, William K.W.; Jaeger, Michael C.; Wombacher, Andreas

    Service-oriented architecture (SOA) is an evolving architectural paradigm where businesses can expose their capabilities as modular, network-accessible software services. By decomposing capabilities into modular services, organizations can share their offerings at multiple levels of granularity

  2. Design implications for task-specific search utilities for retrieval and re-engineering of code

    Science.gov (United States)

    Iqbal, Rahat; Grzywaczewski, Adam; Halloran, John; Doctor, Faiyaz; Iqbal, Kashif

    2017-05-01

    The importance of information retrieval systems is unquestionable in the modern society and both individuals as well as enterprises recognise the benefits of being able to find information effectively. Current code-focused information retrieval systems such as Google Code Search, Codeplex or Koders produce results based on specific keywords. However, these systems do not take into account developers' context such as development language, technology framework, goal of the project, project complexity and developer's domain expertise. They also impose additional cognitive burden on users in switching between different interfaces and clicking through to find the relevant code. Hence, they are not used by software developers. In this paper, we discuss how software engineers interact with information and general-purpose information retrieval systems (e.g. Google, Yahoo!) and investigate to what extent domain-specific search and recommendation utilities can be developed in order to support their work-related activities. In order to investigate this, we conducted a user study and found that software engineers followed many identifiable and repeatable work tasks and behaviours. These behaviours can be used to develop implicit relevance feedback-based systems based on the observed retention actions. Moreover, we discuss the implications for the development of task-specific search and collaborative recommendation utilities embedded with the Google standard search engine and Microsoft IntelliSense for retrieval and re-engineering of code. Based on implicit relevance feedback, we have implemented a prototype of the proposed collaborative recommendation system, which was evaluated in a controlled environment simulating the real-world situation of professional software engineers. The evaluation has achieved promising initial results on the precision and recall performance of the system.

  3. Impact of peculiar features of construction of transport infrastructure on the choice of tools for reengineering of business processes

    Science.gov (United States)

    Khripko, Elena

    2017-10-01

    In the present article we study the issues of organizational resistance to reengineering of business processes in construction of transport infrastructure. Reengineering in a company of transport sector is, first and foremost, an innovative component of business strategy. We analyze the choice of forward and reverse reengineering tools and terms of their application in connection with organizational resistance. Reengineering is defined taking into account four aspects: fundamentality, radicality, abruptness, business process. We describe the stages of reengineering and analyze key requirements to newly created business processes.

  4. Re-engineering the urban drainage system for resource recovery and protection of drinking water supplies.

    Science.gov (United States)

    Gumbo, B

    2000-01-01

    The Harare metropolis in Zimbabwe, extending upstream from Manyame Dam in the Upper Manyame River Basin, consists of the City of Harare and its satellite towns: Chitungwiza, Norton, Epworth and Ruwa. The existing urban drainage system is typically a single-use-mixing system: water is used and discharged to "waste", excreta are flushed to sewers and eventually, after "treatment", the effluent is discharged to a drinking water supply source. Polluted urban storm water is evacuated as fast as possible. This system not only ignores the substantial value in "waste" materials, but it also exports problems to downstream communities and to vulnerable fresh-water sources. The question is how can the harare metropolis urban drainage system, which is complex and has evolved over time, be rearranged to achieve sustainability (i.e. water conservation, pollution prevention at source, protection of the vulnerable drinking water sources and recovery of valuable materials)? This paper reviews current concepts regarding the future development of the urban drainage system in line with the new vision of "Sustainable Cities of the Future". The Harare Metropolis in Zimbabwe is taken as a case, and philosophical options for re-engineering the drainage system are discussed.

  5. Biocybrid systems and the re-engineering of life

    Science.gov (United States)

    Domingues, Diana; Ferreira da Rocha, Adson; Hamdan, Camila; Augusto, Leci; Miosso, Cristiano Jacques

    2011-03-01

    The reengineering of life expanded by perceptual experiences in the sense of presence in Virtual Reality and Augmented Reality is the theme of our investigation in collaborative practices confirming the artistś creativity close to the inventivity of scientists and mutual capacity for the generation of biocybrid systems. We consider the enactive bodily interfaces for human existence being co-located in the continuum and symbiotic zone between body and flesh - cyberspace and data - and the hybrid properties of physical world. That continuum generates a biocybrid zone (Bio+cyber+hybrid) and the life is reinvented. Results reaffirm the creative reality of coupled body and mutual influences with environment information, enhancing James Gibson's ecological perception theory. The ecosystem life in its dynamical relations between human, animal, plants, landscapes, urban life and objects, bring questions and challenges for artworks and the reengineering of life discussed in our artworks in technoscience. Finally, we describe an implementation in which the immersion experience is enhanced by the datavisualization of biological audio signals and by using wearable miniaturized devices for biofeedback.

  6. A Longitudinal BPR Study in a Danish Manufacturing Company - From Reengineering to Process Management

    DEFF Research Database (Denmark)

    Larsen, Michael Holm; Wieth, Christian; Domsten, Zenia Vittarp

    1998-01-01

    , Business Process Reengineering (BPR) is the most applied method for planning and carrying out projects. Novo Nordisk A/S is one of the largest companies in Denmark and the world's largest producer of industrial enzymes with a market share of more than 50%.This paper is a longitudinal study of BPR...... initiatives at Enzyme Business carried out within the time frame of January 1994 to March 1998. The paper provides empirical insight from a number of BPR-projects and related BPR-initiatives, e.g. Business System Reengineering projects. The results of the paper suggest that reengineering with the means...

  7. IDC reengineering Phase 2 & 3 US industry standard cost estimate summary

    Energy Technology Data Exchange (ETDEWEB)

    Harris, James M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Huelskamp, Robert M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Sandia National Laboratories has prepared a ROM cost estimate for budgetary planning for the IDC Reengineering Phase 2 & 3 effort, using a commercial software cost estimation tool calibrated to US industry performance parameters. This is not a cost estimate for Sandia to perform the project. This report provides the ROM cost estimate and describes the methodology, assumptions, and cost model details used to create the ROM cost estimate. ROM Cost Estimate Disclaimer Contained herein is a Rough Order of Magnitude (ROM) cost estimate that has been provided to enable initial planning for this proposed project. This ROM cost estimate is submitted to facilitate informal discussions in relation to this project and is NOT intended to commit Sandia National Laboratories (Sandia) or its resources. Furthermore, as a Federally Funded Research and Development Center (FFRDC), Sandia must be compliant with the Anti-Deficiency Act and operate on a full-cost recovery basis. Therefore, while Sandia, in conjunction with the Sponsor, will use best judgment to execute work and to address the highest risks and most important issues in order to effectively manage within cost constraints, this ROM estimate and any subsequent approved cost estimates are on a 'full-cost recovery' basis. Thus, work can neither commence nor continue unless adequate funding has been accepted and certified by DOE.

  8. Evolvable synthetic neural system

    Science.gov (United States)

    Curtis, Steven A. (Inventor)

    2009-01-01

    An evolvable synthetic neural system includes an evolvable neural interface operably coupled to at least one neural basis function. Each neural basis function includes an evolvable neural interface operably coupled to a heuristic neural system to perform high-level functions and an autonomic neural system to perform low-level functions. In some embodiments, the evolvable synthetic neural system is operably coupled to one or more evolvable synthetic neural systems in a hierarchy.

  9. Reengineering NHS Hospitals in Greece: Redistribution Leads to Rational Mergers.

    Science.gov (United States)

    Nikolentzos, Athanasios; Kontodimopoulos, Nick; Polyzos, Nikolaos; Thireos, Eleftherios; Tountas, Yannis

    2015-03-18

    The purpose of this study was to record and evaluate existing public hospital infrastructure of the National Health System (NHS), in terms of clinics and laboratories, as well as the healthcare workforce in each of these units and in every health region in Greece, in an attempt to optimize the allocation of these resources. An extensive analysis of raw data according to supply and performance indicators was performed to serve as a solid and objective scientific baseline for the proposed reengineering of the Greek public hospitals. Suggestions for "reshuffling" clinics and diagnostic laboratories, and their personnel, were made by using a best versus worst outcome indicator approach at a regional and national level. This study is expected to contribute to the academic debate about the gap between theory and evidence based decision-making in health policy.

  10. Fifteen years of Superfund at South Valley: Reengineering required

    International Nuclear Information System (INIS)

    Cormier, J.; Horak, F.

    1995-01-01

    It is no surprise to many of Superfund's practitioners that the law and its application are flawed. The South Valley Superfund Site in Albuquerque, New Mexico has not escaped Superfund's problems. The problems and issues arising out of the South Valley Superfund site have spurred the desire to seek a better way to administer and manage cleanup. This new method applies organizational and role changes that bring Superfund closer to an efficient business-like entity. This ''Reengineered'' Superfund strives for reorganization, contractor reduction, improved communication, reporting reduction, and teaming. In addition, modifications are made to the roles of regulators, potentially responsible parties (PRPs), and the public. Today the site encompasses roughly one square mile in area, includes six identified contaminant sources, and deals with solvent and petroleum by-product contamination

  11. IAEA safeguards information system re-engineering project (IRP)

    International Nuclear Information System (INIS)

    Whitaker, G.; Becar, J.-M.; Ifyland, N.; Kirkgoeze, R.; Koevesd, G.; Szamosi, L.

    2007-01-01

    The Safeguards Information System Re-engineering Project (IRP) was initiated to assist the IAEA in addressing current and future verification and analysis activities through the establishment of a new information technology framework for strengthened and integrated safeguards. The Project provides a unique opportunity to enhance all of the information services for the Department of Safeguards and will require project management 'best practices' to balance limited funds, available resources and Departmental priorities. To achieve its goals, the Project will require the participation of all stakeholders to create a comprehensive and cohesive plan that provides both a flexible and stable foundation for address changing business needs. The expectation is that high quality integrated information systems will be developed that incorporate state-of-the-art technical architectural standards, improved business processes and consistent user interfaces to store various data types in an enterprise data repository which is accessible on-line in a secure environment. (author)

  12. Re-Engineering Complex Legacy Systems at NASA

    Science.gov (United States)

    Ruszkowski, James; Meshkat, Leila

    2010-01-01

    The Flight Production Process (FPP) Re-engineering project has established a Model-Based Systems Engineering (MBSE) methodology and the technological infrastructure for the design and development of a reference, product-line architecture as well as an integrated workflow model for the Mission Operations System (MOS) for human space exploration missions at NASA Johnson Space Center. The design and architectural artifacts have been developed based on the expertise and knowledge of numerous Subject Matter Experts (SMEs). The technological infrastructure developed by the FPP Re-engineering project has enabled the structured collection and integration of this knowledge and further provides simulation and analysis capabilities for optimization purposes. A key strength of this strategy has been the judicious combination of COTS products with custom coding. The lean management approach that has led to the success of this project is based on having a strong vision for the whole lifecycle of the project and its progress over time, a goal-based design and development approach, a small team of highly specialized people in areas that are critical to the project, and an interactive approach for infusing new technologies into existing processes. This project, which has had a relatively small amount of funding, is on the cutting edge with respect to the utilization of model-based design and systems engineering. An overarching challenge that was overcome by this project was to convince upper management of the needs and merits of giving up more conventional design methodologies (such as paper-based documents and unwieldy and unstructured flow diagrams and schedules) in favor of advanced model-based systems engineering approaches.

  13. Re-engineering of Products and Processes How to Achieve Global Success in the Changing Marketplace

    CERN Document Server

    Rotini, Federico; Cascini, Gaetano

    2012-01-01

    Whilst previous methods for business process re-engineering have focused on time and cost reduction policies to preserve competitive services and products, Re-engineering of Products and Processes: How to Achieve Global Success in the Changing Marketplace presents a new approach which aims to include aspects that impact the customer perceived value. This method supports business re-engineering initiatives by identifying process bottlenecks as well as new products and services available to overcome market competition. This original approach is described step-by-step, explaining the theory through examples of performable tasks and the selection of relevant tools according to the nature of the problem. Supported by illustrations, tables and diagrams, Re-engineering of Products and Processes: How to Achieve Global Success in the Changing Marketplace clearly explains a method which is then applied to several case studies across different industrial sectors. Re-engineering of Products and Processes: How to Achieve...

  14. Business Process Reengineering: A Primer for the Marine Corps' Process Owner

    National Research Council Canada - National Science Library

    Brewster, Rollin

    1997-01-01

    .... Business Process Reengineering (BPR) is a technique used by the private sector to achieve order of magnitude improvements in organizational performance by leveraging information technology to enable the holistic redesign of business processes...

  15. Technologies and problems of reengineering of the business processes of company

    Science.gov (United States)

    Silka, Dmitriy

    2017-10-01

    Management of the combination of business processes is a modern approach in the field of business management. Together with a lot of management approaches business processes allow us to identify all the resultant actions. Article reveals the modern view on the essence of business processes as well as the general approaches of their allocation. Principles of construction and business process re-engineering are proposed. Recommendations on how to perform re-engineering under high cyclic dynamics of business activity are provided.

  16. A Framework for Process Reengineering in Higher Education: A case study of distance learning exam scheduling and distribution

    Directory of Open Access Journals (Sweden)

    M'hammed Abdous

    2008-10-01

    Full Text Available In this paper, we propose a conceptual and operational framework for process reengineering (PR in higher education (HE institutions. Using a case study aimed at streamlining exam scheduling and distribution in a distance learning (DL unit, we outline a sequential and non-linear four-step framework designed to reengineer processes. The first two steps of this framework – initiating and analyzing – are used to initiate, document, and flowchart the process targeted for reengineering, and the last two steps – reengineering/ implementing and evaluating – are intended to prototype, implement, and evaluate the reengineered process. Our early involvement of all stakeholders, and our in-depth analysis and documentation of the existing process, allowed us to avoid the traditional pitfalls associated with business process reengineering (BPR. Consequently, the outcome of our case study indicates a streamlined and efficient process with a higher faculty satisfaction at substantial cost reduction.

  17. Core Flight Software

    Data.gov (United States)

    National Aeronautics and Space Administration — The AES Core Flight Software (CFS) project purpose is to analyze applicability, and evolve and extend the reusability of the CFS system originally developed by...

  18. BUSINESS BUZZWORDS: RIGHTSIZING, DOWNSIZING, RE-ENGINEERING, DE-LAYERING

    Directory of Open Access Journals (Sweden)

    Pop Anamaria Mirabela

    2011-07-01

    Full Text Available The paper attempts to analyse the rise and use of a new vocabulary (economic buzzwords related to staff dismissal in the new economy of the world. In this new economy, the organizational boundaries between states and firms become unclear and a new vocabulary has been conceived in order to express the changes the firms are undergoing. The new rhetoric includes buzzwords like privatization, de-regulation, re-engineering, rightsizing, downsizing, de-layering, quality service or global sourcing. The research is based on the conclusions of bibliographical and direct research of the literature relevant in the field, trying to emphasise the importance of strategic language when it comes to human resources management. Concepts like freedom of speech, politically correct language or non-discriminatory language are brought to attention and analysed focusing on their importance during periods of change and uncertainty characterising the economic environment nowadays. Two trends are depicted in the paper: the first is that of the supporters of political correctness who attempt to homogenize the language and thought to enhance the self-esteem of minorities. One approach to reaching this goal is to eliminate discriminatory or offensive words and phrases and the substitutions of harmless vocabulary at the expense of economy, clarity, and logic. Another approach is to deconstruct a word or phrase into its component parts, treat the component parts as wholes, and focus on secondary meanings of the component parts. On the other hand, reflecting upon the nature of large-scale organizational restructuring, there are the critics arguing that this type of language is a euphemistic form of phraseology. The analysis starts with the assumption that the economic lexis is not a rigid system of terms. Morphologically, there is a high degree of variety in productive types of compounding which exceeds the possibilities that exist in the common English vocabulary. In this

  19. Reengineering of Permanent Mould Casting with Lean Manufacturing Methods

    Directory of Open Access Journals (Sweden)

    R. Władysiak

    2007-07-01

    Full Text Available At the work were introduced main areas of production system project of casts produced in permanent moulds, that constitutes reengineering of conventional production system according to Lean Manufacturing (LM methods. New resolution of cooling of dies with water mist was shown to casting of car wheels made from aluminium alloys in low pressure casting process. It was implemented as a part of goal-oriented project in R.H. Alurad Sp.z o.o. in Gorzyce. Its using intensifies solidification and self-cooling of casts shortening the time of casting cycle by the 30%. It was described reorganizing casting stations into multi-machines cells production and the process of their fast tool’s exchange with applying the SMED method. A project of the system was described controlling the production of the foundry with the computer aided light Kanban system. A visualization of the process was shown the production of casts with use the value stream mapping method. They proved that applying casting new method in the technology and LM methods allowed to eliminate down-times, to reduce the level of stocks, to increase the productivity and the flow of the castings production.

  20. Brooklyn Union strategy: Re-engineering from outside in

    International Nuclear Information System (INIS)

    Parker, W.P. Jr.

    1997-01-01

    Five years ago, the management at Brooklyn Union embarked on a long, hard look at the way the company conducted business. In effect, they stepped into their customers' shoes. Business Process Improvement (BPI) is designed to construct a lasting corporate culture that can help Brooklyn Union meet its stated goal of becoming the premier energy company in the Northeast. A major component of that culture involves a dedication to service and cost management that is as solid as their credit ratings. To date, the bottom line on BPI has been impressive: By 1995, the customer satisfaction rating, which had been hovering in the '80s, had shot up to 95%. The management commitment has come in the form of resources, and a willingness to put its money where its mouth is (rewards for performance). The employee buy-in has shown up in those outstanding ratings from customers and in the financial results. Changing the culture of any long-established entity is never easy, whether it be on the micro-level (a family, for instance) of the macro-level (a country). It involves issues of trust, and a certain leap of faith that the new approach will bring results. Communication and education are two of the keys to gaining that participation. The company was able to impress upon employees the need for change--in particular the need for them to begin thinking like customers. The paper discusses the implementation of this re-engineering strategy

  1. The Organizational-Economic Provision of Reengineering of Marketing Activity of Ukrainian Machine-Building Enterprises

    Directory of Open Access Journals (Sweden)

    Kobyzskyi Denys S.

    2018-02-01

    Full Text Available The article is aimed at developing an organizational mechanism to provide reengineering of the marketing activities of machine-building enterprise for further development of the appropriate methodical recommendations. The meaning and role of organizational structure in the sphere of reengineering are disclosed, the key aspects and principles of its construction are defined; the key elements, in particular business processes, and their role in organizational structure as well as properties of the organizational system are researched; content of the basic components of the organizational mechanism of the provision, their role and peculiarities of communication between them are analyzed. The new attitude to the principles of construction, functional content and content of the constituents of organization of enterprises allows to realize the wide functional potential of organizational possibilities within the terms of reengineering, as well as to form an organizational mechanism of post-reengineering company. Certain aspects of development of the organizational mechanism create the preconditions and disclose a potential instrumentarium for effective and efficient methodical recommendations as to reengineering of marketing activities of Ukrainian machine-building enterprises.

  2. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in our...... case study. Handling variants and tracing the impact of variants across the development lifecycle is a challenge. This chapter shows how we can maintain different versions of software in a reuse-based way....

  3. A systematic approach for component-based software development

    NARCIS (Netherlands)

    Guareis de farias, Cléver; van Sinderen, Marten J.; Ferreira Pires, Luis

    2000-01-01

    Component-based software development enables the construction of software artefacts by assembling prefabricated, configurable and independently evolving building blocks, called software components. This paper presents an approach for the development of component-based software artefacts. This

  4. Spatial confidentiality and GIS: re-engineering mortality locations from published maps about Hurricane Katrina

    Directory of Open Access Journals (Sweden)

    Leitner Michael

    2006-10-01

    Full Text Available Abstract Background Geographic Information Systems (GIS can provide valuable insight into patterns of human activity. Online spatial display applications, such as Google Earth, can democratise this information by disseminating it to the general public. Although this is a generally positive advance for society, there is a legitimate concern involving the disclosure of confidential information through spatial display. Although guidelines exist for aggregated data, little has been written concerning the display of point level information. The concern is that a map containing points representing cases of cancer or an infectious disease, could be re-engineered back to identify an actual residence. This risk is investigated using point mortality locations from Hurricane Katrina re-engineered from a map published in the Baton Rouge Advocate newspaper, and a field team validating these residences using search and rescue building markings. Results We show that the residence of an individual, visualized as a generalized point covering approximately one and half city blocks on a map, can be re-engineered back to identify the actual house location, or at least a close neighbour, even if the map contains little spatial reference information. The degree of re-engineering success is also shown to depend on the urban characteristic of the neighborhood. Conclusion The results in this paper suggest a need to re-evaluate current guidelines for the display of point (address level data. Examples of other point maps displaying health data extracted from the academic literature are presented where a similar re-engineering approach might cause concern with respect to violating confidentiality. More research is also needed into the role urban structure plays in the accuracy of re-engineering. We suggest that health and spatial scientists should be proactive and suggest a series of point level spatial confidentiality guidelines before governmental decisions are made

  5. Spatial confidentiality and GIS: re-engineering mortality locations from published maps about Hurricane Katrina.

    Science.gov (United States)

    Curtis, Andrew J; Mills, Jacqueline W; Leitner, Michael

    2006-10-10

    Geographic Information Systems (GIS) can provide valuable insight into patterns of human activity. Online spatial display applications, such as Google Earth, can democratise this information by disseminating it to the general public. Although this is a generally positive advance for society, there is a legitimate concern involving the disclosure of confidential information through spatial display. Although guidelines exist for aggregated data, little has been written concerning the display of point level information. The concern is that a map containing points representing cases of cancer or an infectious disease, could be re-engineered back to identify an actual residence. This risk is investigated using point mortality locations from Hurricane Katrina re-engineered from a map published in the Baton Rouge Advocate newspaper, and a field team validating these residences using search and rescue building markings. We show that the residence of an individual, visualized as a generalized point covering approximately one and half city blocks on a map, can be re-engineered back to identify the actual house location, or at least a close neighbour, even if the map contains little spatial reference information. The degree of re-engineering success is also shown to depend on the urban characteristic of the neighborhood. The results in this paper suggest a need to re-evaluate current guidelines for the display of point (address level) data. Examples of other point maps displaying health data extracted from the academic literature are presented where a similar re-engineering approach might cause concern with respect to violating confidentiality. More research is also needed into the role urban structure plays in the accuracy of re-engineering. We suggest that health and spatial scientists should be proactive and suggest a series of point level spatial confidentiality guidelines before governmental decisions are made which may be reactionary toward the threat of revealing

  6. A Case Study: Business Process Reengineering at Raymond W. Bliss Army Community Hospital

    Science.gov (United States)

    1997-05-01

    Among the causes listed are: • Inadequate Management of Resistance • Attempting Painless Reengineering • Lack of Understanding About Reengineering...Parathyroid Proc. 0.9554 $5,458 1 0.9554 $5,458 SD 290 Thyroid Proc 0.9362 $5,349 3 2.8086 $16,046 SD 291 Thyroglossal Proc 0.4657 $2,661 2 0.9314 $5,321...Tissues Breast Age 0-17 0.6146 $2,799 SD 284 Minor skin disord. w/o CC 0.4042 $1,841 SD 289 Parathyroid Proc. 0.9554 $4,351 SD 290 Thyroid Proc

  7. The complementariness of the business process reengineering and activity-based management

    Directory of Open Access Journals (Sweden)

    Violeta DOMANOVIC

    2010-05-01

    Full Text Available In order to sustain long term growth and development, an enterprise has toenvisage and implement contemporary management innovations altogether. Intransition economies, like Serbia is, it is of great importance to redesign businessprocesses and activities, to analyse activity profitability in order to select value-addedactivities and reduce non-value added ones. This paper considers the possibility forcomplementary implementation of the business process reengineering and activitybased management in the process of long term efficiency improvement. Namely, thebasic postulate of business process reengineering concept might be established in theprocess of activity based management implementation and conversely.

  8. Improving Software Engineering on NASA Projects

    Science.gov (United States)

    Crumbley, Tim; Kelly, John C.

    2010-01-01

    Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.

  9. Revisiting software ecosystems research

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2016-01-01

    Software ecosystems’ is argued to first appear as a concept more than 10 years ago and software ecosystem research started to take off in 2010. We conduct a systematic literature study, based on the most extensive literature review in the field up to date, with two primarily aims: (a) to provide...... an updated overview of the field and (b) to document evolution in the field. In total, we analyze 231 papers from 2007 until 2014 and provide an overview of the research in software ecosystems. Our analysis reveals a field that is rapidly growing both in volume and empirical focus while becoming more mature...... from evolving. We propose means for future research and the community to address them. Finally, our analysis shapes the view of the field having evolved outside the existing definitions of software ecosystems and thus propose the update of the definition of software ecosystems....

  10. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  11. Software that meets its Intent

    NARCIS (Netherlands)

    Huisman, Marieke; Bos, Herbert; Brinkkemper, Sjaak; van Deursen, Arie; Groote, Jan Friso; Lago, Patricia; van de Pol, Jaco; Visser, Eelco; Margaria, Tiziana; Steffen, Bernhard

    2016-01-01

    Software is widely used, and society increasingly depends on its reliability. However, software has become so complex and it evolves so quickly that we fail to keep it under control. Therefore, we propose intents: fundamental laws that capture a software systems’ intended behavior (resilient,

  12. Evolving Capabilities for Virtual Globes

    Science.gov (United States)

    Glennon, A.

    2006-12-01

    Though thin-client spatial visualization software like Google Earth and NASA World Wind enjoy widespread popularity, a common criticism is their general lack of analytical functionality. This concern, however, is rapidly being addressed; standard and advanced geographic information system (GIS) capabilities are being developed for virtual globes--though not centralized into a single implementation or software package. The innovation is mostly originating from the user community. Three such capabilities relevant to the earth science, education, and emergency management communities are modeling dynamic spatial phenomena, real-time data collection and visualization, and multi-input collaborative databases. Modeling dynamic spatial phenomena has been facilitated through joining virtual globe geometry definitions--like KML--to relational databases. Real-time data collection uses short scripts to transform user-contributed data into a format usable by virtual globe software. Similarly, collaborative data collection for virtual globes has become possible by dynamically referencing online, multi-person spreadsheets. Examples of these functions include mapping flows within a karst watershed, real-time disaster assessment and visualization, and a collaborative geyser eruption spatial decision support system. Virtual globe applications will continue to evolve further analytical capabilities, more temporal data handling, and from nano to intergalactic scales. This progression opens education and research avenues in all scientific disciplines.

  13. Sculpting carbon bonds for allotropic transformation through solid-state re-engineering of –sp2 carbon

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Hyun Young; Araujo, Paulo T.; Kim, Young Lae; Jung, Sung Mi; Jia, Xiaoting; Hong, Sanghyun; Ahn, Chi Won; Kong, Jing; Dresselhaus, Mildred S.; Kar, Swastik; Jung, Yung Joon

    2014-09-15

    Carbon forms one of nature’s strongest chemical bonds; its allotropes having provided some of the most exciting scientific discoveries in recent times. The possibility of inter-allotropic transformations/hybridization of carbon is hence a topic of immense fundamental and technological interest. Such modifications usually require extreme conditions (high temperature, pressure and/or high-energy irradiations), and are usually not well controlled. Here we demonstrate inter-allotropic transformations/hybridizations of specific types that appear uniformly across large-area carbon networks, using moderate alternating voltage pulses. By controlling the pulse magnitude, small-diameter single-walled carbon nanotubes can be transformed predominantly into larger-diameter single-walled carbon nanotubes, multi-walled carbon nanotubes of different morphologies, multi-layered graphene nanoribbons or structures with sp3 bonds. This re-engineering of carbon bonds evolves via a coalescence-induced reconfiguration of sp2 hybridization, terminates with negligible introduction of defects and demonstrates remarkable reproducibility. This reflects a potential step forward for large-scale engineering of nanocarbon allotropes and their junctions.

  14. Gaming and simulation for transforming and reengineering government : Towards a research agenda

    NARCIS (Netherlands)

    Janssen, M.F.W.H.A.; Klievink, B.

    2010-01-01

    Purpose – In the process of transformation, governments have to deal with a host of stakeholders and complex organizational and technical issues. In this viewpoint paper, an argument is made in favour of using gaming and simulation as tools designed to aid the transformation and reengineering of

  15. Business Process Elicitation, Modeling, and Reengineering: Teaching and Learning with Simulated Environments

    Science.gov (United States)

    Jeyaraj, Anand

    2010-01-01

    The design of enterprise information systems requires students to master technical skills for elicitation, modeling, and reengineering business processes as well as soft skills for information gathering and communication. These tacit skills and behaviors cannot be effectively taught students but rather experienced and learned by students. This…

  16. As Easy as ABC: Re-engineering the Cost Accounting System.

    Science.gov (United States)

    Trussel, John M.; Bitner, Larry N.

    1996-01-01

    To be useful for management decision making, the college or university's cost accounting system must capture and measure improvements. Activity-based costing (ABC), which determines more accurately the full costs of services and products, tracks improvements and should proceed alongside reengineering of institutional accounting. Guidelines are…

  17. Reforms in Education: The Need for Re-Engineering Teacher Education for Sustainable Development

    Science.gov (United States)

    Ofoego, O. C.; Ebebe, I. E.

    2016-01-01

    The paper is concerned with reforms in Education and the need for re-engineering Teacher education in Nigeria for better professionalism and National Development. In the process, key concepts like Teacher Education and professionalism were explained. A brief review of the state of Teacher Education and Development in Nigeria revealed the…

  18. Refactoring, reengineering and evolution: paths to Geant4 uncertainty quantification and performance improvement

    International Nuclear Information System (INIS)

    Batič, M; Hoff, G; Pia, M G; Saracco, P; Begalli, M; Han, M; Kim, C H; Seo, H; Hauf, S; Kuster, M; Weidenspointner, G; Zoglauer, A

    2012-01-01

    Ongoing investigations for the improvement of Geant4 accuracy and computational performance resulting by refactoring and reengineering parts of the code are discussed. Issues in refactoring that are specific to the domain of physics simulation are identified and their impact is elucidated. Preliminary quantitative results are reported.

  19. Marketing Cooperatives' Re-engineering: Influences among Organizational Attributes, Strategic Attributes & Performance

    NARCIS (Netherlands)

    Benos, T.; Kalogeras, N.; Verhees, F.J.H.M.; Pennings, J.M.E.

    2009-01-01

    ABSTRACT In this paper we expand the agribusiness co-op literature by studying the re-engineering process of marketing cooperatives (co-ops). More specifically we discuss and empirically examine organizational innovations adopted by marketing co-ops in Greece. We hypothesize three types of

  20. A business process modeling experience in a complex information system re-engineering.

    Science.gov (United States)

    Bernonville, Stéphanie; Vantourout, Corinne; Fendeler, Geneviève; Beuscart, Régis

    2013-01-01

    This article aims to share a business process modeling experience in a re-engineering project of a medical records department in a 2,965-bed hospital. It presents the modeling strategy, an extract of the results and the feedback experience.

  1. IDC Re-Engineering Phase 2 Glossary Version 1.3

    Energy Technology Data Exchange (ETDEWEB)

    Young, Christopher J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Harris, James M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    This document contains the glossary of terms used for the IDC Re-Engineering Phase 2 project. This version was created for Iteration E3. The IDC applies automatic processing methods in order to produce, archive, and distribute standard IDC products on behalf of all States Parties.

  2. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  3. Mentoring: An Evolving Relationship.

    Science.gov (United States)

    Block, Michelle; Florczak, Kristine L

    2017-04-01

    The column concerns itself with mentoring as an evolving relationship between mentor and mentee. The collegiate mentoring model, the transformational transcendence model, and the humanbecoming mentoring model are considered in light of a dialogue with mentors at a Midwest university and conclusions are drawn.

  4. Methods Evolved by Observation

    Science.gov (United States)

    Montessori, Maria

    2016-01-01

    Montessori's idea of the child's nature and the teacher's perceptiveness begins with amazing simplicity, and when she speaks of "methods evolved," she is unveiling a methodological system for observation. She begins with the early childhood explosion into writing, which is a familiar child phenomenon that Montessori has written about…

  5. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    Science.gov (United States)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool

  6. Greek Co-ops' Re-Engineering: Exploring the Influences among Organizational Attributes, Strategic Attributes, and Performance

    NARCIS (Netherlands)

    Benos, T.; Kalogeras, N.; Verhees, F.J.H.M.

    2007-01-01

    Abstract We develop an actual classification entailing traditional vs. reengineered cooperative organizational attributes. Using this classification, we conceptualize and empirically investigate three types of relationships: a) organizational (i.e., collective ownership, control and cost/benefit

  7. Evaluation Policy Alternatives for the Reengineering of the Department of Defense Personal Property Shipment and Storage Program - A Stakeholder Approach

    National Research Council Canada - National Science Library

    Lepson, Michael

    1999-01-01

    ...) to evaluate the personal property pilot programs as part of Management Reform Memorandum # 6. This thesis evaluates the policy alternatives for reengineering the DOD personal property program using a stakeholder approach...

  8. Reengineering the laboratory: strategic process and systems innovation to improve performance. Recreating our role on the health-care team.

    Science.gov (United States)

    Johnson, E

    1995-01-01

    The author describes reengineering efforts in the laboratory of a 550-bed hospital. Key benefits include reduced costs, improved turnaround time, and redirection of staff into new roles in information management and outreach.

  9. Re-Engineering Control Systems using Automatic Generation Tools and Process Simulation: the LHC Water Cooling Case

    CERN Document Server

    Booth, W; Bradu, B; Gomez Palacin, L; Quilichini, M; Willeman, D

    2014-01-01

    This paper presents the approach used at CERN (European Organization for Nuclear Research) to perform the re-engineering of the control systems dedicated to the LHC (Large Hadron Collider) water cooling systems.

  10. EVOLVE 2014 International Conference

    CERN Document Server

    Tantar, Emilia; Sun, Jian-Qiao; Zhang, Wei; Ding, Qian; Schütze, Oliver; Emmerich, Michael; Legrand, Pierrick; Moral, Pierre; Coello, Carlos

    2014-01-01

    This volume encloses research articles that were presented at the EVOLVE 2014 International Conference in Beijing, China, July 1–4, 2014.The book gathers contributions that emerged from the conference tracks, ranging from probability to set oriented numerics and evolutionary computation; all complemented by the bridging purpose of the conference, e.g. Complex Networks and Landscape Analysis, or by the more application oriented perspective. The novelty of the volume, when considering the EVOLVE series, comes from targeting also the practitioner’s view. This is supported by the Machine Learning Applied to Networks and Practical Aspects of Evolutionary Algorithms tracks, providing surveys on new application areas, as in the networking area and useful insights in the development of evolutionary techniques, from a practitioner’s perspective. Complementary to these directions, the conference tracks supporting the volume, follow on the individual advancements of the subareas constituting the scope of the confe...

  11. Evolving Procurement Organizations

    DEFF Research Database (Denmark)

    Bals, Lydia; Laine, Jari; Mugurusi, Godfrey

    Procurement has to find further levers and advance its contribution to corporate goals continuously. This places pressure on its organization in order to facilitate its performance. Therefore, procurement organizations constantly have to evolve in order to match these demands. A conceptual model...... and external contingency factors and having a more detailed look at the structural dimensions chosen, beyond the well-known characteristics of centralization, formalization, participation, specialization, standardization and size. From a theoretical perspective, it opens up insights that can be leveraged...

  12. Symbiotic Composition and Evolvability

    OpenAIRE

    Watson, Richard A.; Pollack, Jordan B.

    2001-01-01

    Several of the Major Transitions in natural evolution, such as the symbiogenic origin of eukaryotes from prokaryotes, share the feature that existing entities became the components of composite entities at a higher level of organisation. This composition of pre-adapted extant entities into a new whole is a fundamentally different source of variation from the gradual accumulation of small random variations, and it has some interesting consequences for issues of evolvability. In this paper we p...

  13. Evolved H II regions

    International Nuclear Information System (INIS)

    Churchwell, E.

    1975-01-01

    A probable evolutionary sequence of H II regions based on six distinct types of observed objects is suggested. Two examples which may deviate from this idealized sequence, are discussed. Even though a size-mean density relation of H II regions can be used as a rough indication of whether a nebula is very young or evolved, it is argued that such a relation is not likely to be useful for the quantitative assignment of ages to H II regions. Evolved H II regions appear to fit into one of four structural types: rings, core-halos, smooth structures, and irregular or filamentary structures. Examples of each type are given with their derived physical parameters. The energy balance in these nebulae is considered. The mass of ionized gas in evolved H II regions is in general too large to trace the nebula back to single compact H II regions. Finally, the morphological type of the Galaxy is considered from its H II region content. 2 tables, 2 figs., 29 refs

  14. The development and technology transfer of software engineering technology at NASA. Johnson Space Center

    Science.gov (United States)

    Pitman, C. L.; Erb, D. M.; Izygon, M. E.; Fridge, E. M., III; Roush, G. B.; Braley, D. M.; Savely, R. T.

    1992-01-01

    The United State's big space projects of the next decades, such as Space Station and the Human Exploration Initiative, will need the development of many millions of lines of mission critical software. NASA-Johnson (JSC) is identifying and developing some of the Computer Aided Software Engineering (CASE) technology that NASA will need to build these future software systems. The goal is to improve the quality and the productivity of large software development projects. New trends are outlined in CASE technology and how the Software Technology Branch (STB) at JSC is endeavoring to provide some of these CASE solutions for NASA is described. Key software technology components include knowledge-based systems, software reusability, user interface technology, reengineering environments, management systems for the software development process, software cost models, repository technology, and open, integrated CASE environment frameworks. The paper presents the status and long-term expectations for CASE products. The STB's Reengineering Application Project (REAP), Advanced Software Development Workstation (ASDW) project, and software development cost model (COSTMODL) project are then discussed. Some of the general difficulties of technology transfer are introduced, and a process developed by STB for CASE technology insertion is described.

  15. Re-engineering quality related processes and activities

    International Nuclear Information System (INIS)

    Preisser, T.E.

    1995-01-01

    Given both desire and opportunity, improvements to program quality hinge upon a thorough understanding of what processes are currently performed, which are necessary to support the product or service, and what ideal processes should look like. Thorough understanding derives from process analysis, process mapping, and the use of other quality tools. Despite the level of knowledge any process team claims, there is likely to be at least one area that was hidden before the process was deeply analyzed. Finding that hidden element may mean the difference between evolving an improvement versus a breakthrough

  16. Educational Process Reengineering and Diffusion of Innovation in Formal Learning Environment

    DEFF Research Database (Denmark)

    Khalid, Md. Saifuddin; Hossain, Mohammad Shahadat; Rongbutsri, Nikorn

    2011-01-01

    administration and evaluation and assessment. Educational environments are flexible and not governed by standard operating procedures, making technology use lithe. Theory of diffusion of innovations‟ is recommended to be integrated to reason and measure acceptance or rejection of EPR selected technology......In technology mediated learning while relative advantages of technologies is proven, lack of contextualization and process centric change, and lack of user driven change has kept intervention and adoption of educational technologies among individuals and organizations as challenges. Reviewing...... the formal, informal and non-formal learning environments, this study focuses on the formal part. This paper coins the term 'Educational Process Reengineering (EPR) based on the established concept of 'Business Process Reengineering (BPR) for process improvement of teaching learning activities, academic...

  17. Applying object technology principles to business reengineering in the oil, gas, and petrochemical industries

    International Nuclear Information System (INIS)

    Davis, J.M.

    1996-01-01

    The oil, gas, and petrochemical industries face a dilemma, to be financially competitive while complying with strict and expanding environmental, safety, and health regulation. Companies need new tools and techniques, indeed a completely new paradigm for organizing and performing work. They must build efficient and flexible business processes, ones that rely on advanced information systems for improved decision making and productivity. And they must adopt a culture of change and improvement to permit the business to change as the business climate changes. Fortunately, two industry developments are changing the traditional business paradigm in a dramatic way; business reengineering and object technology. Applying principles of object technology in the performance of business reengineering makes available a new form of business modeling that transforms the technique of modeling a business while directly supported the development of its enabling information systems. This modeling technique is called Object Modeling and is becoming an important force in improving business competitiveness

  18. Proposal of an Embedded Methodology that uses Organizational Diagnosis and Reengineering: Case of bamboo panel company

    Directory of Open Access Journals (Sweden)

    Eva Selene Hernández Gress

    2017-08-01

    Full Text Available This work is an extension of the Proceedings of the International Conference on Industrial Engineering, Management Science and Applications, which presented some of the phases of Reengineering applied to Bamboo Panel Company; the results were Strategic planning, Systemic Diagnosis and Performance Indicators through the Balanced Scorecard. Now, the main purpose of this article is to present a methodology that embedding Organizational Diagnosis and Reengineering, which emphasizes the incorporation of culture, context, management style, and knowledge as well as inner and outer actors. The results of the proposed methodology applied to the case study are included, up to the moment of the writing of this article. Future work consists on the development of strategies for Innovation as a strategy planned in the Balanced Scorecard and derived from the embedded methodology.

  19. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  20. Process Reengineering of Cold Chain Logistics of Agricultural Products Based on Low-carbon Economy

    OpenAIRE

    Guo, Hong-xia; Shao, Ming

    2012-01-01

    Through the process analysis of cold chain logistics of agricultural products, we find that cold chain logistics of agricultural products contradict the development model of low-carbon economy to some extent. We apply the development idea of low-carbon economy, introduce the third-party logistics companies, establish distribution center of cold chain logistics of agricultural products, and strengthen information sharing, to reengineer the process of cold chain logistics of agricultural produc...

  1. RE-ENGINEERING PRIMARY HEALTHCARE NURSING AS A FIRST CAREER CHOICE.

    Science.gov (United States)

    Wheeler, Emily; Govan, Linda

    2016-08-01

    In line with international models and critical to the primary healthcare nursing workforce, the Australian Primary Health Care Nursing Association (APNA) has been funded by the Commonwealth Department of Health to develop an Education and Career Framework and Toolkit for primary healthcare nurses. The aim of the project is to improve the recruitment and retention of nurses and to re-engineer primary healthcare as a first choice career option.

  2. Re-Engineering a High Performance Electrical Series Elastic Actuator for Low-Cost Industrial Applications

    Directory of Open Access Journals (Sweden)

    Kenan Isik

    2017-01-01

    Full Text Available Cost is an important consideration when transferring a technology from research to industrial and educational use. In this paper, we introduce the design of an industrial grade series elastic actuator (SEA performed via re-engineering a research grade version of it. Cost-constrained design requires careful consideration of the key performance parameters for an optimal performance-to-cost component selection. To optimize the performance of the new design, we started by matching the capabilities of a high-performance SEA while cutting down its production cost significantly. Our posit was that performing a re-engineering design process on an existing high-end device will significantly reduce the cost without compromising the performance drastically. As a case study of design for manufacturability, we selected the University of Texas Series Elastic Actuator (UT-SEA, a high-performance SEA, for its high power density, compact design, high efficiency and high speed properties. We partnered with an industrial corporation in China to research the best pricing options and to exploit the retail and production facilities provided by the Shenzhen region. We succeeded in producing a low-cost industrial grade actuator at one-third of the cost of the original device by re-engineering the UT-SEA with commercial off-the-shelf components and reducing the number of custom-made parts. Subsequently, we conducted performance tests to demonstrate that the re-engineered product achieves the same high-performance specifications found in the original device. With this paper, we aim to raise awareness in the robotics community on the possibility of low-cost realization of low-volume, high performance, industrial grade research and education hardware.

  3. Reengineering of waste management at the Oak Ridge National Laboratory. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Myrick, T.E.

    1997-08-01

    A reengineering evaluation of the waste management program at the Oak Ridge National Laboratory (ORNL) was conducted during the months of February through July 1997. The goal of the reengineering was to identify ways in which the waste management process could be streamlined and improved to reduce costs while maintaining full compliance and customer satisfaction. A Core Team conducted preliminary evaluations and determined that eight particular aspects of the ORNL waste management program warranted focused investigations during the reengineering. The eight areas included Pollution Prevention, Waste Characterization, Waste Certification/Verification, Hazardous/Mixed Waste Stream, Generator/WM Teaming, Reporting/Records, Disposal End Points, and On-Site Treatment/Storage. The Core Team commissioned and assembled Process Teams to conduct in-depth evaluations of each of these eight areas. The Core Team then evaluated the Process Team results and consolidated the 80 process-specific recommendations into 15 overall recommendations. Benchmarking of a commercial nuclear facility, a commercial research facility, and a DOE research facility was conducted to both validate the efficacy of these findings and seek additional ideas for improvement. The outcome of this evaluation is represented by the 15 final recommendations that are described in this report.

  4. Managing hospital supplies: process reengineering at Gujarat Cancer Research Institute, India.

    Science.gov (United States)

    Ramani, K V

    2006-01-01

    Aims to give an overview of the re-engineering of processes and structures at Gujarat Cancer Research Institute (GCRI), Ahmedabad. A general review of the design, development and implementation of reengineered systems in order to address concerns about the existing systems. Findings GCRI is a comprehensive cancer care center with 550 beds and well equipped with modern diagnostic and treatment facilities. It serves about 200,000 outpatients and 16,000 inpatients annually. The approach to a better management of hospital supplies led to the design, development, and implementation of an IT-based reengineered and integrated purchase and inventory management system. The new system has given GCRI a saving of about 8 percent of its annual costs of purchases, and improved the availability of materials to the user departments. Shows that the savings obtained are used not only for buying more hospital supplies, but also to buy better quality of hospital supplies, and thereby satisfactorily address the GCRI responsibility towards meeting its social obligations for cancer care.

  5. Reengineering of waste management at the Oak Ridge National Laboratory. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Myrick, T.E.

    1997-08-01

    A reengineering evaluation of the waste management program at the Oak Ridge National Laboratory (ORNL) was conducted during the months of February through July 1997. The goal of the reengineering was to identify ways in which the waste management process could be streamlined and improved to reduce costs while maintaining full compliance and customer satisfaction. A Core Team conducted preliminary evaluations and determined that eight particular aspects of the ORNL waste management program warranted focused investigations during the reengineering. The eight areas included Pollution Prevention, Waste Characterization, Waste Certification/Verification, Hazardous/Mixed Waste Stream, Generator/WM Teaming, Reporting/Records, Disposal End Points, and On-Site Treatment/Storage. The Core Team commissioned and assembled Process Teams to conduct in-depth evaluations of each of these eight areas. The Core Team then evaluated the Process Team results and consolidated the 80 process-specific recommendations into 15 overall recommendations. Volume 2 consists of nine appendices which contain the Process Team reports and Benchmarking reports.

  6. Reengineering of waste management at the Oak Ridge National Laboratory. Volume 2

    International Nuclear Information System (INIS)

    Myrick, T.E.

    1997-08-01

    A reengineering evaluation of the waste management program at the Oak Ridge National Laboratory (ORNL) was conducted during the months of February through July 1997. The goal of the reengineering was to identify ways in which the waste management process could be streamlined and improved to reduce costs while maintaining full compliance and customer satisfaction. A Core Team conducted preliminary evaluations and determined that eight particular aspects of the ORNL waste management program warranted focused investigations during the reengineering. The eight areas included Pollution Prevention, Waste Characterization, Waste Certification/Verification, Hazardous/Mixed Waste Stream, Generator/WM Teaming, Reporting/Records, Disposal End Points, and On-Site Treatment/Storage. The Core Team commissioned and assembled Process Teams to conduct in-depth evaluations of each of these eight areas. The Core Team then evaluated the Process Team results and consolidated the 80 process-specific recommendations into 15 overall recommendations. Volume 2 consists of nine appendices which contain the Process Team reports and Benchmarking reports

  7. Reengineering of waste management at the Oak Ridge National Laboratory. Volume 1

    International Nuclear Information System (INIS)

    Myrick, T.E.

    1997-08-01

    A reengineering evaluation of the waste management program at the Oak Ridge National Laboratory (ORNL) was conducted during the months of February through July 1997. The goal of the reengineering was to identify ways in which the waste management process could be streamlined and improved to reduce costs while maintaining full compliance and customer satisfaction. A Core Team conducted preliminary evaluations and determined that eight particular aspects of the ORNL waste management program warranted focused investigations during the reengineering. The eight areas included Pollution Prevention, Waste Characterization, Waste Certification/Verification, Hazardous/Mixed Waste Stream, Generator/WM Teaming, Reporting/Records, Disposal End Points, and On-Site Treatment/Storage. The Core Team commissioned and assembled Process Teams to conduct in-depth evaluations of each of these eight areas. The Core Team then evaluated the Process Team results and consolidated the 80 process-specific recommendations into 15 overall recommendations. Benchmarking of a commercial nuclear facility, a commercial research facility, and a DOE research facility was conducted to both validate the efficacy of these findings and seek additional ideas for improvement. The outcome of this evaluation is represented by the 15 final recommendations that are described in this report

  8. Outsourcing the development of specific application software using the ESA software engineering standards the SPS software Interlock System

    CERN Document Server

    Denis, B

    1995-01-01

    CERN is considering outsourcing as a solution to the reduction of staff. To need to re-engineer the SPS Software Interlock System provided an opportunity to explore the applicability of outsourcing to our specific controls environment and the ESA PSS-05 standards were selected for the requirements specification, the development, the control and monitoring and the project management. The software produced by the contractor is now fully operational. After outlining the scope and the complexity of the project, a discussion on the ESA PSS-05 will be presented: the choice, the way these standards improve the outsourcing process, the quality induced but also the need to adapt them and their limitation in the definition of the customer-supplier relationship. The success factors and the difficulties of development under contract will also be discussed. The maintenance aspect and the impact on in-house developments will finally be addressed.

  9. Testing methodology of embedded software in digital plant protection system

    International Nuclear Information System (INIS)

    Seong, Ah Young; Choi, Bong Joo; Lee, Na Young; Hwang, Il Soon

    2001-01-01

    It is necessary to assure the reliability of software in order to digitalize RPS(Reactor Protection System). Since RPS causes fatal damage on accidental cases, it is classified as Safety 1E class. Therefore we propose the effective testing methodology to assure the reliability of embedded software in the DPPS(Digital Plant Protection System). To test the embedded software effectively in DPPS, our methodology consists of two steps. The first is the re-engineering step that extracts classes from structural source program, and the second is the level of testing step which is composed of unit testing, Integration Testing and System Testing. On each testing step we test the embedded software with selected test cases after the test item identification step. If we use this testing methodology, we can test the embedded software effectively by reducing the cost and the time

  10. CORBA technology in reengineering the FTU data acquisition system

    Energy Technology Data Exchange (ETDEWEB)

    Bertocchi, A; Buceti, G; Centioli, C; Di Muzio, D; Iannone, F.; Panella, M; Vitale, V

    2002-06-01

    In its early stages, Frascati tokamak upgrade DAS was essentially devoted to acquiring data from experiments in CAMAC standard, using a software system (code and database) entirely written by domestic professionals. In 15 years of life DAS has been growing in size and complexity, still preserving its original structure; at the same time new standards were introduced (VME, PCI) to take into account users' ever increasing demands for amount of data and acquisition frequency with which the existing code couldn't cope. Moreover, machines were getting old and the maintenance became troublesome. Finally, the data archive porting to Unix has definitely shown that the DAS system was ageing and a thorough redesign was needed. The system we are planning to introduce is founded on a standard CORBA bus: (i) to integrate heterogeneous platforms and define a standard layer for interactions between the different acquisition units; (ii) to grant, with open source tools (MySql) and interfaces (Html and Java), unified access to hardware and software configuration data. So, a dedicated PC server, connected via a suitable PCI serial highway driver card, will perform the CAMAC access for all the clients interacting through the CORBA layer. Up to now we have successfully tested CAMAC access, and we designed an acquisition unit, which will be the building block of the new system. The next step will be migrating to Alpha/VMS the software related to CAMAC data acquisition, which has been so far the cornerstone of the whole DAS; it will be completely redesigned to fit the 'acquisition unit' paradigm we have defined. Finally we will have a fully distributed data acquisition system with VME (at present six such units have been operating since 1999) and PCI stations, an Alpha/VMS client of the CAMAC/PC server and any possible platform interacting through a CORBA bus for getting data configuration, synchronisation and data archiving.

  11. CORBA technology in reengineering the FTU data acquisition system

    International Nuclear Information System (INIS)

    Bertocchi, A.; Buceti, G.; Centioli, C.; Di Muzio, D.; Iannone, F.; Panella, M.; Vitale, V.

    2002-01-01

    In its early stages, Frascati tokamak upgrade DAS was essentially devoted to acquiring data from experiments in CAMAC standard, using a software system (code and database) entirely written by domestic professionals. In 15 years of life DAS has been growing in size and complexity, still preserving its original structure; at the same time new standards were introduced (VME, PCI) to take into account users' ever increasing demands for amount of data and acquisition frequency with which the existing code couldn't cope. Moreover, machines were getting old and the maintenance became troublesome. Finally, the data archive porting to Unix has definitely shown that the DAS system was ageing and a thorough redesign was needed. The system we are planning to introduce is founded on a standard CORBA bus: (i) to integrate heterogeneous platforms and define a standard layer for interactions between the different acquisition units; (ii) to grant, with open source tools (MySql) and interfaces (Html and Java), unified access to hardware and software configuration data. So, a dedicated PC server, connected via a suitable PCI serial highway driver card, will perform the CAMAC access for all the clients interacting through the CORBA layer. Up to now we have successfully tested CAMAC access, and we designed an acquisition unit, which will be the building block of the new system. The next step will be migrating to Alpha/VMS the software related to CAMAC data acquisition, which has been so far the cornerstone of the whole DAS; it will be completely redesigned to fit the 'acquisition unit' paradigm we have defined. Finally we will have a fully distributed data acquisition system with VME (at present six such units have been operating since 1999) and PCI stations, an Alpha/VMS client of the CAMAC/PC server and any possible platform interacting through a CORBA bus for getting data configuration, synchronisation and data archiving

  12. Evolving Procurement Organizations

    DEFF Research Database (Denmark)

    Bals, Lydia; Laiho, Aki; Laine, Jari

    Procurement has to find further levers and advance its contribution to corporate goals continuously. This places pressure on its organization in order to facilitate its performance. Therefore, Procurement organizations constantly have to evolve in order to match these demands. A conceptual model...... is presented and results of a first case study discussed. The findings highlight the importance of taking a contingency perspective on Procurement organization, understanding the internal and internal contingency factors. From a theoretical perspective, it opens up insights that can be furthermore leveraged...... in future studies in the fields of hybrid procurement organizations, global sourcing organizations as well as international procurement offices (IPOs). From a practical standpoint, an assessment of external and internal contingencies provides the opportunity to consciously match organization to its...

  13. Diffusion between evolving interfaces

    International Nuclear Information System (INIS)

    Juntunen, Janne; Merikoski, Juha

    2010-01-01

    Diffusion in an evolving environment is studied by continuous-time Monte Carlo simulations. Diffusion is modeled by continuous-time random walkers on a lattice, in a dynamic environment provided by bubbles between two one-dimensional interfaces driven symmetrically towards each other. For one-dimensional random walkers constrained by the interfaces, the bubble size distribution dominates diffusion. For two-dimensional random walkers, it is also controlled by the topography and dynamics of the interfaces. The results of the one-dimensional case are recovered in the limit where the interfaces are strongly driven. Even with simple hard-core repulsion between the interfaces and the particles, diffusion is found to depend strongly on the details of the dynamical rules of particles close to the interfaces.

  14. Re-Engineering Alzheimer Clinical Trials: Global Alzheimer's Platform Network.

    Science.gov (United States)

    Cummings, J; Aisen, P; Barton, R; Bork, J; Doody, R; Dwyer, J; Egan, J C; Feldman, H; Lappin, D; Truyen, L; Salloway, S; Sperling, R; Vradenburg, G

    2016-06-01

    Alzheimer's disease (AD) drug development is costly, time-consuming, and inefficient. Trial site functions, trial design, and patient recruitment for trials all require improvement. The Global Alzheimer Platform (GAP) was initiated in response to these challenges. Four GAP work streams evolved in the US to address different trial challenges: 1) registry-to-cohort web-based recruitment; 2) clinical trial site activation and site network construction (GAP-NET); 3) adaptive proof-of-concept clinical trial design; and 4) finance and fund raising. GAP-NET proposes to establish a standardized network of continuously funded trial sites that are highly qualified to perform trials (with established clinical, biomarker, imaging capability; certified raters; sophisticated management system. GAP-NET will conduct trials for academic and biopharma industry partners using standardized instrument versions and administration. Collaboration with the Innovative Medicines Initiative (IMI) European Prevention of Alzheimer's Disease (EPAD) program, the Canadian Consortium on Neurodegeneration in Aging (CCNA) and other similar international initiatives will allow conduct of global trials. GAP-NET aims to increase trial efficiency and quality, decrease trial redundancy, accelerate cohort development and trial recruitment, and decrease trial costs. The value proposition for sites includes stable funding and uniform training and trial execution; the value to trial sponsors is decreased trial costs, reduced time to execute trials, and enhanced data quality. The value for patients and society is the more rapid availability of new treatments for AD.

  15. Evolving PSTN to NGN

    Science.gov (United States)

    Wu, Liang T.

    2004-04-01

    The concept of Next Generation Network (NGN) was conceived around 1998 as an integrated solution to combine the quality and features of the PSTN with the low cost and routing flexibility of the Internet to provide a single infrastructure for the future public network. This carrier grade Internet solution calls for the creation of a consolidated, packet transport and switching infrastructure and the development of a flexible, open, software switch (softswitch) to handle voice telephony as well as multimedia services. Almost all the telecom equipment manufacturers as well as some Internet equipment vendors immediately subscribed to this vision and joined the race to create convergent products for the NGN market.

  16. Your Office software is evolving – use its full potential!

    CERN Multimedia

    Michal Kwiatek (IT-OIS) for the IT Department

    2011-01-01

    Microsoft Office 2010 has been available at CERN since May 2011. It is the default version installed on new NICE computers. The IT Department is now planning to migrate the remaining NICE Windows 7 computers running Office 2007 to this version, so that it becomes the only version of Microsoft Office on NICE Windows 7 and all users can benefit from the improvements that it brings. NICE Windows 7 computers in the IT Department will be migrated on 12 January and the migration in the other departments will begin on 21 February.  You can migrate earlier at your convenience according to the “Next steps” below. Windows XP users are not affected by this change. Until Windows XP is decommissioned from office use at the end of 2012, Microsoft Office 2007 will remain the only supported version of Microsoft Office on NICE Windows XP. Revolutionary benefits of the evolution Office 2010 is very similar to its predecessor, Office 2007. In particular, the file formats remain the same and th...

  17. Architecture-driven Migration of Legacy Systems to Cloud-enabled Software

    DEFF Research Database (Denmark)

    Ahmad, Aakash; Babar, Muhammad Ali

    2014-01-01

    of legacy systems to cloud computing. The framework leverages the software reengineering concepts that aim to recover the architecture from legacy source code. Then the framework exploits the software evolution concepts to support architecture-driven migration of legacy systems to cloud-based architectures....... The Legacy-to-Cloud Migration Horseshoe comprises of four processes: (i) architecture migration planning, (ii) architecture recovery and consistency, (iii) architecture transformation and (iv) architecture-based development of cloud-enabled software. We aim to discover, document and apply the migration...

  18. Why did heterospory evolve?

    Science.gov (United States)

    Petersen, Kurt B; Burd, Martin

    2017-08-01

    The primitive land plant life cycle featured the production of spores of unimodal size, a condition called homospory. The evolution of bimodal size distributions with small male spores and large female spores, known as heterospory, was an innovation that occurred repeatedly in the history of land plants. The importance of desiccation-resistant spores for colonization of the land is well known, but the adaptive value of heterospory has never been well established. It was an addition to a sexual life cycle that already involved male and female gametes. Its role as a precursor to the evolution of seeds has received much attention, but this is an evolutionary consequence of heterospory that cannot explain the transition from homospory to heterospory (and the lack of evolutionary reversal from heterospory to homospory). Enforced outcrossing of gametophytes has often been mentioned in connection to heterospory, but we review the shortcomings of this argument as an explanation of the selective advantage of heterospory. Few alternative arguments concerning the selective forces favouring heterospory have been proposed, a paucity of attention that is surprising given the importance of this innovation in land plant evolution. In this review we highlight two ideas that may lead us to a better understanding of why heterospory evolved. First, models of optimal resource allocation - an approach that has been used for decades in evolutionary ecology to help understand parental investment and other life-history patterns - suggest that an evolutionary increase in spore size could reach a threshold at which small spores yielding small, sperm-producing gametophytes would return greater fitness per unit of resource investment than would large spores and bisexual gametophytes. With the advent of such microspores, megaspores would evolve under frequency-dependent selection. This argument can account for the appearance of heterospory in the Devonian, when increasingly tall and complex

  19. Evolving a photosynthetic organelle

    Directory of Open Access Journals (Sweden)

    Nakayama Takuro

    2012-04-01

    Full Text Available Abstract The evolution of plastids from cyanobacteria is believed to represent a singularity in the history of life. The enigmatic amoeba Paulinella and its 'recently' acquired photosynthetic inclusions provide a fascinating system through which to gain fresh insight into how endosymbionts become organelles. The plastids, or chloroplasts, of algae and plants evolved from cyanobacteria by endosymbiosis. This landmark event conferred on eukaryotes the benefits of photosynthesis - the conversion of solar energy into chemical energy - and in so doing had a huge impact on the course of evolution and the climate of Earth 1. From the present state of plastids, however, it is difficult to trace the evolutionary steps involved in this momentous development, because all modern-day plastids have fully integrated into their hosts. Paulinella chromatophora is a unicellular eukaryote that bears photosynthetic entities called chromatophores that are derived from cyanobacteria and has thus received much attention as a possible example of an organism in the early stages of organellogenesis. Recent studies have unlocked the genomic secrets of its chromatophore 23 and provided concrete evidence that the Paulinella chromatophore is a bona fide photosynthetic organelle 4. The question is how Paulinella can help us to understand the process by which an endosymbiont is converted into an organelle.

  20. Fat: an evolving issue

    Directory of Open Access Journals (Sweden)

    John R. Speakman

    2012-09-01

    Work on obesity is evolving, and obesity is a consequence of our evolutionary history. In the space of 50 years, we have become an obese species. The reasons why can be addressed at a number of different levels. These include separating between whether the primary cause lies on the food intake or energy expenditure side of the energy balance equation, and determining how genetic and environmental effects contribute to weight variation between individuals. Opinion on whether increased food intake or decreased energy expenditure drives the obesity epidemic is still divided, but recent evidence favours the idea that food intake, rather than altered expenditure, is most important. There is more of a consensus that genetics explains most (probably around 65% of weight variation between individuals. Recent advances in genome-wide association studies have identified many polymorphisms that are linked to obesity, yet much of the genetic variance remains unexplained. Finding the causes of this unexplained variation will be an impetus of genetic and epigenetic research on obesity over the next decade. Many environmental factors – including gut microbiota, stress and endocrine disruptors – have been linked to the risk of developing obesity. A better understanding of gene-by-environment interactions will also be key to understanding obesity in the years to come.

  1. Evolving a photosynthetic organelle.

    Science.gov (United States)

    Nakayama, Takuro; Archibald, John M

    2012-04-24

    The evolution of plastids from cyanobacteria is believed to represent a singularity in the history of life. The enigmatic amoeba Paulinella and its 'recently' acquired photosynthetic inclusions provide a fascinating system through which to gain fresh insight into how endosymbionts become organelles.The plastids, or chloroplasts, of algae and plants evolved from cyanobacteria by endosymbiosis. This landmark event conferred on eukaryotes the benefits of photosynthesis--the conversion of solar energy into chemical energy--and in so doing had a huge impact on the course of evolution and the climate of Earth 1. From the present state of plastids, however, it is difficult to trace the evolutionary steps involved in this momentous development, because all modern-day plastids have fully integrated into their hosts. Paulinella chromatophora is a unicellular eukaryote that bears photosynthetic entities called chromatophores that are derived from cyanobacteria and has thus received much attention as a possible example of an organism in the early stages of organellogenesis. Recent studies have unlocked the genomic secrets of its chromatophore 23 and provided concrete evidence that the Paulinella chromatophore is a bona fide photosynthetic organelle 4. The question is how Paulinella can help us to understand the process by which an endosymbiont is converted into an organelle.

  2. Communicability across evolving networks.

    Science.gov (United States)

    Grindrod, Peter; Parsons, Mark C; Higham, Desmond J; Estrada, Ernesto

    2011-04-01

    Many natural and technological applications generate time-ordered sequences of networks, defined over a fixed set of nodes; for example, time-stamped information about "who phoned who" or "who came into contact with who" arise naturally in studies of communication and the spread of disease. Concepts and algorithms for static networks do not immediately carry through to this dynamic setting. For example, suppose A and B interact in the morning, and then B and C interact in the afternoon. Information, or disease, may then pass from A to C, but not vice versa. This subtlety is lost if we simply summarize using the daily aggregate network given by the chain A-B-C. However, using a natural definition of a walk on an evolving network, we show that classic centrality measures from the static setting can be extended in a computationally convenient manner. In particular, communicability indices can be computed to summarize the ability of each node to broadcast and receive information. The computations involve basic operations in linear algebra, and the asymmetry caused by time's arrow is captured naturally through the noncommutativity of matrix-matrix multiplication. Illustrative examples are given for both synthetic and real-world communication data sets. We also discuss the use of the new centrality measures for real-time monitoring and prediction.

  3. Evolving Concepts of Asthma

    Science.gov (United States)

    Ray, Anuradha; Wenzel, Sally E.

    2015-01-01

    Our understanding of asthma has evolved over time from a singular disease to a complex of various phenotypes, with varied natural histories, physiologies, and responses to treatment. Early therapies treated most patients with asthma similarly, with bronchodilators and corticosteroids, but these therapies had varying degrees of success. Similarly, despite initial studies that identified an underlying type 2 inflammation in the airways of patients with asthma, biologic therapies targeted toward these type 2 pathways were unsuccessful in all patients. These observations led to increased interest in phenotyping asthma. Clinical approaches, both biased and later unbiased/statistical approaches to large asthma patient cohorts, identified a variety of patient characteristics, but they also consistently identified the importance of age of onset of disease and the presence of eosinophils in determining clinically relevant phenotypes. These paralleled molecular approaches to phenotyping that developed an understanding that not all patients share a type 2 inflammatory pattern. Using biomarkers to select patients with type 2 inflammation, repeated trials of biologics directed toward type 2 cytokine pathways saw newfound success, confirming the importance of phenotyping in asthma. Further research is needed to clarify additional clinical and molecular phenotypes, validate predictive biomarkers, and identify new areas for possible interventions. PMID:26161792

  4. UKAEA'S evolving contract philosophy

    International Nuclear Information System (INIS)

    Nicol, R. D.

    2003-01-01

    The United Kingdom Atomic Energy Authority (UKAEA) has gone through fundamental change over the last ten years. At the heart of this change has been UKAEA's relationship with the contracting and supply market. This paper describes the way in which UKAEA actively developed the market to support the decommissioning programme, and how the approach to contracting has evolved as external pressures and demands have changed. UKAEA's pro-active approach to industry has greatly assisted the development of a healthy, competitive market for services supporting decommissioning in the UK. There have been difficult changes and many challenges along the way, and some retrenchment was necessary to meet regulatory requirements. Nevertheless, UKAEA has sustained a high level of competition - now measured in terms of competed spend as a proportion of competable spend - with annual out-turns consistently over 80%. The prime responsibility for market development will pass to the new Nuclear Decommissioning Authority (NDA) in 2005, as the owner, on behalf of the Government, of the UK's civil nuclear liabilities. The preparatory work for the NDA indicates that the principles established by UKAEA will be carried forward. (author)

  5. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  6. Six Sigma software development

    CERN Document Server

    Tayntor, Christine B

    2002-01-01

    Since Six Sigma has had marked success in improving quality in other settings, and since the quality of software remains poor, it seems a natural evolution to apply the concepts and tools of Six Sigma to system development and the IT department. Until now however, there were no books available that applied these concepts to the system development process. Six Sigma Software Development fills this void and illustrates how Six Sigma concepts can be applied to all aspects of the evolving system development process. It includes the traditional waterfall model and in the support of legacy systems,

  7. Applying Business Process Re-Engineering to Public Sector as A New Public Management Strategy

    Directory of Open Access Journals (Sweden)

    Ropinder Oberoi

    2013-08-01

    Full Text Available The introduction of Business Process Reengineering (BPR to the public sector follows the much broader trend of New Public Management. BPR in the public sector mostly means amalgamation of business processes, computerization of various activities and removal of some unnecessary ones. BPR assimilates a radical premeditated scheme of business pro-cess reengineering and an additional progressive technique of uninterrupted process improvement with adequate information technology (IT and e-business infrastructure strategies. Public organizations have specific and exclusive features that differentiae-ate them from private sector organizations. Based on the literature review and examining of study find-ings, it is argued that a public sector organization can employ BPR to get better its process and overall organizational performance, if it (1 has accrues a collection of BPR-relevant resources and capabilities; (2 has embarked on BPR with adequate depth and breadth; (3 is developing a post-BPR complementary set of skills, systems and technologies, which are essential to further develop the organizational impact of the BPR; and (4 has successfully mitigated the effects of BPR implementation problems. In addition to its effect on administration and ser-vice delivery processes through reduction of the processing time, work steps and cost of government processes, BPR also contributes to enhancing citizen/customer and employee satisfaction, increasing organizational transparency and responsiveness which have also become an essential objective of New Public Management. Therefore, public sector BPR is emerging as an indispensable to performance of organizations in the developing economy. The essential questions addressed in this paper are: What are the scenario and impending problems of reengineering applications in the public sector? Can it be functional for the public sector in attending to frequent problems blockading bureaucracies of developed and

  8. Reengineering the picture archiving and communication system (PACS) process for digital imaging networks PACS.

    Science.gov (United States)

    Horton, M C; Lewis, T E; Kinsey, T V

    1999-05-01

    Prior to June 1997, military picture archiving and communications systems (PACS) were planned, procured, and installed with key decisions on the system, equipment, and even funding sources made through a research and development office called Medical Diagnostic Imaging Systems (MDIS). Beginning in June 1997, the Joint Imaging Technology Project Office (JITPO) initiated a collaborative and consultative process for planning and implementing PACS into military treatment facilities through a new Department of Defense (DoD) contract vehicle called digital imaging networks (DIN)-PACS. The JITPO reengineered this process incorporating multiple organizations and politics. The reengineered PACS process administered through the JITPO transformed the decision process and accountability from a single office to a consultative method that increased end-user knowledge, responsibility, and ownership in PACS. The JITPO continues to provide information and services that assist multiple groups and users in rendering PACS planning and implementation decisions. Local site project managers are involved from the outset and this end-user collaboration has made the sometimes difficult transition to PACS an easier and more acceptable process for all involved. Corporately, this process saved DoD sites millions by having PACS plans developed within the government and proposed to vendors second, and then having vendors respond specifically to those plans. The integrity and efficiency of the process have reduced the opportunity for implementing nonstandard systems while sharing resources and reducing wasted government dollars. This presentation will describe the chronology of changes, encountered obstacles, and lessons learned within the reengineering of the PACS process for DIN-PACS.

  9. High-tech organizations: What can they tell us about reengineering (grow and reproduce, or die)

    Energy Technology Data Exchange (ETDEWEB)

    Norton, F.J.

    1996-06-10

    Change is the norm of the 1990s, and it will continue to be a major factor in running a company and/or organization as the coming decades unfold. The former cycle of change followed by stability is gone; change as a continuous reality is the new cycle. The necessity to be customer-driven implies a fundamental transformation of the way organizations and their managers choose to do business. Much has been learned about the way people interact with information systems/engineering information (IS/EI) systems technologies. The cultures of the Department of Energy`s (DOE) National Laboratories are built on a research and development (R and D) mentality that greatly increases the difficulty of building an effective IS/EI systems cross-functional group for various organizations. Classical planning approaches ignore cultural and organizational factors. These factors, however, are crucial in devising meaningful and relevant plans. Also, as more and more organizations strive to become competitive, the philosophy and concepts of total quality management (TQM) are receiving increased attention. This paper: discusses the possibility of applying manufacturing reengineering techniques to other industries to help them overcome the risk of failure; provides a comprehensive look at the changes that have occurred in the business environment since the advent of reengineering; discusses why reengineering is so important and how people and executives of organizations can play even more pivotal roles as long-term strategists in there organizations; introduces the concept of the core mission to planning; provides business process redesign that takes into consideration the interaction of humans and technology.

  10. IDC Re-Engineering Phase 2 System Specification Document Version 1.5

    Energy Technology Data Exchange (ETDEWEB)

    Satpathi, Meara Allena [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, John F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Harris, James M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    This document contains the system specifications derived to satisfy the system requirements found in the IDC System Requirements Document for the IDC Re-Engineering Phase 2 project. This System Specification Document (SSD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data but does include requirements for the dissemination of radionuclide data and products.

  11. Rural district hospitals - essential cogs in the district health system - and primary healthcare re-engineering.

    Science.gov (United States)

    le Roux, K W D P; Couper, I

    2015-06-01

    The re-engineering of primary healthcare (PHC) is regarded as an essential precursor to the implementation of National Health Insurance in South Africa, but improvements in the provision of PHC services have been patchy. The authors contend that the role of well- functioning rural district hospitals as a hub from which PHC services can be most efficiently managed has been underestimated, and that the management of district hospitals and PHC clinics need to be co-located at the level of the rural district hospital, to allow for proper integration of care and effective healthcare provision.

  12. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  13. Addressing Software Security

    Science.gov (United States)

    Bailey, Brandon

    2015-01-01

    Historically security within organizations was thought of as an IT function (web sites/servers, email, workstation patching, etc.) Threat landscape has evolved (Script Kiddies, Hackers, Advanced Persistent Threat (APT), Nation States, etc.) Attack surface has expanded -Networks interconnected!! Some security posture factors Network Layer (Routers, Firewalls, etc.) Computer Network Defense (IPS/IDS, Sensors, Continuous Monitoring, etc.) Industrial Control Systems (ICS) Software Security (COTS, FOSS, Custom, etc.)

  14. Chemical Reactive Anchoring Lipids with Different Performance for Cell Surface Re-engineering Application.

    Science.gov (United States)

    Vabbilisetty, Pratima; Boron, Mallorie; Nie, Huan; Ozhegov, Evgeny; Sun, Xue-Long

    2018-02-28

    Introduction of selectively chemical reactive groups at the cell surface enables site-specific cell surface labeling and modification opportunity, thus facilitating the capability to study the cell surface molecular structure and function and the molecular mechanism it underlies. Further, it offers the opportunity to change or improve a cell's functionality for interest of choice. In this study, two chemical reactive anchor lipids, phosphatidylethanolamine-poly(ethylene glycol)-dibenzocyclooctyne (DSPE-PEG 2000 -DBCO) and cholesterol-PEG-dibenzocyclooctyne (CHOL-PEG 2000 -DBCO) were synthesized and their potential application for cell surface re-engineering via lipid fusion were assessed with RAW 264.7 cells as a model cell. Briefly, RAW 264.7 cells were incubated with anchor lipids under various concentrations and at different incubation times. The successful incorporation of the chemical reactive anchor lipids was confirmed by biotinylation via copper-free click chemistry, followed by streptavidin-fluorescein isothiocyanate binding. In comparison, the cholesterol-based anchor lipid afforded a higher cell membrane incorporation efficiency with less internalization than the phospholipid-based anchor lipid. Low cytotoxicity of both anchor lipids upon incorporation into the RAW 264.7 cells was observed. Further, the cell membrane residence time of the cholesterol-based anchor lipid was evaluated with confocal microscopy. This study suggests the potential cell surface re-engineering applications of the chemical reactive anchor lipids.

  15. An IoT Knowledge Reengineering Framework for Semantic Knowledge Analytics for BI-Services

    Directory of Open Access Journals (Sweden)

    Nilamadhab Mishra

    2015-01-01

    Full Text Available In a progressive business intelligence (BI environment, IoT knowledge analytics are becoming an increasingly challenging problem because of rapid changes of knowledge context scenarios along with increasing data production scales with business requirements that ultimately transform a working knowledge base into a superseded state. Such a superseded knowledge base lacks adequate knowledge context scenarios, and the semantics, rules, frames, and ontology contents may not meet the latest requirements of contemporary BI-services. Thus, reengineering a superseded knowledge base into a renovated knowledge base system can yield greater business value and is more cost effective and feasible than standardising a new system for the same purpose. Thus, in this work, we propose an IoT knowledge reengineering framework (IKR framework for implementation in a neurofuzzy system to build, organise, and reuse knowledge to provide BI-services to the things (man, machines, places, and processes involved in business through the network of IoT objects. The analysis and discussion show that the IKR framework can be well suited to creating improved anticipation in IoT-driven BI-applications.

  16. Chemical Reactive Anchoring Lipids with Different Performance for Cell Surface Re-engineering Application

    Science.gov (United States)

    2018-01-01

    Introduction of selectively chemical reactive groups at the cell surface enables site-specific cell surface labeling and modification opportunity, thus facilitating the capability to study the cell surface molecular structure and function and the molecular mechanism it underlies. Further, it offers the opportunity to change or improve a cell’s functionality for interest of choice. In this study, two chemical reactive anchor lipids, phosphatidylethanolamine–poly(ethylene glycol)–dibenzocyclooctyne (DSPE–PEG2000–DBCO) and cholesterol–PEG–dibenzocyclooctyne (CHOL–PEG2000–DBCO) were synthesized and their potential application for cell surface re-engineering via lipid fusion were assessed with RAW 264.7 cells as a model cell. Briefly, RAW 264.7 cells were incubated with anchor lipids under various concentrations and at different incubation times. The successful incorporation of the chemical reactive anchor lipids was confirmed by biotinylation via copper-free click chemistry, followed by streptavidin-fluorescein isothiocyanate binding. In comparison, the cholesterol-based anchor lipid afforded a higher cell membrane incorporation efficiency with less internalization than the phospholipid-based anchor lipid. Low cytotoxicity of both anchor lipids upon incorporation into the RAW 264.7 cells was observed. Further, the cell membrane residence time of the cholesterol-based anchor lipid was evaluated with confocal microscopy. This study suggests the potential cell surface re-engineering applications of the chemical reactive anchor lipids. PMID:29503972

  17. Scope and prospects of re-engineering and retrofitting wind farms in India

    International Nuclear Information System (INIS)

    Rajsekhar, B.; Van Hulle, F.J.L.

    2001-09-01

    The paper starts with a brief analysis of the characteristics of the Indian wind energy programmes while enumerating the developments that have taken place so far. In view of the large scope for renewable energy based power generation and in order to boost the present uprise of the wind farm development, the authors investigate the possibilities that lay in re-engineering of existing wind farms. Existing wind farm entrepreneurs are showing interest to improve the performance of their wind farms. New initiatives are suggested addressing the involved technical and commercial concerns of both the state-run utility (the principal customer of wind generated electricity) and wind farm entrepreneur to spur development of economically competitive wind-power plants In addition, inferences are drawn from a recently conducted detailed case study at a 5 year old large wind farm in Muppandal area. The study involved conducting detailed WAsP based analysis based on remote land use and land cover details interfacing with GIS. In addition, detailed site investigations were conducted to assess the health of the machines and the adequacy of the power evacuation facility together with the analysis of the machine down times. The paper highlights the benefits that can be expected from such undertakings for several parties both in India and in EU. The paper finally outlines the possible business opportunities and economic benefits that exist for retrofitting and re-engineering in the country, which has over 700 individually designed wind farms. 2 refs

  18. USULAN PERBAIKAN PROSES BISNIS DENGAN KONSEP BUSINESS PROCESS REENGINEERING (STUDI KASUS : PERMATA GUEST HOUSE

    Directory of Open Access Journals (Sweden)

    Bhaswara Adhitya Wardhana

    2013-04-01

    Full Text Available Permata Guest House Semarang merupakan usaha bisnis yang bergerak di bidang jasa penginapan. Perkembangan skala bisnis Permata Guest House yang pesat tidak disertai dengan penataan dan pengelolaan proses bisnis (Business Process yang memadai, mengakibatkan banyak kemunculan keluhan dari stakeholder, yaitu dari pelanggan (customer, dan kalangan internal karyawan. Key Performance Indicator (KPI yang digunakan dalam penelitian terhadap Permata Guest House ini adalah KPI yang diturunkan dari Critical Success Factor (CSF. Pengukuran kinerja dilakukan dengan berdasarkan indikator performansi (Performance Indicator yang telah ditentukan pada observasi awal beserta ukuran dan target yang dicapai untuk tiap indikatornya. Dari hasil observasi awal didapatkan selisih indikator moral kerja dan loyalitas sebesar 20%, komplain keramahtamahan sebesar 16,67%, komplain penampilan dan sikap sebesar 16,67%, tingkat okupansi marketing sales sebesar 30%, dan kepuasan layanan sebesar 77,78%. Berdasarkan analisis pengukuran kinerja dan proses bisnis telah didapatkan faktor-faktor yang menyebabkan kinerja proses bisnis belum sesuai target, salah satu penyelesaian untuk dapat memperbaiki kinerja adalah dengan merancang ulang proses bisnis dengan menggunakan metode Business Process Reengineering (BPR. Dari hasil BPR perlu dilakukan peninjauan kembali (rethinking, perancangan ulang (redesign, dan evaluasi (retool terhadap model kinerja bisnis. Hasil dari rekayasa ulang proses bisnis berupa pembakuan usulan proses bisnis, penyusunan visi misi perusahaan, perancangan struktur organisasi dan job description, serta penyusunan Standart Operating Procedure Kata Kunci  : critical success factor, key performance indicator, business process reengineering   Abstract Permata Guest House Semarang is a business engaged in the lodging services. The development of business scale rapid Permata Guest House is not accompanied by the administration and management of business processes are

  19. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  20. Disgust: Evolved function and structure

    NARCIS (Netherlands)

    Tybur, J.M.; Lieberman, D.; Kurzban, R.; DeScioli, P.

    2013-01-01

    Interest in and research on disgust has surged over the past few decades. The field, however, still lacks a coherent theoretical framework for understanding the evolved function or functions of disgust. Here we present such a framework, emphasizing 2 levels of analysis: that of evolved function and

  1. Natural selection promotes antigenic evolvability

    NARCIS (Netherlands)

    Graves, C.J.; Ros, V.I.D.; Stevenson, B.; Sniegowski, P.D.; Brisson, D.

    2013-01-01

    The hypothesis that evolvability - the capacity to evolve by natural selection - is itself the object of natural selection is highly intriguing but remains controversial due in large part to a paucity of direct experimental evidence. The antigenic variation mechanisms of microbial pathogens provide

  2. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  3. Re-engineering closing watersheds: The negotiated expansion of a dam-based irrigation system in Bolivia

    NARCIS (Netherlands)

    Rocha Lopez, R.F.; Vincent, L.F.; Rap, E.R.

    2015-01-01

    The expansion of the Totora Khocha dam-based irrigation system in the Pucara watershed is a case of planned re-engineering of a closing watershed. This article shows how, when irrigation systems expand in space and across boundaries to capture new water, they also involve new claims by existing and

  4. The Battle for the Soul of Management Denmark-The Shaping of the Danish Versions of Business Process Reengineering

    DEFF Research Database (Denmark)

    Koch, Christian; Vogelius, Peter

    1997-01-01

    Managerial theory distilled into tidy concepts is continually and almost ritually launched into the international and national management audiences. The paper discuss the contemporary exemplar of such management concepts: Business Process Reengineering (BPR). By taking the management fads serious...... to the concept. And on the other hand the need for the consultants to differentiate their product (the concept) from others available on the market....

  5. Re-Engineering Vocational and Technical Education (VTE) for Sustainable Development in North Central Geo-Political Zone, Nigeria

    Science.gov (United States)

    Sofoluwe, Abayomi Olumade

    2013-01-01

    The purpose of the study is to re-engineer vocational and technical education for sustainable development in the North Central Geo-Political Zone in Nigeria. The research design adopted was a survey inferential type. Stratified random was used to select 36 schools out of 98 schools while 920 students out of 3680 students were sampled. The data…

  6. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  7. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  8. Development of the re-engineered European decision support system for off-site nuclear and radiological emergencies - JRODOS. Application to air pollution transport modelling

    International Nuclear Information System (INIS)

    Ievdin, I.; Treebushny, D.; Raskob, W.; Zheleznyak, M.

    2008-01-01

    Full text: The European decision support system for nuclear and radiological emergencies RODOS includes a set of numerical models simulating the transport of radionuclides in the environment, estimating potential doses to the public and simulating and evaluating the efficiency of countermeasures. The re-engineering of the RODOS system using the Java technology has started recently which will allow to apply the new system called JRODOS on nearly any computational platform running Java virtual machine. Modern software development approaches were used for the JRODOS system architecture and implementation: distributed system design (client, management server, computational server), geo-database utilization, plug-in model structure and OpenMI-like compatibility to support seamless model inter-connection. Stable open source components such as an ORM solution (Hibernate), an OpenGIS component (Geotools) and a charting/reporting component (JFree, Pentaho) were utilized to optimize the development effort and allow a fast completion of the project. The architecture of the system is presented and illustrated for the atmospheric dispersion module ALSMC (Atmospheric Local Scale Model Chain) performing calculations of atmospheric pollution transport and the corresponding acute doses and dose rates. The example application is based on a synthetic scenario of a release from a nuclear power plant located in Europe. (author)

  9. A Practical Software Architecture for Virtual Universities

    Science.gov (United States)

    Xiang, Peifeng; Shi, Yuanchun; Qin, Weijun

    2006-01-01

    This article introduces a practical software architecture called CUBES, which focuses on system integration and evolvement for online virtual universities. The key of CUBES is a supporting platform that helps to integrate and evolve heterogeneous educational applications developed by different organizations. Both standardized educational…

  10. Spacetimes containing slowly evolving horizons

    International Nuclear Information System (INIS)

    Kavanagh, William; Booth, Ivan

    2006-01-01

    Slowly evolving horizons are trapping horizons that are ''almost'' isolated horizons. This paper reviews their definition and discusses several spacetimes containing such structures. These include certain Vaidya and Tolman-Bondi solutions as well as (perturbatively) tidally distorted black holes. Taking into account the mass scales and orders of magnitude that arise in these calculations, we conjecture that slowly evolving horizons are the norm rather than the exception in astrophysical processes that involve stellar-scale black holes

  11. Natural selection promotes antigenic evolvability.

    Science.gov (United States)

    Graves, Christopher J; Ros, Vera I D; Stevenson, Brian; Sniegowski, Paul D; Brisson, Dustin

    2013-01-01

    The hypothesis that evolvability - the capacity to evolve by natural selection - is itself the object of natural selection is highly intriguing but remains controversial due in large part to a paucity of direct experimental evidence. The antigenic variation mechanisms of microbial pathogens provide an experimentally tractable system to test whether natural selection has favored mechanisms that increase evolvability. Many antigenic variation systems consist of paralogous unexpressed 'cassettes' that recombine into an expression site to rapidly alter the expressed protein. Importantly, the magnitude of antigenic change is a function of the genetic diversity among the unexpressed cassettes. Thus, evidence that selection favors among-cassette diversity is direct evidence that natural selection promotes antigenic evolvability. We used the Lyme disease bacterium, Borrelia burgdorferi, as a model to test the prediction that natural selection favors amino acid diversity among unexpressed vls cassettes and thereby promotes evolvability in a primary surface antigen, VlsE. The hypothesis that diversity among vls cassettes is favored by natural selection was supported in each B. burgdorferi strain analyzed using both classical (dN/dS ratios) and Bayesian population genetic analyses of genetic sequence data. This hypothesis was also supported by the conservation of highly mutable tandem-repeat structures across B. burgdorferi strains despite a near complete absence of sequence conservation. Diversification among vls cassettes due to natural selection and mutable repeat structures promotes long-term antigenic evolvability of VlsE. These findings provide a direct demonstration that molecular mechanisms that enhance evolvability of surface antigens are an evolutionary adaptation. The molecular evolutionary processes identified here can serve as a model for the evolution of antigenic evolvability in many pathogens which utilize similar strategies to establish chronic infections.

  12. Natural selection promotes antigenic evolvability.

    Directory of Open Access Journals (Sweden)

    Christopher J Graves

    Full Text Available The hypothesis that evolvability - the capacity to evolve by natural selection - is itself the object of natural selection is highly intriguing but remains controversial due in large part to a paucity of direct experimental evidence. The antigenic variation mechanisms of microbial pathogens provide an experimentally tractable system to test whether natural selection has favored mechanisms that increase evolvability. Many antigenic variation systems consist of paralogous unexpressed 'cassettes' that recombine into an expression site to rapidly alter the expressed protein. Importantly, the magnitude of antigenic change is a function of the genetic diversity among the unexpressed cassettes. Thus, evidence that selection favors among-cassette diversity is direct evidence that natural selection promotes antigenic evolvability. We used the Lyme disease bacterium, Borrelia burgdorferi, as a model to test the prediction that natural selection favors amino acid diversity among unexpressed vls cassettes and thereby promotes evolvability in a primary surface antigen, VlsE. The hypothesis that diversity among vls cassettes is favored by natural selection was supported in each B. burgdorferi strain analyzed using both classical (dN/dS ratios and Bayesian population genetic analyses of genetic sequence data. This hypothesis was also supported by the conservation of highly mutable tandem-repeat structures across B. burgdorferi strains despite a near complete absence of sequence conservation. Diversification among vls cassettes due to natural selection and mutable repeat structures promotes long-term antigenic evolvability of VlsE. These findings provide a direct demonstration that molecular mechanisms that enhance evolvability of surface antigens are an evolutionary adaptation. The molecular evolutionary processes identified here can serve as a model for the evolution of antigenic evolvability in many pathogens which utilize similar strategies to establish

  13. Post-Modern Software Development

    Science.gov (United States)

    Filman, Robert E.

    2005-01-01

    The history of software development includes elements of art, science, engineering, and fashion(though very little manufacturing). In all domains, old ideas give way or evolve to new ones: in the fine arts, the baroque gave way to rococo, romanticism, modernism, postmodernism, and so forth. What is the postmodern programming equivalent? That is, what comes after object orientation?

  14. Downsizing, reengineering, and restructuring: long-term implications for healthcare organizations.

    Science.gov (United States)

    Leatt, P; Baker, G R; Halverson, P K; Aird, C

    1997-01-01

    This article provides a framework for analyzing how downsizing and reengineering have affected healthcare organizations. These approaches are reviewed, and key tools that have been used, such as across-the-board cuts, reorganizing, and redesigning, are described. Examples are drawn from healthcare as well as other business sectors. The consequences of cost reduction strategies for an organizations's performance in terms of costs, quality of services, and satisfaction of consumers and employees are explored. The case is made that an organization's context--that is, its culture, level of trust, and leadership--is an important factor that influences the effect of cost-cutting strategies. Characteristics of organizations where downsizing has a better chance of succeeding also are described.

  15. Business Process Reengineering Of Funding On Indonesia’s Islamic Banks

    Directory of Open Access Journals (Sweden)

    Aslam Mei Nur Widigdo

    2016-02-01

    Full Text Available This research attempts to analyze the value chain of Islamic banking business processes and to develop a business processes model on depositors’ funds in order to improve the performance of Islamic banks. Four models of Islamic banking operating in Indonesia are used as the objects of the study. This research applies qualitative study (exploratory approach and utilizes primary data obtained from questionnaire and interviews. This data are then processed by value stream mapping and process activity mapping. This study shows that the waiting time for services is the sub-stage of business process that does not have value added and categorized as pure waste based on VSM criteria.The reengineering of business process of the third party fundraising may reduce collection time up to 1490 minutes for corporate customer and 22 minutes for individual customer.DOI: 10.15408/aiq.v8i1.2506

  16. Managing your practice's first impression: the process of front-desk reengineering.

    Science.gov (United States)

    Walsh, Alison L

    2004-01-01

    Patients must be regarded as consumers. As such, they are increasingly informed, questioning, cost-conscious, technologically savvy, and demanding. Just as health plans have developed defined contribution products that offer consumers more control over how and where their health-care dollars are spent, practice success is linked to reengineering office operations to offer consumers and patients greater choice, control, autonomy, and service. Patients and consumers want practices that deliver clinical and business services that meet the criteria of reliability, effciency, service offerings, patient focus, enthusiasm, customization, and trust. Physician practices must also take care to avoid destructive and disruptive behaviors and conditions such as noise, interference, excessive repetition, long waits, appointment delays, and staff rudeness. A successful patient-focused practice emerges when physicians and office staff begin to look at the clinical and service experience through the patient's eyes.

  17. Re-Engineering Biosafety Regulations In India: Towards a Critique of Policy, Law and Prescriptions

    Directory of Open Access Journals (Sweden)

    A. Damodaran

    2005-06-01

    Full Text Available This article surveys the structure and essence of India’s biosafety regulations from an evolutionary perspective. After detailing the processes associated with the biosafety law and guidelines in the country, this article looks critically at recent efforts to re-engineer the regulations. It is argued that India’s biosafety regulations should move towards a more inclusive approach, which will facilitate transparent and informed decision-making, based on stakeholder-convergence. It is also suggested that the entire spectrum of laws and regulations that have a direct or indirect bearing on biosafety in India, need to be explored so that greater coherence could be secured in the management of biotechnology products that are sensitive to the environment. Drawing from the experience of the Bt cotton case, the article advocates a greater role for civil society and grassroots organizations.

  18. Implementation of 5S tools as a starting point in business process reengineering

    Directory of Open Access Journals (Sweden)

    Vorkapić Miloš 0000-0002-3463-8665

    2017-01-01

    Full Text Available The paper deals with the analysis of elements which represent a starting point in implementation of a business process reengineering. We have used Lean tools through the analysis of 5S model in our research. On the example of finalization of the finished transmitter in IHMT-CMT production, 5S tools were implemented with a focus on Quality elements although the theory shows that BPR and TQM are two opposite activities in an enterprise. We wanted to distinguish the significance of employees’ self-discipline which helps the process of product finalization to develop in time and without waste and losses. In addition, the employees keep their work place clean, tidy and functional.

  19. Beyond the computer-based patient record: re-engineering with a vision.

    Science.gov (United States)

    Genn, B; Geukers, L

    1995-01-01

    In order to achieve real benefit from the potential offered by a Computer-Based Patient Record, the capabilities of the technology must be applied along with true re-engineering of healthcare delivery processes. University Hospital recognizes this and is using systems implementation projects, such as the catalyst, for transforming the way we care for our patients. Integration is fundamental to the success of these initiatives and this must be explicitly planned against an organized systems architecture whose standards are market-driven. University Hospital also recognizes that Community Health Information Networks will offer improved quality of patient care at a reduced overall cost to the system. All of these implementation factors are considered up front as the hospital makes its initial decisions on to how to computerize its patient records. This improves our chances for success and will provide a consistent vision to guide the hospital's development of new and better patient care.

  20. Reengineering of the business process in the Serbian post's department for express parcel service

    Directory of Open Access Journals (Sweden)

    Lazarević Dragan M.

    2015-01-01

    Full Text Available In this paper the model that solves the problem of exceeding time limit in the system of express parcel shipping in the Post of Serbia is described. The existing principle of the organization of the area serving is explained, as well as the problem of exceeding time limit that appears and leads to the delay of the service to the user. Two approaches for problem solving are suggested. The reengineering of the existing business processes is carried out to some extent through these two approaches, and will be presented by BPMN notation. The first approach is based on the use of the fuzzy set theory, i.e. fuzzy logical systems, while the other one is based on the use of algorithm 'zoning-routing'.

  1. Construction of RNA nanocages by re-engineering the packaging RNA of Phi29 bacteriophage

    Science.gov (United States)

    Hao, Chenhui; Li, Xiang; Tian, Cheng; Jiang, Wen; Wang, Guansong; Mao, Chengde

    2014-05-01

    RNA nanotechnology promises rational design of RNA nanostructures with wide array of structural diversities and functionalities. Such nanostructures could be used in applications such as small interfering RNA delivery and organization of in vivo chemical reactions. Though having impressive development in recent years, RNA nanotechnology is still quite limited and its programmability and complexity could not rival the degree of its closely related cousin: DNA nanotechnology. Novel strategies are needed for programmed RNA self-assembly. Here, we have assembled RNA nanocages by re-engineering a natural, biological RNA motif: the packaging RNA of phi29 bacteriophage. The resulting RNA nanostructures have been thoroughly characterized by gel electrophoresis, cryogenic electron microscopy imaging and dynamic light scattering.

  2. Organizational learning as a test-bed for business process reengineering

    DEFF Research Database (Denmark)

    Larsen, Michael Holm; Leinsdorff, Torben

    1998-01-01

    The fact that a company's learning ability may prevent strategic drift and the fact that many companies are undertaking BPR (business process reengineering) projects leads us to ask whether all these BPR activities promote organizational learning. Within this framework, we studied the extent...... of Enzyme Business, Novo Nordisk A/S. The result of the analysis is that a correlation between BPR and organizational learning has been established, i.e. the BPR elements: customer focus, process orientation, high level of ambition, clean sheet principle, performance measuring, the business system diamond...... to which BPR promotes organizational learning by focusing on the project group and the steering committee. This paper is based partly on a theoretical study of the significant characteristics of BPR and of organizational learning and partly on a field study carried out in cooperation with the business unit...

  3. Re-engineering the process of medical imaging physics and technology education and training.

    Science.gov (United States)

    Sprawls, Perry

    2005-09-01

    The extensive availability of digital technology provides an opportunity for enhancing both the effectiveness and efficiency of virtually all functions in the process of medical imaging physics and technology education and training. This includes degree granting academic programs within institutions and a wide spectrum of continuing education lifelong learning activities. Full achievement of the advantages of technology-enhanced education (e-learning, etc.) requires an analysis of specific educational activities with respect to desired outcomes and learning objectives. This is followed by the development of strategies and resources that are based on established educational principles. The impact of contemporary technology comes from its ability to place learners into enriched learning environments. The full advantage of a re-engineered and implemented educational process involves changing attitudes and functions of learning facilitators (teachers) and resource allocation and sharing both within and among institutions.

  4. Re-engineering the mission life cycle with ABC and IDEF

    Science.gov (United States)

    Mandl, Daniel; Rackley, Michael; Karlin, Jay

    1994-01-01

    The theory behind re-engineering a business process is to remove the non-value added activities thereby lowering the process cost. In order to achieve this, one must be able to identify where the non-value added elements are located which is not a trivial task. This is because the non-value added elements are often hidden in the form of overhead and/or pooled resources. In order to be able to isolate these non-value added processes from among the other processes, one must first decompose the overall top level process into lower layers of sub-processes. In addition, costing data must be assigned to each sub-process along with the value the sub-process adds towards the final product. IDEF0 is a Federal Information Processing Standard (FIPS) process-modeling tool that allows for this functional decomposition through structured analysis. In addition, it illustrates the relationship of the process and the value added to the product or service. The value added portion is further defined in IDEF1X which is an entity relationship diagramming tool. The entity relationship model is the blueprint of the product as it moves along the 'assembly line' and therefore relates all of the parts to each other and the final product. It also relates the parts to the tools that produce the product and all of the paper work that is used in their acquisition. The use of IDEF therefore facilitates the use of Activity Based Costing (ABC). ABC is an essential method in a high variety, product-customizing environment, to facilitate rapid response to externally caused change. This paper describes the work being done in the Mission Operations Division to re-engineer the development and operation life cycle of Mission Operations Centers using these tools.

  5. Robustness to Faults Promotes Evolvability: Insights from Evolving Digital Circuits.

    Science.gov (United States)

    Milano, Nicola; Nolfi, Stefano

    2016-01-01

    We demonstrate how the need to cope with operational faults enables evolving circuits to find more fit solutions. The analysis of the results obtained in different experimental conditions indicates that, in absence of faults, evolution tends to select circuits that are small and have low phenotypic variability and evolvability. The need to face operation faults, instead, drives evolution toward the selection of larger circuits that are truly robust with respect to genetic variations and that have a greater level of phenotypic variability and evolvability. Overall our results indicate that the need to cope with operation faults leads to the selection of circuits that have a greater probability to generate better circuits as a result of genetic variation with respect to a control condition in which circuits are not subjected to faults.

  6. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...

  7. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  8. Virtual Immunology: Software for Teaching Basic Immunology

    Science.gov (United States)

    Berçot, Filipe Faria; Fidalgo-Neto, Antônio Augusto; Lopes, Renato Matos; Faggioni, Thais; Alves, Luiz Anastácio

    2013-01-01

    As immunology continues to evolve, many educational methods have found difficulty in conveying the degree of complexity inherent in its basic principles. Today, the teaching-learning process in such areas has been improved with tools such as educational software. This article introduces "Virtual Immunology," a software program available…

  9. Graph Based Verification of Software Evolution Requirements

    NARCIS (Netherlands)

    Ciraci, S.

    2009-01-01

    Due to market demands and changes in the environment, software systems have to evolve. However, the size and complexity of the current software systems make it time consuming to incorporate changes. During our collaboration with the industry, we observed that the developers spend much time on the

  10. A neighbourhood evolving network model

    International Nuclear Information System (INIS)

    Cao, Y.J.; Wang, G.Z.; Jiang, Q.Y.; Han, Z.X.

    2006-01-01

    Many social, technological, biological and economical systems are best described by evolved network models. In this short Letter, we propose and study a new evolving network model. The model is based on the new concept of neighbourhood connectivity, which exists in many physical complex networks. The statistical properties and dynamics of the proposed model is analytically studied and compared with those of Barabasi-Albert scale-free model. Numerical simulations indicate that this network model yields a transition between power-law and exponential scaling, while the Barabasi-Albert scale-free model is only one of its special (limiting) cases. Particularly, this model can be used to enhance the evolving mechanism of complex networks in the real world, such as some social networks development

  11. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  12. Finding Security Patterns to Countermeasure Software Vulnerabilities

    OpenAIRE

    Borstad, Ole Gunnar

    2008-01-01

    Software security is an increasingly important part of software development as the risk from attackers is constantly evolving through increased exposure, threats and economic impact of security breaches. Emerging security literature describes expert knowledge such as secure development best practices. This knowledge is often not applied by software developers because they lack security awareness, security training and secure development methods and tools. Existing methods and tools require to...

  13. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  14. Software Reviews.

    Science.gov (United States)

    Davis, Shelly J., Ed.; Knaupp, Jon, Ed.

    1984-01-01

    Reviewed is computer software on: (1) classification of living things, a tutorial program for grades 5-10; and (2) polynomial practice using tiles, a drill-and-practice program for algebra students. (MNS)

  15. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Three pieces of computer software are described and reviewed: HyperCard, to build and use varied applications; Iggy's Gnees, for problem solving with shapes in grades kindergarten-two; and Algebra Shop, for practicing skills and problem solving. (MNS)

  16. Software for medical image based phantom modelling

    International Nuclear Information System (INIS)

    Possani, R.G.; Massicano, F.; Coelho, T.S.; Yoriyaz, H.

    2011-01-01

    Latest treatment planning systems depends strongly on CT images, so the tendency is that the dosimetry procedures in nuclear medicine therapy be also based on images, such as magnetic resonance imaging (MRI) or computed tomography (CT), to extract anatomical and histological information, as well as, functional imaging or activities map as PET or SPECT. This information associated with the simulation of radiation transport software is used to estimate internal dose in patients undergoing treatment in nuclear medicine. This work aims to re-engineer the software SCMS, which is an interface software between the Monte Carlo code MCNP, and the medical images, that carry information from the patient in treatment. In other words, the necessary information contained in the images are interpreted and presented in a specific format to the Monte Carlo MCNP code to perform the simulation of radiation transport. Therefore, the user does not need to understand complex process of inputting data on MCNP, as the SCMS is responsible for automatically constructing anatomical data from the patient, as well as the radioactive source data. The SCMS was originally developed in Fortran- 77. In this work it was rewritten in an object-oriented language (JAVA). New features and data options have also been incorporated into the software. Thus, the new software has a number of improvements, such as intuitive GUI and a menu for the selection of the energy spectra correspondent to a specific radioisotope stored in a XML data bank. The new version also supports new materials and the user can specify an image region of interest for the calculation of absorbed dose. (author)

  17. Evolving phenotypic networks in silico.

    Science.gov (United States)

    François, Paul

    2014-11-01

    Evolved gene networks are constrained by natural selection. Their structures and functions are consequently far from being random, as exemplified by the multiple instances of parallel/convergent evolution. One can thus ask if features of actual gene networks can be recovered from evolutionary first principles. I review a method for in silico evolution of small models of gene networks aiming at performing predefined biological functions. I summarize the current implementation of the algorithm, insisting on the construction of a proper "fitness" function. I illustrate the approach on three examples: biochemical adaptation, ligand discrimination and vertebrate segmentation (somitogenesis). While the structure of the evolved networks is variable, dynamics of our evolved networks are usually constrained and present many similar features to actual gene networks, including properties that were not explicitly selected for. In silico evolution can thus be used to predict biological behaviours without a detailed knowledge of the mapping between genotype and phenotype. Copyright © 2014 The Author. Published by Elsevier Ltd.. All rights reserved.

  18. Ranking in evolving complex networks

    Science.gov (United States)

    Liao, Hao; Mariani, Manuel Sebastian; Medo, Matúš; Zhang, Yi-Cheng; Zhou, Ming-Yang

    2017-05-01

    Complex networks have emerged as a simple yet powerful framework to represent and analyze a wide range of complex systems. The problem of ranking the nodes and the edges in complex networks is critical for a broad range of real-world problems because it affects how we access online information and products, how success and talent are evaluated in human activities, and how scarce resources are allocated by companies and policymakers, among others. This calls for a deep understanding of how existing ranking algorithms perform, and which are their possible biases that may impair their effectiveness. Many popular ranking algorithms (such as Google's PageRank) are static in nature and, as a consequence, they exhibit important shortcomings when applied to real networks that rapidly evolve in time. At the same time, recent advances in the understanding and modeling of evolving networks have enabled the development of a wide and diverse range of ranking algorithms that take the temporal dimension into account. The aim of this review is to survey the existing ranking algorithms, both static and time-aware, and their applications to evolving networks. We emphasize both the impact of network evolution on well-established static algorithms and the benefits from including the temporal dimension for tasks such as prediction of network traffic, prediction of future links, and identification of significant nodes.

  19. Implementation plan for waste management reengineering at Oak Ridge National Laboratory

    International Nuclear Information System (INIS)

    Berry, J.B.

    1997-10-01

    An intensive reengineering evaluation of the Oak Ridge National Laboratory (ORNL) waste management program was conducted from February to July 1997 resulting in the following vision for ORNL waste management: ORNL Waste Management will become an integrated Waste Management/Generator function that: (1) Treats ORNL as a single generator for expert-based waste characterization and certification purposes; (2) Recognizes Generators, Department of Energy (DOE), and the Management and Integration (M ampersand I) contractor as equally important customers; (3) Focuses on pollution prevention followed by waste generation, collection, treatment, storage, and disposal operations that reflect more cost-effective commercial approaches; and (4) Incorporates new technology and outsourcing of services where appropriate to provide the lowest cost solutions. A cross-functional Core Team recommended 15 cost-effectiveness improvements that are expected to reduce the fiscal year (FY) 1996 ORNL waste management costs of $75M by $10-$15M annually. These efficiency improvements will be realized by both Research and Waste Management Organizations

  20. Re-engineering of Bacterial Luciferase; For New Aspects of Bioluminescence.

    Science.gov (United States)

    Kim, Da-Som; Choi, Jeong-Ran; Ko, Jeong-Ae; Kim, Kangmin

    2018-01-01

    Bacterial luminescence is the end-product of biochemical reactions catalyzed by the luciferase enzyme. Nowadays, this fascinating phenomenon has been widely used as reporter and/or sensors to detect a variety of biological and environmental processes. The enhancement or diversification of the luciferase activities will increase the versatility of bacterial luminescence. Here, to establish the strategy for luciferase engineering, we summarized the identity and relevant roles of key amino acid residues modulating luciferase in Vibrio harveyi, a model luminous bacterium. The current opinions on crystal structures and the critical amino acid residues involved in the substrate binding sites and unstructured loop have been delineated. Based on these, the potential target residues and/or parameters for enzyme engineering were also suggested in limited scale. In conclusion, even though the accurate knowledge on the bacterial luciferase is yet to be reported, the structure-guided site-directed mutagenesis approaches targeting the regulatory amino acids will provide a useful platform to re-engineer the bacterial luciferase in the future. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  1. Stakeholder Analysis as a Medium to Aid Change in Information System Reengineering Projects

    Directory of Open Access Journals (Sweden)

    Jean Davison

    2004-04-01

    Full Text Available The importance of involving stakeholders within a change process is well recognised, and successfully managed change is equally important. Information systems development and redesign is a form of change activity involving people and social issues and therefore resistance to change may occur. A stakeholder identification and analysis (SIA technique has been developed as an enhancement to PISO® (Process Improvement for Strategic Objectives, a method that engages the users of a system in the problem solving and reengineering of their own work-based problem areas. The SIA technique aids the identification and analysis of system stakeholders, and helps view the projected outcome of system changes and their effect on relevant stakeholders with attention being given to change resistance to ensure smooth negotiation and achieve consensus. A case study is presented here describing the successful implementation of a direct appointment booking system for patients within the National Health Service in the UK, utilising the SIA technique, which resulted in a feeling of empowerment and ownership of the change of those involved.

  2. IDC Reengineering Phase 2 & 3 Rough Order of Magnitude (ROM) Cost Estimate Summary (Leveraged NDC Case).

    Energy Technology Data Exchange (ETDEWEB)

    Harris, James M.; Prescott, Ryan; Dawson, Jericah M.; Huelskamp, Robert M.

    2014-11-01

    Sandia National Laboratories has prepared a ROM cost estimate for budgetary planning for the IDC Reengineering Phase 2 & 3 effort, based on leveraging a fully funded, Sandia executed NDC Modernization project. This report provides the ROM cost estimate and describes the methodology, assumptions, and cost model details used to create the ROM cost estimate. ROM Cost Estimate Disclaimer Contained herein is a Rough Order of Magnitude (ROM) cost estimate that has been provided to enable initial planning for this proposed project. This ROM cost estimate is submitted to facilitate informal discussions in relation to this project and is NOT intended to commit Sandia National Laboratories (Sandia) or its resources. Furthermore, as a Federally Funded Research and Development Center (FFRDC), Sandia must be compliant with the Anti-Deficiency Act and operate on a full-cost recovery basis. Therefore, while Sandia, in conjunction with the Sponsor, will use best judgment to execute work and to address the highest risks and most important issues in order to effectively manage within cost constraints, this ROM estimate and any subsequent approved cost estimates are on a 'full-cost recovery' basis. Thus, work can neither commence nor continue unless adequate funding has been accepted and certified by DOE.

  3. Software Authentication

    International Nuclear Information System (INIS)

    Wolford, J.K.; Geelhood, B.D.; Hamilton, V.A.; Ingraham, J.; MacArthur, D.W.; Mitchell, D.J.; Mullens, J.A.; Vanier, P. E.; White, G.K.; Whiteson, R.

    2001-01-01

    The effort to define guidance for authentication of software for arms control and nuclear material transparency measurements draws on a variety of disciplines and has involved synthesizing established criteria and practices with newer methods. Challenges include the need to protect classified information that the software manipulates as well as deal with the rapid pace of innovation in the technology of nuclear material monitoring. The resulting guidance will shape the design of future systems and inform the process of authentication of instruments now being developed. This paper explores the technical issues underlying the guidance and presents its major tenets

  4. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  5. Reviews, Software.

    Science.gov (United States)

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  6. Software Reviews.

    Science.gov (United States)

    Kinnaman, Daniel E.; And Others

    1988-01-01

    Reviews four educational software packages for Apple, IBM, and Tandy computers. Includes "How the West was One + Three x Four,""Mavis Beacon Teaches Typing,""Math and Me," and "Write On." Reviews list hardware requirements, emphasis, levels, publisher, purchase agreements, and price. Discusses the strengths…

  7. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game…

  8. Software Reviews.

    Science.gov (United States)

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  9. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  10. The 'E' factor -- evolving endodontics.

    Science.gov (United States)

    Hunter, M J

    2013-03-01

    Endodontics is a constantly developing field, with new instruments, preparation techniques and sealants competing with trusted and traditional approaches to tooth restoration. Thus general dental practitioners must question and understand the significance of these developments before adopting new practices. In view of this, the aim of this article, and the associated presentation at the 2013 British Dental Conference & Exhibition, is to provide an overview of endodontic methods and constantly evolving best practice. The presentation will review current preparation techniques, comparing rotary versus reciprocation, and question current trends in restoration of the endodontically treated tooth.

  11. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    COMMUNICATIONS, AND COMPUTER SCIENCES I ,(AIRMICS) FO~SOFTWARE TOOLS (.o FOR SOF1 ’ARE MAINTENANCE (ASQBG-1-89-001) October, 1988 DTIC ELECTE -ifB...SUNWW~. B..c Program An~Iysw HA.c C-Tractr C Cobol Stncturing Facility VS Cobol 11 F-Scan Foctma Futbol Cobol Fortran Sltiuc Code Anaiyaer Fortran IS

  12. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  13. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  14. Agent-based re-engineering of ErbB signaling: a modeling pipeline for integrative systems biology.

    Science.gov (United States)

    Das, Arya A; Ajayakumar Darsana, T; Jacob, Elizabeth

    2017-03-01

    Experiments in systems biology are generally supported by a computational model which quantitatively estimates the parameters of the system by finding the best fit to the experiment. Mathematical models have proved to be successful in reverse engineering the system. The data generated is interpreted to understand the dynamics of the underlying phenomena. The question we have sought to answer is that - is it possible to use an agent-based approach to re-engineer a biological process, making use of the available knowledge from experimental and modelling efforts? Can the bottom-up approach benefit from the top-down exercise so as to create an integrated modelling formalism for systems biology? We propose a modelling pipeline that learns from the data given by reverse engineering, and uses it for re-engineering the system, to carry out in-silico experiments. A mathematical model that quantitatively predicts co-expression of EGFR-HER2 receptors in activation and trafficking has been taken for this study. The pipeline architecture takes cues from the population model that gives the rates of biochemical reactions, to formulate knowledge-based rules for the particle model. Agent-based simulations using these rules, support the existing facts on EGFR-HER2 dynamics. We conclude that, re-engineering models, built using the results of reverse engineering, opens up the possibility of harnessing the power pack of data which now lies scattered in literature. Virtual experiments could then become more realistic when empowered with the findings of empirical cell biology and modelling studies. Implemented on the Agent Modelling Framework developed in-house. C ++ code templates available in Supplementary material . liz.csir@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  15. Deriving a research agenda for a financial service industry's methodology for carrying out business process re-engineering

    Directory of Open Access Journals (Sweden)

    Kader, I. A.

    2016-05-01

    Full Text Available Why do projects fail? This is a question that has been researched across various project disciplines, including that of Business Process Re-engineering (BPR. This paper introduces a different angle on why BPR projects fail. An analysis of a case study conducted within a financial institution revealed new factors that could influence BPR project outcomes, but that have not been identified in the literature. The Organisation Ring of Influence model was developed to indicate the impact that organisation behaviours and structures had on the outcome of an executed BPR project. This model also helps to highlight which factors were more influential than others.

  16. Business Process Reengineering- Can a Management Strategy improve the Working Environment ?

    DEFF Research Database (Denmark)

    Koch, Christian

    1997-01-01

    Ergonomists need to adopt a more proactive approach to management concepts. it is insufficient to wait until the workplace examples have evolved.In the contribution BPR is used as an examplar of a typical contemporary management concept....

  17. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  18. Peripartum hysterectomy: an evolving picture.

    LENUS (Irish Health Repository)

    Turner, Michael J

    2012-02-01

    Peripartum hysterectomy (PH) is one of the obstetric catastrophes. Evidence is emerging that the role of PH in modern obstetrics is evolving. Improving management of postpartum hemorrhage and newer surgical techniques should decrease PH for uterine atony. Rising levels of repeat elective cesarean deliveries should decrease PH following uterine scar rupture in labor. Increasing cesarean rates, however, have led to an increase in the number of PHs for morbidly adherent placenta. In the case of uterine atony or rupture where PH is required, a subtotal PH is often sufficient. In the case of pathological placental localization involving the cervix, however, a total hysterectomy is required. Furthermore, the involvement of other pelvic structures may prospectively make the diagnosis difficult and the surgery challenging. If resources permit, PH for pathological placental localization merits a multidisciplinary approach. Despite advances in clinical practice, it is likely that peripartum hysterectomy will be more challenging for obstetricians in the future.

  19. Infrared spectroscopy of evolved objects

    International Nuclear Information System (INIS)

    Aitken, D.K.; Roche, P.F.

    1984-01-01

    In this review, the authors are concerned with spectroscopic observations of evolved objects made in the wavelength range 1-300μm. Spectroscopic observations can conveniently be divided into studies of narrow lines, bands and broader continua. The vibrational frequencies of molecular groups fall mainly in this spectral region and appear as vibration-rotation bands from the gas phase, and as less structured, but often broader, features from the solid state. Many ionic lines, including recombination lines of abundant species and fine structure lines of astrophysically important ions also appear in this region. The continuum can arise from a number of mechanisms - photospheric emission, radiation from dust, free-free transitions in ionized gas and non-thermal processes. (Auth.)

  20. Software Prototyping

    Science.gov (United States)

    Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R.

    2016-01-01

    Summary Background Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. Objective To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Methods Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Results Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system. PMID:27081404

  1. Challenges of reengineering into multi-tenant SaaS applications

    NARCIS (Netherlands)

    Bezemer, C.; Zaidman, A.

    2010-01-01

    Multi-tenancy is a relatively new software architecture principle in the realm of the Software as a Service (SaaS) business model. It allows to make full use of the economy of scale, as multiple customers "tenants" share the same application and database instance. All the while, the tenants enjoy a

  2. Preparing for the future: a case study of role changing and reengineering. Recognize and seize the new opportunities.

    Science.gov (United States)

    Holland, C A

    1995-01-01

    Today's laboratory managers are caught in the midst of a tumultuous environment as a result of managed care, mergers and acquisitions, and downsizing. We must prepare ourselves through continuous learning, recognize the marketable value of our skills outside of the laboratory, and seize opportunities to expand into new roles. At Arkansas Children's Hospital, the Chief Executive Officer selected the Administrative Director of Laboratories to reengineer the General Pediatric Center. Our goals were to improve quality of care, efficiency, teamwork, clinic visit times, and satisfaction of patients, staff, and physicians. We developed ideal objectives from surveys, brainstorming sessions, and interviews to serve as guidelines for reengineering teams. Teams met the goals and 12 of 15 ideal objectives. Patient flow redesign resulted in different processes for different patient populations and a 35% decrease in the average clinic visit time. Patient, staff, and physician satisfaction improved, as did the clinic's financial status. The project's success confirms that our leadership and analytical skills are transferable from the laboratory to carry us to new heights in other health-care arenas.

  3. Evaluating the Implementation of the Re-Engineering Systems of Primary Care Treatment in the Military (RESPECT-Mil)

    Science.gov (United States)

    Wong, Eunice C.; Jaycox, Lisa H.; Ayer, Lynsay; Batka, Caroline; Harris, Racine; Naftel, Scott; Paddock, Susan M.

    2015-01-01

    Abstract A RAND team conducted an independent implementation evaluation of the Re-Engineering Systems of Primary Care Treatment in the Military (RESPECT-Mil) Program, a system of care designed to screen, assess, and treat posttraumatic stress disorder and depression among active duty service members in the Army's primary care settings. Evaluating the Implementation of the Re-Engineering Systems of Primary Care Treatment in the Military (RESPECT-Mil) presents the results from RAND's assessment of the implementation of RESPECT-Mil in military treatment facilities and makes recommendations to improve the delivery of mental health care in these settings. Analyses were based on existing program data used to monitor fidelity to RESPECT-Mil across the Army's primary care clinics, as well as discussions with key stakeholders. During the time of the evaluation, efforts were under way to implement the Patient Centered Medical Home, and uncertainties remained about the implications for the RESPECT-Mil program. Consideration of this transition was made in designing the evaluation and applying its findings more broadly to the implementation of collaborative care within military primary care settings. PMID:28083389

  4. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  5. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  6. CERN internal communication is evolving

    CERN Multimedia

    2016-01-01

    CERN news will now be regularly updated on the CERN People page (see here).      Dear readers, All over the world, communication is becoming increasingly instantaneous, with news published in real time on websites and social networks. In order to keep pace with these changes, CERN's internal communication is evolving too. From now on, you will be informed of what’s happening at CERN more often via the “CERN people” page, which will frequently be updated with news. The Bulletin is following this trend too: twice a month, we will compile the most important articles published on the CERN site, with a brand-new layout. You will receive an e-mail every two weeks as soon as this new form of the Bulletin is available. If you have interesting news or stories to share, tell us about them through the form at: https://communications.web.cern.ch/got-story-cern-website​. You can also find out about news from CERN in real time...

  7. Economies Evolve by Energy Dispersal

    Directory of Open Access Journals (Sweden)

    Stanley Salthe

    2009-10-01

    Full Text Available Economic activity can be regarded as an evolutionary process governed by the 2nd law of thermodynamics. The universal law, when formulated locally as an equation of motion, reveals that a growing economy develops functional machinery and organizes hierarchically in such a way as to tend to equalize energy density differences within the economy and in respect to the surroundings it is open to. Diverse economic activities result in flows of energy that will preferentially channel along the most steeply descending paths, leveling a non-Euclidean free energy landscape. This principle of 'maximal energy dispersal‘, equivalent to the maximal rate of entropy production, gives rise to economic laws and regularities. The law of diminishing returns follows from the diminishing free energy while the relation between supply and demand displays a quest for a balance among interdependent energy densities. Economic evolution is dissipative motion where the driving forces and energy flows are inseparable from each other. When there are multiple degrees of freedom, economic growth and decline are inherently impossible to forecast in detail. Namely, trajectories of an evolving economy are non-integrable, i.e. unpredictable in detail because a decision by a player will affect also future decisions of other players. We propose that decision making is ultimately about choosing from various actions those that would reduce most effectively subjectively perceived energy gradients.

  8. Recommendation in evolving online networks

    Science.gov (United States)

    Hu, Xiao; Zeng, An; Shang, Ming-Sheng

    2016-02-01

    Recommender system is an effective tool to find the most relevant information for online users. By analyzing the historical selection records of users, recommender system predicts the most likely future links in the user-item network and accordingly constructs a personalized recommendation list for each user. So far, the recommendation process is mostly investigated in static user-item networks. In this paper, we propose a model which allows us to examine the performance of the state-of-the-art recommendation algorithms in evolving networks. We find that the recommendation accuracy in general decreases with time if the evolution of the online network fully depends on the recommendation. Interestingly, some randomness in users' choice can significantly improve the long-term accuracy of the recommendation algorithm. When a hybrid recommendation algorithm is applied, we find that the optimal parameter gradually shifts towards the diversity-favoring recommendation algorithm, indicating that recommendation diversity is essential to keep a high long-term recommendation accuracy. Finally, we confirm our conclusions by studying the recommendation on networks with the real evolution data.

  9. A Change Impact Analysis to Characterize Evolving Program Behaviors

    Science.gov (United States)

    Rungta, Neha Shyam; Person, Suzette; Branchaud, Joshua

    2012-01-01

    Change impact analysis techniques estimate the potential effects of changes made to software. Directed Incremental Symbolic Execution (DiSE) is an intraprocedural technique for characterizing the impact of software changes on program behaviors. DiSE first estimates the impact of the changes on the source code using program slicing techniques, and then uses the impact sets to guide symbolic execution to generate path conditions that characterize impacted program behaviors. DiSE, however, cannot reason about the flow of impact between methods and will fail to generate path conditions for certain impacted program behaviors. In this work, we present iDiSE, an extension to DiSE that performs an interprocedural analysis. iDiSE combines static and dynamic calling context information to efficiently generate impacted program behaviors across calling contexts. Information about impacted program behaviors is useful for testing, verification, and debugging of evolving programs. We present a case-study of our implementation of the iDiSE algorithm to demonstrate its efficiency at computing impacted program behaviors. Traditional notions of coverage are insufficient for characterizing the testing efforts used to validate evolving program behaviors because they do not take into account the impact of changes to the code. In this work we present novel definitions of impacted coverage metrics that are useful for evaluating the testing effort required to test evolving programs. We then describe how the notions of impacted coverage can be used to configure techniques such as DiSE and iDiSE in order to support regression testing related tasks. We also discuss how DiSE and iDiSE can be configured for debugging finding the root cause of errors introduced by changes made to the code. In our empirical evaluation we demonstrate that the configurations of DiSE and iDiSE can be used to support various software maintenance tasks

  10. Interactive scalable condensation of reverse engineered UML class diagrams for software comprehension

    NARCIS (Netherlands)

    Osman, Mohd Hafeez Bin

    2015-01-01

    Software design documentation is a valuable aid in software comprehension. However, keeping the software design up-to-date with evolving source code is challenging and time-consuming. Reverse engineering is one of the options for recovering software architecture from the implementation code.

  11. Idiopathic pulmonary fibrosis: evolving concepts.

    Science.gov (United States)

    Ryu, Jay H; Moua, Teng; Daniels, Craig E; Hartman, Thomas E; Yi, Eunhee S; Utz, James P; Limper, Andrew H

    2014-08-01

    Idiopathic pulmonary fibrosis (IPF) occurs predominantly in middle-aged and older adults and accounts for 20% to 30% of interstitial lung diseases. It is usually progressive, resulting in respiratory failure and death. Diagnostic criteria for IPF have evolved over the years, and IPF is currently defined as a disease characterized by the histopathologic pattern of usual interstitial pneumonia occurring in the absence of an identifiable cause of lung injury. Understanding of the pathogenesis of IPF has shifted away from chronic inflammation and toward dysregulated fibroproliferative repair in response to alveolar epithelial injury. Idiopathic pulmonary fibrosis is likely a heterogeneous disorder caused by various interactions between genetic components and environmental exposures. High-resolution computed tomography can be diagnostic in the presence of typical findings such as bilateral reticular opacities associated with traction bronchiectasis/bronchiolectasis in a predominantly basal and subpleural distribution, along with subpleural honeycombing. In other circumstances, a surgical lung biopsy may be needed. The clinical course of IPF can be unpredictable and may be punctuated by acute deteriorations (acute exacerbation). Although progress continues in unraveling the mechanisms of IPF, effective therapy has remained elusive. Thus, clinicians and patients need to reach informed decisions regarding management options including lung transplant. The findings in this review were based on a literature search of PubMed using the search terms idiopathic pulmonary fibrosis and usual interstitial pneumonia, limited to human studies in the English language published from January 1, 2000, through December 31, 2013, and supplemented by key references published before the year 2000. Copyright © 2014 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  12. Intrinsic Motivation in Open Source Software Development

    DEFF Research Database (Denmark)

    Bitzer, J.; W., Schrettl,; Schröder, Philipp

    2004-01-01

    This papers sheds light on the puzzling evidence that even though open source software (OSS) is a public good, it is developed for free by highly qualified, young and motivated individuals, and evolves at a rapid pace. We show that once OSS development is understood as the private provision...

  13. IDC Re-Engineering Phase 2 Iteration E1 Use Case Realizations version 1.2.

    Energy Technology Data Exchange (ETDEWEB)

    Hamlet, Benjamin R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Harris, James M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, John F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Montoya, Mark Sinclair [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sandoval, Rudy Daniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-12-01

    This document contains 4 use case realizations generated from the model contained in Rational Software Architect. These use case realizations are the current versions of the realizations originally delivered in Elaboration Iteration 1.

  14. IDC Re-Engineering Phase 2 Iteration E3 Use Case Realizations Version 1.2

    International Nuclear Information System (INIS)

    Hamlet, Benjamin R.; Harris, James M.; Burns, John F.

    2017-01-01

    This document contains 4 use case realizations generated from the model contained in Rational Software Architect. These use case realizations are the current versions of the realizations originally delivered in Elaboration Iteration 3.

  15. IDC Re-Engineering Phase 2 Iteration E2 Use Case Realizations Version 1.2.

    Energy Technology Data Exchange (ETDEWEB)

    Hamlet, Benjamin R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Harris, James M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, John F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lober, Randall R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vickers, James Wallace [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-12-01

    This document contains 4 use case realizations generated from the model contained in Rational Software Architect. These use case realizations are the current versions of the realizations originally delivered in Elaboration Iteration 2.

  16. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  17. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  18. Evolving expectations from international organisations

    International Nuclear Information System (INIS)

    Ruiz Lopez, C.

    2008-01-01

    The author stated that implementation of the geological disposal concept requires a strategy that provides national decision makers with sufficient confidence in the level of long-term safety and protection ultimately achieved. The concept of protection against harm has a broader meaning than radiological protection in terms of risk and dose. It includes the protection of the environment and socio-economic interests of communities. She recognised that a number of countries have established regulatory criteria already, and others are now discussing what constitutes a proper regulatory test and suitable time frame for judging the safety of long-term disposal. Each regulatory programme seeks to define reasonable tests of repository performance, using protection criteria and safety approaches consistent with the culture, values and expectations of the citizens of the country concerned. This means that there are differences in how protection and safety are addressed in national approaches to regulation and in the bases used for that. However, as was recognised in the Cordoba Workshop, it would be important to reach a minimum level of consistency and be able to explain the differences. C. Ruiz-Lopez presented an overview of the development of international guidance from ICRP, IAEA and NEA from the Cordoba workshop up to now, and positions of independent National Advisory Bodies. The evolution of these guidelines over time demonstrates an evolving understanding of long-term implications, with the recognition that dose and risk constraints should not be seen as measures of detriment beyond a few hundred years, the emphasis on sound engineering practices, and the introduction of new concepts and approaches which take into account social and economical aspects (e.g. constrained optimisation, BAT, managerial principles). In its new recommendations, ICRP (draft 2006) recognizes. in particular, that decision making processes may depend on other societal concerns and considers

  19. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  20. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Hansen, Jonas; Roetter, Daniel Enrique Lucani; Krigslund, Jeppe

    2015-01-01

    Software defined networking has garnered large attention due to its potential to virtualize services in the Internet, introducing flexibility in the buffering, scheduling, processing, and routing of data in network routers. SDN breaks the deadlock that has kept Internet network protocols stagnant...... for decades, while applications and physical links have evolved. This article advocates for the use of SDN to bring about 5G network services by incorporating network coding (NC) functionalities. The latter constitutes a major leap forward compared to the state-of-the- art store and forward Internet paradigm...

  1. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  2. Change Impact Analysis of Crosscutting in Software Architectural Design

    NARCIS (Netherlands)

    van den Berg, Klaas

    2006-01-01

    Software architectures should be amenable to changes in user requirements and implementation technology. The analysis of the impact of these changes can be based on traceability of architectural design elements. Design elements have dependencies with other software artifacts but also evolve in time.

  3. Evolving Technologies: A View to Tomorrow

    Science.gov (United States)

    Tamarkin, Molly; Rodrigo, Shelley

    2011-01-01

    Technology leaders must participate in strategy creation as well as operational delivery within higher education institutions. The future of higher education--the view to tomorrow--is irrevocably integrated and intertwined with evolving technologies. This article focuses on two specific evolving technologies: (1) alternative IT sourcing; and (2)…

  4. Systematic profiling to monitor and specify the software refactoring process of the LHCb experiment

    CERN Document Server

    Couturier, Ben; Lohn, Stefan B

    2014-01-01

    The LHCb upgrade program implies a significant increase in data processing that will not be matched by additional computing resources. Furthermore, new architectures such as many-core platforms can currently not be fully exploited due to memory and I/O bandwidth limitations. A considerable refactoring effort will therefore be needed to vectorize and parallelize the LHCb software, to minimize hotspots and to reduce the impact of bottlenecks. It is crucial to guide refactoring with a profiling system that gives hints to regions in source-code for possible and necessary re-engineering and which kind of optimization could lead to final success. Software optimization is a sophisticated process where all parts, compiler, operating system, external libraries and chosen hardware play a role. Intended improvements can have different effects on different platforms. To obtain precise information of the general performance, to make profiles comparable, reproducible and to verify the progress of performance in the framewo...

  5. Evolvability Search: Directly Selecting for Evolvability in order to Study and Produce It

    DEFF Research Database (Denmark)

    Mengistu, Henok; Lehman, Joel Anthony; Clune, Jeff

    2016-01-01

    of evolvable digital phenotypes. Although some types of selection in evolutionary computation indirectly encourage evolvability, one unexplored possibility is to directly select for evolvability. To do so, we estimate an individual's future potential for diversity by calculating the behavioral diversity of its...... immediate offspring, and select organisms with increased offspring variation. While the technique is computationally expensive, we hypothesized that direct selection would better encourage evolvability than indirect methods. Experiments in two evolutionary robotics domains confirm this hypothesis: in both...... domains, such Evolvability Search produces solutions with higher evolvability than those produced with Novelty Search or traditional objective-based search algorithms. Further experiments demonstrate that the higher evolvability produced by Evolvability Search in a training environment also generalizes...

  6. Continuous software quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The software for the ATLAS experiment on the Large Hadron Collider at CERN has evolved over many years to meet the demands of Monte Carlo simulation, particle detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by an active worldwide developer community. In order to run the experiment software efficiently at hundreds of computing centres it is essential to maintain a high level of software quality standards. The methods proposed to improve software quality practices by incorporating checks into the new ATLAS software build infrastructure.

  7. A Smart Mobile Lab-on-Chip-Based Medical Diagnostics System Architecture Designed For Evolvability

    DEFF Research Database (Denmark)

    Patou, François; Dimaki, Maria; Svendsen, Winnie Edith

    2015-01-01

    for this work. We introduce a smart-mobile and LoC-based system architecture designed for evolvability. By propagating LoC programmability, instrumentation, and control tools to the highlevel abstraction smart-mobile software layer, our architecture facilitates the realisation of new use...

  8. Software Development and Test Methodology for a Distributed Ground System

    Science.gov (United States)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  9. ESTSC - Software Best Practices

    Science.gov (United States)

    DOE Scientific and Technical Software Best Practices December 2010 Table of Contents 1.0 Introduction 2.0 Responsibilities 2.1 OSTI/ESTSC 2.2 SIACs 2.3 Software Submitting Sites/Creators 2.4 Software Sensitivity Review 3.0 Software Announcement and Submission 3.1 STI Software Appropriate for Announcement 3.2

  10. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  11. The Impact of the Dimensions of the Administrative Decision Support Systems on the Re-engineering of the Systems of the Palestinian universities in Gaza Strip from the Employees’ Perspective

    Directory of Open Access Journals (Sweden)

    Mazen Jehad I. Al Shobaki

    2017-08-01

    Full Text Available This study aimed to identify the impact of the dimensions of the administrative decision support systems on the re-engineering of the systems of the Palestinian universities in Gaza Strip from the standpoint of employees. A descriptive approach was used through which a questionnaire was developed and distributed to a stratified random sample. (500 questionnaires were distributed and (449 were returned, with (89.8% response rate. The study revealed these results: There was an effect for the potentials (physical, human, technical, and organizational design available for the decision support systems and re-engineering of the systems in the Palestinian higher education institutions in Gaza Strip.There were significant differences between the assessment means of the study sample about the impact of decision support systems to re-engineer the systems in the Palestinian higher education institutions in Gaza Strip due to the gender variable in favor of males. There also differences due to the name of the university variable in favor of the Islamic University, Al Azhar University, Al Aqsa University, respectively. It was recommended that Palestinian higher education institutions which intend to start re-engineering the systems should be encouraged immediately start the process. These institutions should also develop the infrastructure of the decisions support systems when re-engineering their operations. Keywords: Decision support systems, Re-engineering, Palestinian higher education institutions.

  12. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  13. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  14. The Legacy of Space Shuttle Flight Software

    Science.gov (United States)

    Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.

    2011-01-01

    The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.

  15. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  16. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  17. Designing Process Improvement of Finished Good On Time Release and Performance Indicator Tool in Milk Industry Using Business Process Reengineering Method

    Science.gov (United States)

    Dachyar, M.; Christy, E.

    2014-04-01

    To maintain position as a major milk producer, the Indonesian milk industry should do some business development with the purpose of increasing customer service level. One strategy is to create on time release conditions for finished goods which will be distributed to customers and distributors. To achieve this condition, management information systems of finished goods on time release needs to be improved. The focus of this research is to conduct business process improvement using Business Process Reengineering (BPR). The deliverable key of this study is a comprehensive business strategy which is the solution of the root problems. To achieve the goal, evaluation, reengineering, and improvement of the ERP system are conducted. To visualize the predicted implementation, a simulation model is built by Oracle BPM. The output of this simulation showed that the proposed solution could effectively reduce the process lead time and increase the number of quality releases.

  18. Sex determination: ways to evolve a hermaphrodite.

    OpenAIRE

    Braendle , Christian; Félix , Marie-Anne

    2006-01-01

    Most species of the nematode genus Caenorhabditis reproduce through males and females; C. elegans and C. briggsae, however, produce self-fertile hermaphrodites instead of females. These transitions to hermaphroditism evolved convergently through distinct modifications of germline sex determination mechanisms.

  19. Marshal: Maintaining Evolving Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — SIFT proposes to design and develop the Marshal system, a mixed-initiative tool for maintaining task models over the course of evolving missions. Marshal-enabled...

  20. Satcom access in the Evolved Packet Core

    NARCIS (Netherlands)

    Cano Soveri, M.D.; Norp, A.H.J.; Popova, M.P.

    2011-01-01

    Satellite communications (Satcom) networks are increasingly integrating with terrestrial communications networks, namely Next Generation Networks (NGN). In the area of NGN the Evolved Packet Core (EPC) is a new network architecture that can support multiple access technologies. When Satcom is

  1. Satcom access in the evolved packet core

    NARCIS (Netherlands)

    Cano, M.D.; Norp, A.H.J.; Popova, M.P.

    2012-01-01

    Satellite communications (Satcom) networks are increasingly integrating with terrestrial communications networks, namely Next Generation Networks (NGN). In the area of NGN the Evolved Packet Core (EPC) is a new network architecture that can support multiple access technologies. When Satcom is

  2. Evolving effective incremental SAT solvers with GP

    OpenAIRE

    Bader, Mohamed; Poli, R.

    2008-01-01

    Hyper-Heuristics could simply be defined as heuristics to choose other heuristics, and it is a way of combining existing heuristics to generate new ones. In a Hyper-Heuristic framework, the framework is used for evolving effective incremental (Inc*) solvers for SAT. We test the evolved heuristics (IncHH) against other known local search heuristics on a variety of benchmark SAT problems.

  3. Ragnarok: An Architecture Based Software Development Environment

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    of the development process. The main contributions presented in the thesis have evolved from work with two of the hypotheses: These address the problems of management of evolution, and overview, comprehension and navigation respectively. The first main contribution is the Architectural Software Configuration...... Management Model: A software configuration management model where the abstractions and hierarchy of the logical aspect of software architecture forms the basis for version control and configuration management. The second main contribution is the Geographic Space Architecture Visualisation Model......: A visualisation model where entities in a software architecture are organised geographically in a two-dimensional plane, their visual appearance determined by processing a subset of the data in the entities, and interaction with the project's underlying data performed by direct manipulation of the landscape...

  4. Radiobiology software for educational purpose

    International Nuclear Information System (INIS)

    Pandey, A.K.; Sharma, S.K.; Kumar, R.; Bal, C.S.; Nair, O.; Haresh, K.P.; Julka, P.K.

    2014-01-01

    To understand radio-nuclide therapy and the basis of radiation protection, it is essential to understand radiobiology. With limited time for classroom teaching and limited time and resources for radiobiology experiments students do not acquire firm grasp of theoretical mathematical models and experimental knowledge of target theory and Linear quadratic models that explain nature of cell survival curves. We believe that this issue might be addressed with numerical simulation of cell survival curves using mathematical models. Existing classroom teaching can be reoriented to understand the subject using the concept of modeling, simulation and virtual experiments. After completion of the lecture, students can practice with simulation tool at their convenient time. In this study we have developed software that can help the students to acquire firm grasp of theoretical and experimental radiobiology. The software was developed using FreeMat ver 4.0, open source software. Target theory, linear quadratic model, cell killing based on Poisson model have been included. The implementation of the program structure was to display the menu for the user choice to be made and then program flows depending on the users choice. The program executes by typing 'Radiobiology' on the command line interface. Students can investigate the effect of radiation dose on cell, interactively. They can practice to draw the cell survival curve based on the input and output data and they can also compare their handmade graphs with automatically generated graphs by the program. This software is in the early stage of development and will evolve on user feedback. We feel this simulation software will be quite useful for students entering in the nuclear medicine, radiology and radiotherapy disciplines. (author)

  5. Swarming Robot Design, Construction and Software Implementation

    Science.gov (United States)

    Stolleis, Karl A.

    2014-01-01

    In this paper is presented an overview of the hardware design, construction overview, software design and software implementation for a small, low-cost robot to be used for swarming robot development. In addition to the work done on the robot, a full simulation of the robotic system was developed using Robot Operating System (ROS) and its associated simulation. The eventual use of the robots will be exploration of evolving behaviors via genetic algorithms and builds on the work done at the University of New Mexico Biological Computation Lab.

  6. Modeling Student Software Testing Processes: Attitudes, Behaviors, Interventions, and Their Effects

    Science.gov (United States)

    Buffardi, Kevin John

    2014-01-01

    Effective software testing identifies potential bugs and helps correct them, producing more reliable and maintainable software. As software development processes have evolved, incremental testing techniques have grown in popularity, particularly with introduction of test-driven development (TDD). However, many programmers struggle to adopt TDD's…

  7. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  8. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  9. Ensuring Software IP Cleanliness

    OpenAIRE

    Mahshad Koohgoli; Richard Mayer

    2007-01-01

    At many points in the life of a software enterprise, determination of intellectual property (IP) cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  10. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  11. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  12. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  13. Software - Naval Oceanography Portal

    Science.gov (United States)

    are here: Home › USNO › Earth Orientation › Software USNO Logo USNO Navigation Earth Orientation Products GPS-based Products VLBI-based Products EO Information Center Publications about Products Software Search databases Auxiliary Software Supporting Software Form Folder Earth Orientation Matrix Calculator

  14. Software Engineering Education Directory

    Science.gov (United States)

    1990-04-01

    and Engineering (CMSC 735) Codes: GPEV2 * Textiooks: IEEE Tutoria on Models and Metrics for Software Management and Engameeing by Basi, Victor R...Software Engineering (Comp 227) Codes: GPRY5 Textbooks: IEEE Tutoria on Software Design Techniques by Freeman, Peter and Wasserman, Anthony 1. Software

  15. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  16. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  17. Agile software assessment

    OpenAIRE

    Nierstrasz Oscar; Lungu Mircea

    2012-01-01

    Informed decision making is a critical activity in software development but it is poorly supported by common development environments which focus mainly on low level programming tasks. We posit the need for agile software assessment which aims to support decision making by enabling rapid and effective construction of software models and custom analyses. Agile software assessment entails gathering and exploiting the broader context of software information related to the system at hand as well ...

  18. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  19. Delineating slowly and rapidly evolving fractions of the Drosophila genome.

    Science.gov (United States)

    Keith, Jonathan M; Adams, Peter; Stephen, Stuart; Mattick, John S

    2008-05-01

    Evolutionary conservation is an important indicator of function and a major component of bioinformatic methods to identify non-protein-coding genes. We present a new Bayesian method for segmenting pairwise alignments of eukaryotic genomes while simultaneously classifying segments into slowly and rapidly evolving fractions. We also describe an information criterion similar to the Akaike Information Criterion (AIC) for determining the number of classes. Working with pairwise alignments enables detection of differences in conservation patterns among closely related species. We analyzed three whole-genome and three partial-genome pairwise alignments among eight Drosophila species. Three distinct classes of conservation level were detected. Sequences comprising the most slowly evolving component were consistent across a range of species pairs, and constituted approximately 62-66% of the D. melanogaster genome. Almost all (>90%) of the aligned protein-coding sequence is in this fraction, suggesting much of it (comprising the majority of the Drosophila genome, including approximately 56% of non-protein-coding sequences) is functional. The size and content of the most rapidly evolving component was species dependent, and varied from 1.6% to 4.8%. This fraction is also enriched for protein-coding sequence (while containing significant amounts of non-protein-coding sequence), suggesting it is under positive selection. We also classified segments according to conservation and GC content simultaneously. This analysis identified numerous sub-classes of those identified on the basis of conservation alone, but was nevertheless consistent with that classification. Software, data, and results available at www.maths.qut.edu.au/-keithj/. Genomic segments comprising the conservation classes available in BED format.

  20. Software Re-Engineering of the Human Factors Analysis and Classification System - (Maintenance Extension) Using Object Oriented Methods in a Microsoft Environment

    Science.gov (United States)

    2001-09-01

    replication) -- all from Visual Basic and VBA . In fact, we found that the SQL Server engine actually had a plethora of options, most formidable of...2002, the new SQL Server 2000 database engine, and Microsoft Visual Basic.NET. This thesis describes our use of the Spiral Development Model to...versions of Microsoft products? Specifically, the pending release of Microsoft Office 2002, the new SQL Server 2000 database engine, and Microsoft

  1. Proposing an Evidence-Based Strategy for Software Requirements Engineering.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2016-01-01

    This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries.

  2. Storage system software solutions for high-end user needs

    Science.gov (United States)

    Hogan, Carole B.

    1992-01-01

    Today's high-end storage user is one that requires rapid access to a reliable terabyte-capacity storage system running in a distributed environment. This paper discusses conventional storage system software and concludes that this software, designed for other purposes, cannot meet high-end storage requirements. The paper also reviews the philosophy and design of evolving storage system software. It concludes that this new software, designed with high-end requirements in mind, provides the potential for solving not only the storage needs of today but those of the foreseeable future as well.

  3. Tracking code patterns over multiple software versions with Herodotos

    DEFF Research Database (Denmark)

    Palix, Nicolas Jean-Michel; Lawall, Julia; Muller, Gilles

    2010-01-01

    An important element of understanding a software code base is to identify the repetitive patterns of code it contains and how these evolve over time. Some patterns are useful to the software, and may be modularized. Others are detrimental to the software, such as patterns that represent defects...... pattern occurrences over multiple versions of a software project, independent of other changes in the source files. Guided by a user-provided configuration file, Herodotos builds various graphs showing the evolution of the pattern occurrences and computes some statistics. We have evaluated this approach...

  4. Next-generation business intelligence software with Silverlight 3

    CERN Document Server

    Czernicki, Bart

    2010-01-01

    Business Intelligence (BI) software is the code and tools that allow you to view different components of a business using a single visual platform, making comprehending mountains of data easier. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI applications. Currently, we are in the second generation of BI software, called BI 2.0. This generation is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increas

  5. Development of Flexible Software Process Lines with Variability Operations

    DEFF Research Database (Denmark)

    Schramm, Joachim; Dohrmann, Patrick; Kuhrmann, Marco

    2015-01-01

    families of processes and, as part of this, variability operations provide means to modify and reuse pre-defined process assets. Objective: Our goal is to evaluate the feasibility of variability operations to support the development of flexible software process lines. Method: We conducted a longitudinal......Context: Software processes evolve over time and several approaches were proposed to support the required flexibility. Yet, little is known whether these approaches sufficiently support the development of large software processes. A software process line helps to systematically develop and manage...

  6. Dynamic Capabilities and Project Management in Small Software Companies

    DEFF Research Database (Denmark)

    Nørbjerg, Jacob; Nielsen, Peter Axel; Persson, John Stouby

    2017-01-01

    A small software company depends on its capability to adapt to rapid technological and other changes in its environment—its dynamic capabilities. In this paper, we argue that to evolve and maintain its dynamic capabilities a small software company must pay attention to the interaction between...... dynamic capabilities at different levels of the company — particularly between the project management and the company levels. We present a case study of a small software company and show how successful dynamic capabilities at the company level can affect project management in small software companies...

  7. Summary of the International Conference on Software and System Processes

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; O'Connor, Rory V.; Perry, Dewayne E.

    2016-01-01

    The International Conference on Software and Systems Process (ICSSP), continuing the success of Software Process Workshop (SPW), the Software Process Modeling and Simulation Workshop (ProSim) and the International Conference on Software Process (ICSP) conference series, has become the established...... premier event in the field of software and systems engineering processes. It provides a leading forum for the exchange of research outcomes and industrial best-practices in process development from software and systems disciplines. ICSSP 2016 was held in Austin, Texas, from 14-15 May 2016, co......-located with the 38th International Conference on Software Engineering (ICSE). The theme of mICSSP 2016 was studying "Process(es) in Action" by recognizing that the AS-Planned and AS-Practiced processes can be quite different in many ways including their ows, their complexity and the evolving needs of stakeholders...

  8. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  9. Re-engineering therapeutic antibodies for Alzheimer's disease as blood-brain barrier penetrating bi-specific antibodies.

    Science.gov (United States)

    Pardridge, William M

    2016-12-01

    Therapeutic antibodies are large molecule drugs that do not cross the blood-brain barrier (BBB). Therefore, drug development of therapeutic antibodies for Alzheimer's disease (AD) requires that these molecules be re-engineered to enable BBB delivery. This is possible by joining the therapeutic antibody with a transporter antibody, resulting in the engineering of a BBB-penetrating bispecific antibody (BSA). Areas covered: The manuscript covers transporter antibodies that cross the BBB via receptor-mediated transport systems on the BBB, such as the insulin receptor or transferrin receptor. Furthermore, it highlights therapeutic antibodies for AD that target the Abeta amyloid peptide, beta secretase-1, or the metabotropic glutamate receptor-1. BSAs are comprised of both the transporter antibody and the therapeutic antibody, as well as IgG constant region, which can induce immune tolerance or trigger transport via Fc receptors. Expert opinion: Multiple types of BSA molecular designs have been used to engineer BBB-penetrating BSAs, which differ in valency and spatial orientation of the transporter and therapeutic domains of the BSA. The plasma pharmacokinetics and dosing regimens of BSAs differ from that of conventional therapeutic antibodies. BBB-penetrating BSAs may be engineered in the future as new treatments of AD, as well as other neural disorders.

  10. Conversion of Sox17 into a pluripotency reprogramming factor by reengineering its association with Oct4 on DNA.

    Science.gov (United States)

    Jauch, Ralf; Aksoy, Irene; Hutchins, Andrew Paul; Ng, Calista Keow Leng; Tian, Xian Feng; Chen, Jiaxuan; Palasingam, Paaventhan; Robson, Paul; Stanton, Lawrence W; Kolatkar, Prasanna R

    2011-06-01

    Very few proteins are capable to induce pluripotent stem (iPS) cells and their biochemical uniqueness remains unexplained. For example, Sox2 cooperates with other transcription factors to generate iPS cells, but Sox17, despite binding to similar DNA sequences, cannot. Here, we show that Sox2 and Sox17 exhibit inverse heterodimerization preferences with Oct4 on the canonical versus a newly identified compressed sox/oct motif. We can swap the cooperativity profiles of Sox2 and Sox17 by exchanging single amino acids at the Oct4 interaction interface resulting in Sox2KE and Sox17EK proteins. The reengineered Sox17EK now promotes reprogramming of somatic cells to iPS, whereas Sox2KE has lost this potential. Consistently, when Sox2KE is overexpressed in embryonic stem cells it forces endoderm differentiation similar to wild-type Sox17. Together, we demonstrate that strategic point mutations that facilitate Sox/Oct4 dimer formation on variant DNA motifs lead to a dramatic swap of the bioactivities of Sox2 and Sox17. Copyright © 2011 AlphaMed Press.

  11. Business process re-engineering in the logistics industry: a study of implementation, success factors, and performance

    Science.gov (United States)

    Shen, Chien-wen; Chou, Ching-Chih

    2010-02-01

    As business process re-engineering (BPR) is an important foundation to ensure the success of enterprise systems, this study would like to investigate the relationships among BPR implementation, BPR success factors, and business performance for logistics companies. Our empirical findings show that BPR companies outperformed non-BPR companies, not only on information processing, technology applications, organisational structure, and co-ordination, but also on all of the major logistics operations. Comparing the different perceptions of the success factors for BPR, non-BPR companies place greater emphasis on the importance of employee involvement while BPR companies are more concerned about the influence of risk management. Our findings also suggest that management attitude towards BPR success factors could affect performance with regard to technology applications and logistics operations. Logistics companies which have not yet implemented the BPR approach could refer to our findings to evaluate the advantages of such an undertaking and to take care of those BPR success factors affecting performance before conducting BPR projects.

  12. A measurement system for large, complex software programs

    Science.gov (United States)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  13. Evolving Intelligent Systems Methodology and Applications

    CERN Document Server

    Angelov, Plamen; Kasabov, Nik

    2010-01-01

    From theory to techniques, the first all-in-one resource for EIS. There is a clear demand in advanced process industries, defense, and Internet and communication (VoIP) applications for intelligent yet adaptive/evolving systems. Evolving Intelligent Systems is the first self- contained volume that covers this newly established concept in its entirety, from a systematic methodology to case studies to industrial applications. Featuring chapters written by leading world experts, it addresses the progress, trends, and major achievements in this emerging research field, with a strong emphasis on th

  14. Interactively Evolving Compositional Sound Synthesis Networks

    DEFF Research Database (Denmark)

    Jónsson, Björn Þór; Hoover, Amy K.; Risi, Sebastian

    2015-01-01

    the space of potential sounds that can be generated through such compositional sound synthesis networks (CSSNs). To study the effect of evolution on subjective appreciation, participants in a listener study ranked evolved timbres by personal preference, resulting in preferences skewed toward the first......While the success of electronic music often relies on the uniqueness and quality of selected timbres, many musicians struggle with complicated and expensive equipment and techniques to create their desired sounds. Instead, this paper presents a technique for producing novel timbres that are evolved...

  15. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  16. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  17. On the Benefits of Divergent Search for Evolved Representations

    DEFF Research Database (Denmark)

    Lehman, Joel; Risi, Sebastian; Stanley, Kenneth O

    2012-01-01

    Evolved representations in evolutionary computation are often fragile, which can impede representation-dependent mechanisms such as self-adaptation. In contrast, evolved representations in nature are robust, evolvable, and creatively exploit available representational features. This paper provide...

  18. Preface: evolving rotifers, evolving science: Proceedings of the XIV International Rotifer Symposium

    Czech Academy of Sciences Publication Activity Database

    Devetter, Miloslav; Fontaneto, D.; Jersabek, Ch.D.; Welch, D.B.M.; May, L.; Walsh, E.J.

    2017-01-01

    Roč. 796, č. 1 (2017), s. 1-6 ISSN 0018-8158 Institutional support: RVO:60077344 Keywords : evolving rotifers * 14th International Rotifer Symposium * evolving science Subject RIV: EG - Zoology OBOR OECD: Zoology Impact factor: 2.056, year: 2016

  19. Investigating interoperability of the LSST data management software stack with Astropy

    Science.gov (United States)

    Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.

  20. Views on Evolvability of Embedded Systems

    NARCIS (Netherlands)

    Laar, P. van de; Punter, T.

    2011-01-01

    Evolvability, the ability to respond effectively to change, represents a major challenge to today's high-end embedded systems, such as those developed in the medical domain by Philips Healthcare. These systems are typically developed by multi-disciplinary teams, located around the world, and are in

  1. Views on evolvability of embedded systems

    NARCIS (Netherlands)

    Laar, van de P.J.L.J.; Punter, H.T.

    2011-01-01

    Evolvability, the ability to respond effectively to change, represents a major challenge to today's high-end embedded systems, such as those developed in the medical domain by Philips Healthcare. These systems are typically developed by multi-disciplinary teams, located around the world, and are in

  2. Designing Garments to Evolve Over Time

    DEFF Research Database (Denmark)

    Riisberg, Vibeke; Grose, Lynda

    2017-01-01

    This paper proposes a REDO of the current fashion paradigm by investigating how garments might be designed to evolve over time. The purpose is to discuss ways of expanding the traditional role of the designer to include temporal dimensions of creating, producing and using clothes and to suggest...... to a REDO of design education, to further research and the future fashion and textile industry....

  3. EVOLVING AN EMPIRICAL METHODOLOGY DOR DETERMINING ...

    African Journals Online (AJOL)

    The uniqueness of this approach, is that it can be applied to any forest or dynamic feature on the earth, and can enjoy universal application as well. KEY WORDS: Evolving empirical methodology, innovative mathematical model, appropriate interval, remote sensing, forest environment planning and management. Global Jnl ...

  4. Continual Learning through Evolvable Neural Turing Machines

    DEFF Research Database (Denmark)

    Lüders, Benno; Schläger, Mikkel; Risi, Sebastian

    2016-01-01

    Continual learning, i.e. the ability to sequentially learn tasks without catastrophic forgetting of previously learned ones, is an important open challenge in machine learning. In this paper we take a step in this direction by showing that the recently proposed Evolving Neural Turing Machine (ENTM...

  5. Did Language Evolve Like the Vertebrate Eye?

    Science.gov (United States)

    Botha, Rudolf P.

    2002-01-01

    Offers a critical appraisal of the way in which the idea that human language or some of its features evolved like the vertebrate eye by natural selection is articulated in Pinker and Bloom's (1990) selectionist account of language evolution. Argues that this account is less than insightful because it fails to draw some of the conceptual…

  6. Lightweight Methods for Effective Verification of Software Product Lines with Off-the-Shelf Tools

    DEFF Research Database (Denmark)

    Iosif-Lazar, Alexandru Florin

    Certification is the process of assessing the quality of a product and whether it meets a set of requirements and adheres to functional and safety standards. I is often legally required to provide guarantee for human safety and to make the product available on the market. The certification process...... relies on objective evidence of quality, which is produced by using qualified and state-of-the-art tools and verification and validation techniques. Software product line (SPL) engineering distributes costs among similar products that are developed simultaneously. However, SPL certification faces major...... SPL reengineering projects that involve complex source code transformations. To facilitate product (re)certification, the transformation must preserve certain qualitative properties such as code structure and semantics—a difficult task due to the complexity of the transformation and because certain...

  7. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  8. Spotting software errors sooner

    International Nuclear Information System (INIS)

    Munro, D.

    1989-01-01

    Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)

  9. Avionics and Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the AES Avionics and Software (A&S) project is to develop a reference avionics and software architecture that is based on standards and that can be...

  10. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  11. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  12. Process mining software repositories

    NARCIS (Netherlands)

    Poncin, W.; Serebrenik, A.; Brand, van den M.G.J.

    2011-01-01

    Software developers' activities are in general recorded in software repositories such as version control systems, bug trackers and mail archives. While abundant information is usually present in such repositories, successful information extraction is often challenged by the necessity to

  13. Optimization of Antivirus Software

    OpenAIRE

    Catalin BOJA; Adrian VISOIU

    2007-01-01

    The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyze...

  14. Open Source Software Development

    Science.gov (United States)

    2011-01-01

    appropriate to refer to FOSS or FLOSS (L for Libre , where the alternative term “ libre software ” has popularity in some parts of the world) in order...Applying Social Network Analysis to Community-Drive Libre Software Projects, Intern. J. Info. Tech. and Web Engineering, 2006, 1(3), 27-28. 17...Open Source Software Development* Walt Scacchi Institute for Software Researcher University of California, Irvine Irvine, CA 92697-3455 USA Abstract

  15. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  16. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  17. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  18. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  19. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  20. XES Software Communication Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  1. Neutron Scattering Software

    Science.gov (United States)

    Home Page | Facilities | Reference | Software | Conferences | Announcements | Mailing Lists Neutron Scattering Banner Neutron Scattering Software A new portal for neutron scattering has just been established sets KUPLOT: data plotting and fitting software ILL/TAS: Matlab probrams for analyzing triple axis data

  2. XES Software Event Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  3. ARC Software and Models

    Science.gov (United States)

    Archives RESEARCH ▼ Research Areas Ongoing Projects Completed Projects SOFTWARE CONTACT ▼ Primary Contacts Researchers External Link MLibrary Deep Blue Software Archive Most research conducted at the ARC produce software code and methodologies that are transferred to TARDEC and industry partners. These

  4. XES Software Telemetry Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  5. Specifications in software prototyping

    OpenAIRE

    Luqi; Chang, Carl K.; Zhu, Hong

    1998-01-01

    We explore the use of software speci®cations for software prototyping. This paper describes a process model for software prototyping, and shows how specifications can be used to support such a process via a cellular mobile phone switch example.

  6. Software Engineering for Portability.

    Science.gov (United States)

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  7. Software Acquisition and Software Engineering Best Practices

    National Research Council Canada - National Science Library

    Eslinger, S

    1999-01-01

    The purpose of this white paper is to address the issues raised in the recently published Senate Armed Services Committee Report 106-50 concerning Software Management Improvements for the Department of Defense (DoD...

  8. El CAD en la actividad de reingeniería e ingeniería en los mantenimientos a centrales eléctricas // CAD in the reengineering and engineering activity in maintenance of power plants

    Directory of Open Access Journals (Sweden)

    R. García Ramírez

    2000-07-01

    Full Text Available El presente trabajo muestra algunas experiencias obtenidas en la actividad de ingeniería y reingeniería durante elmantenimiento a centrales eléctricas con empleo del CAD (Computer Aided Design, se muestran además las estrategiasseguidas con vistas a automatizar la actividad de reingeniería en ordenadores y a lograr mejoras económicas en la actividada costa de disminuir los costos de producción.Palabras claves: CAD, reingeniería, mantenimiento de calderas._________________________________________________________________________AbstractThe present work shows some experiences obtained in the engineering and reengineering during the maintenance activity inpower plants carried out by our company applying the CAD (Computer Aided Design, it is also exposed the strategyfollowed with a view to automating the reengineering activity with the use of computers, keeping in mind a view to achieveconomic improvements in the activity to diminish production costs.Key words: reengineering, CAD, maintenance, boiler.

  9. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  10. Reconfigurable network systems and software-defined networking

    OpenAIRE

    Zilberman, N.; Watts, P. M.; Rotsos, C.; Moore, A. W.

    2015-01-01

    Modern high-speed networks have evolved from relatively static networks to highly adaptive networks facilitating dynamic reconfiguration. This evolution has influenced all levels of network design and management, introducing increased programmability and configuration flexibility. This influence has extended from the lowest level of physical hardware interfaces to the highest level of network management by software. A key representative of this evolution is the emergence of software-defined n...

  11. An Implementation Methodology and Software Tool for an Entropy Based Engineering Model for Evolving Systems

    Science.gov (United States)

    2003-06-01

    delivery Data Access (1980s) "What were unit sales in New England last March?" Relational databases (RDBMS), Structured Query Language ( SQL ...macros written in Visual Basic for Applications ( VBA ). 32 Iteration Two: Class Diagram Tech OASIS Export ScriptImport Filter Data ProcessingMethod 1...MS Excel * 1 VBA Macro*1 contains sends data to co nt ai ns executes * * 1 1 contains contains Figure 20. Iteration two class diagram The

  12. Evolving a Simulation Model Product Line Software Architecture from Heterogeneous Model Representations

    Science.gov (United States)

    2003-09-01

    large image database [ WWF +98], and the Object-Oriented Method for Interoperability (OOMI) [You02c]. The [You02c] solution is a federated...scientific programs, or cooperative public-private partnerships . NASA led the effort to collect data from space-based sensors, and continues to this day...international scientific programs, or cooperative public- private partnerships . However, remotely sensed data from space may be difficult to share based on

  13. Role of performance measures in reengineering U.S. Department of Energy's management of environmental management programs

    International Nuclear Information System (INIS)

    Murthy, K.S.; Harroun, W.P.

    1996-01-01

    The Rocky Flats Environmental Technology Site (Rocky Flats) contributed to America's defense up to the end of the Cold War. It is one of several large US Department of Energy (DOE) nuclear industrial facilities, currently undergoing cleanup and closure. The Site was constructed in a sparsely populated area along the Rocky Mountain Foothills, near Denver, in 1952. In the 45 years since, Denver has grown to a major metropolitan area. Over 2 million people live within the Site's 50-mile radius. The Site is directly upstream of water supplies that serve over 300,000 people. As a result, accelerated cleanup, consolidation, reuse, and closure of the Site are the current essentials. The Site has had three management and operating (M and O) contractors since inception. In keeping with the shift in the Site's paradigm from one of weapon-parts production program to cleanup and closure project, DOE changed its contracting philosophy for the Site from the M and O type of contract to a Performance-based Incentive Fee Integrating Management contract (PBIF IMC). Doe selected the Site's fourth contractor as an IMC contractor in July 1995. Kaiser-Hill Company L.L.C. was awarded the contract and assumed IMC responsibility for the Site on July 1, 1995. Integral to this contract is the establishment and implementation of a performance measures system. Performance measures are the bases for incentives that motivate the IMC and the subcontractors working at Rocky Flats. This paper provides an overview of Performance Measures system practiced at Rocky Flats from July 1995 to December 1995. Also described are the developments in reengineering during the July 1995--March 1996 interval

  14. Licensing safety critical software

    International Nuclear Information System (INIS)

    Archinoff, G.H.; Brown, R.A.

    1990-01-01

    Licensing difficulties with the shutdown system software at the Darlington Nuclear Generating Station contributed to delays in starting up the station. Even though the station has now been given approval by the Atomic Energy Control Board (AECB) to operate, the software issue has not disappeared - Ontario Hydro has been instructed by the AECB to redesign the software. This article attempts to explain why software based shutdown systems were chosen for Darlington, why there was so much difficulty licensing them, and what the implications are for other safety related software based applications

  15. Continuous Software Quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The regular application of software quality tools in large collaborative projects is required to reduce code defects to an acceptable level. If left unchecked the accumulation of defects invariably results in performance degradation at scale and problems with the long-term maintainability of the code. Although software quality tools are effective for identification there remains a non-trivial sociological challenge to resolve defects in a timely manner. This is a ongoing concern for the ATLAS software which has evolved over many years to meet the demands of Monte Carlo simulation, detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by a community of hundreds of developers worldwide. It is therefore preferable to address code defects before they are introduced into a widely used software release. Recent wholesale changes to the ATLAS software infrastructure have provided an ideal opportunity to apply software quali...

  16. The evolving definition of systemic arterial hypertension.

    Science.gov (United States)

    Ram, C Venkata S; Giles, Thomas D

    2010-05-01

    Systemic hypertension is an important risk factor for premature cardiovascular disease. Hypertension also contributes to excessive morbidity and mortality. Whereas excellent therapeutic options are available to treat hypertension, there is an unsettled issue about the very definition of hypertension. At what level of blood pressure should we treat hypertension? Does the definition of hypertension change in the presence of co-morbid conditions? This article covers in detail the evolving concepts in the diagnosis and management of hypertension.

  17. The evolving epidemiology of inflammatory bowel disease.

    LENUS (Irish Health Repository)

    Shanahan, Fergus

    2009-07-01

    Epidemiologic studies in inflammatory bowel disease (IBD) include assessments of disease burden and evolving patterns of disease presentation. Although it is hoped that sound epidemiologic studies provide aetiological clues, traditional risk factor-based epidemiology has provided limited insights into either Crohn\\'s disease or ulcerative colitis etiopathogenesis. In this update, we will summarize how the changing epidemiology of IBD associated with modernization can be reconciled with current concepts of disease mechanisms and will discuss studies of clinically significant comorbidity in IBD.

  18. Development and the evolvability of human limbs

    OpenAIRE

    Young, Nathan M.; Wagner, Günter P.; Hallgrímsson, Benedikt

    2010-01-01

    The long legs and short arms of humans are distinctive for a primate, the result of selection acting in opposite directions on each limb at different points in our evolutionary history. This mosaic pattern challenges our understanding of the relationship of development and evolvability because limbs are serially homologous and genetic correlations should act as a significant constraint on their independent evolution. Here we test a developmental model of limb covariation in anthropoid primate...

  19. Quantum games on evolving random networks

    OpenAIRE

    Pawela, Łukasz

    2015-01-01

    We study the advantages of quantum strategies in evolutionary social dilemmas on evolving random networks. We focus our study on the two-player games: prisoner's dilemma, snowdrift and stag-hunt games. The obtained result show the benefits of quantum strategies for the prisoner's dilemma game. For the other two games, we obtain regions of parameters where the quantum strategies dominate, as well as regions where the classical strategies coexist.

  20. The Evolving Leadership Path of Visual Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Kluse, Michael; Peurrung, Anthony J.; Gracio, Deborah K.

    2012-01-02

    This is a requested book chapter for an internationally authored book on visual analytics and related fields, coordianted by a UK university and to be published by Springer in 2012. This chapter is an overview of the leadship strategies that PNNL's Jim Thomas and other stakeholders used to establish visual analytics as a field, and how those strategies may evolve in the future.

  1. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  2. Software Validation in ATLAS

    International Nuclear Information System (INIS)

    Hodgkinson, Mark; Seuster, Rolf; Simmons, Brinick; Sherwood, Peter; Rousseau, David

    2012-01-01

    The ATLAS collaboration operates an extensive set of protocols to validate the quality of the offline software in a timely manner. This is essential in order to process the large amounts of data being collected by the ATLAS detector in 2011 without complications on the offline software side. We will discuss a number of different strategies used to validate the ATLAS offline software; running the ATLAS framework software, Athena, in a variety of configurations daily on each nightly build via the ATLAS Nightly System (ATN) and Run Time Tester (RTT) systems; the monitoring of these tests and checking the compilation of the software via distributed teams of rotating shifters; monitoring of and follow up on bug reports by the shifter teams and periodic software cleaning weeks to improve the quality of the offline software further.

  3. Evolving artificial metalloenzymes via random mutagenesis

    Science.gov (United States)

    Yang, Hao; Swartz, Alan M.; Park, Hyun June; Srivastava, Poonam; Ellis-Guardiola, Ken; Upp, David M.; Lee, Gihoon; Belsare, Ketaki; Gu, Yifan; Zhang, Chen; Moellering, Raymond E.; Lewis, Jared C.

    2018-03-01

    Random mutagenesis has the potential to optimize the efficiency and selectivity of protein catalysts without requiring detailed knowledge of protein structure; however, introducing synthetic metal cofactors complicates the expression and screening of enzyme libraries, and activity arising from free cofactor must be eliminated. Here we report an efficient platform to create and screen libraries of artificial metalloenzymes (ArMs) via random mutagenesis, which we use to evolve highly selective dirhodium cyclopropanases. Error-prone PCR and combinatorial codon mutagenesis enabled multiplexed analysis of random mutations, including at sites distal to the putative ArM active site that are difficult to identify using targeted mutagenesis approaches. Variants that exhibited significantly improved selectivity for each of the cyclopropane product enantiomers were identified, and higher activity than previously reported ArM cyclopropanases obtained via targeted mutagenesis was also observed. This improved selectivity carried over to other dirhodium-catalysed transformations, including N-H, S-H and Si-H insertion, demonstrating that ArMs evolved for one reaction can serve as starting points to evolve catalysts for others.

  4. CMIP6 Data Citation of Evolving Data

    Directory of Open Access Journals (Sweden)

    Martina Stockhause

    2017-06-01

    Full Text Available Data citations have become widely accepted. Technical infrastructures as well as principles and recommendations for data citation are in place but best practices or guidelines for their implementation are not yet available. On the other hand, the scientific climate community requests early citations on evolving data for credit, e.g. for CMIP6 (Coupled Model Intercomparison Project Phase 6. The data citation concept for CMIP6 is presented. The main challenges lie in limited resources, a strict project timeline and the dependency on changes of the data dissemination infrastructure ESGF (Earth System Grid Federation to meet the data citation requirements. Therefore a pragmatic, flexible and extendible approach for the CMIP6 data citation service was developed, consisting of a citation for the full evolving data superset and a data cart approach for citing the concrete used data subset. This two citation approach can be implemented according to the RDA recommendations for evolving data. Because of resource constraints and missing project policies, the implementation of the second part of the citation concept is postponed to CMIP7.

  5. Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications

    Science.gov (United States)

    Patou, François; AlZahra’a Alatraktchi, Fatima; Kjægaard, Claus; Dimaki, Maria; Madsen, Jan; Svendsen, Winnie E.

    2016-01-01

    The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods. PMID:27598208

  6. Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications.

    Science.gov (United States)

    Patou, François; AlZahra'a Alatraktchi, Fatima; Kjægaard, Claus; Dimaki, Maria; Madsen, Jan; Svendsen, Winnie E

    2016-09-03

    The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods.

  7. Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications

    Directory of Open Access Journals (Sweden)

    François Patou

    2016-09-01

    Full Text Available The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods.

  8. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  9. Science and Software

    Science.gov (United States)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  10. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    Software Maintenance and Evolution: The Implication for Software Development. ... Software maintenance is the process of modifying existing operational software by correcting errors, ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  11. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges...

  12. Intrinsic Motivation versus Signaling in Open Source Software Development

    DEFF Research Database (Denmark)

    Bitzer, J; Schrettl, W; Schröder, P

    This papers sheds light on the puzzling fact that even though open source software (OSS) is a public good, it is developed for free by highly qualified, young, motivated individuals, and evolves at a rapid pace. We show that when OSS development is understood as the private provision of a public...

  13. Status of REBUS fuel management software development for RERTR applications

    International Nuclear Information System (INIS)

    Olson, Arne P.

    2000-01-01

    The REBUS-5 burnup code has evolved substantially in order to meet the needs of the ANL RERTR Program. This paper presents a summary of the past changes and improvements in the capabilities of this software, and also identifies future plans. (author)

  14. Software engineering in industry

    Science.gov (United States)

    Story, C. M.

    1989-12-01

    Can software be "engineered"? Can a few people with limited resources and a negligible budget produce high quality software solutions to complex software problems? It is possible to resolve the conflict between research activities and the necessity to view software development as a means to an end rather than as an end in itself? The aim of this paper is to encourage further thought and discussion on various topics which, in the author's experience, are becoming increasingly critical in large current software production and development projects, inside and outside high energy physics (HEP). This is done by briefly exploring some of the software engineering ideas and technologies now used in the information industry, using, as a case-study, a project with many similarities to those currently under way in HEP.

  15. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection.

    Science.gov (United States)

    Janković, Srdja; Ćirković, Milan M

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.

  16. Revealing evolved massive stars with Spitzer

    Science.gov (United States)

    Gvaramadze, V. V.; Kniazev, A. Y.; Fabrika, S.

    2010-06-01

    Massive evolved stars lose a large fraction of their mass via copious stellar wind or instant outbursts. During certain evolutionary phases, they can be identified by the presence of their circumstellar nebulae. In this paper, we present the results of a search for compact nebulae (reminiscent of circumstellar nebulae around evolved massive stars) using archival 24-μm data obtained with the Multiband Imaging Photometer for Spitzer. We have discovered 115 nebulae, most of which bear a striking resemblance to the circumstellar nebulae associated with luminous blue variables (LBVs) and late WN-type (WNL) Wolf-Rayet (WR) stars in the Milky Way and the Large Magellanic Cloud (LMC). We interpret this similarity as an indication that the central stars of detected nebulae are either LBVs or related evolved massive stars. Our interpretation is supported by follow-up spectroscopy of two dozen of these central stars, most of which turn out to be either candidate LBVs (cLBVs), blue supergiants or WNL stars. We expect that the forthcoming spectroscopy of the remaining objects from our list, accompanied by the spectrophotometric monitoring of the already discovered cLBVs, will further increase the known population of Galactic LBVs. This, in turn, will have profound consequences for better understanding the LBV phenomenon and its role in the transition between hydrogen-burning O stars and helium-burning WR stars. We also report on the detection of an arc-like structure attached to the cLBV HD 326823 and an arc associated with the LBV R99 (HD 269445) in the LMC. Partially based on observations collected at the German-Spanish Astronomical Centre, Calar Alto, jointly operated by the Max-Planck-Institut für Astronomie Heidelberg and the Instituto de Astrofísica de Andalucía (CSIC). E-mail: vgvaram@mx.iki.rssi.ru (VVG); akniazev@saao.ac.za (AYK); fabrika@sao.ru (SF)

  17. A software product certification model

    NARCIS (Netherlands)

    Heck, P.M.; Klabbers, M.D.; van Eekelen, Marko

    2010-01-01

    Certification of software artifacts offers organizations more certainty and confidence about software. Certification of software helps software sales, acquisition, and can be used to certify legislative compliance or to achieve acceptable deliverables in outsourcing. In this article, we present a

  18. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  19. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  20. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-10-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programed in order to control the function that they perform. The basics of microprograming and new microcircuits have already been discussed. In this course, the methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogramed circuit itself. 15 figures, 2 tables

  1. Hospital Management Software Development

    OpenAIRE

    sobogunGod, olawale

    2012-01-01

    The purpose of this thesis was to implement a hospital management software which is suitable for small private hospitals in Nigeria, especially for the ones that use a file based system for storing information rather than having it stored in a more efficient and safer environment like databases or excel programming software. The software developed within this thesis project was specifically designed for the Rainbow specialist hospital which is based in Lagos, the commercial neurological cente...

  2. Computer software configuration management

    International Nuclear Information System (INIS)

    Pelletier, G.

    1987-08-01

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  3. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere)

  4. Software quality management

    International Nuclear Information System (INIS)

    Bishop, D.C.; Pymm, P.

    1991-01-01

    As programmable electronic (software-based) systems are increasingly being proposed as design solutions for high integrity applications in nuclear power stations, the need to adopt suitable quality management arrangements is paramount. The authors describe Scottish Nuclear's strategy for software quality management and, using the main on-line monitoring system at Torness Power Station as an example, explain how this strategy is put into practice. Particular attention is given to the topics of software quality planning and change control. (author)

  5. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  6. Assuring Software Reliability

    Science.gov (United States)

    2014-08-01

    technologies and processes to achieve a required level of confidence that software systems and services function in the intended manner. 1.3 Security Example...that took three high-voltage lines out of service and a software fail- ure (a race condition3) that disabled the computing service that notified the... service had failed. Instead of analyzing the details of the alarm server failure, the reviewers asked why the following software assurance claim had

  7. FASTBUS software status

    International Nuclear Information System (INIS)

    Gustavson, D.B.

    1980-10-01

    Computer software will be needed in addition to the mechanical, electrical, protocol and timing specifications of the FASTBUS, in order to facilitate the use of this flexible new multiprocessor and multisegment data acquisition and processing system. Software considerations have been important in the FASTBUS design, but standard subroutines and recommended algorithms will be needed as the FASTBUS comes into use. This paper summarizes current FASTBUS software projects, goals and status

  8. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  9. Solar Asset Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Iverson, Aaron [Ra Power Management, Inc., Oakland, CA (United States); Zviagin, George [Ra Power Management, Inc., Oakland, CA (United States)

    2016-09-30

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  10. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  11. The CMS software performance at the start of data taking

    CERN Document Server

    Benelli, Gabriele

    2009-01-01

    The CMS software framework (CMSSW) is a complex project evolving very rapidly as the first LHC colliding beams approach. The computing requirements constrain performance in terms of CPU time, memory footprint and event size on disk to allow for planning and managing the computing infrastructure necessary to handle the needs of the experiment. A performance suite of tools has been developed to track all aspects of code performance, through the software release cycles, allowing for regression and guiding code development for optimization. In this talk, we describe the CMSSW performance suite tools used and present some sample performance results from the release integration process for the CMS software.

  12. Evolving Random Forest for Preference Learning

    DEFF Research Database (Denmark)

    Abou-Zleikha, Mohamed; Shaker, Noor

    2015-01-01

    This paper introduces a novel approach for pairwise preference learning through a combination of an evolutionary method and random forest. Grammatical evolution is used to describe the structure of the trees in the Random Forest (RF) and to handle the process of evolution. Evolved random forests ...... obtained for predicting pairwise self-reports of users for the three emotional states engagement, frustration and challenge show very promising results that are comparable and in some cases superior to those obtained from state-of-the-art methods....

  13. An evolving network model with community structure

    International Nuclear Information System (INIS)

    Li Chunguang; Maini, Philip K

    2005-01-01

    Many social and biological networks consist of communities-groups of nodes within which connections are dense, but between which connections are sparser. Recently, there has been considerable interest in designing algorithms for detecting community structures in real-world complex networks. In this paper, we propose an evolving network model which exhibits community structure. The network model is based on the inner-community preferential attachment and inter-community preferential attachment mechanisms. The degree distributions of this network model are analysed based on a mean-field method. Theoretical results and numerical simulations indicate that this network model has community structure and scale-free properties

  14. Radio Imaging of Envelopes of Evolved Stars

    Science.gov (United States)

    Cotton, Bill

    2018-04-01

    This talk will cover imaging of stellar envelopes using radio VLBI techniques; special attention will be paid to the technical differences between radio and optical/IR interferomery. Radio heterodyne receivers allow a straightforward way to derive spectral cubes and full polarization observations. Milliarcsecond resolution of very bright, i.e. non thermal, emission of molecular masers in the envelopes of evolved stars can be achieved using VLBI techniques with baselines of thousands of km. Emission from SiO, H2O and OH masers are commonly seen at increasing distance from the photosphere. The very narrow maser lines allow accurate measurements of the velocity field within the emitting region.

  15. Mobile computing acceptance grows as applications evolve.

    Science.gov (United States)

    Porn, Louis M; Patrick, Kelly

    2002-01-01

    Handheld devices are becoming more cost-effective to own, and their use in healthcare environments is increasing. Handheld devices currently are being used for e-prescribing, charge capture, and accessing daily schedules and reference tools. Future applications may include education on medications, dictation, order entry, and test-results reporting. Selecting the right handheld device requires careful analysis of current and future applications, as well as vendor expertise. It is important to recognize the technology will continue to evolve over the next three years.

  16. Evolved Minimal Frustration in Multifunctional Biomolecules.

    Science.gov (United States)

    Röder, Konstantin; Wales, David J

    2018-05-25

    Protein folding is often viewed in terms of a funnelled potential or free energy landscape. A variety of experiments now indicate the existence of multifunnel landscapes, associated with multifunctional biomolecules. Here, we present evidence that these systems have evolved to exhibit the minimal number of funnels required to fulfil their cellular functions, suggesting an extension to the principle of minimum frustration. We find that minimal disruptive mutations result in additional funnels, and the associated structural ensembles become more diverse. The same trends are observed in an atomic cluster. These observations suggest guidelines for rational design of engineered multifunctional biomolecules.

  17. SALT Spectroscopy of Evolved Massive Stars

    Science.gov (United States)

    Kniazev, A. Y.; Gvaramadze, V. V.; Berdnikov, L. N.

    2017-06-01

    Long-slit spectroscopy with the Southern African Large Telescope (SALT) of central stars of mid-infrared nebulae detected with the Spitzer Space Telescope and Wide-Field Infrared Survey Explorer (WISE) led to the discovery of numerous candidate luminous blue variables (cLBVs) and other rare evolved massive stars. With the recent advent of the SALT fiber-fed high-resolution echelle spectrograph (HRS), a new perspective for the study of these interesting objects is appeared. Using the HRS we obtained spectra of a dozen newly identified massive stars. Some results on the recently identified cLBV Hen 3-729 are presented.

  18. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  19. Agile software development

    CERN Document Server

    Dingsoyr, Torgeir; Moe, Nils Brede

    2010-01-01

    Agile software development has become an umbrella term for a number of changes in how software developers plan and coordinate their work, how they communicate with customers and external stakeholders, and how software development is organized in small, medium, and large companies, from the telecom and healthcare sectors to games and interactive media. Still, after a decade of research, agile software development is the source of continued debate due to its multifaceted nature and insufficient synthesis of research results. Dingsoyr, Dyba, and Moe now present a comprehensive snapshot of the kno

  20. Optimization of Antivirus Software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyzes some of the optimization concepts applied to this category of applications

  1. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  2. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  3. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  4. Software as quality product

    International Nuclear Information System (INIS)

    Enders, A.

    1975-01-01

    In many discussions on the reliability of computer systems, software is presented as the weak link in the chain. The contribution attempts to identify the reasons for this situation as seen from the software development. The concepts correctness and reliability of programmes are explained as they are understood in the specialist discussion of today. Measures and methods are discussed which are particularly relevant as far as the obtaining of fault-free and reliable programmes is concerned. Conclusions are drawn for the user of software so that he is in the position to judge himself what can be justly expected frm the product software compared to other products. (orig./LH) [de

  5. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research....

  6. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  7. Software Intensive Systems

    National Research Council Canada - National Science Library

    Horvitz, E; Katz, D. J; Rumpf, R. L; Shrobe, H; Smith, T. B; Webber, G. E; Williamson, W. E; Winston, P. H; Wolbarsht, James L

    2006-01-01

    .... Additionally, recommend that DoN invest in software engineering, particularly as it complements commercial industry developments and promotes the application of systems engineering methodology...

  8. Contractor Software Charges

    National Research Council Canada - National Science Library

    Granetto, Paul

    1994-01-01

    .... Examples of computer software costs that contractors charge through indirect rates are material management systems, security systems, labor accounting systems, and computer-aided design and manufacturing...

  9. Decentralized Software Architecture

    National Research Council Canada - National Science Library

    Khare, Rohit

    2002-01-01

    .... While the term "decentralization" is familiar from political and economic contexts, it has been applied extensively, if indiscriminately, to describe recent trends in software architecture towards...

  10. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  11. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  12. Plagiarism in the Context of Education and Evolving Detection Strategies.

    Science.gov (United States)

    Gasparyan, Armen Yuri; Nurmashev, Bekaidar; Seksenbayev, Bakhytzhan; Trukhachev, Vladimir I; Kostyukova, Elena I; Kitas, George D

    2017-08-01

    Plagiarism may take place in any scientific journals despite currently employed anti-plagiarism tools. The absence of widely acceptable definitions of research misconduct and reliance solely on similarity checks do not allow journal editors to prevent most complex cases of recycling of scientific information and wasteful, or 'predatory,' publishing. This article analyses Scopus-based publication activity and evidence on poor writing, lack of related training, emerging anti-plagiarism strategies, and new forms of massive wasting of resources by publishing largely recycled items, which evade the 'red flags' of similarity checks. In some non-Anglophone countries 'copy-and-paste' writing still plagues pre- and postgraduate education. Poor research management, absence of courses on publication ethics, and limited access to quality sources confound plagiarism as a cross-cultural and multidisciplinary phenomenon. Over the past decade, the advent of anti-plagiarism software checks has helped uncover elementary forms of textual recycling across journals. But such a tool alone proves inefficient for preventing complex forms of plagiarism. Recent mass retractions of plagiarized articles by reputable open-access journals point to critical deficiencies of current anti-plagiarism software that do not recognize manipulative paraphrasing and editing. Manipulative editing also finds its way to predatory journals, ignoring the adherence to publication ethics and accommodating nonsense plagiarized items. The evolving preventive strategies are increasingly relying on intelligent (semantic) digital technologies, comprehensively evaluating texts, keywords, graphics, and reference lists. It is the right time to enforce adherence to global editorial guidance and implement a comprehensive anti-plagiarism strategy by helping all stakeholders of scholarly communication. © 2017 The Korean Academy of Medical Sciences.

  13. Reengineering water treatment units for removal of Sr-90, I-129, Tc-99, and uranium from contaminated groundwater at the DOE's Savannah River Site

    International Nuclear Information System (INIS)

    Serkiz, S.M.

    2000-01-01

    The 33 years of active operation of the F- and H-Area Seepage Basins to dispose of liquid low-level radioactive waste at the Department of Energy's Savannah River Site has resulted in the contamination of the groundwater underlying these basins with a wide variety of radionuclides and stable metals. The current Resource Conservation and Recovery Act (RCRA) Part B permit requires the operation of a pump-and-treat system capable of (1) maintaining hydraulic control of a specified contaminated groundwater plume, (2) treatment of the extracted groundwater, and (3) reinjection of the treated water hydraulically upgradient of the basins. Two multimillion-dollar water treatment units (WTUs) were designed and built in 1997 and the basic design consists of (1) reverse osmosis concentration, (2) chemical addition, neutralization, precipitation, polymer addition, flocculation, and clarification of the reverse osmosis concentrate, and (3) final polishing of the clarified water by ion exchange (IX) and sorption. During startup of these units numerous process optimizations were identified and, therefore, the WTUs have been recently reengineered. A systematic approach of: (1) developing a technical baseline through laboratory studies, (2) scale-up and plant testing, (3) plant modification, and (4) system performance monitoring was the basis for reengineering the WTUs. Laboratory experiments were conducted in order to establish a technical baseline for further scale-up/plant testing and system modifications. These studies focused on the following three areas of the process: (1) contaminant removal during chemical addition, neutralization and precipitation, (2) solids separation by flocculation, coagulation, clarification, and filtration, and (3) contaminant polishing of the clarified liquid by IX/sorption. Using standard laboratory-scale jar tests, the influences of pH and Fe concentration on contaminant removal during precipitation/neutralization were evaluated. The results of

  14. Re-engineering the nuclear medicine residency curriculum in the new era of PET imaging: Perspectives on PET education and training in the Philippine context

    International Nuclear Information System (INIS)

    Pascual, T.N.; Santiago, J.F.; Leus, M.

    2007-01-01

    Full text: There is rapid development in PET Imaging and Molecular Nuclear Medicine. In the context of a residency training program, there is a need to incorporate these technologies in the existing Nuclear Medicine Residency Training Curriculum. This will ensure that trainees are constantly updated with the latest innovations in Nuclear Medicine making them apply this progress in their future practice hence making them achieve the goals and objectives of the curriculum. In residency training programs wherein no PET facilities are existing, these may be remedied by re-engineering the curriculum to include mandatory /electives rotations to other hospitals where the facilities are available. In order to ensure the integrity of the training program in this process of development, a proper sequence of this re-engineering process adhering to educational principles is suggested. These steps reflect the adoption of innovations and developments in the field of Nuclear Medicine essential for nuclear medicine resident learning. Curriculum re-engineering is a scientific and logical method reflecting the processes of addressing changes in the curriculum in order to deliver the desired goals and objectives of the program as dictated by time and innovations. The essential steps in this curriculum re-engineering process, which in this case aim to incorporate and/or update PET Imaging and Molecular Nuclear Imaging education and training, include (1) Curriculum Conceptualization and Legitimatisation, (2) Curriculum Diagnosis, (3) Curriculum Engineering, Designing and Organization, (4) Curriculum Implementation, (5) Curriculum Evaluation, (6) Curriculum Maintenance and (7) Curriculum Re-engineering. All of these sequences consider the participation of the different stakeholders of the training program. They help develop the curriculum, which seeks to promote student learning according to the dictates of the goals and objectives of the program and technology development. Once the

  15. Social software in global software development

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2010-01-01

    variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  16. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  17. Usability study of clinical exome analysis software: top lessons learned and recommendations.

    Science.gov (United States)

    Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W

    2014-10-01

    dramatically accelerate clinician analysis and interpretation of patient genomic data. We present the first application of usability methods to evaluate software interfaces in the context of exome analysis. Our results highlight how the study of user responses can lead to identification of usability issues and challenges and reveal software reengineering opportunities for improving clinical next-generation sequencing analysis. While the evaluation focused on two distinctive software tools, the results are general and should inform active and future software development for genome analysis software. As large-scale genome analysis becomes increasingly common in healthcare, it is critical that efficient and effective software interfaces are provided to accelerate clinical adoption of the technology. Implications for improved design of such applications are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  18. BOOK REVIEW: OPENING SCIENCE, THE EVOLVING GUIDE ...

    Science.gov (United States)

    The way we get our funding, collaborate, do our research, and get the word out has evolved over hundreds of years but we can imagine a more open science world, largely facilitated by the internet. The movement towards this more open way of doing and presenting science is coming, and it is not taking hundreds of years. If you are interested in these trends, and would like to find out more about where this is all headed and what it means to you, consider downloding Opening Science, edited by Sönke Bartling and Sascha Friesike, subtitled The Evolving Guide on How the Internet is Changing Research, Collaboration, and Scholarly Publishing. In 26 chapters by various authors from a range of disciplines the book explores the developing world of open science, starting from the first scientific revolution and bringing us to the next scientific revolution, sometimes referred to as “Science 2.0”. Some of the articles deal with the impact of the changing landscape of how science is done, looking at the impact of open science on Academia, or journal publishing, or medical research. Many of the articles look at the uses, pitfalls, and impact of specific tools, like microblogging (think Twitter), social networking, and reference management. There is lots of discussion and definition of terms you might use or misuse like “altmetrics” and “impact factor”. Science will probably never be completely open, and Twitter will probably never replace the journal article,

  19. Evolving NASA's Earth Science Data Systems

    Science.gov (United States)

    Walter, J.; Behnke, J.; Murphy, K. J.; Lowe, D. R.

    2013-12-01

    NASA's Earth Science Data and Information System Project (ESDIS) is charged with managing, maintaining, and evolving NASA's Earth Observing System Data and Information System (EOSDIS) and is responsible for processing, archiving, and distributing NASA Earth science data. The system supports a multitude of missions and serves diverse science research and other user communities. Keeping up with ever-changing information technology and figuring out how to leverage those changes across such a large system in order to continuously improve and meet the needs of a diverse user community is a significant challenge. Maintaining and evolving the system architecture and infrastructure is a continuous and multi-layered effort. It requires a balance between a "top down" management paradigm that provides a coherent system view and maintaining the managerial, technological, and functional independence of the individual system elements. This presentation will describe some of the key elements of the current system architecture, some of the strategies and processes we employ to meet these challenges, current and future challenges, and some ideas for meeting those challenges.

  20. The Comet Cometh: Evolving Developmental Systems.

    Science.gov (United States)

    Jaeger, Johannes; Laubichler, Manfred; Callebaut, Werner

    In a recent opinion piece, Denis Duboule has claimed that the increasing shift towards systems biology is driving evolutionary and developmental biology apart, and that a true reunification of these two disciplines within the framework of evolutionary developmental biology (EvoDevo) may easily take another 100 years. He identifies methodological, epistemological, and social differences as causes for this supposed separation. Our article provides a contrasting view. We argue that Duboule's prediction is based on a one-sided understanding of systems biology as a science that is only interested in functional, not evolutionary, aspects of biological processes. Instead, we propose a research program for an evolutionary systems biology, which is based on local exploration of the configuration space in evolving developmental systems. We call this approach-which is based on reverse engineering, simulation, and mathematical analysis-the natural history of configuration space. We discuss a number of illustrative examples that demonstrate the past success of local exploration, as opposed to global mapping, in different biological contexts. We argue that this pragmatic mode of inquiry can be extended and applied to the mathematical analysis of the developmental repertoire and evolutionary potential of evolving developmental mechanisms and that evolutionary systems biology so conceived provides a pragmatic epistemological framework for the EvoDevo synthesis.

  1. Software refactoring at the package level using clustering techniques

    KAUST Repository

    Alkhalid, A.

    2011-01-01

    Enhancing, modifying or adapting the software to new requirements increases the internal software complexity. Software with high level of internal complexity is difficult to maintain. Software refactoring reduces software complexity and hence decreases the maintenance effort. However, software refactoring becomes quite challenging task as the software evolves. The authors use clustering as a pattern recognition technique to assist in software refactoring activities at the package level. The approach presents a computer aided support for identifying ill-structured packages and provides suggestions for software designer to balance between intra-package cohesion and inter-package coupling. A comparative study is conducted applying three different clustering techniques on different software systems. In addition, the application of refactoring at the package level using an adaptive k-nearest neighbour (A-KNN) algorithm is introduced. The authors compared A-KNN technique with the other clustering techniques (viz. single linkage algorithm, complete linkage algorithm and weighted pair-group method using arithmetic averages). The new technique shows competitive performance with lower computational complexity. © 2011 The Institution of Engineering and Technology.

  2. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available within the Namibian context. An implication for software project managers and software developers is that customer interaction should be properly managed to ensure that the software methodologies for improving software development processes...

  3. Marketing Mix del Software.

    Directory of Open Access Journals (Sweden)

    Yudith del Carmen Rodríguez Pérez

    2006-03-01

    Por ello, en este trabajo se define el concepto de producto software, se caracteriza al mismo y se exponen sus atributos de calidad. Además, se aborda la mezcla de marketing del software necesaria y diferente a la de otros productos para que este triunfe en el mercado.

  4. Sustainability in Software Engineering

    NARCIS (Netherlands)

    Wolfram, N.J.E.; Lago, P.; Osborne, Francesco

    2017-01-01

    The intersection between software engineering research and issues related to sustainability and green IT has been the subject of increasing attention. In spite of that, we observe that sustainability is still not clearly defined, or understood, in the field of software engineering. This lack of

  5. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.

    1992-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  6. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.; Heemstra, F.J.

    1993-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  7. Software engineering ethics

    Science.gov (United States)

    Bown, Rodney L.

    1991-01-01

    Software engineering ethics is reviewed. The following subject areas are covered: lack of a system viewpoint; arrogance of PC DOS software vendors; violation od upward compatibility; internet worm; internet worm revisited; student cheating and company hiring interviews; computing practitioners and the commodity market; new projects and old programming languages; schedule and budget; and recent public domain comments.

  8. Computer Software Reviews.

    Science.gov (United States)

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  9. Software product family evaluation

    NARCIS (Netherlands)

    van der Linden, F; Bosch, J; Kamsties, E; Kansala, K; Krzanik, L; Obbink, H; VanDerLinden, F

    2004-01-01

    This paper proposes a 4-dimensional software product family engineering evaluation model. The 4 dimensions relate to the software engineering concerns of business, architecture, organisation and process. The evaluation model is meant to be used within organisations to determine the status of their

  10. Selecting the Right Software.

    Science.gov (United States)

    Shearn, Joseph

    1987-01-01

    Selection of administrative software requires analyzing present needs and, to meet future needs, choosing software that will function with a more powerful computer system. Other important factors to include are a professional system demonstration, maintenance and training, and financial considerations that allow leasing or renting alternatives.…

  11. Trends in software testing

    CERN Document Server

    Mohanty, J; Balakrishnan, Arunkumar

    2017-01-01

    This book is focused on the advancements in the field of software testing and the innovative practices that the industry is adopting. Considering the widely varied nature of software testing, the book addresses contemporary aspects that are important for both academia and industry. There are dedicated chapters on seamless high-efficiency frameworks, automation on regression testing, software by search, and system evolution management. There are a host of mathematical models that are promising for software quality improvement by model-based testing. There are three chapters addressing this concern. Students and researchers in particular will find these chapters useful for their mathematical strength and rigor. Other topics covered include uncertainty in testing, software security testing, testing as a service, test technical debt (or test debt), disruption caused by digital advancement (social media, cloud computing, mobile application and data analytics), and challenges and benefits of outsourcing. The book w...

  12. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  13. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... makes these countries an attractive host for software companies. Often, high-quality engineering and excellent quality of products, e.g., machinery and equipment, are mentioned. Yet, the question is: Can such arguments be also found for the software industry? We aim at investigating the degree...... of professionalism and systematization of software development to draw a map of strengths and weaknesses. To this end, we conducted as a first step an exploratory survey in Germany, presented in this paper. In this survey, we focused on the perceived importance of the two general software engineering process areas...

  14. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  15. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  16. Software licenses: Stay honest!

    CERN Multimedia

    Computer Security Team

    2012-01-01

    Do you recall our article about copyright violation in the last issue of the CERN Bulletin, “Music, videos and the risk for CERN”? Now let’s be more precise. “Violating copyright” not only means the illegal download of music and videos, it also applies to software packages and applications.   Users must respect proprietary rights in compliance with the CERN Computing Rules (OC5). Not having legitimately obtained a program or the required licenses to run that software is not a minor offense. It violates CERN rules and puts the Organization at risk! Vendors deserve credit and compensation. Therefore, make sure that you have the right to use their software. In other words, you have bought the software via legitimate channels and use a valid and honestly obtained license. This also applies to “Shareware” and software under open licenses, which might also come with a cost. Usually, only “Freeware” is complete...

  17. Linux software for large topology optimization problems

    DEFF Research Database (Denmark)

    evolving product, which allows a parallel solution of the PDE, it lacks the important feature that the matrix-generation part of the computations is localized to each processor. This is well-known to be critical for obtaining a useful speedup on a Linux cluster and it motivates the search for a COMSOL......-like package for large topology optimization problems. One candidate for such software is developed for Linux by Sandia Nat’l Lab in the USA being the Sundance system. Sundance also uses a symbolic representation of the PDE and a scalable numerical solution is achieved by employing the underlying Trilinos...

  18. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    International Nuclear Information System (INIS)

    Smith, P.R.; Sarfaty, R.

    1993-01-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility's physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables

  19. Improving the Customer Configuration Update Process by Explicitly Managing Software Knowledge

    NARCIS (Netherlands)

    Slinger, S.R.L.

    2006-01-01

    The implementation and continuous support of a software product at a customer with evolving requirements is a complex task for a product software vendor. There are many customers for the vendor to serve, all of whom might require their own version or variant of the application. Furthermore, the

  20. Crossing the borders and the cultural gaps for educating PhDs in software engineering

    DEFF Research Database (Denmark)

    Knutas, Antti; Seffah, Ahmed; Sørensen, Lene Tolstrup

    2017-01-01

    Software systems have established themselves as the heart of business and everyday living, and as the pillar of the emerging global digital economy. This puts pressure on educational institutions to train people for the continuously evolving software industry, which puts additional demand for new...

  1. Evolving colon injury management: a review.

    Science.gov (United States)

    Greer, Lauren T; Gillern, Suzanne M; Vertrees, Amy E

    2013-02-01

    The colon is the second most commonly injured intra-abdominal organ in penetrating trauma. Management of traumatic colon injuries has evolved significantly over the past 200 years. Traumatic colon injuries can have a wide spectrum of severity, presentation, and management options. There is strong evidence that most non-destructive colon injuries can be successfully managed with primary repair or primary anastomosis. The management of destructive colon injuries remains controversial with most favoring resection with primary anastomosis and others favor colonic diversion in specific circumstances. The historical management of traumatic colon injuries, common mechanisms of injury, demographics, presentation, assessment, diagnosis, management, and complications of traumatic colon injuries both in civilian and military practice are reviewed. The damage control revolution has added another layer of complexity to management with continued controversy.

  2. Pulmonary Sporotrichosis: An Evolving Clinical Paradigm.

    Science.gov (United States)

    Aung, Ar K; Spelman, Denis W; Thompson, Philip J

    2015-10-01

    In recent decades, sporotrichosis, caused by thermally dimorphic fungi Sporothrix schenckii complex, has become an emerging infection in many parts of the world. Pulmonary infection with S. schenckii still remains relatively uncommon, possibly due to underrecognition. Pulmonary sporotrichosis presents with distinct clinical and radiological patterns in both immunocompetent and immunocompromised hosts and can often result in significant morbidity and mortality despite treatment. Current understanding regarding S. schenckii biology, epidemiology, immunopathology, clinical diagnostics, and treatment options has been evolving in the recent years with increased availability of molecular sequencing techniques. However, this changing knowledge has not yet been fully translated into a better understanding of the clinical aspects of pulmonary sporotrichosis, as such current management guidelines remain unsupported by high-level clinical evidence. This article examines recent advances in the knowledge of sporotrichosis and its application to the difficult challenges of managing pulmonary sporotrichosis. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  3. Resiliently evolving supply-demand networks

    Science.gov (United States)

    Rubido, Nicolás; Grebogi, Celso; Baptista, Murilo S.

    2014-01-01

    The ability to design a transport network such that commodities are brought from suppliers to consumers in a steady, optimal, and stable way is of great importance for distribution systems nowadays. In this work, by using the circuit laws of Kirchhoff and Ohm, we provide the exact capacities of the edges that an optimal supply-demand network should have to operate stably under perturbations, i.e., without overloading. The perturbations we consider are the evolution of the connecting topology, the decentralization of hub sources or sinks, and the intermittence of supplier and consumer characteristics. We analyze these conditions and the impact of our results, both on the current United Kingdom power-grid structure and on numerically generated evolving archetypal network topologies.

  4. Development and the evolvability of human limbs.

    Science.gov (United States)

    Young, Nathan M; Wagner, Günter P; Hallgrímsson, Benedikt

    2010-02-23

    The long legs and short arms of humans are distinctive for a primate, the result of selection acting in opposite directions on each limb at different points in our evolutionary history. This mosaic pattern challenges our understanding of the relationship of development and evolvability because limbs are serially homologous and genetic correlations should act as a significant constraint on their independent evolution. Here we test a developmental model of limb covariation in anthropoid primates and demonstrate that both humans and apes exhibit significantly reduced integration between limbs when compared to quadrupedal monkeys. This result indicates that fossil hominins likely escaped constraints on independent limb variation via reductions to genetic pleiotropy in an ape-like last common ancestor (LCA). This critical change in integration among hominoids, which is reflected in macroevolutionary differences in the disparity between limb lengths, facilitated selection for modern human limb proportions and demonstrates how development helps shape evolutionary change.

  5. Evolving spiking networks with variable resistive memories.

    Science.gov (United States)

    Howard, Gerard; Bull, Larry; de Lacy Costello, Ben; Gale, Ella; Adamatzky, Andrew

    2014-01-01

    Neuromorphic computing is a brainlike information processing paradigm that requires adaptive learning mechanisms. A spiking neuro-evolutionary system is used for this purpose; plastic resistive memories are implemented as synapses in spiking neural networks. The evolutionary design process exploits parameter self-adaptation and allows the topology and synaptic weights to be evolved for each network in an autonomous manner. Variable resistive memories are the focus of this research; each synapse has its own conductance profile which modifies the plastic behaviour of the device and may be altered during evolution. These variable resistive networks are evaluated on a noisy robotic dynamic-reward scenario against two static resistive memories and a system containing standard connections only. The results indicate that the extra behavioural degrees of freedom available to the networks incorporating variable resistive memories enable them to outperform the comparative synapse types.

  6. Life cycle planning: An evolving concept

    International Nuclear Information System (INIS)

    Moore, P.J.R.; Gorman, I.G.

    1994-01-01

    Life-cycle planning is an evolving concept in the management of oil and gas projects. BHP Petroleum now interprets this idea to include all development planning from discovery and field appraisal to final abandonment and includes safety, environmental, technical, plant, regulatory, and staffing issues. This article describes in the context of the Timor Sea, how despite initial successes and continuing facilities upgrades, BHPP came to perceive that current operations could be the victim of early development successes, particularly in the areas of corrosion and maintenance. The search for analogies elsewhere lead to the UK North Sea, including the experiences of Britoil and BP, both of which performed detailed Life of Field studies in the later eighties. These materials have been used to construct a format and content for total Life-cycle plans in general and the social changes required to ensure their successful application in Timor Sea operations and deployment throughout Australia

  7. Argentina and Brazil: an evolving nuclear relationship

    International Nuclear Information System (INIS)

    Redick, J.R.

    1990-01-01

    Argentina and Brazil have Latin America's most advanced nuclear research and power programs. Both nations reject the Non-Proliferation Treaty (NPT), and have not formally embraced the Tlatelolco Treaty creating a regional nuclear-weapon-free zone. Disturbing ambiguities persist regarding certain indigenous nuclear facilities and growing nuclear submarine and missile capabilities. For these, and other reasons, the two nations are widely considered potential nuclear weapon states. However both nations have been active supporters of the International Atomic Energy Agency (IAEA) and have, in recent years, assumed a generally responsible position in regard to their own nuclear export activities (requiring IAEA safeguards). Most important, however, has been the advent of bilateral nuclear cooperation. This paper considers the evolving nuclear relationship in the context of recent and dramatic political change in Argentina and Brazil. It discusses current political and nuclear developments and the prospects for maintaining and expanding present bilateral cooperation into an effective non-proliferation arrangement. (author)

  8. A software perspective of environmental data quality

    International Nuclear Information System (INIS)

    Banerjee, B.

    1995-01-01

    Because of the large amount of complex data in environmental projects, particularly large decontamination and decommissioning projects, the quality of the data has a profound impact on the success and cost of the mission. In every phase of the life cycle of the project, including regulatory intervention and legal proceedings, maintaining the quality of data and presenting data in a timely and meaningful manner are critical. In this paper, a systemic view of data quality management from a software engineering perspective is presented. A method of evaluation evolves from this view. This method complements the principles of the data quality objective. When graded adequately, the method of evaluation establishes a paradigm for ensuring data quality for new and renewed projects. This paper also demonstrates that incorporating good practices of software engineering into the data management process leads to continuous improvement of data quality

  9. The genotype-phenotype map of an evolving digital organism

    OpenAIRE

    Fortuna, Miguel A.; Zaman, Luis; Ofria, Charles; Wagner, Andreas

    2017-01-01

    To understand how evolving systems bring forth novel and useful phenotypes, it is essential to understand the relationship between genotypic and phenotypic change. Artificial evolving systems can help us understand whether the genotype-phenotype maps of natural evolving systems are highly unusual, and it may help create evolvable artificial systems. Here we characterize the genotype-phenotype map of digital organisms in Avida, a platform for digital evolution. We consider digital organisms fr...

  10. Improving Software Sustainability: Lessons Learned from Profiles in Science.

    Science.gov (United States)

    Gallagher, Marie E

    2013-01-01

    The Profiles in Science® digital library features digitized surrogates of historical items selected from the archival collections of the U.S. National Library of Medicine as well as collaborating institutions. In addition, it contains a database of descriptive, technical and administrative metadata. It also contains various software components that allow creation of the metadata, management of the digital items, and access to the items and metadata through the Profiles in Science Web site [1]. The choices made building the digital library were designed to maximize the sustainability and long-term survival of all of the components of the digital library [2]. For example, selecting standard and open digital file formats rather than proprietary formats increases the sustainability of the digital files [3]. Correspondingly, using non-proprietary software may improve the sustainability of the software--either through in-house expertise or through the open source community. Limiting our digital library software exclusively to open source software or to software developed in-house has not been feasible. For example, we have used proprietary operating systems, scanning software, a search engine, and office productivity software. We did this when either lack of essential capabilities or the cost-benefit trade-off favored using proprietary software. We also did so knowing that in the future we would need to replace or upgrade some of our proprietary software, analogous to migrating from an obsolete digital file format to a new format as the technological landscape changes. Since our digital library's start in 1998, all of its software has been upgraded or replaced, but the digitized items have not yet required migration to other formats. Technological changes that compelled us to replace proprietary software included the cost of product licensing, product support, incompatibility with other software, prohibited use due to evolving security policies, and product abandonment

  11. Towards an Ontology of Software

    OpenAIRE

    Wang, Xiaowei

    2016-01-01

    Software is permeating every aspect of our personal and social life. And yet, the cluster of concepts around the notion of software, such as the notions of a software product, software requirements, software specifications, are still poorly understood with no consensus on the horizon. For many, software is just code, something intangible best defined in contrast with hardware, but it is not particularly illuminating. This erroneous notion, software is just code, presents both in the ontology ...

  12. LDUA software custodian's notebook

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements

  13. Software Formal Inspections Guidebook

    Science.gov (United States)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  14. The STARLINK software collection

    Science.gov (United States)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  15. Evidence synthesis software.

    Science.gov (United States)

    Park, Sophie Elizabeth; Thomas, James

    2018-06-07

    It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise

  16. Beginning software engineering

    CERN Document Server

    Stephens, Rod

    2015-01-01

    Beginning Software Engineering demystifies the software engineering methodologies and techniques that professional developers use to design and build robust, efficient, and consistently reliable software. Free of jargon and assuming no previous programming, development, or management experience, this accessible guide explains important concepts and techniques that can be applied to any programming language. Each chapter ends with exercises that let you test your understanding and help you elaborate on the chapter's main concepts. Everything you need to understand waterfall, Sashimi, agile, RAD, Scrum, Kanban, Extreme Programming, and many other development models is inside!

  17. Software industrial flexible

    OpenAIRE

    Díaz Araya, Daniel; Muñoz, Leandro; Sirerol, Daniel; Oviedo, Sandra; Ibáñez, Francisco S.

    2012-01-01

    En este trabajo se pretende investigar y proponer técnicas, métodos y tecnologías que permitan el desarrollo de software flexible en ambientes industriales. El objetivo es generar métodos y técnicas para facilitar el desarrollo de software flexible en ambientes industriales. Las áreas de investigación son los sistemas de scheduling de producción, la generación de software para plataformas de hardware abiertas y la innovación.

  18. Thyroid uptake software

    International Nuclear Information System (INIS)

    Alonso, Dolores; Arista, Eduardo

    2003-01-01

    The DETEC-PC software was developed as a complement to a measurement system (hardware) able to perform Iodine Thyroid Uptake studies. The software was designed according to the principles of Object oriented programming using C++ language. The software automatically fixes spectrometric measurement parameters and besides patient measurement also performs statistical analysis of a batch of samples. It possesses a PARADOX database with all information of measured patients and a help system with the system options and medical concepts related to the thyroid uptake study

  19. Criteria for software modularization

    Science.gov (United States)

    Card, David N.; Page, Gerald T.; Mcgarry, Frank E.

    1985-01-01

    A central issue in programming practice involves determining the appropriate size and information content of a software module. This study attempted to determine the effectiveness of two widely used criteria for software modularization, strength and size, in reducing fault rate and development cost. Data from 453 FORTRAN modules developed by professional programmers were analyzed. The results indicated that module strength is a good criterion with respect to fault rate, whereas arbitrary module size limitations inhibit programmer productivity. This analysis is a first step toward defining empirically based standards for software modularization.

  20. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  1. Upgrade Software and Computing

    CERN Document Server

    The LHCb Collaboration, CERN

    2018-01-01

    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis.

  2. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    SOFTWARE, LIKE ALL industry products, is the result of complex multinational supply chains with many partners from concept to development to production and maintenance. Global software engineering (GSE), IT outsourcing, and business process outsourcing during the past decade have showed growth...... rates of 10 to 20 percent per year. This instalment of Practitioner’s Digest summarizes experiences and guidance from industry to facilitate knowledge and technology transfer for GSE. It’s based on industry feedback from the annual IEEE International Conference on Global Software Engineering, which had...

  3. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-01-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programmed in order to control the function that they perform. In the previous paper the author has already discussed the basics of microprogramming and have studied in some detail two types of new microcircuits. In this paper, methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogrammed circuit itself. (Auth.)

  4. Guide to software export

    CERN Document Server

    Philips, Roger A

    2014-01-01

    An ideal reference source for CEOs, marketing and sales managers, sales consultants, and students of international marketing, Guide to Software Export provides a step-by-step approach to initiating or expanding international software sales. It teaches you how to examine critically your candidate product for exportability; how to find distributors, agents, and resellers abroad; how to identify the best distribution structure for export; and much, much more!Not content with providing just the guidelines for setting up, expanding, and managing your international sales channels, Guide to Software

  5. Software takes command

    CERN Document Server

    Manovich, Lev

    2013-01-01

    Software has replaced a diverse array of physical, mechanical, and electronic technologies used before 21st century to create, store, distribute and interact with cultural artifacts. It has become our interface to the world, to others, to our memory and our imagination - a universal language through which the world speaks, and a universal engine on which the world runs. What electricity and combustion engine were to the early 20th century, software is to the early 21st century. Offering the the first theoretical and historical account of software for media authoring and its effects on the prac

  6. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  7. Sobre software libre

    OpenAIRE

    Matellán Olivera, Vicente; González Barahona, Jesús; Heras Quirós, Pedro de las; Robles Martínez, Gregorio

    2004-01-01

    220 p. "Sobre software libre" reune casi una treintena de ensayos sobre temas de candente actualidad relacionados con el software libre (del cual Linux es su ex- ponente más conocido). Los ensayos que el lector encontrará están divididos en bloques temáticos que van desde la propiedad intelectual o las cuestiones económicas y sociales de este modelo hasta su uso en la educación y las administraciones publicas, pasando por alguno que repasa la historia del software libre en l...

  8. The NOvA software testing framework

    International Nuclear Information System (INIS)

    Tamsett, M; Group, C

    2015-01-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study vε appearance in a vμ beam. NOvA has already produced more than one million Monte Carlo and detector generated files amounting to more than 1 PB in size. This data is divided between a number of parallel streams such as far and near detector beam spills, cosmic ray backgrounds, a number of data-driven triggers and over 20 different Monte Carlo configurations. Each of these data streams must be processed through the appropriate steps of the rapidly evolving, multi-tiered, interdependent NOvA software framework. In total there are greater than 12 individual software tiers, each of which performs a different function and can be configured differently depending on the input stream. In order to regularly test and validate that all of these software stages are working correctly NOvA has designed a powerful, modular testing framework that enables detailed validation and benchmarking to be performed in a fast, efficient and accessible way with minimal expert knowledge. The core of this system is a novel series of python modules which wrap, monitor and handle the underlying C++ software framework and then report the results to a slick front-end web-based interface. This interface utilises modern, cross-platform, visualisation libraries to render the test results in a meaningful way. They are fast and flexible, allowing for the easy addition of new tests and datasets. In total upwards of 14 individual streams are regularly tested amounting to over 70 individual software processes, producing over 25 GB of output files. The rigour enforced through this flexible testing framework enables NOvA to rapidly verify configurations, results and software and thus ensure that data is available for physics analysis in a timely and robust manner. (paper)

  9. Intellectual Property Protection of Software – At the Crossroads of Software Patents and Open Source Software

    OpenAIRE

    Tantarimäki, Maria

    2018-01-01

    The thesis considers the intellectual property protection of software in Europe and in the US, which is increasingly important subject as the world is globalizing and digitalizing. The special nature of software has challenges the intellectual property rights. The current protection of software is based on copyright protection but in this thesis, two other options are considered: software patents and open source software. Software patents provide strong protection for software whereas the pur...

  10. Center for Adaptive Optics | Software

    Science.gov (United States)

    Optics Software The Center for Adaptive Optics acts as a clearing house for distributing Software to Institutes it gives specialists in Adaptive Optics a place to distribute their software. All software is shared on an "as-is" basis and the users should consult with the software authors with any

  11. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    Seo, Yong Seok; Seong, Seung Hwan; Park, Keun Ok; Hur, Sub; Kim, Dong Hoon

    2001-01-01

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  12. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  13. Petroleum software profiles

    International Nuclear Information System (INIS)

    Anon.

    1996-01-01

    A profile of twenty-two software packages designed for petroleum exploration and production was provided. Some focussed on the oil and gas engineering industry, and others on mapping systems containing well history files and well data summaries. Still other programs provided accounting systems designed to address the complexities of the oil and gas industry. The software packages reviewed were developed by some of the best-known groups involved in software development for the oil and gas industry, including among others, Geoquest, the Can Tek Group, Applied Terravision Systems Inc., Neotechnology Consultants Ltd., (12) OGCI Software Inc., Oracle Energy, Production Revenue Information Systems Management, Virtual Computing Services Ltd., and geoLogic Systems Ltd

  14. Next Generation Software Development

    National Research Council Canada - National Science Library

    Manna, Zohar

    2005-01-01

    Under this grant we have studied the development of a scientifically sound basis for software development that builds on widely used pragmatic methods but is firmly grounded in well-established formal...

  15. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...

  16. eSoftwareList

    Data.gov (United States)

    US Agency for International Development — USAID Software Database reporting tool created in Oracle Application Express (APEX). This version provides read only access to a database view of the JIRA SAR...

  17. Software didattico: integrazione scolastica

    Directory of Open Access Journals (Sweden)

    Lucia Ferlino

    1996-01-01

    Full Text Available Discussion of the use of educational software for school integration. Requires being aware of its potential effectiveness and know that it also lies in the choice of functional products.

  18. Tier2 Submit Software

    Science.gov (United States)

    Download this tool for Windows or Mac, which helps facilities prepare a Tier II electronic chemical inventory report. The data can also be exported into the CAMEOfm (Computer-Aided Management of Emergency Operations) emergency planning software.

  19. SEER Data & Software

    Science.gov (United States)

    Options for accessing datasets for incidence, mortality, county populations, standard populations, expected survival, and SEER-linked and specialized data. Plus variable definitions, documentation for reporting and using datasets, statistical software (SEER*Stat), and observational research resources.

  20. Software for nuclear spectrometry

    International Nuclear Information System (INIS)

    1998-10-01

    The Advisory Group Meeting (AGM) on Software for Nuclear Spectrometry was dedicated to review the present status of software for nuclear spectrometry and to advise on future activities in this field. Because similar AGM and consultant's meetings had been held in the past; together with an attempt to get more streamlined, this AGM was devoted to the specific field of software for gamma ray spectrometry. Nevertheless, many of the issues discussed and the recommendations made are of general concern for any software on nuclear spectrometry. The report is organized by sections. The 'Summary' gives conclusions and recommendations adopted at the AGM. These conclusions and recommendations resulted from the discussions held during and after presentations of the scientific and technical papers. These papers are reported here in their integral form in the following Sections